Bridging Virtual Worlds
and Human Interaction

Welcome to my portfolio. I am a Ph.D. student specializing in XR, AR, and VR technologies, passionate about creating immersive and realistic virtual experiences. My work focuses on bridging the gap between technology and human interaction through innovative projects in virtual environments, photogrammetry, and 3D modeling. Explore my journey, skills, and accomplishments, and see how I can contribute to your team's success.

Filmmaking meets VR: Location Scouting.

Currently, I am dedicated to a project focused on simulating the scouting process for the film industry in downtown Lafayette. I am developing a VR experience that enables filmmakers to virtually explore and scout locations. This project aims to streamline the scouting process, offering a cost-effective and efficient alternative to physically visiting potential filming sites.

Published in ISMAR 2024

You can also explore our Location Scouting Checklist to experience a practical tool we developed for filmmakers.

VR Nursing Visit Simulator.

One standout project that I have worked on is the creation of a community visit simulator. This VR-based application aims to provide training for future nurses by simulating their visits to patients’ homes. By immersing trainees in realistic virtual environments, this simulator helps them identify potential hazards and improve their skills in a controlled and safe setting.

VR Visual Acuity Test

This is a Research study on visual acuity tests in VR, exploring the use of Snellen charts within virtual environments. This work compared VR-based tests with traditional methods. The study highlights VR's potential as an accessible tool for preliminary eye vision testing, Published in IEEEVR 2024

EyeVR Evolves to AR.

Building on our VR-based visual acuity tests, I am now exploring Mixed Reality (MR) and Augmented Reality (AR) with the Quest Pro's spatial capabilities for improved accuracy. This project integrates eye-tracking to study gaze patterns in low vision and examines VR headset optical limitations for accurate vision testing. The work is ongoing and unpublished.

Other Projects

XR/VR experiments, collaborations, and technical demos.

SFG-VR: Distributed VR Renderer

SFG-VR: Distributed VR Renderer

A distributed VR rendering system that offloads visual computation to two networked PCs. The Meta Quest (master) sends headset transforms to slave PCs, which render separate halves of the view. Their outputs are stitched in real time and displayed in VR.

This is an extension of the SFG architecture from ATC’24. It features Unity + WebRTC networking, stereo occlusion rendering, and live UI-based IP config.

View on GitHub →

VR Nursing Simulator Featured in La Louisiane Magazine

La Louisiane Magazine Feature

Our interdisciplinary research on the VR Nursing Visit Simulator was spotlighted in the Spring 2024 issue of La Louisiane, UL Lafayette’s official magazine.

The article, titled "Virtual Reality Puts Health Care’s Future in Focus", covers how students and faculty from Nursing, Psychology, and Computer Science collaborated to simulate home visits and improve nursing education through VR.

Roberto Salazar helped adapt the system using Unity and Meta Quest to allow immersive, interactive home safety evaluations.

📖 Read the article in La Louisiane →

AR Motivation for Reading (HoloLens + GPT)

AR Reading Motivation with HoloLens 2

This HoloLens 2 prototype uses AR to boost motivation during difficult readings. A mounted external webcam enhances document capture, and OCR is applied in real time to detect and extract text from the user’s view.

The recognized text is corrected and passed to GPT-4 to generate voice-based explanations, motivational insights, or fun facts. It adapts dynamically based on the user's educational level—from elementary to PhD.

Future versions aim to reduce cognitive load by integrating non-intrusive motivational feedback like memes, gifs, or visual highlights that enrich the reading experience without distraction.

3D Scan of the LITE Center

A detailed photogrammetry model of the LITE building in Lafayette, LA, created using a DJI Mini 2 drone and RealityCapture. This project explores drone-based 3D scanning for digital preservation and integration into virtual environments.

Tools: DJI Mini 2, RealityCapture
Location: Louisiana Immersive Technologies Enterprise (LITE), Lafayette, LA

#Photogrammetry #RealityCapture #DroneMapping

Urgent, Hard Problems I Solve Fast

I help labs and startups build functional AI/VR prototypes fast.

I turn ideas into Unity/GPT demos in days, not months. Ideal for MVPs and pilots.

GPT apps that automate domain tasks like summarization or planning.

Immersive training apps with feedback and logging for safety/health.

I help labs quickly build what they proposed in grant submissions.

Need something fast? Let’s connect.

Roberto Salazar

Roberto Salazar

“I am passionate about creating realistic and immersive virtual worlds, exploring the boundaries of XR, AR, and VR. By leveraging technologies like Point Cloud, photogrammetry, 3D modeling, and Unity, I strive to bridge the gap between virtual environments and human interaction, building innovative connections between the digital and real worlds”

Roberto Salazar Magical Realism

LinkedIn: linkedin.com/in/sroberto27

GitHub: github.com/sroberto27

Location

301 E Lewis St, Room 363
Lafayette, LA 70503

Contact

EMAIL ME

× LinkedIn XR Logo

🎉 New Post:
“Hollywood Meets Bayou—VR Location Scouting” selected winner in GSAW 2025!