XR Virtual Reality Game

Rebuild NYC

What if rebuilding history could help us remember it better?
Rebuild NYC is an immersive VR experience where users reconstruct iconic New York landmarks in 3D space while an adaptive narrative engine responds to their behavior in real time.This project explores how embodied cognition, spatial interaction, and AI-driven storytelling can increase engagement and knowledge retention compared to passive museum experien

5

Project Details

Project:

XR Product Developer | UX Architect | Interaction Designer

Timeline:

Oct 2025 - Nov 2025 (Production-ready MVP)

Status:

Unity3D, XR Interaction Toolkit, Meta Quest 3, C#

Mentors:

Prof. Lisa Lokshina

Overview

Museums and digital learning platforms often rely on passive consumption: plaques, flat screens, audio guides. Retention is low. Emotional engagement is shallow.
Interaction is minimal.

Opportunity:
Can spatial computing transform historical learning from passive observation into active reconstruction?

An Adaptive Spatial Learning Platform Built for
the Future of Immersive Computing

The Product Vision

Transform users from observers into rebuilders.

By physically reconstructing fragmented landmarks, users:

• Engage motor memory
• Activate spatial reasoning
• Experience contextual storytelling
• Build emotional connection

This aligns with research in embodied cognition, which suggests physical interaction increases memory encoding.

Core Experience Design

I designed a modular interaction system built around 6DOF spatial alignment.

  1. User grabs a monument fragment using physics-based interaction

  2. Inspects geometry and spatial cues

  3. Aligns fragment in 3D space

  4. Correct placement triggers contextual narrative

  5. System adapts difficulty and storytelling depth

This loop was optimized for cognitive flow and reduced friction in VR interaction.

Adaptive Narrative System

Instead of linear storytelling, I built a behavior-responsive narrative engine. The system tracks:

• Time to placement
• Alignment accuracy
• Exploration behavior
• Idle time

Based on this, the experience dynamically adjusts narrative tone and depth. Fast users receive architectural insights. Exploratory users receive cultural and emotional context. This was implemented using a rule-based state engine with scalable reinforcement-learning logic.

Interaction & UX Architecture: Spatial UI Strategy

Rookie Mistakes: Flat menus break immersion in XR. I replaced traditional UI with:

• Holographic radial menu anchored to non-dominant hand
• Gaze-assisted highlighting
• Color-coded alignment feedback
• Spatial adjustable audio guidance

Design decisions were guided by XR ergonomics and arm fatigue studies.

Usability Testing

Tested with 10 users (VR and non-VR backgrounds) - Quantitative Results

• 82% puzzle completion without guidance
• 28% faster task performance after 5 minutes
• 40% improvement in historical recall

Qualitative Feedback

“It feels like I’m holding history.”
“The story responding to how I build makes it feel alive.”

Got a project in mind? Tell me about it

Hi, I’m Priyam Joshi, a Designer who turns curiosity into meaningful experiences. I bridge human emotion, technology, and strategy to craft experiences that don’t just look good they work beautifully. As a UX Designer and Researcher, my passion lies in uncovering the “why” behind user behavior and transforming insights into inclusive, data-driven design systems that create real business impact.