Augmented Reality Agent: Proxemics

Radeeyah Ade

XR Design
User Researcher
AR/VR Developer
Trello
Unity
Visual Studio Code

Project Overview

Developed an Augmented Reality (AR) agent that adapts its proximity dynamically to users, respecting cultural norms and personal preferences. The project integrates proxemic theory with AR technology to enhance user interaction in immersive environments.
This project tackled critical AR/VR challenges, such as grounding virtual agents, ensuring natural interaction, and real-time responsiveness. User testing validated its effectiveness, with insights highlighting the transformative potential of this technology.
scroll to the end to see the video demo

Key Features

Implementation of Proxemics

Integrated Edward T. Hall’s proxemics theory into the AR agent's behavior, enabling recognition of four primary proxemic zones: intimate (0–0.45m), personal (0.46–1.2m), social (1.3–3.6m), and public (>3.6m).
proxemic zones (in meters)
proxemic zones (in meters)
Programmed the agent to calculate user distance in real-time and adjust its position dynamically, maintaining appropriate spacing for each zone.
Defined logic for boundary detection:
If the user moved beyond or within zone limits, the agent recalculated its target distance and repositioned itself smoothly.
private void MaintainProxemicDistance()
{
float distanceToUser = Vector3.Distance(transform.position, user.transform.position);

if (distanceToUser < minDistance || distanceToUser > maxDistance)
{
Vector3 directionToUser = (user.transform.position - transform.position).normalized;

if (distanceToUser < minDistance)
{

targetPosition = user.transform.position - directionToUser * averageDistance;
navMeshAgent.SetDestination(targetPosition);
SetAnimationState(false, false, false, true); // Walking backward
}
else if (distanceToUser > maxDistance)
{

targetPosition = user.transform.position - directionToUser * averageDistance;
navMeshAgent.SetDestination(targetPosition);
SetAnimationState(true, false, false, false); // Walking forward
}
}
else
{
navMeshAgent.ResetPath();
SetAnimationState(false, false, false, false); // Idle
StartCoroutine(TurnToFaceUserCoroutine());
}
Implemented orientation tracking: The agent maintained eye contact with the user by dynamically adjusting its head and body alignment based on user movement.

Advanced Animation and Behavior Control

Developed a fully rigged 3D humanoid agent using Unity, Animancer, and Inverse Kinematics (IK) for lifelike movements and interactions.
image of the AR Agent
image of the AR Agent
Integrated a Finite State Machine (FSM) to manage state transitions (idle, walking, turning) based on user proximity.
 private void SetAnimationState(bool isWalking, bool isTurningRight, bool isTurningLeft, bool isWalkingBackward)
{
animator.SetBool("isWalking", isWalking);
animator.SetBool("isTurningRight", isTurningRight);
animator.SetBool("isTurningLeft", isTurningLeft);
animator.SetBool("isWalkingBackward", isWalkingBackward);
}

Grounding and Navigation

Tackled fragmented spatial meshes by implementing raycasting to detect the floor dynamically via HoloLens spatial mapping.
fragmented spatial meshes generated
fragmented spatial meshes generated
Anchored a pre-baked NavMesh on the detected floor, ensuring accurate grounding and navigation.
A prompt popping up after a spatial mesh was detected
A prompt popping up after a spatial mesh was detected
Utilized NavMesh and the A algorithm* for efficient pathfinding, enabling the agent to move smoothly within defined boundaries.
 private void PositionAgentBehindUser(float distance)
{
Vector3 behindPosition = user.transform.position - user.transform.forward * distance;
behindPosition.y = 0; // Ensure y is set to 0

if (NavMesh.SamplePosition(behindPosition, out NavMeshHit hit, 1.0f, NavMesh.AllAreas))
{
navMeshAgent.Warp(hit.position);
AdjustYPosition();
StartCoroutine(TurnToFaceUserCoroutine());
}
else
{
Debug.LogError("Failed to position agent on NavMesh.");
}
}

User Interface (UI) for Customization

Designed an intuitive UI allowing users to select proxemic zones and customize the agent's behavior dynamically.
Menu buttons
Menu buttons
user interacting with the interface
user interacting with the interface
Incorporated MRTK’s speech recognition for hands-free interaction, enabling users to confirm floor placement or switch zones using voice commands.
Speech input handler
Speech input handler

Challenges and Solutions

Challenge: Fragmented spatial meshes disrupted NavMesh generation.
Solution: Implemented a static plane overlay on the floor, detected via raycasting, as a workaround for real-time spatial mapping.
Challenge: Ensuring agent realism in mixed environments.
Solution: Used Animancer and IK to dynamically adjust animations and maintain a natural appearance during interactions.
Challenge: Proxemics is rarely implemented in AR.
Solution: Integrated dynamic zone recognition and real-time behavior adjustment based on user positioning and movement.

User Testing

Setup and Methodology

Conducted user testing with 6 participants aged 21–30 in a controlled environment.
Participants interacted with the AR agent by selecting different proxemic zones (intimate, personal, social, public) through a customized user interface.
Evaluated the agent’s ability to maintain consistent proximity, adapt to user movement, and respond to boundary-breaking actions.

Feedback Mechanism

Participants completed a post-experiment questionnaire featuring Likert scale ratings and open-ended questions.
Metrics included ease of interaction, realism, immersion, and responsiveness.

Key Insights from Testing

Strengths:
Users reported high satisfaction with the agent’s ability to maintain proxemic distances dynamically.
rating of agent's responsiveness
rating of agent's responsiveness
ranking of the zones
ranking of the zones
The animations (walking, turning, and eye contact) were praised for their natural feel and realism.
The interactive UI and speech recognition were noted as intuitive and user-friendly.
Challenges Identified:
Occasionally, the agent struggled with abrupt movements when users exited the NavMesh boundaries.
There was a minor delay in recalibrating proximity when participants moved too quickly.

Future Scope

1. Improved Dynamic Environments
Enhance real-time adaptability by integrating runtime NavMesh updates to allow the agent to navigate expanding or shifting spaces seamlessly.
2. Advanced Proxemics Customization
Introduce features for users to define custom proximity zones tailored to their cultural or individual preferences.
3. Expanding Applications
Healthcare: Deploy proxemics-aware agents as interactive therapy assistants or virtual nurses.
Education: Integrate with AR-based learning platforms to provide personalized tutoring in immersive environments.
Retail: Use agents for enhanced customer service by guiding users while respecting their personal space.
4. AI Enhancements
Incorporate machine learning algorithms to enable agents to predict user preferences over time, refining interactions through contextual learning.
Implement gesture recognition for more intuitive interactions beyond speech and touch.

Impact and Significance

Validated Feasibility: The user testing demonstrated the practicality of proxemics-aware agents in improving user experience.
Real-World Relevance: This work bridges gaps in AR/VR design by addressing user comfort and cultural considerations, paving the way for personalized AR applications.
Future Potential: By enhancing the flexibility and scalability of the system, the technology can revolutionize industries such as healthcare, retail, education, and entertainment.

Video Demo

Tools & Skills Demonstrated

Unity, Animancer, MRTK
AI techniques: FSM, IK, NavMesh
AR/VR hardware: Microsoft HoloLens 2
Raycasting, 3D modeling, animation control

Thank you!!!

Partner With Radeeyah
View Services

More Projects by Radeeyah