Inclusive AR Navigation & Travel Guide by Paul FadayoInclusive AR Navigation & Travel Guide by Paul Fadayo

Inclusive AR Navigation & Travel Guide

Paul Fadayo

Paul Fadayo

Project Overview

Project Type: Mobile Application (iOS/Android)
Role: Lead Mobile Engineer & Product Architect
Objective: Create a travel navigation tool that uses Augmented Reality to make city exploration safer and more intuitive for users of all abilities.
Scope: AR Implementation, Geolocation Services, Accessibility Engine, UI/UX Design.

The Challenge

Traditional 2D maps are often abstract and difficult to interpret in real-time, especially for users with cognitive or visual impairments. Navigating a bustling city like Lagos or New York requires more than just a blue dot on a screen.
The challenge was to build a navigation system that wasn't just visually impressive, but functionally inclusive allowing users to customize the interface based on their specific visual, auditory, or motor needs.

The Solution: AR + Adaptive Accessibility

I architected a mobile solution that merges camera-based navigation with a robust personalization engine.
Augmented Reality (AR) Pathfinding Instead of a flat map, the app uses the phone's camera to overlay a green navigation path onto the real world. I implemented ARKit/ARCore to anchor 3D directional arrows to real-world coordinates, helping users visualize exactly where to turn.
The Accessibility Engine This is the core differentiator. Unlike standard apps, this platform begins with a "Needs Assessment." I engineered a settings module that allows users to toggle:
Visual Aids: High contrast modes, dyslexia-friendly fonts, and color blindness corrections (Deuteranopia support).
Haptic & Audio Feedback: Navigation cues via vibration patterns and text-to-speech for users who cannot rely on the screen.
Input Control: Voice command integration for hands-free usage.
Location-Aware Context The app dynamically adapts to the user's city. Whether in Lagos or New York, the backend fetches real-time POI (Point of Interest) data to categorize locations into "Nightlife," "Shopping," or "Restaurants" with crowd-density indicators.
High-fidelity screen showing the New York interface and the AR Navigation view in action

Key Technical Implementations

Geolocation Accuracy: Implemented sensor fusion (GPS + Compass + Accelerometer) to ensure AR overlays remain stable and accurate while walking.
State Management: Managed complex user preference states (font size, color schemes, audio settings) globally to ensure the entire app adapts instantly to the user's profile.
Performance: Optimized 3D rendering to run at 60fps on mobile devices without excessive battery drain.

The Results

This project demonstrates the future of inclusive technology:
Universal Design: A single codebase serving both standard users and those with disabilities.
Immersive Tech: Successfully bridging the gap between digital maps and physical reality.
🌍 Global Scalability: Architecture ready to support navigation data for any major city worldwide.
Like this project

Posted Dec 20, 2025

Architected an accessible-first AR navigation app. Features augmented reality pathfinding, voice commands, and adaptive UI for visually impaired travelers.

Likes

4

Views

11

Timeline

Jul 21, 2024 - Aug 30, 2024

Clients

Nickelfox