Overview
A self-driving vehicle prototype featuring real-time LIDAR-based obstacle detection, sensor fusion, and an Android companion app for telemetry visualization.
The project demonstrates embedded systems principles: real-time constraints, sensor integration, and hardware-software coordination.
Technical Architecture
Sensor System
- LIDAR: Primary obstacle detection with 360° scanning
- Ultrasonic sensors: Supplementary distance measurement
- Kalman filtering: Sensor fusion for stable navigation decisions
Control System
- Arduino Mega: Main control unit running real-time control loop
- Response time: Sub-100ms obstacle detection to motor response
- PWM motor control: Smooth acceleration and steering
Communication
- Bluetooth Low Energy: Android connectivity
- Real-time telemetry: Sensor readings, motor states, navigation decisions
Android Companion App
The mobile app provides:
- Live sensor data visualization
- Navigation state monitoring
- Manual override controls
- Diagnostic logging
Performance Metrics
- Obstacle detection latency:
<100ms - Navigation update rate: 20Hz
- Bluetooth throughput: Sufficient for real-time telemetry
Learning Outcomes
This project solidified understanding of:
- Real-time embedded systems constraints
- Sensor fusion algorithms
- Hardware-software integration patterns
- BLE protocol implementation