Ultrasound Guidance via Q-Learning
Project Overview
Motivation
- The project was inspired by the idea of enabling a robot to learn how to guide a human to a goal in the fastest and most efficient way possible.
- It addresses challenges in scenarios where users need to use personal medical equipment, ensuring that the robot can guide the user to attach the equipment in the correct position.
What Was Done
Project Description
Developed a one-dimensional ultrasound guidance system using Q-learning with a reward system to improve guidance over time.
Physical Setup
Built a movable device with a breadboard, PIC microcontroller, motor driver, ultrasound sensor, and RS232 connector on a 5-28 inch track.
Electrical Components
Used HC-SR04 ultrasound sensor for position detection and L293B motor driver to apply PWM signals to the motors.
Q-Learning Implementation
Initialized a random Q-table with states as rows and actions as columns, updated using the Bellman equation over 50 training episodes.
Program Flow:
- Initialized Q-table
- Read ultrasonic sensor data
- Selected an action based on the Q-table
- Performed the action and measured the reward
- Updated the Q-table based on the reward and state transition
What was Learned
Efficient Guidance
The robot successfully learned to guide the user to the goal efficiently by improving its decision-making over time.
Algorithm Adaptation
The Q-learning algorithm demonstrated its ability to adapt and refine the guidance system through repeated training episodes.
Real-world Applications
The system highlighted the potential of reinforcement learning in real-world applications, particularly in scenarios requiring precise guidance.
What Was Achieved
- Successful implementation of Q-learning algorithm
- Effective guidance system with 90% accuracy