Project Description
(From the Project Definition)
Senior care and living centers are becoming very large with many of the areas (recreation, pool, health, dining...) separated and difficult to find without assistance. Some centers are training mobile robots to guide residents to the different areas. The robots use pre-taught navigational markers to help them follow the correct course.
(MySolution)
I will be "teaching" an iRobot Create (like the Roomba vacuum cleaner) to guide a person to a predefined location. I will actually now be using a Turtlebot and the Robot Operating System (ROS) to accomplish this task. The Turtlebot uses a Kinect sensor (like that found on the XBOX 360) along with SLAM (Simultaneous Localization and Mapping) to create a map of an area. Then you can click on where you would like the robot to go and it will go there. I am working on adding other features to the Turtlebot.
Senior care and living centers are becoming very large with many of the areas (recreation, pool, health, dining...) separated and difficult to find without assistance. Some centers are training mobile robots to guide residents to the different areas. The robots use pre-taught navigational markers to help them follow the correct course.
(MySolution)
I will be "teaching" an iRobot Create (like the Roomba vacuum cleaner) to guide a person to a predefined location. I will actually now be using a Turtlebot and the Robot Operating System (ROS) to accomplish this task. The Turtlebot uses a Kinect sensor (like that found on the XBOX 360) along with SLAM (Simultaneous Localization and Mapping) to create a map of an area. Then you can click on where you would like the robot to go and it will go there. I am working on adding other features to the Turtlebot.
LINKS
Final Presentation: Presentation
Walk-through: Walk-through
Documentation
Turtlebot Installation
Turtlebot Start-up
Turtlebot Map Explanation
Walk-through: Walk-through
Documentation
Turtlebot Installation
Turtlebot Start-up
Turtlebot Map Explanation
VIDEO DEMOS
Creating a Map of Cofrin 15
Navigating the Map in Cofrin 15
Here are a couple of examples of experiments I did dealing with what the robot "sees." I placed random objects in an area and took pictures of what the robot saw and what a human would see. It was very interesting!
On the right is what is right in front of the robot. I placed the recycling bin right in front of the robot to see what would happen. As can be seen from the left photo, the bin prohibits the robot from seeing much of anything that is in front of it, in fact it sees objects that are not even present that appear to be in front of the recycling bin. Lesson: an object right in front of the robot may cause problems with navigation.
|
On the right is what I am looking at. It is kind of hard to tell, bu the robot does see the laptop case (is is a little bump-out near the back wall). The robot can also see the recycling bin on the left hand side. (Note: the white lines hovering above is what the robot sees from the laser, the dark and light blue colors are what the robot interprets as objects).
|
Here is an example screenshot of the "map" when using SLAM to map a room or area. The lighter gray area is what the robot sees as open space. The black lines (look like blocks) are what the robot sees as boundaries (walls, objects, etc.). The current data the robot is receiving is marked with the little green blocks (they are on top of the orange and blue blocks). The orange and blue blocks indicate the obstacles the robot sees with its Laser Sensor. The orange indicates an obstacle seen by the sensor. The blue is the inflated obstacle (the software guesses the real size of the obstacle based on what its sensors see and a parameter that the user sets). This map gets updated in real time.