The End
4/29/2013
It has been quite an experience working with robotic guidance and the Turtlebot. This project definitely had its ups and downs, but it was quite an enjoyable and educational experience. I had my defense this afternoon, and I feel it went pretty well. I am almost done with the binder and CD, and then this project will be completely wrapped up. In addition, all of the documents on my webpage are up to date. I can't believe how fast the time went by! I hope that others in the future are able to follow the guides I have created and further expand upon this project. It was a great experience and I am actually sad it is over. Good luck and have fun to all who get the opportunity to work on this project. Only 12 more days and I will officially be an alum!!
It has been quite an experience working with robotic guidance and the Turtlebot. This project definitely had its ups and downs, but it was quite an enjoyable and educational experience. I had my defense this afternoon, and I feel it went pretty well. I am almost done with the binder and CD, and then this project will be completely wrapped up. In addition, all of the documents on my webpage are up to date. I can't believe how fast the time went by! I hope that others in the future are able to follow the guides I have created and further expand upon this project. It was a great experience and I am actually sad it is over. Good luck and have fun to all who get the opportunity to work on this project. Only 12 more days and I will officially be an alum!!
Almost Done
4/22/2013
The presentation is now done! Unfortunately the Turtlebot did not decide to cooperate for the actual presentation although it worked just a few hours earlier (Mary can testify to this) so I ended up using the videos created of the robot actually working. Next I will be preparing for the defense which is next week and working on my binder. I also uploaded my presentation to the "My Project" page. The end is in sight!!
The presentation is now done! Unfortunately the Turtlebot did not decide to cooperate for the actual presentation although it worked just a few hours earlier (Mary can testify to this) so I ended up using the videos created of the robot actually working. Next I will be preparing for the defense which is next week and working on my binder. I also uploaded my presentation to the "My Project" page. The end is in sight!!
Presentation Time
4/21/2013
Well, the presentations are tomorrow. I think I am all set to go. Mary and I are going to go over our presentations tonight to make sure everything will work as it should. I sure hope the robot does not fail tomorrow, if it does I do have a backup plan. Can't wait for this to be over!! I still need to put together a binder and get ready for the defense next week.
Tasks for the week
Well, the presentations are tomorrow. I think I am all set to go. Mary and I are going to go over our presentations tonight to make sure everything will work as it should. I sure hope the robot does not fail tomorrow, if it does I do have a backup plan. Can't wait for this to be over!! I still need to put together a binder and get ready for the defense next week.
Tasks for the week
- Presentation - DONE
- Defense Prep - DONE
- Work on Binder - DONE
The End is in Sight
4/14/2013
A week from tomorrow I will be presenting my project, it is crazy how fast the time has gone by. My project is basically done now. Everything seems to work as it should and I can successfully navigate around an area and adapt this to new areas by creating new maps. This past week I put finishing touches on the Turtlebot as well as a lot of testing. I wrote a lot of documentation so the next person who gets this project will have a pretty good starting point. In addition, I did some video-taping (thanks to the help of Danielle Berchmans) of my robot in action. This includes creating a map of an area and navigating in that area (I did create a map of Cofrin 15 that I will use for the live demo). I did this both in JMS and Cofrin 15, where our presentations will be held. I will use the video of creating the map during the presentation and the video of navigating the map as a backup plan in case the live demo doesn't work. In addition, I just finished putting together my presentation, I will just need to rehearse it and everything should be good to go. Everything seems to be working well though...3 more weeks!!
Tasks for the week
A week from tomorrow I will be presenting my project, it is crazy how fast the time has gone by. My project is basically done now. Everything seems to work as it should and I can successfully navigate around an area and adapt this to new areas by creating new maps. This past week I put finishing touches on the Turtlebot as well as a lot of testing. I wrote a lot of documentation so the next person who gets this project will have a pretty good starting point. In addition, I did some video-taping (thanks to the help of Danielle Berchmans) of my robot in action. This includes creating a map of an area and navigating in that area (I did create a map of Cofrin 15 that I will use for the live demo). I did this both in JMS and Cofrin 15, where our presentations will be held. I will use the video of creating the map during the presentation and the video of navigating the map as a backup plan in case the live demo doesn't work. In addition, I just finished putting together my presentation, I will just need to rehearse it and everything should be good to go. Everything seems to be working well though...3 more weeks!!
Tasks for the week
- Finish documentation. - DONE
- Rehearse presentation. - DONE
Things are coming together
4/7/2013
Well, it has been a couple of weeks since I last updated my blog, and a lot has happened. Last week Thursday I met with Dr. Pankratz and showed him some of what was working with the robot. He instructed me to do some tests and place random objects in an area to get a feel for what the robot "sees." I did just that and have found that using its laser scanner, the iRobot sometimes cannot see things that could be potential problems. In addition, there is a height at which the robot cannot see below, so for some objects that are in the way (like a shoe), the robot will not know to avoid it. In addition, I found that robot does have a wide angled view and so when making a map using the SLAM technology, it should be able to see a hallway on one pass without having to look from side to side. It was very interesting to see how the robot views an area and I attached some images to the My Project page for examples.
In addition to talking about testing the robot's vision, Dr. Pankratz lent me a wireless router to use for this project. I had been having a lot of latency problems which I attributed to using an adhoc network between the two laptops. What a difference this router has made!! Everything is much snappier and in real-time. There is no lag whatsoever and I can see the robots sensors update in real-time in response to environmental changes. Experiments are much easier to conduct with no latency and navigation now works great!
I was also given a fourth wheel to use on the robot as I was having issues with balance. This has resolved this problem completely. I also got a couple of bungee cords to put on the laptop that is riding on top of the robot. Everything seems to be great now as far as balance is concerned.
One other thing I worked on over this past weekend is documentation. I wrote a couple of documents, one is for installing ROS on two laptops to use with the Turtlebot (if someone would be starting from scratch). The other document explains how to start-up the Turtlebot and create/use a map using the Turltbot's navigation capabilities. This next week will be a fine-tuning week, with the presentation being only two weeks away!
Tasks for the week
Well, it has been a couple of weeks since I last updated my blog, and a lot has happened. Last week Thursday I met with Dr. Pankratz and showed him some of what was working with the robot. He instructed me to do some tests and place random objects in an area to get a feel for what the robot "sees." I did just that and have found that using its laser scanner, the iRobot sometimes cannot see things that could be potential problems. In addition, there is a height at which the robot cannot see below, so for some objects that are in the way (like a shoe), the robot will not know to avoid it. In addition, I found that robot does have a wide angled view and so when making a map using the SLAM technology, it should be able to see a hallway on one pass without having to look from side to side. It was very interesting to see how the robot views an area and I attached some images to the My Project page for examples.
In addition to talking about testing the robot's vision, Dr. Pankratz lent me a wireless router to use for this project. I had been having a lot of latency problems which I attributed to using an adhoc network between the two laptops. What a difference this router has made!! Everything is much snappier and in real-time. There is no lag whatsoever and I can see the robots sensors update in real-time in response to environmental changes. Experiments are much easier to conduct with no latency and navigation now works great!
I was also given a fourth wheel to use on the robot as I was having issues with balance. This has resolved this problem completely. I also got a couple of bungee cords to put on the laptop that is riding on top of the robot. Everything seems to be great now as far as balance is concerned.
One other thing I worked on over this past weekend is documentation. I wrote a couple of documents, one is for installing ROS on two laptops to use with the Turtlebot (if someone would be starting from scratch). The other document explains how to start-up the Turtlebot and create/use a map using the Turltbot's navigation capabilities. This next week will be a fine-tuning week, with the presentation being only two weeks away!
Tasks for the week
- Create a video to use for the presentation if I cannot get the live demo working. - DONE
- Finish documentation. - DONE
- Do more testing of random objects/taking pictures. - DONE
- Start working on the presentation. - DONE
Almost There
3/25/2013
I have had some bigger successes over the past week. The robot is almost able to carry out the task set forth in the project description. I can successfully tele-operate the Turtlebot and create a map using SLAM. The problem I was having was actually due to the network. If I have the screen showing what the Turtlebot sees, there is too much information being sent over the adhoc network between the two laptops. However, if I close the screen showing the mapping in real time, I am able to drive the Turtlebot with a keyboard and create a virtual map of an area or room. I talked to Dr. Pankratz, and hopefully I will have permissions to use the SNC network to have a more reliable connection between the two PCs. In addition, after creating the map, I can successfully save it and use it to navigate that same area in the future. When opening a saved map, I have to tell the robot where it is on the map and then tell it where I want it to go, and it moves to that spot. I still have to do some calibration with the gyroscope to make the robot more accurate. I also have to find a way to lower the speed of the robot, as it moves much too quickly to make a good tour guide. In addition, I will be doing investigative work over the week to look at how it is using the map to navigate. Tomorrow I will be showing what I have done to Dr. Pankratz, I sure hope it decides to work for me.
I have had some bigger successes over the past week. The robot is almost able to carry out the task set forth in the project description. I can successfully tele-operate the Turtlebot and create a map using SLAM. The problem I was having was actually due to the network. If I have the screen showing what the Turtlebot sees, there is too much information being sent over the adhoc network between the two laptops. However, if I close the screen showing the mapping in real time, I am able to drive the Turtlebot with a keyboard and create a virtual map of an area or room. I talked to Dr. Pankratz, and hopefully I will have permissions to use the SNC network to have a more reliable connection between the two PCs. In addition, after creating the map, I can successfully save it and use it to navigate that same area in the future. When opening a saved map, I have to tell the robot where it is on the map and then tell it where I want it to go, and it moves to that spot. I still have to do some calibration with the gyroscope to make the robot more accurate. I also have to find a way to lower the speed of the robot, as it moves much too quickly to make a good tour guide. In addition, I will be doing investigative work over the week to look at how it is using the map to navigate. Tomorrow I will be showing what I have done to Dr. Pankratz, I sure hope it decides to work for me.
Turtlebot Mapping
3/17/2013
More good news over this past week, I was able to navigate the robot using the Navigation tutorial. All of the sensors on the Kinect seem to work correctly as I am able to generate a map using this tutorial. I can click on a spot and the robot seems to navigate to that place correctly. I still have to calibrate the robot yet to gain more accuracy. In addition, I cannot tele-operate the robot in order to generate the map, which is a problem. Also, I cannot save the map so I am unable to use a known map to navigate an area, which is a key requirement of this project. I also have to see if there are any improvements I can make to the Turtlebot software. Over the past week, I also looked at all of the stacks and packages used for the Turtlebot to try and further understand what parts work together to make the Turtlebot work. I also have been preparing for the walk-through over the past week. This next week, I also want to try and see if I can use my Android tablet to navigate the Turtlebot.
Tasks for the week
More good news over this past week, I was able to navigate the robot using the Navigation tutorial. All of the sensors on the Kinect seem to work correctly as I am able to generate a map using this tutorial. I can click on a spot and the robot seems to navigate to that place correctly. I still have to calibrate the robot yet to gain more accuracy. In addition, I cannot tele-operate the robot in order to generate the map, which is a problem. Also, I cannot save the map so I am unable to use a known map to navigate an area, which is a key requirement of this project. I also have to see if there are any improvements I can make to the Turtlebot software. Over the past week, I also looked at all of the stacks and packages used for the Turtlebot to try and further understand what parts work together to make the Turtlebot work. I also have been preparing for the walk-through over the past week. This next week, I also want to try and see if I can use my Android tablet to navigate the Turtlebot.
Tasks for the week
- Save a map using the SLAM technology (using tele-operating). - DONE
- Continue investigating the Turtlebot stacks. - DONE
- Look at improvement I can make. - DONE
The Turtlebot Can See!
3/10/2013
What a week it has been... Currently we are on spring break, and I will be busy this week working on the Turtlebot (as we have a walk-through the week we get back from break). The week didn't start off too well as I couldn't even start up the robot. However, I found out that this was because of a new release from the Turtlebot software that introduced a bug into the software. After another update, I was able to start up the Turtlebot. After this success, I tried controlling the robot from a remote keyboard again, and it didn't work. However, this past Friday I was able to control the robot from another keyboard, back to where I was before. The real test came yesterday, dealing with the kinect sensors. I was able to see an image from the Kinect's camera yesterday when controlling the robot! This is great news in that I know that the Kinect sensor does work. My next hurdle is going to be a latency issue. With the camera sensor up and running, controlling the robot was very slow (sometimes it took up to 15 seconds to respond to a keystroke). This is going to be a problem when trying to build my map, which I will hopefully be able to do this week sometime, although at least the robot does work now. In addition, I plan on creating a presentation for the walk-through and preparing for this demonstration in addition to looking at how the Turtlebot stacks all work together.
Tasks for the week
What a week it has been... Currently we are on spring break, and I will be busy this week working on the Turtlebot (as we have a walk-through the week we get back from break). The week didn't start off too well as I couldn't even start up the robot. However, I found out that this was because of a new release from the Turtlebot software that introduced a bug into the software. After another update, I was able to start up the Turtlebot. After this success, I tried controlling the robot from a remote keyboard again, and it didn't work. However, this past Friday I was able to control the robot from another keyboard, back to where I was before. The real test came yesterday, dealing with the kinect sensors. I was able to see an image from the Kinect's camera yesterday when controlling the robot! This is great news in that I know that the Kinect sensor does work. My next hurdle is going to be a latency issue. With the camera sensor up and running, controlling the robot was very slow (sometimes it took up to 15 seconds to respond to a keystroke). This is going to be a problem when trying to build my map, which I will hopefully be able to do this week sometime, although at least the robot does work now. In addition, I plan on creating a presentation for the walk-through and preparing for this demonstration in addition to looking at how the Turtlebot stacks all work together.
Tasks for the week
- Create a map using the SLAM technology. - DONE
- Look at the Turtlebot stacks and how they work together. - DONE
- Prepare for walk-through. - DONE
No Progress
3/3/2013
As stated in the previous post, I have been having problems dealing with the Kinect sensor (probably a driver related issue). Therefore, I talked to Dr. Pankratz over the past week as well as emailed Alex Popov about his experiences (both have been very helpful). That being said, I decided that it would be best to start over with the Turtlebot laptop (reinstall Ubuntu) in case I maybe messed something up earlier. I did so late last week/over the weekend using the USB disk that came with the Turtlebot kit. Well, I was unsuccessful in doing so. I tried the version on the USB drive as well as the "latest" version from the Turtlebot website. Well that version is not the latest as all of the tutorials of setting up the robot use a much later version than these two versions (a lot has been deprecated). I then reinstalled the latest version of Ubuntu in addition to the ROS and Turtlebot stacks, as I had done to start out. After following everything exactly as written, I am not currently able to start up the Turtlebot software (and dashboard) or tele-operate the robot. So, I am back to a couple of weeks ago. I am not quite sure what is going on with this, I will have to do more research into the different error messages I have been receiving. Hopefully by the end of the week I will at least be able to remotely control the Turtlebot again. Then I will try and tackle the Kinect sensor error. In addition, I need to start looking at what I can modify. Here's to a good week...
Tasks for the week
As stated in the previous post, I have been having problems dealing with the Kinect sensor (probably a driver related issue). Therefore, I talked to Dr. Pankratz over the past week as well as emailed Alex Popov about his experiences (both have been very helpful). That being said, I decided that it would be best to start over with the Turtlebot laptop (reinstall Ubuntu) in case I maybe messed something up earlier. I did so late last week/over the weekend using the USB disk that came with the Turtlebot kit. Well, I was unsuccessful in doing so. I tried the version on the USB drive as well as the "latest" version from the Turtlebot website. Well that version is not the latest as all of the tutorials of setting up the robot use a much later version than these two versions (a lot has been deprecated). I then reinstalled the latest version of Ubuntu in addition to the ROS and Turtlebot stacks, as I had done to start out. After following everything exactly as written, I am not currently able to start up the Turtlebot software (and dashboard) or tele-operate the robot. So, I am back to a couple of weeks ago. I am not quite sure what is going on with this, I will have to do more research into the different error messages I have been receiving. Hopefully by the end of the week I will at least be able to remotely control the Turtlebot again. Then I will try and tackle the Kinect sensor error. In addition, I need to start looking at what I can modify. Here's to a good week...
Tasks for the week
- Get the Turtlebot to work. - DONE
Unexpected Excitement
2/26/2013
In my last update, I was having a lot of trouble logging into the Turtlebot laptop from another laptop. Well, yesterday I was successful in logging in. In addition, I was able to move the Turtlebot using the remote laptop's keyboard! My next issue facing me is the Kinect sensors and camera. I am unable to get data from the Kinect, which is a main part of the Turtlebot system. After reading forums and other information, I think it may be an issue with the driver for the Kinect sensor. This will be my next task to figure out. Hopefully that will happen by the end of the week.
I just added how I assembled the Turtlebot to the My Project page. Check it out!!
In my last update, I was having a lot of trouble logging into the Turtlebot laptop from another laptop. Well, yesterday I was successful in logging in. In addition, I was able to move the Turtlebot using the remote laptop's keyboard! My next issue facing me is the Kinect sensors and camera. I am unable to get data from the Kinect, which is a main part of the Turtlebot system. After reading forums and other information, I think it may be an issue with the driver for the Kinect sensor. This will be my next task to figure out. Hopefully that will happen by the end of the week.
I just added how I assembled the Turtlebot to the My Project page. Check it out!!
Turtlebot Assembly/Frustration
2/24/2013
The Turtlebot arrived this past week and I assembled it this past weekend. I also tried to get it to actually work with one computer on the Turtlebot and one as the workstation. There has been a lot of frustration this weekend with trying to do this. First of all, there is a tutorial on exactly how to get the Turtlebot to work. I followed the tutorial exactly as written, however I am not much closer to accomplishing this task than when I began this weekend. My first big setback was the wireless network. It appears the SNC network does not work as is needed for the Turtlebot to work, so I began exploring other options. Another alternative was to use an ad hoc network between the two laptops. One of the laptops, however, could never connect to the network. After searching for hours, it appeared that the default Ubuntu driver for the wireless card on this laptop is not compatible with an ad hoc network, so I installed the correct driver and was able to create a network between the two laptops. My next setback, that I still have no idea what to do about yet, was being able to SSH between the two laptops and log-in to the Turtlebot from the workstation computer. It may have something to do with environment variables or something else all together, I am not exactly sure what is going on. Hopefully this week will turn out much better than this past weekend.
Tasks for the week
The Turtlebot arrived this past week and I assembled it this past weekend. I also tried to get it to actually work with one computer on the Turtlebot and one as the workstation. There has been a lot of frustration this weekend with trying to do this. First of all, there is a tutorial on exactly how to get the Turtlebot to work. I followed the tutorial exactly as written, however I am not much closer to accomplishing this task than when I began this weekend. My first big setback was the wireless network. It appears the SNC network does not work as is needed for the Turtlebot to work, so I began exploring other options. Another alternative was to use an ad hoc network between the two laptops. One of the laptops, however, could never connect to the network. After searching for hours, it appeared that the default Ubuntu driver for the wireless card on this laptop is not compatible with an ad hoc network, so I installed the correct driver and was able to create a network between the two laptops. My next setback, that I still have no idea what to do about yet, was being able to SSH between the two laptops and log-in to the Turtlebot from the workstation computer. It may have something to do with environment variables or something else all together, I am not exactly sure what is going on. Hopefully this week will turn out much better than this past weekend.
Tasks for the week
- Get the Turtlebot to work. - DONE
Time for a Turtlebot
2/17/2013
Well, Dr. Pankratz has ordered a Turtlebot kit that can be used with the iRobot. It includes a shelving type system, the Kinect sensors, as well as other accessories needed to convert the iRobot to a Turtlebot. The Turtlebot should be able to map out an area and then go to a location in that area while avoiding obstacles. The Turtlebot uses the open source ROS and has step by step instructions on how to set it up correctly. Hopefully I will be able to get everything up and running once the kit arrives (it shipped this past week). I will also need to talk to Dr. Pankratz and ask what he wants me to do as far as modifications/additions (after I know exactly how the Turtlebot works).
This past week, I didn't do too much that was "new", however I continued looking at the Turtlebot stacks and how everything works together (the sensors, robot drivers, and navigation) so that I can start looking at what I want to modify/add.
Tasks for the week
Well, Dr. Pankratz has ordered a Turtlebot kit that can be used with the iRobot. It includes a shelving type system, the Kinect sensors, as well as other accessories needed to convert the iRobot to a Turtlebot. The Turtlebot should be able to map out an area and then go to a location in that area while avoiding obstacles. The Turtlebot uses the open source ROS and has step by step instructions on how to set it up correctly. Hopefully I will be able to get everything up and running once the kit arrives (it shipped this past week). I will also need to talk to Dr. Pankratz and ask what he wants me to do as far as modifications/additions (after I know exactly how the Turtlebot works).
This past week, I didn't do too much that was "new", however I continued looking at the Turtlebot stacks and how everything works together (the sensors, robot drivers, and navigation) so that I can start looking at what I want to modify/add.
Tasks for the week
- Continue looking at the Turtlebot stacks. - DONE
- Hopefully the Turtlebot will arrive so I can assemble it and make it work. - DONE
Robot Operating System
2/10/2013
Well, it has been quite a week and a lot has changed since I last posted. First of all, it looks as though I will be scrapping the whole SURF algorithm. I received an email from Alex Popov (who currently works in the Robotics Department at the University of Minnesota) who led me in a whole new direction. I will now be using the open source ROS or Robot Operating System (link here). As the website says, it provides tools and libraries to help software developers create robot applications. It is based on nodes where one node can represent an iRobot controller, another a wrapper for a camera, etc. and then the different nodes can subscribe to and send/receive messages from each other. While researching ROS, I found a project that seemed to do exactly what I needed to do, called the Turtlebot. The Turtlebot itself comes as a package which includes everything you need including the iRobot. However, there is also a kit that can be purchased to add on to an existing iRobot. It includes a "shelving" type system as well as a kinect sensor (like the one used for XBOX Kinect) and other connectors. Using the Kinect Sensor and a couple of other sensors, the robot uses a technology called SLAM (Simultaneous localization and mapping) to create a virtual map of it's surroundings. This technology has been used in developing the Google self-driving car. After creating the map, the robot can drive to a specified point within the area of the map. Dr. Pankratz will be ordering this kit for use on this project, I am VERY excited! The Turtlebot then uses an ROS Stack to operate (one laptop is actually on the robot and another controls the robot; therefore the robot uses the wireless network to communicate between the two PCs). Since this Turtlebot uses ROS, I will be able to modify the code to add new features and play around with the operating system. Once again, this is a total redirect of what I had been thinking of doing, thank you Alex!
This past week, I have been busy learning ROS. I installed Ubuntu on my laptop (since ROS runs in Linux) and installed the ROS. I have been exploring the many tutorials and looking at the inner workings of ROS. There is quite a bit to the system, a lot of which still doesn't quite make sense. Also, it seems that most of the code used is in Python, so that will be something interesting to start learning about as well when I start looking at the Turtlebot code. I was able to successfully install the Turtlebot stack and can now control the iRobot using ROS and a keyboard. This consumed most of my weekend as I encountered many difficulties when it came to using ROS and the USB cable with the iRobot (I had permission problems with the USB cable that I encountered in Linux that I had not encountered with Windows 7). Anyway, I was successful and will be showing this to the class tomorrow in the demonstration (I sure hope it works). Oh, and I also got a laptop this week from Dr. Pankratz which I will use for the project.
Tasks for the week
Well, it has been quite a week and a lot has changed since I last posted. First of all, it looks as though I will be scrapping the whole SURF algorithm. I received an email from Alex Popov (who currently works in the Robotics Department at the University of Minnesota) who led me in a whole new direction. I will now be using the open source ROS or Robot Operating System (link here). As the website says, it provides tools and libraries to help software developers create robot applications. It is based on nodes where one node can represent an iRobot controller, another a wrapper for a camera, etc. and then the different nodes can subscribe to and send/receive messages from each other. While researching ROS, I found a project that seemed to do exactly what I needed to do, called the Turtlebot. The Turtlebot itself comes as a package which includes everything you need including the iRobot. However, there is also a kit that can be purchased to add on to an existing iRobot. It includes a "shelving" type system as well as a kinect sensor (like the one used for XBOX Kinect) and other connectors. Using the Kinect Sensor and a couple of other sensors, the robot uses a technology called SLAM (Simultaneous localization and mapping) to create a virtual map of it's surroundings. This technology has been used in developing the Google self-driving car. After creating the map, the robot can drive to a specified point within the area of the map. Dr. Pankratz will be ordering this kit for use on this project, I am VERY excited! The Turtlebot then uses an ROS Stack to operate (one laptop is actually on the robot and another controls the robot; therefore the robot uses the wireless network to communicate between the two PCs). Since this Turtlebot uses ROS, I will be able to modify the code to add new features and play around with the operating system. Once again, this is a total redirect of what I had been thinking of doing, thank you Alex!
This past week, I have been busy learning ROS. I installed Ubuntu on my laptop (since ROS runs in Linux) and installed the ROS. I have been exploring the many tutorials and looking at the inner workings of ROS. There is quite a bit to the system, a lot of which still doesn't quite make sense. Also, it seems that most of the code used is in Python, so that will be something interesting to start learning about as well when I start looking at the Turtlebot code. I was able to successfully install the Turtlebot stack and can now control the iRobot using ROS and a keyboard. This consumed most of my weekend as I encountered many difficulties when it came to using ROS and the USB cable with the iRobot (I had permission problems with the USB cable that I encountered in Linux that I had not encountered with Windows 7). Anyway, I was successful and will be showing this to the class tomorrow in the demonstration (I sure hope it works). Oh, and I also got a laptop this week from Dr. Pankratz which I will use for the project.
Tasks for the week
- Continue looking at ROS and how I may be able to use other packages/stacks. DONE
- Look at the Turtlebot stack to understand how it works and all of its components. DONE
Some Success!
2/2/2013
I received the robot this past week along with a webcam. I looked at the Open Interface Manual for the iRobot Create and mostly understand how the robot is able to be controlled and do different things by sending different bit codes. I was successfully able to use Bluetooth and Alex Popov's project to drive the robot from my laptop, although it took quite some time (I found out the Bluetooth does not work with Windows 8, so I have to use Windows 7 instead). It was great to see the robot "running" around my room and under my control (too bad the vacuum part isn't attached). I also wrote my Philosophy statement and my website is now live! In addition I created a Gantt Chart that is also now on the website and updated my Timeline.
This week I also emailed Alex Popov and received some great information back about using SLAM with Kinect (to map out the environment of the robot), ROS (Robot Operating System) that I may be able to use for the control drivers and the "boring" things when using the robot, and other useful tidbits. I will be researching quite a bit this week to find out the best way to accomplish my project. I also need to look into maybe using Kinect, we will see what I find out this week.
Tasks for the week
I received the robot this past week along with a webcam. I looked at the Open Interface Manual for the iRobot Create and mostly understand how the robot is able to be controlled and do different things by sending different bit codes. I was successfully able to use Bluetooth and Alex Popov's project to drive the robot from my laptop, although it took quite some time (I found out the Bluetooth does not work with Windows 8, so I have to use Windows 7 instead). It was great to see the robot "running" around my room and under my control (too bad the vacuum part isn't attached). I also wrote my Philosophy statement and my website is now live! In addition I created a Gantt Chart that is also now on the website and updated my Timeline.
This week I also emailed Alex Popov and received some great information back about using SLAM with Kinect (to map out the environment of the robot), ROS (Robot Operating System) that I may be able to use for the control drivers and the "boring" things when using the robot, and other useful tidbits. I will be researching quite a bit this week to find out the best way to accomplish my project. I also need to look into maybe using Kinect, we will see what I find out this week.
Tasks for the week
- Research the SLAM and SURF algorithms as well as ROS to see what would be best to use on this project. - DONE (will use ROS and a Turtlebot)
- Decide whether or not to use the Kinect with SLAM. - DONE using Kinect with SLAM and a Turtlebot
- Look for a new laptop? (my screen is starting to give out on me). - DONE
The Project Begins...
1/27/2013
I received my project this past Tuesday (1/22/2013) and have started to research different aspects of the requirements. I am also starting to creating some type of outline to follow as this semester progresses. This website is supposed to be live on Thursday, so I should be good to go. I do have a current resume available but I still need to write a philosophy statement.
Tasks for this week:
I received my project this past Tuesday (1/22/2013) and have started to research different aspects of the requirements. I am also starting to creating some type of outline to follow as this semester progresses. This website is supposed to be live on Thursday, so I should be good to go. I do have a current resume available but I still need to write a philosophy statement.
Tasks for this week:
- Create a "Hello World" project so I actually have something started that I can use. (I am thinking of using a Windows Forms application) - STARTED
- Look at Alex's project to see how to interact with the robot. - DONE
- Finish the first iteration of my website and write my philosophy statement. - DONE
- Continue researching the SURF algorithm and how I might be able to use it. - DONE (not using)
- Get the robot/sensors. - DONE
- Create an outline. - DONE