Original Activity Bot 360 setupActivity Bot 360 InfoThe Activity Bot 360 is a modular robot from Parallax. The robot’s main component is the propeller activity board, which allows the robot to send and receive signals from various sensors. This allows the robot to be controlled by programming. When we attach various sensors they can update different variables in our program so the wheels can adapt to the surroundings. This is similar to how self-driving cars work, but at a much simpler scale.
Ultrasonic Distance Sensor Test
The ultrasonic distance sensor marks the time between an ultrasonic wave being sent and bouncing back to calculate distance. This sensor gives fast and accurate distance measurements of what is in front of it. Using this we are able to test a sensor that is actually used on real self-driving cars (mostly for closer range information). The program we used to create the test video continues a loop of forward movement until something comes within 30 centimeters. When something is too close, the robot turns a random direction until it gets an ultrasonic reading far enough away.
Video Bellow: https://youtu.be/h8xeGOR889s Program:
// ------ Libraries and Definitions ------ #include "simpletools.h" #include "abdrive360.h" #include "ping.h" // ------ Global Variables and Objects ------ int turn; // ------ Main Program ------ int main() { drive_setAcceleration(FOR_SPEED, 300); while(1) { drive_speed(64, 64); while ((ping_cm(8) >= 30)) { pause(5); } drive_speed(0, 0); turn = (random(1, 2)); if (turn == 1) { drive_speed(64, -64); } else { drive_speed(-64, 64); } while ((ping_cm(8) < 30)) { } drive_speed(0, 0); }} ColorPAL Color Sensor Tes
The ColorPAL color sensor uses an LED light spectrum with a broad spectrum light-to-voltage converter to determine an approximate color value. This sensor is slow and must be very close to the color for an accurate reading. For this reason it had to be mounted underneath the board, only a few inches from the ground. Overall the sensor is not really optimal for real self-driving conditions, but it does allow us to demonstrate how color is another essential piece of information for self-driving. The program we used has different colors on the ground/road change how the robot moves.
Red = Stop, Yellow = ½ Speed, Green = Start/Speed up Video Bellow: https://youtu.be/m1sjJe9geJs Program:
// ------ Libraries and Definitions ------ #include "simpletools.h" #include "abdrive360.h" #include "colorpal.h" #include "colormath.h" // ------ Global Variables and Objects ------ colorPal *cpal2; int cpRR = 0; int cpGG = 0; int cpBB = 0; int Color; int Left; int Right; int Pause; // ------ Function Declarations ------ void Set_Speed(); void Color_Detection(); // ------ Main Program ------ int main() { cpal2 = colorPal_open(2); drive_setAcceleration(FOR_SPEED, 300); while(1) { Color_Detection(); Set_Speed(); } } // ------ Functions ------ void Set_Speed() { drive_speed(Left, Right); pause(Pause); } void Color_Detection() { colorPal_getRGB(cpal2, &cpRR, &cpGG, &cpBB); Color = colorPalRRGGBB(cpRR, cpGG, cpBB);if ((compareRRGGBB(Color, 0x009900)) > 160) { // Green = Speed Up/Start Left = 40; Right = 40; Pause = 1000; } else if ((compareRRGGBB(Color, 0xcc0000)) > 170) { // Red = Stop Left = 0; Right = 0; Pause = 1000; }else if ((compareRRGGBB(Color, 0xcccc00)) > 150) { // Yellow = Slow to 1/2 speed Left = 20; Right = 20; Pause = 1000; } } Multi-sensor test
Using both sensors we can give the robot feedback of which direction to turn, as well as the ability to adjust the speed using color. In real self-driving technology there are usually 3 main sensors and many other sub sensors. This makes having multiple sensor capabilities vital. For our program we use the distance sensor to tell us when we need to initiate a turn and the color sensor to tell us which direction we need to turn. This allows us to make a full loop around the “road”.
Green = right turn, red = left turn Video Bellow: https://youtu.be/qmW0UoP2DEw Program: // ------ Libraries and Definitions ------ #include "simpletools.h" #include "abdrive360.h" #include "colorpal.h" #include "ping.h" #include "colormath.h" // ------ Global Variables and Objects ------ colorPal *cpal2; int cpRR = 0; int cpGG = 0; int cpBB = 0; int color; // ------ Main Program ------ int main() { cpal2 = colorPal_open(2); drive_setAcceleration(FOR_SPEED, 300); while(1) { drive_speed(40, 40); while ((ping_cm(8) >= 20)) { colorPal_getRGB(cpal2, &cpRR, &cpGG, &cpBB); color = colorPalRRGGBB(cpRR, cpGG, cpBB);pause(5); } drive_speed(0, 0); if ((compareRRGGBB(color, 0x00ff00)) > 165) { drive_speed(80, -80); } else if ((compareRRGGBB(color, 0xcc0000)) > 150) { drive_speed(-80, 80); } pause(200); while ((ping_cm(8) < 20)) { } drive_speed(0, 0); } } Final Activity Bot 360Robot Versus Real-LifeIn the real world, sensors need to be able to work at high speeds, on icy roads, during foggy nights, and when the sun is shining directly in a camera lens. The amount of scenarios a car’s program would need to be adaptive to, as well as the unpredictability of the millions of miles of road, shows how complex getting to full autonomy is. The Activity Bot 360 allowed us to try and understand how this works on a smaller scale. Both self-driving cars and the robot use sensors that interact with the environment to give feedback to the computer which, through code, controls the wheels/movement . This at the end of the day is all level 1 self-driving is.
|