Journals

2/4

Website is beginning to come together in a primitive form.  I met with a student who did work on the Create last semester to get a better understanding of the capabilities of the robot.  I also searched online for possible accessories for the Create.  The only manufacture sensor I was able to find was a light sensor.  There was little documentation on how effective the sensor was.  It seems others who have used this robot for projects have used 3rd party sensors and attached them to the robot and were able to communicate with it.  My ultimate goal is to NOT have laptops attached to the two robots, however as of right now I am not sure how to manage it.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/8

I was finally able to read data from the sensors on the robot. I learned that the robot
only sends data back when it is told to. Once the robot is told to send data, I just wait
for the event to be received by the COM port and then I can interpret it. I have not spent
much time deciphering messages yet. They have a distinct pattern to them, it will just take
time to determine what it all means.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/9

I continued work decoding the information i was receiving. The data it sends back is not
as consistent as I had hoped. The one bright side today was that I was able to convert
the data into integer form which allows me to check what it is saying with the graph in the
manual. It updates what it sends every 15ms and i was checking every 500ms. Even when the
robot was stationary these values were extremely dynamic. I feel like the timing of when I
interpret these events is going to be crucial and most likely the first response received
when polled will be most important. Only time will tell.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/15

Spent a great deal of time trying to get light sensors to work. The command module comes with an example program that uses the light sensors. If I can first get this program to work, then I can easily expand upon it. I believe I had the correct light sensor, but it also requires a transistor to work properly. I used what I thought would be an acceptable transistor, but no matter what I tried I did not get the correct program output. It asked for a 10K transistor and I used an 8K. I do not have much expertise in this field, so everything is learn as you go. I'll figure out exactly what I need during my next meeting.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/16

I completely redid my website today. I finally have a nice template and can easily update as I go. I do not have very extensive web development skills however, what little skills I have are quickly growing. This is still not going to be final design. As I learn new tricks I'll certainly implement them as I go.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/17

I continued work on my website today. There were many minor implementations that I assumed would take a short amount of time that ended up taking a great deal of time. I am learning more and more that it is important to finish things as soon as possible. The bonus to finishing things quickly is the ability to go back and add better features. The small victories may not be noticable, but they have come with a lot of time and more importantly- a great deal of thought. I also edited my Gantt chart. I put more thought into how long everything will actually take and how things would realistically follow eachother. I plan on updating this quite often as I learn my abilities.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

2/22

After many long hours spent in the small room on the top floor of the PAC, I was finally able to get the light sensors to work properly.  This was a major break through in my project.  I needed to get passed this hump before I could go any further.  Now I can move further with the fun stuff.  I get to dig into the code, start placing more than one sensor, etc.  The next task will be to get two or even three sensors working independently of each other, but all contributing to a master algorithm that will give the robot the correct heading.  The code is all done in C, which I haven't had too much exposure to, but it is similar enough to C++ that I will make huge strides forwards in a short amount of time.  Another step to be taken soon will be to find the limitations and capabilities of the sensors.  Ideally I wish to be able to follow a flashlight and eventually be able to follow another robot with a light beacon attached to it.  With a greater understanding of exactly how the light sensors work I can already imagine moving on to other sensors such as IR or ultrasonic.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/1

The next big step for my project is to communicate with the robot via the command module.  It can be done one of two ways.  Either the USB drive on the command module or the built in serial connector on the Create are capable of sending information back and forth.  The major problem I'm running into is that the loaded program does not work properly when the robot is tethered.  I have searched the code and can't find anything that tells it to perform this way. 

Although it is not very smooth, I have tried to let the program run, then abruptly plug in the USB control.  It works for a little while and DOES appear to be sending information back, however shortly after it goes into some infinite loop and nothing works properly.   This is obviously not the solution.  I could possibly send the light signal back in beeps, meaning every 10 units of light could be one beep from the Creates speaker.  This is more laborious than seeing real time read outs of the light sensors, but is the most fool proof.  It's getting closer and closer to the point where I will give in and debug this way.  It is important to know the readings of the light sensors, but it this task is drastically hampering my progress.  If no large improvements are made in the next few days, this is the route I'm going to take.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/8

I began testing different settings for the serial port. There were recommendations in the command module manual that appeared to be fool proof. What I found was the complete opposite. I ran into similar problems with these settings as I did with the default settings. Because of these frustrations I am going to move forward with the robot tuning. The next step is going to be to have the robot move toward an extremely intense light source. It will not be as accurate as I intend, but it will at least give me a way to move forward. Here were the results for the serial port settings.

  UBRR0 = 19;
  UCSR0B = (_BV(RXCIE0) | _BV(TXEN0) | _BV(RXEN0));
  UCSR0C = (_BV(UCSZ00) | _BV(UCSZ01));

Using the above default code, does not allow the USB cord to be plugged in.  As soon as it is plugged in, the program does not run at all.  The initial series of lights that indicates the proper program is loaded, immediately change when the cord is attached.

  UBRR0= 19;
  UCSR0B=0x18;
  UCSR0C=0x06;

It did nothing with this setting.  The lights changed again, this time in a different way. Some of the the indicator lights remained the same, but two key lights turned off(located on the command module).

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/11

I began working on the prototype design for the light sensor mounts.  I did a lot of cutting, gluing and taping.  It may not look pretty, but it gets the job done.  I now have the light sensors mounted at 3 distinct point 120* apart from each other.  There are cones surrounding the light sensors in an attempt to amplify the light caught by each sensor. 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/12

A milestone was made today .  I can finally run the program on the robot with the USB attached.  This gives me the possibility to debug (send data back).  Dr. Pankratz and I solved many little issues as well.  As great of a hurdle that this was, I still have two more major hurdles in the near future.  The first hurdle will be to achieve communication with the two other ports that now have light sensors.  The second hurdle will be to CORRECTLY debug the data being sent back.  It is great to have data flowing in, but if I can't understand it it is as bad as having no information. 

p.s. - I am now getting appropriate light levels.  This was a huge milestone.  I now know that light levels come back in the range of 0-255. I was consistently getting 40's and 50's in low light and 120's in high light.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/22

After a nice break, I finally came back to my robot.  I began working on dissecting the readings and figuring out what kind of light conditions produce what intensity of light.  I ran into a problem when trying to do comparisons of light readings.  I was not able to do comparisons of different light readings when the numbers were 8 bit int's, but was able to do comparisons when they were 16 bit int's.  The new problem I am running into is that the light readings are consistent, however the starting value is inconsistent.  For example, I tested the light reading at one point and it was consistently reading a level of 20-30.  An hour later, in the same room, with the same lights on, I had consistent readings of between 190-210.  This is going to make writing an algorithm difficult.  I have not finalized my design on the light sensor apparatus, and I feel like that may make a difference as well.  If worst comes to worst, I may need separate algorithms that are tweaked for low light and high light conditions.  Either way I will have a demonstration ready for Thursday that will show my control of the robot.  It will be primitive yet powerful.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/24

I am having lots of problems getting a simple "if" statement to work.  I am comparing numbers such as 120 to 50- I ask if 120 is greater than 50, then do some data transfer.  The problem is the "if" statement is always true, so it does data transfer regardless if the number is greater or less than 50.  I have tried 8 bit and 16 bit int's.  I have not messed around with unsigned vs. signed int's, so that's the next task.

I have not tried to get other light sensors going recently.  The demo I am preparing for Thursday will only involve one anyways, so I'm focusing my energy where it is most needed. 

That's my plan as of now:  Gain control of the comparison statement and get multiple light sensors working correctly.

(short break)

I am getting slightly more control over the comparison statement now.  I have tried both signed and unsigned ints and both seem to behaving in the same fashion, so I do not believe that was my problem.  I am able to do one bit-wise operation on the number that works correctly, but as soon as I do another operation on the same variable it messes up.  I am at a complete loss for this, I hope I am missing something obvious.  Now, I'm shifting my focus on getting multiple light sensors working.

I made big progress with the multiple sensors.  I discovered a typo in the manual that had me hung up for awhile.  I now know the correct bits to turn on and off for the individual light sensors to fire. 

As my knowledge of the pins and I/O increases I am finding shortcuts.  For example, in the example program, the processor reads from one input location located on the top center ePort.  However, it initializes every pin to be input.  The remaining 7 pins are never used, but are set in the "input" position.  I was struggling to figure out exactly which bits I should have set in the input position when I could have left them all input and moved on.  The more I learn, the easier things are becoming.  In a quick summary, what I need to do is A) leave every bit in the input position and simply B) tell the processor which pin to read from just before I need a reading. 

Note to myself:  Start in the Initialize function, everything has to do with DDRC, ADMUX, OR ADCSRA or something in that vicinity.  Damn close!  Working on the top right ePort pin 2, processor pin C2!

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3/30

Finally got the "if" statement working properly.  To get it to work required some odd manipulation by me.  For example, the "if" statement was always true when I did

if (light_now > 100){
//code
}

Whether the light was coming in was over 100 or under 100 it still ran the code in the statement.  However when I declared a variable it worked correctly.

uint8_t temp = 100;
if (light_now > temp){
//code
}

The if statement worked as it should.  The code ran when it was above 100 and did not when it was under 100.  These were the ONLY changes I made to the code.  Apparently if it doesn't know what type the number is, it accepts anything.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/2

Finally got all three sensors working at once.  I meticulously followed every command that was related to reading a sensor and found that I was testing the correct pin the entire time, my problem was that the pin was in a different location than I thought.  I had to change the configuration of the transistor and light sensor.  It was a hardware issue and not a software issue.  This was a huge step in my project.  I can now start building the algorithm that will be my final project. 

I have many designs in my head, but I won't know anything for sure until I start experimenting with the code.  Another thing I can now start is the final build of the robot, meaning final positions for all light sensors.  The current robot is a prototype for the final build.  The final robot will be similar to the prototype, but cleaned up considerably.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/7

Had trouble concentrating today .  Got a few modules written and did a lot of commenting.  That was about all my brain wanted to do today . 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/8

Finally got all the hardware I need for my final build.  Began constructing final mount.  I still moved ahead in the code and began more testing of DCP's inspired algorithm.  I wrote a few more modules as well.  Everything seems to be going great at this point.  Now, I need to find limitations on the light sensors, meaning how reliable they are given the lighting conditions.  From this point on, there will be a lot of trial and error.

Right now I am running into similar errors as before.  I am having difficulty using standard operators (+,-,*,/).  The algorithm is correct, but I can't input it into the computer correctly.  It does not like it when I do an expression such as : variable = a + (b/(b+c)*120 :  there is nothing wrong with my syntax, it is something with the language itself.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/16

It has been awhile since I wrote a journal, but that does not mean I have not been working on the robot.  I have put several hours into coding and debugging every day for the past week.  I overcame many hurdles in the debugging process.  The majority of my problems came in the form of mismatched types.  I solved this with a whole slew of type casting.  With these problems solved I was able to go full force with my algorithm.  I have some awesome news...  I successfully found a stationary light source!  There are a few things that still get it hung up.  It functions best when there are no objects around aside from the light source.  The surrounding objects can cast reflected light which the sensors love to pick up. 

The next step is to fine tune my algorithm.  With the speed at which progress is being made, I feel I will have this done as early as tomorrow .  After that I need to obtain a light, hopefully a wireless one, and a VERY bright one.  The brighter the light source the easier the robot picks up the difference in light.  The greater the contrast, the better the algorithm works.  For it to work efficiently in a brightly lit room, the light attached would have to be a very high intensity.  I'll go shopping tomorrow and see what I come up with.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/19

Continued work on my algorithm. It's extremely close to working correctly, I'm just ironing out the last details. Also, I had to change how I had the sensors mounted to the robot. I no longer use the cones around the lights, because they were blocking too much. The three lights are now positioned at 120 degree intervals, but are by themselves. They are pointed outward and slant towards the middle. This allows two sensors to be directly hit by the light. I got much more consistent results with this modification.

I also went out and bought a batter powered lantern, but it wasn't bright enough. I believe now, I am going to have to use a corded light and make sure the cord doesn't get in way of the robot. It's the most fool proof way to run a test.

I have the bump sensor working in the algorithm as well. The part I am struggling with the most is when I tell it to scan for the brightest light source. I have not found a consistent way to find the max. I will figure this little bug out through a lot of testing.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

4/23

i finally got it detecting bumps.  i had to use a combination of the open interface manual, the command module manual, and the existing code.  at first i thought there was a bug in the existing code (it still may be, but i'm not completely sure).  once i got over the fact that their might be a bug i worked exclusively from the open interface manual for my numbers. 

existing code:  i used a few modified function codes from the existing code and called them in similar places as the old version (i made a new delay and update sensors)

command module manual:  found out i needed the function byteRx();  this is used to read the data after i request it.

open interface manual:  i got all my reference numbers like packet ids and op codes.

here's the code that matters...


uint8_t sensevalue;

main(){

delayAndUpdateSensors2(20);

    if(sensevalue > 1){
        sensevalue = 0;
        //do algorithm 
    }
    else {
       //do other part of algorithm

    }

}//end main

void delayAndUpdateSensors2(uint16_t time_ms)
{
  timer_on = 1;
  timer_cnt = time_ms;
  while(timer_on)
  {

    byteTx(CmdSensors);   //op code to request sensor info
    byteTx(7);                       // ask specifically for wheel drops
    sensevalue = byteRx();  //store wheel drop data in global variable sensevalue

  }
}

 

 

 

 

 

| Home | Project | Philosophy | Journals | Resume | ©2009 John Rappel