Final Post

5/11/2021

Well, this is it. This is the last blog post. A lot has happened since my most recent blog post. For one, I have completed my project. All of the functionality that I wanted to get into it is implemented. There is of course still more that I could do if I were to continue working on this project, but that will always be the case. Secondly, I had my actual presentation of my project. I thought that it went very well! My classmates were very supportive as were the faculty and I got a lot of good questions and kind words of encouragement from everyone.

I don’t have much more to say. This capstone class has probably been my favorite course I have ever taken. The opportunity to build an application (and an interesting one for that matter) from the ground up by myself was awesome. The support that I had not only from my peers but also from the incredible faculty was amazing. Special shoutout to Dr. Pankratz for his incredible Mona Lisa and his interest in my progress, to Dr. Diederich for being the one who suggested I get this project and helping me throughout, and of course to Dr. McVey for being an unending well of support and knowledge. Thanks for everything, I can’t wait to see what comes next.

Things are Coming Together

4/17/2021

The method I’m using for saving and loading settings is fully implemented at this point. I have a few more things that I need to implement into the UI such as a quick view of the current user settings in the main menu and a reworking of some of the sizes of buttons and such.

The current UI with the current user labeled and a dropdown menu to change it

The next thing that I really need to work on is developing test applications that will evaluate the precision and accuracy of my application. This is something that I have been putting off in lieu of working more on the application itself but, seeing as we are getting pretty close to presentations at this point, I really should get some of these tests developed.

Walkthroughs

4/1/2021

Today was my day to present a walkthrough of the progress I have made so far to my fellow classmates and professors. In preparation for this presentation, I added the most important feature imaginable to my project… a dark mode setting. Finally, now it may actually be usable!

The all important dark mode setting

I thought the walkthrough went very well. I really liked seeing what everyone else has accomplished at this point as well as showing everyone the progress I had made. I got a lot of great feedback regarding adjustments and improvements I could make to my project. I really appreciate the suggestions and I am looking forward to implementing them soon.

I’m ‘Click’ed off

3/25/2021

I met with Dr McVey, Dr. Pankratz, and Dr. Diederich today to show what I had developed so far. While they enjoyed using it for themselves and I did get a great picture of some art Dr. Pankratz created, I was unsatisfied with how the click functionality worked (or, more precisely, didn’t work). They did have notes for me regarding the UI which I will certainly have to work on, but I am prioritizing getting the click functionality to work well.

Dr. Pankratz’s ‘Mona Lisa’

I was running into a lot of issues with getting signed out of windows when I clicked on the Windows Form. I was very confused as to why this was occurring. After looking into some documentation, I think the issue is that I was running mouse interrupts from within the form which causes issues when the CPU also has its own way of dealing with mouse interrupts. I still don’t fully understand the issue, but I found that most people found a way to avoid this issue using a thread to place the Windows Form upon. In order to accomplish this, I had to look into threads and how to implement them. I was finally able to accomplish this and now, from my preliminary testing, the clicking certainly works much better than it had. 

Feeling Sensitive

3/16/2021

Since the last update, I have implemented a sensitivity adjustment module to both my code and my Windows Form application. In order to accomplish this, I had to reimplement the way I had dealt with smoothing using a class so I could more easily adjust the bin size and thus, adjust the sensitivity of the tracking. The tracking sensitivity is directly linked to the size of the bin (how sensitive the average position is to a new data point).

I also added a slider to the Windows Form so the user would be able to adjust how sensitive the tracking is on the fly. I want to work on clicking next and allow the user to set a button to click with as well as a way for them to save their settings.

A look at what my WinForms application currently looks like

I Can Click

3/10/2021

After meeting with my professors, we have come to the conclusion that I will be allowed to use one button on the keyboard to accomplish clicking tasks. There are a couple of reasons for this. Firstly, it would be very difficult to incorporate clicking functionality using only the technology that is at my disposal. The first idea was to use blinking which would be ineffective as when a person blinks, their gaze moves downward (or at least that is how their eyes are tracked at least). This would make it very difficult to click accurately. Another idea to click would be to use winks. Unfortunately, not everybody is able to wink and, even if we did try to implement that, the API that I am using does not allow me to track individual eyes, only the collective gaze. The second reason that we decided to allow me to use a single button would be that there are many interfaces that allow users with motor disabilities to still press at least one button.

After we decided to allow that, I got to work on how to actually implement that into my project. I eventually figured out a way to watch for user input of a specific key on the keyboard and to click where the current position of the mouse is. There was a lot of debugging to do. However, I am now able to set a keyboard hotkey to use as a left mouse click instead of using the actual left mouse button.

Smooth as Butter

3/1/2021

After a good amount of work, I was successfully able to implement a smoothing algorithm that helps the eye tracking function stick to the user’s eye more closely. Natural sudden jitters from the eye make tracking a precise point the eye is looking at and always mapping the mouse to that position, causing the mouse to jump around and making the application very difficult to use. That is the reason that I had to implement a smoothing algorithm.

Implementing this algorithm was somewhat difficult, instead of setting the mouse to wherever you were currently looking, I kept a rolling average of the last 5 positions that the user was looking. After debugging this implementation, I immediately noticed the improved user experience. I will continue to brainstorm better algorithms that can work efficiently and consistently.

Fun with Forms

2/23/2021

Over the last week I created my PERT diagram which will give me a basic outline for the flow of my project progress. It will give me some goals to try to meet. Also over the course of the last week I started to work with Windows Forms a bit to learn how to use it and decide if I could use it to create a UI for my project. After some experimentation, I was able to migrate my project over and get a basic UI layout that allows the user to toggle the eye tracking on or off using a button. Additionally, I have another button that opens an options menu which I have not done much work with yet. Getting this UI up and running (even though it is incredibly basic) makes me feel like I am making a lot of progress as now I have something very clear to show for my work.

BIG Progress

2/17/2021

We have liftoff! I have spent the last few days experimenting with the Tobii Stream Engine API and looking at the documentation online (which has been invaluable by the way, very well written and easy to understand). The engine works mostly off of an event based system so you subscribe and unsubscribe to whichever streams of information is most useful to you at the time. I have not done much work with events but I looked at some tutorials and I think I understand them pretty well now. 

Anyway, after some initial difficulty with properly configuring the libraries, I was able to get a message to pop up on screen saying my device had been recognized. That felt great, but I didn’t stop there.

Recognizing my connected device for the first time

I went on to subscribe to the gaze_point data stream and after some fiddling, I got the coordinates of where I was currently looking to show up on screen! Now, THAT felt awesome! Tomorrow I plan on looking into how to move the mouse programmatically. We also have our mini-poster sessions tomorrow so I am going to plan for those a bit and then call it a night.

Coordinates of where I am looking at on screen as I move my eyes around

Delay After Delay

2/14/2021

Today I finally got to start work on my project. Over the past few days I have been having lots of setbacks. First I got the Tobii Eye Tracker 5 on Tuesday the 9th but wasn’t able to use it with my Macbook. I was able to get a lab machine that I would be able to use for the rest of the semester after a day. Unfortunately, when I tried to actually use the eye tracker, the Microsoft Store (which I needed to download a related piece of software) would not let me install any software due to a permissions issue. Finally, after taking the problem to the IT service desk, I am able to get started on experimenting with my project today, and I’m very excited about that.

Fresh out of the package

Laying Some Groundwork

2/7/2021

I worked a lot on my website today. I used WordPress on Knight Domains to accomplish this. The website is not completely done yet but it gives me a spot to post these journal entries at least. Over the weekend, Dr. McVey-Pankratz ordered the Tobii Eye Tracker 5 which I plan on using as my eye tracking hardware. While I am waiting for that to be delivered, I have been looking into the API’s that Tobii has available to interact with their hardware. There is an SDK made for Unity which I plan on looking into a lot because I already have some experience with Unity. There is another API called the Stream Engine API which is a library that I could use as a library in my choice of a couple of different languages. I will continue to look more into this. I am excited to learn the ins and outs of how to use this technology once I get my hands on the hardware which should arrive in a couple of days.

First Post

2/3/2021

Today, at approximately 10:00pm we received our problem statements from Dr. McVey Pankratz. My problem requires me to develop eye tracking technology and software to allow it to control the position of the mouse on screen. There are more requirements including creating tests to understand how effective my solution is, but that is the basics of it.

From some basic research, the applications of eye tracking technology mostly lie in the realms of accommodations for physically impared individuals, tools for medical/psychological research, and potentially for virtual reality technology. 

My first impressions of this project are apprehension and nervous excitement. This is a monumental task for me to conceptualize and I have many questions and clarifications before I will be able to fully get underway. There are many potential roadblocks that I see on the path forward and the terrain looks difficult to navigate as I have next to no experience in this field. But I will take the problem apart piece by piece and try to forge my way forward. As for now, I hope a fresh mind in the morning will enlighten me with a renewed sense of confidence and innovations. Goodnight.

css.php