SNC Augmented Reality Virtual Tour
Menu
Project updates
My plan for actually implementing the augmented reality technology is to use the phone's camera to point at nearby buildings. While a building is being pointed at, information about the building will pop up.
The time has come for me to start working with the camera. The first step was learning how to place buttons on top of the map. It ended up being as easy as switching the layout to a Frame Layout. The next step was figuring out how to access the camera from the app. It turned out to be pretty straight forward--documentation online proved to be helpful. My code appears to be working, as far as I can tell. Thanks to my logs, I know it creates a camera object and assigns it to the layout, and I know it releases control of the camera when the onPause function is called. But as for the camera itself? I can't quite get it to display. I plan on taking the time to go through my CameraView class and make sure it is actually doing everything it needs to do. I will also be checking the layout to make sure the CameraView is being added correctly.
0 Comments
It wasn't until I tried creating a geofence that I realized I had no way to test the it. For some reason, my location would never update, so I could never be able to tell whether or not enter or exit a geofence unless I started right in one.
I am happy to say that now that has been fixed. My app knows where you are, as long as you grant it permission. I have also added code that let's the app actively ask the user for permission, instead of requiring the permission without telling the user it needed it. I'm still trying to get my entrance events to display how I want them to, but a lot of progress has been made. My original plan for identifying buildings on campus was to grab the coordinates of each of the corners of the buildings. From there, I would be able to determine that anywhere between those points would be a part of the building. This seemed like a rather daunting task, and I wasn't sure how it was going to work.
The student who worked on this project last year used Geofences to create an area of effect around each building. I decided to look into it, since from what I remembered, it would be a lot easier than what I was planning. To my surprise, geofences are a free android package. And instead of requiring I have multiple coordinates for each building, I will only need one. I will also need a radius to determine how big the area of effect is. This requires me to restructure my database a little bit, but it should not be a problem. Geofences also allow me to trigger an event whenever the user enters or leaves an area of effect. This should allow me to pop up information about a building whenever the user gets close. This does require I ask the user for permission to use their location, but that turned out to be an easy thing to add. My current goal is to get one working around my dorm building. From there, I will need to see what I can do to trigger an event when I enter or leave the area. Last week, I was granted access to a database on the server. The next step was to design my tables. At this point, I have three tables: one to hold admin account information, one to hold building information, and one to hold building coordinates. The admin table is one I do not foresee changing much. The other two might change or even end up becoming one table. Currently, my plan for recognizing buildings is to store the coordinates of the corners of each building, and everything within those coordinates would be the building. However, I am looking into geofences to see if that would be an easier solution. The solution I choose will impact the structure of my tables. But for now, I will create the tables I have designed, insert some test data, and try to connect the android app to the database.
After I spent some time prioritizing my goals and doing some research, I put together a PERT chart to help a project timeline. You can view the chart below (you may need to zoom in a bit). The current estimates for each design goal include some setbacks. The overall project also finishes a bit earlier than expected, so if need be, deadlines can be pushed back a little bit. Creating this chart has helped me define what goals need to go in each of the project's Scrum cycles. Now that the initial planning is done, it is time to dive in!
After three and a half years, the time has finally come: it is the capstone semester. Once this project is complete, it is on to graduation, and then onto the rest of my life. Pretty scary, right?
The project itself is the Augmented Reality SNC Tour application, but there is a good chance that if you found this blog, you already knew that. Getting this application started is the part I fear the most, but the ball is already rolling. I have decided to use Android Studio for this project. This is for a couple of reasons. For one, I am already familiar with the interface. I also know that there are a lot of packages available for Android Studio, and that might end up helping me out with the project. The next step is doing a little more research so I will be able to sketch out a rough battle plan. My goal is to be rolling by Monday. |
AuthorAnastasia Montavon is a senior Computer Science student at St. Norbert College, expecting to graduate May 2018. Archives
March 2018
Categories |
Create a free web site with Weebly