Since my last update two weeks ago, I have learned a lot about how UI is done in VR applications within Unity. I was able to create the entire main menu, including difficulty adjustments and song selection capabilities. I have yet to create a pause menu, in-game score display, and end of song menu, which will be my next goal. I have fallen a little behind on schedule since midterms were last week. Because of this, I wasn’t able to work on the project as much as I would have liked. On the bright side, my schedule is now two classes lighter, which gives me plenty of time to catch up. Below is a video displaying the in-game song selection and using the main menu to tailor the game to the song.
On another note, I met with Dr. Pankratz Thursday, March 11th to discuss beat detection. Everywhere you look for beat detection online you’re sure to encounter Fast Fourier transform (FFT). We discussed this and the concept of breaking up audio into different frequency bands, and using these bands to determine what constitutes a beat. I told DCP of the beat detection algorithm I found that I incorporated into my project. I expressed that I wasn’t super pleased with how it worked, but I was happy that I have something. After this, we talked about having two separate modes; one for beat detection, which may or may not work well for certain songs, and one for beats per minute, which will spawn cubes at a constant rate the entire song. Once I finish the rest of the UI and start working on beat detection more, this would be as easy as throwing two radio buttons on the main menu screen and polling a game mode flag when the game runs.