Weekly Blog

The documentation of every step along my journey to create this project.

Week 01 2/2/2026

Getting Started

The first week of work consisted of a few things. First, I needed to do some housekeeping and installing of software on the MacBook I was given in order to get started on my project. Second, I completed my capstone website so that I am able to update and blog my work. Lastly, I focused on finding documentation and tutorials to learn how to program in Swift for the visionOS. This gave me a good idea of what to work on and learn first so that I can get rolling on the main project.

Week 02 2/8/2026

Learning Swift

This week was spent working on small projects and going through documentation to learn Swift and get familiar with the simulator and UI in Xcode. I completed two small applications: a chatbot app that dealt with simple UI and text boxes on the screen, and then a small weather displaying application that both helped me understand variables, syntax, properties, and more about Swift. Once that was done, I started working through VisionOS documentation, following a tutorial on a VisionOS app with spatial computing. This helped me get familiar with the syntax of Swift, developing a VisionOS application, and the documentation they have available for use.

Week 03 2/15/2026

Learning Swift Pt. 2

I spent my time this week working through the VisionOS applications regarding ornaments, multiple windows, and 3d models in the reality space. These projects really helped me finish up the very general idea of developing applications with the Apple Vision. I was then able to actually hook up the headset with my laptop, and got a real look at what one of my simple applications looked like on the actual Apple Vision headset. This will give me a good idea going into the next week for starting my project.

Week 04 2/21/2026

Identifying Walls

Very awesome and cool progress this week. I started with working on Kyle's old Capstone project and "successfully" transferred it to a visionOS project instead of an IOS project. It wasn't exactly the same, as I was able to transfer over the UI, but the main function of the app still seemed to have some missing pieces. On the other hand, I started out with making sure that all of the visual permissions were set up and creating the ARKit virtual space. Using the debugger, I was successfully able to see the wireframe around everything including the built-in surface and object detection, allowing me to see walls and floors that it identified, as well as a wireframe mapping out the room. Once I did this, I took the walls that it did detect and was able to create a red ball that captured the anchor and center point of the wall. I was then able to use some math in order to paint a transparent plane over the area that it detected as the wall. This part was super finicky, and did not do a very great job of actually detecting the entire wall, so I think a next step will be being able to manually adjust where the wall is.

Week 05 3/1/2026

Selecting Walls and Working on UI

Below is a video regarding my current progress with this application. I wanted to get started on the UI and being able to select walls and have some sort of way of differentiating them from one another. To do this, I chose to display the world space coordinates of these walls when I selected them (this actually ended up being a little lamer than I thought because they were pretty similar in values to each other). But nonetheless, I was able to see the difference between walls and how it created each of them. I was able to use the headsets ability to see where the user is looking and make that my way of choosing walls. The user just had to pinch on the wall they are looking at, which is highlighted so you have a visualizer for which wall you are looking at, and that selects the wall for them. I made a little more progress since this video, and I am working on polishing up the UI so that I can eventually change colors on the walls.

Week 06 3/8/2026

Updated UI and Stuck Fixing the Walls

This week was good progress, but not the progress I was planning for at the beginning of the week. An issue that has occurred, and that you can briefly see in the video, is that the walls cover up objects in front of them and aren't really "there" up against the wall. It is just creating the plane in the video feed and not as an object in the world. This is an issue, because how would people know how the color of the wall looks with their furniture? I struggled on this, and did not reach a possible fix that looked good. So, I decided to stray from that for now and work on the UI. I was successful in being able to recolor walls, hide modifiers, and stop updating of the ARKit so that it doesn't keep resetting. I thought this was good progress, but was still frustrated with the main issue. I did fix an issue where the immersive space did not open when you ran it solely on the Vision, but it worked while running from the laptop via cable, so now it will run no matter what.