Reflecting on the VRTK project, I accomplished two of the three goals (teleportation and object interaction) within the timeframe I had set. The third goal of triggering sound effects remains unfinished. I tied sound to actions but was having an issue of syncing the audio to the event action. Audio effects can be part of a future quick sprint to focus on this one element in interaction. Having a smooth integration between visual selection (when an object is selected, the effect changes) and audio selection response helps reinforce immersion in the environment. The issue encountered was one of time, rather than a matter of implementation. Audio work in Unity is a bit more involved when looking to match animations to event triggers.
Object effects and audio in Unity are two things I would like to explore in a future sprint.