We knew that we wanted to work with VR and include physical objects as part of the interactive experience. The physical object ended up being a wooden structure resembling a cannon on a stand. In order to interact with the cannon while wearing a VR headset, a VIVE Tracker was fitted on the cannon, and the tracking worked surprisingly well and was very accurate.
Our overarching goal was to create a graphically good-looking interactive experience that embraced full body movement. We wanted to put effort into the graphics, while also making sure that the interaction felt natural and enjoyable. To accomplish this, we chose to focus on getting the interaction part done as soon as possible. It included building the cannon and setting up the wireless VIVE Tracker. The rest of the time was spent on the graphics, which inlcuded water, particle effects, and modeling, among others.
The 3D models used in this project is a mixture of our own work and free assets found on Unity Asset Store. Due to the short time frame the decision was made to use an existing water shader which we then adjusted to meet our needs. We played around with the wave speed and foam in order to create calm and stormy appearances of the water. The colours were also adjusted to achieve a realistic and coherent look of time passing by which were made together with a changing skybox. Four different skyboxes is cycled through during the game, and in order to hide the drastic changes and add ambience, we created a fog effect that faded in and out.
Particle effects were used for fire, smoke, and explosion effects which were implemented to give the feeling of a realistic torch used to light the cannon, which itself fired a cannonball with a smoke trail that exploded upon impact with a target, triggering the explosion effect.
We wanted to fully embrace the theme of full body movement with this project. The decision to construct a wooden structure that were supposed to work together with a VR environment without a visible representation of the users hand meant that the VR environment was aligned with the physical structure with utmost care. This way we were able to create an interactive and immersed experience where the user were able to grab the handle of the physical structure by reaching out their hand and grab the handle of the VR canon.
When implementing the water shader to the project with the cannon, ships and targets, we realised that they were made in different rendering pipelines.
Since the ocean shader was only compatible with the built-in render pipeline, we had to import everything else to that project
The fact that shader coding has no debugging feature was a big challenge during development. Especially hard was finding any information on geometry shaders (the type of shader the ocean was). Though supported by unity, it still does not have any official information on what they are, how they work etc.
And even less information on how to implement them into VR. First we tried adding macros from forum posts to define inputs and outputs for multiple displays with no progress. After some time we looked at rendering modes and found that you can render to both eyes by changing from single pass instanced rendering to multipass.
We found that PhyShare: Sharing Physical Interaction in Virtual Reality made by Zhenyi He, Fengyuan Zhu, and Ken Perlin in 2017, and Substitutional Reality: Using the Physical Environment to Design Virtual Reality Experiences made by L. Beever and N. W. John in 2022, were relevant for this project in the way interaction between the physical and virtual space was used.