top of page


summer 23'

Proximity Oscillation is the development of two VR environments in Unity that both explore the same underlying technical ideas but are contextualized in two different experiences. This exercise was prompted by my own willingness to learn new scripting concepts pertaining to dynamic mesh construction within Unity. As a result, the script I developed allowed for constructing meshes whose vertices react in proximity to the nearest target, designated in a list of Transforms. From a small technical exercise, I questioned how this script could be extended and applied to VR experiences, lending itself to a far greater design exploration.

About a year ago, my first exposure to VR development explored the potential for experiencing animated sculptures in VR. Those animations were generated procedurally outside Unity and then captured as static meshes to be animated, meaning that within an actual runtime environment, there is no potential for them to react dynamically. The Proximity Oscillation script starts to solve this problem, and now being a more experienced developer, I wonder how I can realize those previous design concepts more sophisticatedly.

Initial tests for developing mesh oscillation given the proximity of designated target objects

The Proximity Oscillation script is attached to any GameObject with a predetermined mesh filter allowing for any mesh modeled externally to be manipulatable. The goal of this script is to determine whether a given vertex on a mesh is within radial proximity of a target point, and if so, reacts by oscillating within a given vector direction by a sin function of time. The script initializes by building three Vector3 arrays:

  • Original Vertices: An array for tracking the original position of a given vertex allowing for calculating distances to target Transforms and potential displacements based on oscillation parameters.

  • Displaced Vertices: The updated location of the vertices displaced by any oscillation, used for rendering the visible mesh during runtime.

  • Vertex Normals: An array storing the vector normal of each vertex, calculated at runtime. The normal is calculated as a normalized sum of a given vertex's three adjacent face normals. This vertex normal is used as a direction for oscillating the vertices.

With this setup, each frame loops over the vertices checking for whether a vertex is in proximity to a target, and if so, factors its displacement based on its distance to the target and its current sinusoidal displacement. Ignoring the nitty gritty, this is the general gist of this script that allows for parameterizing different characterizable effects for mesh oscillation.

Shader Graph used to help distinguish the vertex displacement via color gradients given vertex position and relative UV

The key to visually understanding the mesh manipulation is developing custom Shader Graphs for the corresponding geometries. The shaders work dynamically per vertex, determining the color of the vertex based on its displacement within a planar direction for a flat surface, or from the center of the sphere. Too, a secondary coloring condition is affected by the UV of the vertex allowing for each vertex to be a unique color describing its own displacement and location on the mesh.

Initial tech demo that starts to explore interactive and visual concepts of the mesh oscillation script

The lack of perceivable realistic constraints in VR is what makes it fun to develop for, and with this script, I find myself very well set up to do so. To incorporate VR immersion, my criteria for an experience utilizing this script were that firstly, the user is in control of the mesh manipulation with their hands, and secondly, that the effect of the mesh oscillation is largely spatial. I was inspired by work similarly seen in Wooorld, which plays with a manipulable tabletop scale that also translates to 1:1 environmental scale. I hence became fascinated with the concept that the VR user can manipulate a miniature tabletop version of the oscillating mesh, affecting the true-to-scale version of it. As a result, this formed the ideation of two environment design concepts, the first being the spherical object, and the second being the topographic plane.

Scalar relationship between interfaceable mesh and its larger scale copy

In terms of designing the aesthetics and spatial quality of the environment, I wanted to persist in this theme of ‘anomaly containment’ drawn from my earlier work, except with heightened thematic detail. For Environment 1, my concept was to place the user as a ‘3D virtual DJ’. Using their futuristic deck, they can manipulate and control the parameters of the sculptural sphere object that potentially could be a centerpiece for larger virtual gatherings. In Environment 2, I spent more time 3D modeling and designing the aesthetic experience, playing more with sound design, post-processing, and particle effects. The style I was aiming to capture was some cross between minimal futuristic design and industrial architecture. Overall, this exercise helped me realize how technical explorations can be an incredibly inspiring seed for designing new environments and experiences.

Environment 1

Environment 2

When it comes to developing VR, I think that the best approaches come from considering how embodying an avatar in virtual space allows us to extend our interactions with the perceivable in ways we are not normally able. It is an expansive field ripe for creative exploration that I believe can welcome any kind of approach and ideas that can help us extend our imagination to new places. VR is a great medium for exploring the infinite ‘what ifs’ we might have as engineers, designers, and simply people.

Environment 2 orthographic section

Environment 2 orthographic worm's eye

bottom of page