Chakra Garden VR

Sacral Chakra landscape

The second chakra is known as the Sacral, or Svadhishthana. In the body it is located in the pelvic area. It’s color association is orange and it is ruled by the element of water. Energies assocated with this chakra are flow, flexibility, creativity and pleasure.

The player begins this level in a watery cave. The VR Pawn player camera is placed inside a post-processing volume that makes it seem as if the player is underwater. Soft music and burbling water sound effects are audio cues.

volumetric post-processing
volumetric post-processing box and VR Pawn “cursor” with a gaze teleportation trigger capsule surrounding the sphere near the entrance.
Post Processing Settings
Post Processing Settings

A volumetric post-processing shape was placed inside the cave and settings were tweaked to give the effect of blurriness and green-blue lighting associated with being underwater. Bloom intensity was set to 2.68, Threshold 1.57, a blue tinted lens flair, and Guassian Depth of Field settings were tweaked. A small pulsating spotlight added to the twinkling water reflection effect.

The player moves around the level by using the Gaze Teleportation system (gazing for 2 seconds at one of the lighted spheres). In this way the player travels around the environment in an upward spiral path. In the center is a column of water falling into a pond. Other watery symbols such as the sea shell remind the player of the chakra’s associations.

Gaze Navigation Complete

First, I’d like to send a huge shout out to Brantly McCord, Purdue grad student and fantastic game dev instructor, for showing me how to create this gaze navigation Blueprint. I couldn’t have done it without him!

VR Pawn
VR Pawn in “Muladhara, the Root Chakra”

In this game, there is a VR Pawn that is basically a camera with a very long cube stuck out the front of it. When the player gazes around the scene while wearing the VR headset, their gaze is mirrored by the camera, which allows the cube to sweep around the scene as a sort of pointer. (This happens invisibly in the game, as you can see in the inset VR_Pawn picture above.) This pointer may overlap a trigger capsule when the player’s gaze sweeps over it. If the gaze pointer does overlap the trigger capsule, certain events are triggered. One of those events that can be triggered is the gaze navigation system, which allows the player to “teleport” to the different spheres in the game simply by gazing at them for 2 seconds. This eliminates the need for controllers – the player simply looks where they want to go next. A particle system that looks like a glowing Spirograph lights up around that sphere, indicating that the player’s gaze has landed on it. The player must keep their gaze steady for two seconds, and then they warp to that spot.

Gaze Navigation Level Blueprint
Gaze Navigation Level Blueprint

Above is the Level Blueprint for the entire gaze navigation mechanism, along with event triggers for particles and sound. What we have is a basic flow of logic that is copy/pasted for each of the 7 sphere triggers. When the user’s gaze overlaps the trigger surrounding that sphere, the logic checks to see if the gaze remained steady on the trigger for 2 seconds, and if so, teleports or “warps” the user’s position (via the VR Pawn) to that sphere.

Teleport logic
Teleport logic
Check Position
Check Position

So for instance, the game checks to see if the VR Pawn is looking at the trigger. If true, a timer is set for 2 seconds. When that time elapses, (time left is less than or equal to zero) then SetActorLocation moves the VR Pawn to that trigger the player was gazing upon.

Time to warp
Time to warp

This logic is repeated for each of the 7 sphere triggers.

Spawn Particle and sound
Spawn Particle and sound

There is also a system for spawning the Spirograph Particle and sound cues when the VR Pawn gazes at the sphere trigger. Sound cues include a brief verbal description of the current chakra’s name, a positive affirmation to assist in meditation, and a subtle “gong” sound effect when the player teleports.

Once the player has ascended through all seven spheres in that chakra level, they ascend to the next chakra level. There are seven chakras in all.

Adding sound cues to gaze navigation

I want sound to play a big role in my environment, so I set out to make it so when you gaze at the spheres to teleport to that location, it also plays a sound cue. I downloaded several royalty-free sound effects and music from and including some deep bell and gong sounds.

I then imported them as .wav files into Unreal. You have to convert them to a Cue, by simplying right-clicking and choose “Create cue.”

Then you select the trigger capsule you wish to play the sound, and right-click, choose “Add Event / OnActorBeginOverlap”.

Sound cues
Sound cues

At this point, Unreal switches you to the Level Blueprint, where you can begin to work with the Event. Pull out a wire to Play Sound at Location, then near the Sound input, select the sound cue you want to play from the pull-down menu. It’s that simple!


Animating the salamander – issues with spline

I found a great tutorial from Hangry Bunnies From Mars on how to get an animated mesh to move along a spline. I created the blueprints and everything worked great, except for one problem – the salamander moves sideways to the spline. You can see in the image below, the salamander is 90 degrees rotated from the vector of the spline, which causes him to walk sideways, looking very silly.

I tried rotating the mesh in the blueprint viewport, but no luck. The rotation may look fine in the viewport, but when you play the animation, it reverts back to it’s original orientation.

This looks correct, but orientation doesn’t stay

One thing that’s very nice and handy about Unreal blueprints, is that in the Details panel where you plug in the Animation and Mesh they are color coded green and megenta so you know which part of the mesh to plug in where. Pretty handy once I noticed it!

Below is the Event Graph blueprint. Everything works splendidly, except the proper orientation.

After spending quite a bit of time on this without an answer, I went to Plan B, which was to animate it using Cinematics.

Salamander Cinematic

This method worked beautifully, and kept the proper scaling and rotation.

There are a lot of tutorials out there on how to attach an animated mesh to a spline using Blueprints, which leads me to wonder why that method might be preferred over using a Cinematic?


Introduction Tutorial Level

I decided it would be helpful to have a “pre-level” in my game that would introduce new players to the concept of Gaze Teleportation, so they know how to move around the scene.

I imported some assets I had created last semester, and set up a stage of sorts.

I also learned how to create Billboards, which are incredibly useful floating text boxes that always face the player.

Intro Level
Intro Level

I also wanted to make the spheres representing the chakras light up one by one, to simulate what will happen metaphorically in the game. I found a great, easy to follow tutorial by Brigid Costello that was so valuable to learning this quickly.

Chakra lights
Chakra lights

To do this, I created a Cinematic. I assigned each sphere to a new “Empty Actor Group” then gave each group a Visibility track. You can keyframe when things are Visible (Show) or Invisible (Hide). I timed it so they each become visible a second after the previous one, so they light up in sequence.

I then created a small Cone shape in the scene, and assigned a Movement track to it. I keyframed it so it flies backwards away from the figure towards the teleportation sphere, thus drawing the viewers’ attention. Once the viewer gazes at that sphere, they are instantly teleported to that spot.

Directional cone
Directional cone

Overall this was a good lesson for me in Cinematics and Billboards.


User testing

Recently I began user testing on my prototype. I have been able to sample a variety of ages and levels of VR experience. A few things have surfaced repeatedly:

  • “Climbing the tree” (moving upwards via gaze teleportation to each sphere up and around the tree root) causes more than half the people to experience a feeling of vertigo and anxiety about heights. Users literally feel like they could fall.
    • To address this I plan to insert visual “platforms” in the shape of tree fungus that allows users to feel they are standing on something as they teleport
  • Some people prefer to stand, while three people age 60+ who also experience balance problems in real life prefer to sit
    • I will revise the Protocol research document to indicate that people will be asked if they prefer to sit or stand
  • Switching to the Oculus platform means that during development I need to be mindful there is now a cord coming from the headset, which could cause issues if users are expected to continuously turn around (spiral pattern moving upwards)
  • Almost everyone responded positively to the music used in the scene.
    • I plan to add additional musical elements, like bells or chimes as users interact with the scene
  • Most people seemed confused by the method of gaze navigation; it had to be verbally explained to them
    • I plan to add an introductory level that demonstrates gaze navigation, and also add animated assets that guide the user by drawing their gaze to the appropriate area
  • Not everyone is comfortable enough with VR to even want to try it. People who are uncomfortable with “video games” or those who have motion sickness concerns will most likely choose not to participate.

Teleportation navigation

This week I continued to develop the gaze navigation system. I created an Emissive (glowing) material for each of the seven spheres and place them in an upwards spiral pattern around the tree root. Near the base of the roots you can see the VR Pawn Actor which has a long “cursor” attached. When the player moves their head and the cursor encounters one of the spheres, the player teleports to that location.

I updated the tree to a more realistic looking bark texture, and tweaked some of the lighting settings from last time. I also added the mushrooms.

After tweaking the lighting, I used the Build Lighting feature, and received the error that I needed to set a “Lightmass Importance Volume,” which I did after looking up a brief tutorial. As you can see below, there are thousands of light calculations going on, which tends to slow down your game frame rate.

By placing a Lightmass Importance Volume, you can control where Unreal Engine puts most of its energy when calculating static lights, which improves performance. Below you can see much fewer instances of calculations.

Documentation for developing for Oculus, Gear VR and plugins for Unreal Engine

Unreal Engine has a section in their Gear VR documentation on how to build cameras, set up functions for Gear VR touchpad, and the motion controller. While these are helpful, it would be great if there was more support for Gear VR built into Unreal instead of developers having to custom-build things. Also, be aware that sometimes Unreal’s documentation is not up to date, and you may need to cross-reference against Oculus’s Developer documentation as well.

Note: if you are developing for Oculus, especially for Gear VR, there are certain plugins that need to be turned on in order to get the Blueprints you need.

Oculus has a section in their documentation on Blueprints, and how to turn on the plugin. So far the documentation is a bit vague and sketchy, but I hope to continue deciphering it.

The Input Blueprint provides a control interface for Touch, Oculus remote, and the Gear VR Controller….Gear VR Controller clickpad events are reported in the Input Blueprint as thumbstick events.

What would be really helpful, would be an Input option to use the Gear VR touchpad or motion controller built into the Inputs/action mapping. So far that does not seem to be an option. If you can use the Action and Axis bindings in the Input section, it works across the entire game and all levels, instead of having to custom build a blueprint each time.


With over 5 million Gear VR headsets sold world-wide, further support within Unreal Engine is a real need for the developer community.

Blueprints for gaze teleportation

In this section, we added to our Blueprint, replacing the PrintScreen value with a function that allows us to move the VR_Pawn (player point of view) to the trigger capsule location when the VR_Pawn pointer overlaps the trigger capsule. In other words, when you look at something, you instantly teleport to it. Below is a pic of the player standing on top of the cauldron after gazing at it.

Teleport to cauldron

Below is an extreme close-up on the salamander – once the player gazed at it, the player was teleported directly to the trigger capsule surrounding the salamander.

Obviously the placement of the capsules need to be fine-tuned so the player doesn’t land so closely upon the object, but the basic interactions are now functional.

Here is the Blueprint that creates this action. Notice the Get Actor Location and Set Actor Location functions.