Chakra Garden VR
Findings from Guided Meditation Research Group
As part of my study on whether VR combined with meditation can alleviate symptoms of anxiety, stress and discomfort, it was suggested that I also study a control group that used a similar meditation without VR. I created a guided meditation using imagery similar to the environments being created in VR.
For instance, the root chakra begins:
We begin with the Root Chakra, Muladhara. Imagine a red ball of energy situated at the base of your spine near your tailbone. Inside this glowing red ball of light are things that remind you of the earth. Think of all the things rooted in the earth – crystals underground, roots, creatures that live underground and help plants thrive, minerals and soil… Allow yourself to feel secure, grounded and connected to the earth.
For each of the seven levels of the chakras there is an appropriate description.
I invited people via social media (acquaintances and members of a local yoga studio) to volunteer to be part of the study. There were two sessions – one took place in a yoga studio, another in a home. The study was briefly explained, along with all necessary information about voluntary consent to participate in the study. Participants were given a preliminary questionnaire that asked basic questions about how they were feeling and whether they had ever done guided meditation before. Their heart rate was measuring using cell phone sensor and apps or wrist-mounted fitness trackers.
After the meditation, participants were asked a nearly identical set of questions about how they were feeling, to gauge if their feelings had changed any. Their heart rate was also measured again post-meditation. Below are the findings.
Pre-test | Heart Rate | |||
A. Current feelings anxiousness | B. Overall sense of health & wellbeing | C. If worried, how hopeful for improvement? | D. Describe current mood | |
1 | 3 | 3 | 84 | |
2 | 3 | 3 | 1 | 91 |
1 | 3 | 3 | ||
2 | 3 | 3 | 3 | 71 |
3 | 1 | 2 | 1 | |
3 | 1 | 2 | 2 | 65 |
2 | 1 | 3 | 2 | 85 |
1 | 1 | 2 | 3 | 72 |
2 | 3 | 3 | 4 | 89 |
2 | 1 | 4 | 1 | 59 |
2 | 2 | 2 | 2 | 86 |
3 | 0 | 1 | 4 | 84 |
2 | 1.833333333 | 2.5 | 2.416666667 | 78.6 |
Post-test | Heart rate | |||
A. Current feelings anxiousness | B. Overall sense of health & wellbeing | C. If worried, how hopeful for improvement? | D. Describe current mood | |
0 | 3 | 4 | 80 | |
1 | 4 | 3 | 4 | 86 |
0 | 3 | 4 | ||
2 | 3 | 3 | 4 | 65 |
2 | 2 | 2 | 1 | |
2 | 2 | 2 | 3 | 75 |
1 | 2 | 3 | 3 | 84 |
1 | 2 | 2 | 3 | 70 |
2 | 4 | 3 | 3 | 89 |
1 | 3 | 4 | 3 | 54 |
1 | 3 | 3 | 3 | 76 |
1 | 1 | 1 | 4 | 70 |
1.166666667 | 2.666666667 | 2.6 | 3.25 | 74.9 |
-0.833 | 0.833 | 0.1 | 0.833 | -3.7 |
Feelings of anxiousness decreased by .83, overall sense of health and well-being increased by .83, hopefulness improved .1, current mood increased by .83, and heart rate decreased by 3.7.
Although this was a small sample size of only 12 people, it did seem to show that anxiousness and mood improved, while heart rates lowered, indicating that meditation had a beneficial, calming effect.
Next step will be to finish the VR simulation and do a similar study on participants to see how people respond to the VR simulation vs. guided meditation.
Gaze navigation – destroying emitters after they have been triggered
One of the issues in the game was that if a player’s gaze looked briefly at a glowing sphere (“OnActorBeginOverlap”), it would trigger the “light up” emitter called the spirograph, but if the player did not gaze at the trigger for the full two seconds, the spirograph trigger would remain instead of disappearing. This caused confusion, especially if multiple spirograph emitters began overlapping.
In the picture below, you can see the gaze cursor (rectangular bar) intersecting with the trigger capsule and the spirograph lights up.

In order to solve this problem, the Level Blueprint was modified to add a component, “OnActorEndOverlap” along with “DestroyComponent.” This caused the emitter to be destroyed (disappear) if the gaze cursor was not overlapping it.

To see the Level Blueprint in it’s entirety, watch the video below.
Now all the spirograph “light up” emitters disappear if the player is not looking at them.
Particle effect – sacral symbol
This particle system was made up of three different emitters. One emitter controls the sacral symbol shape, another controls the dotted ring that shrinks in towards the center, and the third controls the “bubbles” that float upward from the center.
Each emitter uses a custom material to give it a unique appearance. This material is selected in the “Required” node for each emitter. (see video).

Sacral Chakra landscape
The second chakra is known as the Sacral, or Svadhishthana. In the body it is located in the pelvic area. It’s color association is orange and it is ruled by the element of water. Energies assocated with this chakra are flow, flexibility, creativity and pleasure.
The player begins this level in a watery cave. The VR Pawn player camera is placed inside a post-processing volume that makes it seem as if the player is underwater. Soft music and burbling water sound effects are audio cues.


A volumetric post-processing shape was placed inside the cave and settings were tweaked to give the effect of blurriness and green-blue lighting associated with being underwater. Bloom intensity was set to 2.68, Threshold 1.57, a blue tinted lens flair, and Guassian Depth of Field settings were tweaked. A small pulsating spotlight added to the twinkling water reflection effect.
The player moves around the level by using the Gaze Teleportation system (gazing for 2 seconds at one of the lighted spheres). In this way the player travels around the environment in an upward spiral path. In the center is a column of water falling into a pond. Other watery symbols such as the sea shell remind the player of the chakra’s associations.
Gaze Navigation Complete
First, I’d like to send a huge shout out to Brantly McCord, Purdue grad student and fantastic game dev instructor, for showing me how to create this gaze navigation Blueprint. I couldn’t have done it without him!

In this game, there is a VR Pawn that is basically a camera with a very long cube stuck out the front of it. When the player gazes around the scene while wearing the VR headset, their gaze is mirrored by the camera, which allows the cube to sweep around the scene as a sort of pointer. (This happens invisibly in the game, as you can see in the inset VR_Pawn picture above.) This pointer may overlap a trigger capsule when the player’s gaze sweeps over it. If the gaze pointer does overlap the trigger capsule, certain events are triggered. One of those events that can be triggered is the gaze navigation system, which allows the player to “teleport” to the different spheres in the game simply by gazing at them for 2 seconds. This eliminates the need for controllers – the player simply looks where they want to go next. A particle system that looks like a glowing Spirograph lights up around that sphere, indicating that the player’s gaze has landed on it. The player must keep their gaze steady for two seconds, and then they warp to that spot.

Above is the Level Blueprint for the entire gaze navigation mechanism, along with event triggers for particles and sound. What we have is a basic flow of logic that is copy/pasted for each of the 7 sphere triggers. When the user’s gaze overlaps the trigger surrounding that sphere, the logic checks to see if the gaze remained steady on the trigger for 2 seconds, and if so, teleports or “warps” the user’s position (via the VR Pawn) to that sphere.


So for instance, the game checks to see if the VR Pawn is looking at the trigger. If true, a timer is set for 2 seconds. When that time elapses, (time left is less than or equal to zero) then SetActorLocation moves the VR Pawn to that trigger the player was gazing upon.

This logic is repeated for each of the 7 sphere triggers.

There is also a system for spawning the Spirograph Particle and sound cues when the VR Pawn gazes at the sphere trigger. Sound cues include a brief verbal description of the current chakra’s name, a positive affirmation to assist in meditation, and a subtle “gong” sound effect when the player teleports.
Once the player has ascended through all seven spheres in that chakra level, they ascend to the next chakra level. There are seven chakras in all.
Adding sound cues to gaze navigation
I want sound to play a big role in my environment, so I set out to make it so when you gaze at the spheres to teleport to that location, it also plays a sound cue. I downloaded several royalty-free sound effects and music from PurplePlanet.com and Dig.CCMixter.com including some deep bell and gong sounds.
I then imported them as .wav files into Unreal. You have to convert them to a Cue, by simplying right-clicking and choose “Create cue.”
Then you select the trigger capsule you wish to play the sound, and right-click, choose “Add Event / OnActorBeginOverlap”.

At this point, Unreal switches you to the Level Blueprint, where you can begin to work with the Event. Pull out a wire to Play Sound at Location, then near the Sound input, select the sound cue you want to play from the pull-down menu. It’s that simple!

Animating the salamander – issues with spline
I found a great tutorial from Hangry Bunnies From Mars on how to get an animated mesh to move along a spline. I created the blueprints and everything worked great, except for one problem – the salamander moves sideways to the spline. You can see in the image below, the salamander is 90 degrees rotated from the vector of the spline, which causes him to walk sideways, looking very silly.
I tried rotating the mesh in the blueprint viewport, but no luck. The rotation may look fine in the viewport, but when you play the animation, it reverts back to it’s original orientation.

One thing that’s very nice and handy about Unreal blueprints, is that in the Details panel where you plug in the Animation and Mesh they are color coded green and megenta so you know which part of the mesh to plug in where. Pretty handy once I noticed it!
Below is the Event Graph blueprint. Everything works splendidly, except the proper orientation.
After spending quite a bit of time on this without an answer, I went to Plan B, which was to animate it using Cinematics.

This method worked beautifully, and kept the proper scaling and rotation.
There are a lot of tutorials out there on how to attach an animated mesh to a spline using Blueprints, which leads me to wonder why that method might be preferred over using a Cinematic?

Introduction Tutorial Level
I decided it would be helpful to have a “pre-level” in my game that would introduce new players to the concept of Gaze Teleportation, so they know how to move around the scene.
I imported some assets I had created last semester, and set up a stage of sorts.
I also learned how to create Billboards, which are incredibly useful floating text boxes that always face the player.

I also wanted to make the spheres representing the chakras light up one by one, to simulate what will happen metaphorically in the game. I found a great, easy to follow tutorial by Brigid Costello that was so valuable to learning this quickly.

To do this, I created a Cinematic. I assigned each sphere to a new “Empty Actor Group” then gave each group a Visibility track. You can keyframe when things are Visible (Show) or Invisible (Hide). I timed it so they each become visible a second after the previous one, so they light up in sequence.
I then created a small Cone shape in the scene, and assigned a Movement track to it. I keyframed it so it flies backwards away from the figure towards the teleportation sphere, thus drawing the viewers’ attention. Once the viewer gazes at that sphere, they are instantly teleported to that spot.

Overall this was a good lesson for me in Cinematics and Billboards.