One of the issues in the game was that if a player’s gaze looked briefly at a glowing sphere (“OnActorBeginOverlap”), it would trigger the “light up” emitter called the spirograph, but if the player did not gaze at the trigger for the full two seconds, the spirograph trigger would remain instead of disappearing. This caused confusion, especially if multiple spirograph emitters began overlapping.
In the picture below, you can see the gaze cursor (rectangular bar) intersecting with the trigger capsule and the spirograph lights up.
In order to solve this problem, the Level Blueprint was modified to add a component, “OnActorEndOverlap” along with “DestroyComponent.” This caused the emitter to be destroyed (disappear) if the gaze cursor was not overlapping it.
To see the Level Blueprint in it’s entirety, watch the video below.
Now all the spirograph “light up” emitters disappear if the player is not looking at them.
This particle system was made up of three different emitters. One emitter controls the sacral symbol shape, another controls the dotted ring that shrinks in towards the center, and the third controls the “bubbles” that float upward from the center.
Each emitter uses a custom material to give it a unique appearance. This material is selected in the “Required” node for each emitter. (see video).
The second chakra is known as the Sacral, or Svadhishthana. In the body it is located in the pelvic area. It’s color association is orange and it is ruled by the element of water. Energies assocated with this chakra are flow, flexibility, creativity and pleasure.
The player begins this level in a watery cave. The VR Pawn player camera is placed inside a post-processing volume that makes it seem as if the player is underwater. Soft music and burbling water sound effects are audio cues.
A volumetric post-processing shape was placed inside the cave and settings were tweaked to give the effect of blurriness and green-blue lighting associated with being underwater. Bloom intensity was set to 2.68, Threshold 1.57, a blue tinted lens flair, and Guassian Depth of Field settings were tweaked. A small pulsating spotlight added to the twinkling water reflection effect.
The player moves around the level by using the Gaze Teleportation system (gazing for 2 seconds at one of the lighted spheres). In this way the player travels around the environment in an upward spiral path. In the center is a column of water falling into a pond. Other watery symbols such as the sea shell remind the player of the chakra’s associations.
In this game, there is a VR Pawn that is basically a camera with a very long cube stuck out the front of it. When the player gazes around the scene while wearing the VR headset, their gaze is mirrored by the camera, which allows the cube to sweep around the scene as a sort of pointer. (This happens invisibly in the game, as you can see in the inset VR_Pawn picture above.) This pointer may overlap a trigger capsule when the player’s gaze sweeps over it. If the gaze pointer does overlap the trigger capsule, certain events are triggered. One of those events that can be triggered is the gaze navigation system, which allows the player to “teleport” to the different spheres in the game simply by gazing at them for 2 seconds. This eliminates the need for controllers – the player simply looks where they want to go next. A particle system that looks like a glowing Spirograph lights up around that sphere, indicating that the player’s gaze has landed on it. The player must keep their gaze steady for two seconds, and then they warp to that spot.
Above is the Level Blueprint for the entire gaze navigation mechanism, along with event triggers for particles and sound. What we have is a basic flow of logic that is copy/pasted for each of the 7 sphere triggers. When the user’s gaze overlaps the trigger surrounding that sphere, the logic checks to see if the gaze remained steady on the trigger for 2 seconds, and if so, teleports or “warps” the user’s position (via the VR Pawn) to that sphere.
So for instance, the game checks to see if the VR Pawn is looking at the trigger. If true, a timer is set for 2 seconds. When that time elapses, (time left is less than or equal to zero) then SetActorLocation moves the VR Pawn to that trigger the player was gazing upon.
This logic is repeated for each of the 7 sphere triggers.
There is also a system for spawning the Spirograph Particle and sound cues when the VR Pawn gazes at the sphere trigger. Sound cues include a brief verbal description of the current chakra’s name, a positive affirmation to assist in meditation, and a subtle “gong” sound effect when the player teleports.
Once the player has ascended through all seven spheres in that chakra level, they ascend to the next chakra level. There are seven chakras in all.
To mark the transition from this level (root chakra) to the next (a water element), I wanted to introduce a rain effect that is triggered when the player teleports to the 6th sphere. I also added some post-processing volumes to make the atmosphere seem more blue and foggy at that area. At this point the player is literally “going out on a limb” to get ready for the next level.
In the level blueprint, I created OnActorBeginOverlap, which connects to Spawn Emitter at Location. This causes the Rainfall_P particle emitter I created to begin making it rain when the 6th sphere trigger capsule is overlapped.
Here is an image of the Rainfall_P particle emitter system. As you can see, I set the Initial Location very large – 20000 units along X,Y &Z.
The trigger works – it rains when the player encounters it. The problem is that the rain is only occurring inside of a box area, rather than raining over the entire scene. Rather than the player feeling like it’s raining inside the scene, it seems to just be raining inside a transparent box. The player is actually standing outside the box at this point, so they are observing it from afar. I have “Set Fixed Bounds” turned on, because I was told you need to for performance/frame rate issues.
I tried dragging the Rainfall_P particle emitter into the scene without tying it to the trigger, and it does indeed rain over the entire area. This is with that is that it rains all the time – I only want it to begin to rain once the 6th sphere is triggered. I also tried disconnecting the Spawn Emitter from Get Actor Location, which only made it worse. It covered an even smaller area further away.
I want sound to play a big role in my environment, so I set out to make it so when you gaze at the spheres to teleport to that location, it also plays a sound cue. I downloaded several royalty-free sound effects and music from PurplePlanet.com and Dig.CCMixter.com including some deep bell and gong sounds.
I then imported them as .wav files into Unreal. You have to convert them to a Cue, by simplying right-clicking and choose “Create cue.”
Then you select the trigger capsule you wish to play the sound, and right-click, choose “Add Event / OnActorBeginOverlap”.
At this point, Unreal switches you to the Level Blueprint, where you can begin to work with the Event. Pull out a wire to Play Sound at Location, then near the Sound input, select the sound cue you want to play from the pull-down menu. It’s that simple!
To package a project for Windows from Unreal, you must have the proper software installed, including Visual Studio 2015. (See the documentation here.)
I wanted to package my game to send to others, but when I tried File/Package Project/Windows, it packed for a while then gave me a fatal error. Several forum posts I read said to delete your Saved and Intermediate file. This created a tremendous problem when I realized I had started building my introductory level using the Default level, which got deleted when I threw away those folders! Recovering them from the trash only recovered a portion of the scene, so I had to rebuild much of it. Lesson learned: Don’t build your levels in the Default level! Start a new one.
Another thing I did to simply the Package process was to turn off all supported platforms that I wouldn’t be using, such as Linnux and iOS.
By reading through the lines and lines of text in the Error log, I found this:
UATHelper: Packaging (Windows (64-bit)): Cook: LogTexture: Warning: Cannot retrieve source data for mip 0 of texture Mushroom3-texture
This could mean the image file was corrupt. I deleted the Mushroom mesh and texture altogether from my content file, and then my game packaged successfully! Next steps will be to re-save the files, re-import and try again to see if I can use the mushroom mesh.
After the game packaged successfully, the next step was to test the .exe file in VR. When I launched the game, it loaded, but only showed up on the computer monitor, not in the VR headset. After more digging through forums, I found some advice on Blueprints, so I added a Delay of 2 seconds, then something to Enable HMD, and also added the Execute Console Command (although that last part doesn’t seem to work automatically like it should):
If your game doesn’t start in VR automatically, you can type the Tilde key on your keyboard while in game, which opens up a command console, then type “stereo on”. This will launch it in the headset.
The thing that seemed to work best though is a very small setting that is hard to miss and not very well documented. In the Project settings under Description, look for a checkbox that says “Start in VR” under Settings towards the bottom.
After doing all this, my game launched in VR successfully.
I found a great tutorial from Hangry Bunnies From Mars on how to get an animated mesh to move along a spline. I created the blueprints and everything worked great, except for one problem – the salamander moves sideways to the spline. You can see in the image below, the salamander is 90 degrees rotated from the vector of the spline, which causes him to walk sideways, looking very silly.
I tried rotating the mesh in the blueprint viewport, but no luck. The rotation may look fine in the viewport, but when you play the animation, it reverts back to it’s original orientation.
One thing that’s very nice and handy about Unreal blueprints, is that in the Details panel where you plug in the Animation and Mesh they are color coded green and megenta so you know which part of the mesh to plug in where. Pretty handy once I noticed it!
Below is the Event Graph blueprint. Everything works splendidly, except the proper orientation.
After spending quite a bit of time on this without an answer, I went to Plan B, which was to animate it using Cinematics.
This method worked beautifully, and kept the proper scaling and rotation.
There are a lot of tutorials out there on how to attach an animated mesh to a spline using Blueprints, which leads me to wonder why that method might be preferred over using a Cinematic?
I decided it would be helpful to have a “pre-level” in my game that would introduce new players to the concept of Gaze Teleportation, so they know how to move around the scene.
I imported some assets I had created last semester, and set up a stage of sorts.
I also learned how to create Billboards, which are incredibly useful floating text boxes that always face the player.
I also wanted to make the spheres representing the chakras light up one by one, to simulate what will happen metaphorically in the game. I found a great, easy to follow tutorial by Brigid Costello that was so valuable to learning this quickly.
To do this, I created a Cinematic. I assigned each sphere to a new “Empty Actor Group” then gave each group a Visibility track. You can keyframe when things are Visible (Show) or Invisible (Hide). I timed it so they each become visible a second after the previous one, so they light up in sequence.
I then created a small Cone shape in the scene, and assigned a Movement track to it. I keyframed it so it flies backwards away from the figure towards the teleportation sphere, thus drawing the viewers’ attention. Once the viewer gazes at that sphere, they are instantly teleported to that spot.
Overall this was a good lesson for me in Cinematics and Billboards.