I want sound to play a big role in my environment, so I set out to make it so when you gaze at the spheres to teleport to that location, it also plays a sound cue. I downloaded several royalty-free sound effects and music from PurplePlanet.com and Dig.CCMixter.com including some deep bell and gong sounds.
I then imported them as .wav files into Unreal. You have to convert them to a Cue, by simplying right-clicking and choose “Create cue.”
Then you select the trigger capsule you wish to play the sound, and right-click, choose “Add Event / OnActorBeginOverlap”.
At this point, Unreal switches you to the Level Blueprint, where you can begin to work with the Event. Pull out a wire to Play Sound at Location, then near the Sound input, select the sound cue you want to play from the pull-down menu. It’s that simple!
I found a great tutorial from Hangry Bunnies From Mars on how to get an animated mesh to move along a spline. I created the blueprints and everything worked great, except for one problem – the salamander moves sideways to the spline. You can see in the image below, the salamander is 90 degrees rotated from the vector of the spline, which causes him to walk sideways, looking very silly.
I tried rotating the mesh in the blueprint viewport, but no luck. The rotation may look fine in the viewport, but when you play the animation, it reverts back to it’s original orientation.
One thing that’s very nice and handy about Unreal blueprints, is that in the Details panel where you plug in the Animation and Mesh they are color coded green and megenta so you know which part of the mesh to plug in where. Pretty handy once I noticed it!
Below is the Event Graph blueprint. Everything works splendidly, except the proper orientation.
After spending quite a bit of time on this without an answer, I went to Plan B, which was to animate it using Cinematics.
This method worked beautifully, and kept the proper scaling and rotation.
There are a lot of tutorials out there on how to attach an animated mesh to a spline using Blueprints, which leads me to wonder why that method might be preferred over using a Cinematic?
I decided it would be helpful to have a “pre-level” in my game that would introduce new players to the concept of Gaze Teleportation, so they know how to move around the scene.
I imported some assets I had created last semester, and set up a stage of sorts.
I also learned how to create Billboards, which are incredibly useful floating text boxes that always face the player.
I also wanted to make the spheres representing the chakras light up one by one, to simulate what will happen metaphorically in the game. I found a great, easy to follow tutorial by Brigid Costello that was so valuable to learning this quickly.
To do this, I created a Cinematic. I assigned each sphere to a new “Empty Actor Group” then gave each group a Visibility track. You can keyframe when things are Visible (Show) or Invisible (Hide). I timed it so they each become visible a second after the previous one, so they light up in sequence.
I then created a small Cone shape in the scene, and assigned a Movement track to it. I keyframed it so it flies backwards away from the figure towards the teleportation sphere, thus drawing the viewers’ attention. Once the viewer gazes at that sphere, they are instantly teleported to that spot.
Overall this was a good lesson for me in Cinematics and Billboards.
Recently I began user testing on my prototype. I have been able to sample a variety of ages and levels of VR experience. A few things have surfaced repeatedly:
“Climbing the tree” (moving upwards via gaze teleportation to each sphere up and around the tree root) causes more than half the people to experience a feeling of vertigo and anxiety about heights. Users literally feel like they could fall.
To address this I plan to insert visual “platforms” in the shape of tree fungus that allows users to feel they are standing on something as they teleport
Some people prefer to stand, while three people age 60+ who also experience balance problems in real life prefer to sit
I will revise the Protocol research document to indicate that people will be asked if they prefer to sit or stand
Switching to the Oculus platform means that during development I need to be mindful there is now a cord coming from the headset, which could cause issues if users are expected to continuously turn around (spiral pattern moving upwards)
Almost everyone responded positively to the music used in the scene.
I plan to add additional musical elements, like bells or chimes as users interact with the scene
Most people seemed confused by the method of gaze navigation; it had to be verbally explained to them
I plan to add an introductory level that demonstrates gaze navigation, and also add animated assets that guide the user by drawing their gaze to the appropriate area
Not everyone is comfortable enough with VR to even want to try it. People who are uncomfortable with “video games” or those who have motion sickness concerns will most likely choose not to participate.
This week I continued to develop the gaze navigation system. I created an Emissive (glowing) material for each of the seven spheres and place them in an upwards spiral pattern around the tree root. Near the base of the roots you can see the VR Pawn Actor which has a long “cursor” attached. When the player moves their head and the cursor encounters one of the spheres, the player teleports to that location.
I updated the tree to a more realistic looking bark texture, and tweaked some of the lighting settings from last time. I also added the mushrooms.
After tweaking the lighting, I used the Build Lighting feature, and received the error that I needed to set a “Lightmass Importance Volume,” which I did after looking up a brief tutorial. As you can see below, there are thousands of light calculations going on, which tends to slow down your game frame rate.
By placing a Lightmass Importance Volume, you can control where Unreal Engine puts most of its energy when calculating static lights, which improves performance. Below you can see much fewer instances of calculations.
Unreal Engine has a section in their Gear VR documentation on how to build cameras, set up functions for Gear VR touchpad, and the motion controller. While these are helpful, it would be great if there was more support for Gear VR built into Unreal instead of developers having to custom-build things. Also, be aware that sometimes Unreal’s documentation is not up to date, and you may need to cross-reference against Oculus’s Developer documentation as well.
Note: if you are developing for Oculus, especially for Gear VR, there are certain plugins that need to be turned on in order to get the Blueprints you need.
Oculus has a section in their documentation on Blueprints, and how to turn on the plugin. So far the documentation is a bit vague and sketchy, but I hope to continue deciphering it.
The Input Blueprint provides a control interface for Touch, Oculus remote, and the Gear VR Controller….Gear VR Controller clickpad events are reported in the Input Blueprint as thumbstick events.
What would be really helpful, would be an Input option to use the Gear VR touchpad or motion controller built into the Inputs/action mapping. So far that does not seem to be an option. If you can use the Action and Axis bindings in the Input section, it works across the entire game and all levels, instead of having to custom build a blueprint each time.
In this section, we added to our Blueprint, replacing the PrintScreen value with a function that allows us to move the VR_Pawn (player point of view) to the trigger capsule location when the VR_Pawn pointer overlaps the trigger capsule. In other words, when you look at something, you instantly teleport to it. Below is a pic of the player standing on top of the cauldron after gazing at it.
Below is an extreme close-up on the salamander – once the player gazed at it, the player was teleported directly to the trigger capsule surrounding the salamander.
Obviously the placement of the capsules need to be fine-tuned so the player doesn’t land so closely upon the object, but the basic interactions are now functional.
Here is the Blueprint that creates this action. Notice the Get Actor Location and Set Actor Location functions.
In this part of the Pluralsight tutorial “Making a VR Experience in Unreal Engine” we get into setting up a trigger capsule for our objects, then using Blueprints to create a PrintString when our gaze encounters the capsule surrounding the object.
We went into the Blueprint for the VR_Pawn and clicked on the Viewport tab at the top of the Blueprint to see the actual object in order to create a very long Cube that was parented to the camera. This allows us to touch or “Overlap” objects in the scene when the camera points at them.
When the gaze/pointer encounters the object, the PrintString prints the word “Hit” instead of “Miss”.
Next, we built the Blueprints to make the program print different words on the screen depending on which object it overlaps.
WAIT – there is a problem. When the gaze encounters the trigger, it prints the wrong description.
I literally had my wires crossed. I corrected the PrintString outputs.
Now the Cauldron string is connected to the Cauldron trigger, same for salamander.
We have successfully demonstrated creating an event (PrintString) when the VR_Pawn camera pointer overlaps a Trigger Capsule.
Today I’m following a Pluralsight tutorial on creating a first person camera and teleportation system using Blueprints in Unreal Engine. In this Blueprint, we are creating a way for the player to “teleport” from VR_Pawn to VR_Pawn1 when the number 1 key is pressed on the keyboard, and VR_Pawn2 when the number 2 key is pressed, allowing the player to move around the scene.
Here are the views as I click between the original, 1 and 2 keys. (Also, I updated the tree a bit from the previous image, but have not tested it in VR yet to see if the geometry “explodes.”)
The Blueprint allowed me to successfully move around the scene using keyboard keys while experimenting in the VR Preview.
This was helpful for understanding the concept, but in order for it to work in the Gear VR headset, I will need to understand how to map inputs from Gear VR instead of keyboard keys.