Unity Gaming: Infinite Running with Oculus and Kinect

Infinite Runner – Unity 5 Using Your Body as the Controller

Hey Everyone! NEW ANNOUNCEMENT!

As you know (or are just know learning) the Unity Gaming series on this blog is stepping through how to create a 3D Infinite Runner with Oculus Integration and Kinect Controls! Here’s a video explaining about it below.

Well, there is good news. UNITY 5.2 was released! And as such I’ve decided that this series would be perfect for exporing 5.2’s new features. Thus, the series has been re-vamped for Unity 5! (Well 5.2)

This post will highlight the major changes so you can integrate Kinect and Oculus into your Project!

So let’s get coding!

Unity Changes

This section will walk through all the changes that affect the Infinite runner I’m showing you how to create. This way when you follow along with Unity Gaming Series the code will work in unity 5.2.

Visual Studio

So if you download the new Unity 5.2 you’ll now notice that Visual Studio is now included! This is perfect for development debugging and building for Windows 10.

Oculus Integration AND Windows 10

The awesome thing about Unity 5.2 is optimization for Windows 10, Oculus Rift and other VR/AR devices!

Unity5Oculus

The Code

Create Your Own

The Unity Gaming series was meant to walk you through creating an infinite runner from scratch; teaching good coding practices along the way. You can go back to the beginning with the first post: Unity Gaming: Good Practices Unity

Completed Infinite Runner Game

If you’re only curious in the Kinect and Oculus portion you can start from a completed infinite runner game. The code for the game is on my github here: Base Unity 5 Infinite Runner Game

Here are the step by step guides to integrate:

Completed Game with Oculus and Kinect

If none of those appeal to you and you want to download the whole thing. The link to the completed repo is here: Gravity Infinite Runner

Happy Coding!

-TheNappingKat

Unity Gaming: Shooting/Aiming (part3) – Oculus

Last time on Unity Gaming – Getting a hold of learning how to aim and shoot…lol.

We created a scene where we aimed by getting the orientation of the camera and where we were looking. But then, when we tried using this same technique with the Oculus it didn’t work! Now our heroes are stuck. How will they get around their null reference exception and get their shooting ability working!?!?!

Well I mentioned it before.

“The other [method] draws a line from an object, like a gun, or hand out to a target.”

So let’s continue on!

Vector/Object Raycasting

There is no main camera in the scene. So now we need to implement the second common way of aiming and shooting. Which is from an object. Because we can no longer rely on the camera we can create a ray from an object right in front of our character to an angle created by the mouse that will still aim straight to where we are looking. =)

Create an empty GameObject and call it ShotSpawner (this object will act as the origin of our raycast). Child it to the OVRCameraRig.

Note: Remeber this can work without the oculus as well. So you can do this for shooting a gun. Just put the ShotSpawner on the tip of the gun!

ShotSpawnerOVR

In my scene I have moved the TargetCube to (0, 0, 15) to move it out of the way, but still give me a reference. And instead of drawing a Line in debugging I’m going to get the tag of whatever my raycast hits in debugging.

So let’s write the script that will do this.

In ShotSpawner add a new C# script component and name it OVRShoot. Unlike the other scripts we will be saving the gameobject our ray intersects; so we need a RaycastHit object, and a GameObject.

    RaycastHit hit;
    GameObject hitObject;

The Update method will then look like this:

    void Update()
    {
        if (Physics.Raycast(transform.position, transform.forward, out hit, 10))
        {
            hitObject = hit.collider.gameObject;
            Debug.Log(hitObject.tag);
        }
    } 

So whatever the raycast hits will be stored in the hit parameter of the method if it is true. We can then get the hit’s object, IF it has a collider! So remember if you want to be able to interact with an object it’s top most parent object needs to have a collider we can interact with! (<<– this is super important, and causes a lot of people grief because the raycast will hit a child objects collider instead of the parent’s and then the code breaks. So avoid that grief and remember this tid-bit, TRUST ME! Avoid common mistakes and developing will go a lot more smoothly.)

Before we run the Script now we need to add tags to the cubes in the scene. I’ve added the Enemy tag to them in the Tag drop down in the Inspector.

OculusRaycastWorking

And there you have it! It’s printing the Enemy Tag! So now you can use these methods in your own code. I’ll eventually show you how to use it in the Infinite Runner because I use a slightly different method there becuase of some assets I use.

Happy Coding

-TheNappingKat

Unity Gaming: Shooting/Aiming (part 2) – Oculus

So there are two main types of aiming. One, uses the camera’s view to shoot a ray directly out in front of you. This is great if you want to look at things to target them. The other draws a line from an object, like a gun, or hand out to a target.

Let’s do the camera one first.

Camera Raycasting

For this I am using the same scene I made in part 1 but I’ve added some cubes to aim at that are in the air to test out my targeting.

So I’m going to create a cube called TargetObject and attach it to my FirstPersonCamera under a gameobject. Then I’ll change the position to (0, 0, 10), and set it’s box collider to Is Trigger so it doesn’t interfere with my characters movement. I do this so I can get a good idea of where my ray is going to pointing. In Unity you can’t see the raycast that emitted unless you draw it with a line render or call debug.drawline (but that only draws it in the scene view). If I were to play the game now. I should see a cube 10 units away from where I’m looking at all times.

TargetObject

 

Great so now let’s write the raycast like before. Create a new C# script called RaycastShoot

In this update add the following lines:

void Update () {
    Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
    RaycastHit hit;
    if (Physics.Raycast(ray, out hit, 100))
        Debug.DrawLine(ray.origin, hit.point);
    }

What the script is doing is creating a Ray where the orgin is the main camera and the direction is related to the mousePosition and the angle between it and the camera. Then if the raycast hits anything draw a line. In our scene the line should always be drawn since our target object is in front of our character.

DebugRaycast

It’s difficult to see the line, but you can tell in the pic it’s slightly darker than the others.

Now let’s try with the Oculus! You’ll see why we have to change our methods of aiming in a second.

Oculus and Camera Aiming

If you haven’t set up your environment to integrate oculus don’t worry I’ve posted about it before, here, Oculus setup! Again you don’t need the hardware to develop for VR.

First let’s disable our main Character by clicking on the checkbox in the upper left hand corner in the inspector; in disabled mode there should no longer be a check and the object should be greyed out in the hierarchy.

Cool now click and drag in the Oculus Player Controller.

OVRStartScene

Then create another Target cube so we can have a reference as to where we are looking. I moved it to under the OVRCamerRig to (0,0,5) position.

Next we need to add the Raycast script like before. I’ve added it to the OVRCameraRig.

OVRTargetCube

Now when we run it the game still plays but we don’t see the line in the scene view like before. This is because we get a Null ReferenceException from the RaycastShoot Script.

OculusError

The error above is referring to this line of code:

void Update () {
    Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition); //this one
    RaycastHit hit;
    if (Physics.Raycast(ray, out hit, 100))
        Debug.DrawLine(ray.origin, hit.point);
    }

There is no main camera in the scene. So now we need to implement the second common way of aiming and shooting; which is from an object. Because we can no longer rely on the camera, we must create a ray from an object right in front of our character, to an angle created by the mouse. And since the mouse dictates the orientation of the head, we will still aim straight to where we are looking. =)

Correct Oculus shooting and the second type of shooting will be in part 3…

Happy Coding!

-TheNappingKat

Unity Gaming: Shooting (part 1)

Okay so this is a little jump ahead of the Unity Gaming Infinite Runner Series. I’ve gotten a lot of questions about this so I’m writing about it now. Because it’s out of order, the examples are in a blank scene, not the Infinite Runner main scene.

Okay, so, most likely when you are shooting something it’s from either right from of the center of the screen or it’s, for the most part, originating from another objet like a hand or gun. This gets kinda complicated when all of a sudden you are using two cameras to judge the “center” of what you are looking at; like when using an oculus…

But no fear learning the general method of shooting isn’t too bad; and then adding the oculus bit will be easy to understand.

Raycasting

So when you aim for or align something in real life do you draw an invisible line from where you are to the thing you are trying to hit? Yes, Great! If not well, cool, but that’s what Raycasting is.
There are 5 possible parameters for the raycasting method in Unity.

  1. Vector3 origin – which is the position in 3d space where you want the ray to start
  2. Vector3 direction – which is the direction you want the ray to point, the first to parameters make a Ray.
  3. *RaycastHit hitInfo – get whatever the Ray hits first and store it in this parameter
  4. *Float maxDistance – this is the magnitude of the Ray aka how far out you want the ray to point
  5. *Int layerMask – what the Ray can Hit

* Indicates that these parameters are optional

There are also two main ways the Raycast method is written.

  • Raycast(Vector3 origin, Vector3 direction, float maxDistance, int layerMask)
  • Raycast(Vector3 origin, Vector3 direction, out RaycastHit hitInfo, float maxDistance, int layerMask)

In Unity Raycast method return a Boolean value. True, if the ray hit something, and False if nothing was hit. I should also mention that in Unity Raycasts do not detect colliders for which the raycast is inside. However if you are creating an animation or moving, you should keep the Raycast method in a FixedUpdate method so that the physics library can update the data structures before the Raycast hits a collider at its new position.

Aiming

Now there are 2 types of aiming. One is simply shooting in the direction that the player is facing and one is shooting in the direction that the player is looking.
Let’s do the easy one first – aiming where the player is facing aka not aiming just shooting.
Cool so in my scene I have the regular First Person Character from the Unity Sample Assets package and created some ground by scaling a cube to have dimensions (10, 1, 30).

StartScene

Now lets create a shoot script called simpleShoot.

CreateSimpleScript

In the Update function we want to shoot a new sphere every time I click and shoot it in the “forward direction”. So let’s add the following lines.

void Update () {
    if (Input.GetButtonDown("Fire1"))
        {
            GameObject clonedBullet; 
            clonedBullet = Instantiate(GameObject.CreatePrimitive(PrimitiveType.Sphere),
                transform.position, transform.rotation) as GameObject;
            clonedBullet.AddComponent<Rigidbody>();
            clonedBullet.GetComponent<Rigidbody>().AddForce(clonedBullet.transform.forward * 2000);
        }

    }

Cool and if you check it out I’m shooting and the bullets go in the forward direction of the transform of my character! YAY!

SimpleShootScene

Mouse Aim

Okay but that doesn’t shoot the ball in the upward direction when I’m looking up, you say to me. Yes I know that’s the next thing we are going to do. =) …in part two!

Happy Coding! Part 2 coming soon =D

-TheNappingKat

Oculus Headset Unity Error: No HMD Detected

Hi All,

I solved the NO HMD DETECTED, Tracker Connected error you get with trying to do extended screen when using the headset for the Unity Editor.

NoHMD

Well I got it working, not necessarily solved.

Specs:

  • Lenovo X1 Carbon
  • Intel Core i7 – 3667U CPU 2.0ghz
  • 8gb ram
  • Windows 8.1
  • 64-bit
  • Intel HD Graphics 4000
  • Oculus DK2
  • SDK 5.0.1

To start I detached my oculus from the computer. Reattched the oculus, made sure it was working in the normal Direct HMD Access mode. It was.

1) Then I hit “Windows + P” and made sure my Projection setting was on extended.

2) Then I switched the mode to Extended in the Utility

3)  Then on the desktop I right clicked and hit Screen Resolution.

4) I selected the second screen and hit “Detect”, a window came up with Another display not detected.

5) I made the Oculus screen primary and then switched back to the main computer being primary and it worked. The screen now appeared in the Oculus but the orientation was off. so I just adjusted it in the Screen resolution window.

DisplaySettings

 

Now the Oculus Configuration Utility looks like this, but it works.

AttachedNoTracker

 

 

In the Unity Editor I can move the game tab to the Headset Screen and maximize it, I can still see the awkward black rim around the screen but it’s better than nothing. Hopefully the Oculus team can fix this soon.

 

OculusExtendedScreenFull

 

Hope this helps, Happy Coding!

 

-TheNappingKat

Unity Gaming: Integrating Oculus

Setting Up

GREAT NEWS! In 2015 you no longer need Unity Pro Edition to integrate Oculus into your projects. YAY!

Things you’ll need for integration:

  • An Oculus (really you don’t need one to develop but how else will you test it out?)
  • Oculus SDK
  • Oculus Runtime
  • Unity 4 Integration

Okay so, the first thing you’ll need is to have your game all set up and running, in Unity. If you’ve been following my blog then you should have the bulk of the game running.

Cool. Next we need to grab the Oculus pieces from their site.

https://developer.oculus.com/downloads/

Now if you have a Mac or Linux download from those links.

WebsiteDownloads

 

After you download the links you now need to install the runtime. Then restart your computer.

Integrating into Unity

Extract the files from the Unity Integration Package you downloaded. Go Into Unity to Assets>Import Package> Custom Package

Find where you extracted the files and navigate to the Unity Plugin.

ImportingPackage2

 

Then hit import.

ImportingPackage3

Now you should have a new folder in your Assets called OVR

AssetsOVR

Cool so now it’s integrated lets Start using the Oculus Camera in the game.

Using Oculus Cameras

Now using the Oculus Package is super easy. Oculus has already created prefabs for developers to use. They have a prefab with just the camera rig as well as one with the rig and a character motor.

OVRPrefabs

To use them. Just do what you what you would normally do with prefabs. Click and Drag it into your scene.  I created a test scene called OVRTest to make sure everything would work without worrying about the infinite platforms generating around me.

I placed the OVRPlayerController at 0, 2, 0.

OVRinScene

Cool Now try running the game. You should have something that looks like this:

OVRGame

 

YAY! See super easy. The double circle screen is what will be fed to your Oculus, and with the lenses and the headset it should become one image with a 3 dimensional feel.

Now that you have the basic character installed you can add it to the main game scene and try it with the infinite platforms.

 

Happy Coding!

-TheNappingKat