SlideShare a Scribd company logo
BUILDING VR
APPLICATIONS FOR
GOOGLE CARDBOARD
Mark Billinghurst
mark.billinghurst@unisa.edu.au
January 20th 2017
Mark Billinghurst
▪ Director, Empathic Computing Lab
University of South Australia
▪ Past Director of HIT Lab NZ,
University of Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces
▪ More than 300 papers in AR, VR,
interface design
What You Will Learn
• Definitions of VR, Brief History of VR
• Introduction to Mobile VR/Google Cardboard
• Intoduction to Unity3D
• Complete 7 projects
• 1 Building a Unity Scene
• 2 Immersive 360 Panorama
• 3 Creating a 3D VR Scene
• 4 Adding Movement
• 5 Gaze based interaction
• 6 Menu input
• 7 Moving Menus
• Cardboard interface design guidelines
• Resources for learning more
Introduction to Virtual Reality
Virtual Reality
Computer generated multi-sensory simulation of an
artificial environment that is interactive and immersive.
Building VR Applications For Google Cardboard
What is Virtual Reality?
Virtual reality is..
a computer technology that replicates an
environment, real or imagined, and simulates a
user's physical presence and environment to
allow for user interaction. (Wikipedia)
• Defining Characteristics
• Environment simulation
• Presence
• Interaction
Key Technologies
• Autonomy
• Head tracking, body input
• Intelligent systems
• Interaction
• User input devices, HCI
• Presence
• Graphics/audio/multisensory output
• Multisensory displays
• Visual, audio, haptic, olfactory, etc
Types of VR
9
Brief History of Virtual Reality
https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg
Desktop VR - 1995
• Expensive - $150,000+
• 2 million polys/sec
• VGA HMD – 30 Hz
• Magnetic tracking
Desktop VR 2016
• Graphics Desktop
• $1,500 USD
• >4 Billion poly/sec
• $600 HMD
• 1080x1200, 90Hz
• Optical tracking
• Room scale
Oculus Rift
Sony Morpheus
HTC/Valve Vive
2016 - Rise of Consumer HMDs
Google Cardboard - Mobile VR
Computer Based vs. Mobile VR
MobileVR:Google Cardboard
• Released 2014 (Google 20% project)
• >5 million shipped/given away
• Easy to use developer tools
+ =
Version 1.0 vs Version 2.0
• Version 1.0 – Android focused, magnetic switch, small phone
• Version 2.0 – Touch input, iOS/Android, fits many phones
Many Different Cardboard Viewers
Multiple Mobile VR Viewers Available
• In 2016 – 46m possible desktop VR users vs. 400 m mobile VR users
• https://thoughts.ishuman.co/vr-will-be-mobile-
11529fabf87c#.vfcjzy1vf
Mobile VR Applications
Types of VR Experiences
• Immersive Spaces
• 360 Panorama’s/Movies
• High visual quality
• Limited interactivity
• Changing viewpoint orientation
• Immersive Experiences
• 3D graphics
• Lower visual quality
• High interactivity
• Movement in space
• Interact with objects
Immersive Panorama
• High quality 360 image or video surrounding user
• User can turn head to see different views
• Fixed position
Cardboard Camera (iOS/Android)
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Cardboard
Example Panorama Applications
• Within
• http://with.in
• High quality 360 VR content
• New York Times VR Experience
• NYTVR application
• Documentary experiences
• YouTube 360 Videos
• Collection of 360 videos
Google Cardboard App
• 7 default experiences
• Earth: Fly on Google Earth
• Tour Guide: Visit sites with guides
• YouTube: Watch popular videos
• Exhibit: Examine cultural artifacts
• Photo Sphere: Immersive photos
• Street View: Drive along a street
• Windy Day: Interactive short story
100’s of Google Play Cardboard apps
Building VR Experiences
What You Need
• Cardboard Viewer/VR Viewer
• https://vr.google.com/cardboard/
• Smart phone
• Android/iOS
• Authoring Tools/SDK
• Google VR SDK
• Unity/Unreal game engine
• Non programming tools
• Content
• 3D models, video, images, sounds
Software Tools
• Low level SDKs
• Need programming ability
• Java, C#, C++, etc
• Example: Google VR SDK (iOS, Android)
• https://developers.google.com/vr/
• Game Engines
• Powerful, need scripting ability
• Unity - https://unity3d.com/
• Unreal - https://www.unrealengine.com/vr
• Combine with VR plugins (HMDs, input devices)
• Google VR Unity plugin
Unity 3D Game Editor
Tools for Non-Programmers
• Focus on Design, ease of use
• Visual Programming, content arrangement
• Examples
• Insta-VR – 360 panoramas
• http://www.instavr.co/
• Vizor – VR on the Web
• http://vizor.io/
• A-frame – HTML based
• https://aframe.io/
• ENTiTi – Both AR and VR authoring
• http://www.wakingapp.com/
• Eon Creator – Drag and drop tool for AR/VR
• http://www.eonreality.com/eon-creator/
Google VR SDK for Unity
Free Download
https://developers.google.com/vr/unity/download/
Features:
1. Lens distortion correction
2. Head tracking
3. 3D calibration
4. Side-by-side rendering
5. Stereo geometry configuration
6. User input event handling
7. VR emulation mode, etc..
Unity Google VR SDK
INTRODUCTION TO UNITY
Unity Overview (see www.unity3d.com)
• Created in 2005
• Tool for creating games and 2D/3D applications
• Advanced graphics support
• Support for multiplayer, analytics, performance, ads, etc
• Cross Platform Game Engine
• One of the most popular (> 1.5 million developers)
• 27 platforms (iOS,Android, Windows, Mac, etc)
• Multiple license models
• Free for personal use/small business
• Large developer community
• Tutorials, support
• User generated content/assets
Building VR Applications For Google Cardboard
SETUP
Download and Install (for Android)
• Go to unity3d.com/download
• Use Download Assistant – pick components you want
• Make sure to install Android components
• Also install Android studio (https://developer.android.com/studio/)
Getting Started
• First time running Unity you’ll be asked to create a project
• Specify project name and location
• Can pick asset packages (pre-made content)
Unity Interface
• Toolbar, Scene, Hierarchy, Project, Inspector
Customizable Interface
Building Scenes
• Use GameObjects:
• Containers that hold different components
• Eg 3D model, texture, animation
• Use Inspector
• View and edit object properties and other settings
• Use Scene View
• Position objects, camera, lights, other GameObjects etc
• Scripting
• Adding interaction, user input, events, etc
GameObjects
• Every object in Scene is a GameObject
• GameObjects contain Components
• Eg Transform Component, Camera Components
• Clicking on object will show values in Inspector panel
Adding 3D Content
• Create 3D asset using modeling package, or download
• Fbx, Obj file format for 3D models
• Add file to Assets folder in Project
• When project opened 3D model added to Project View
• Drag mesh from Project View into Hierarchy or Scene View
• Creates a game object
Positioning/Scaling Objects
• Click on object and choose transform
Unity Prefabs
• When download assets, often download Prefabs (blue squares)
• Use by dragging and dropping into scene hierachy
• Prefab is a way of storing a game object with properties and
components already set
• Prefab is a template from which you can create new object
instances in the scene
• Changes to a prefab asset will change all instances in the scene
Unity Asset Store
• Download thousands models, scripts, animations, etc
• https://www.assetstore.unity3d.com/
PROJECT 1:BUILDING A
UNITY SCENE
Making a Simple Scene - Key Steps
1. Create New Project
2. Create Game Object
3. Moving main camera position
4. Adding lights
5. Adding more objects
6. Adding physics
7. Changing object materials
8. Adding script behaviour
CreateProject
• Create new folder and project
New Empty Project
Create GameObject
• Load a Sphere into the scene
• GameObject -> 3D Object -> Sphere
Moving main camera
• Select Main Camera
• Select translate icon
• Move camera
Add Light
• GameObject -> Light -> Directional Light
• Use inspector to modify light properties (colour, intensity)
Add Physics
• Select Sphere
• Add Rigidbody component
• Add Component -> Physics -> RigidBody
• or Component -> Physics -> RigidBody
• Modify inspector properties (mass, drag, etc)
Add More Objects
• Add several cubes
• GameObject -> 3D Object – Cube
• Move cube
• Add Rigid Body component (uncheck gravity)
Add Material
• Assets -> Create -> Material
• Click Albedo colour box in inspector
• Select colour
• Drag asset onto object to apply
Add Script
• Assets -> Create -> C# script
• Edit script using Mono
• Drag script onto Game Object
Example C# Script
GameObject Rotation
using UnityEngine;
using System.Collections;
public class spin : MonoBehaviour {
    // Use this for initialization
    void Start () {
    
    }
    
    // Update is called once per frame
    void Update () {
        this.gameObject.transform.Rotate(Vector3.up*10);
    }
}
Scripting C# Unity 3D
• void Awake():
• Is called when the first scene is loaded and the game object is active
• void Start():
• Called on first frame update
• void FixedUpdate():
• Called before physics calculations are made
• void Update():
• Called every frame before rendering
• void LateUpdate():
• Once per frame after update finished
Final Spinning Cube Scene
PROJECT 2:
IMMERSIVE 360 PANORAMA
Key Steps
1. Create a new project
2. Load the Google VR SDK
3. Load a panorama image asset
4. Create a Skymap
5. Add to VR scene
6. Deploy to mobile phone
New Project
Load Google VR SDK
• Assets -> Import Package -> Custom Package
• Navigate to GoogleVRForUnity.unitypackage
• Uncheck iOS (for Android build)
Load Cardboard Main Camera
• Drag GvrViewerMain prefab into Hierarchy
• Assets -> GoogleVR -> Prefabs
• Keep Main Camera
Panorama Image Asset
• Find/create suitable panorama image
• Ideally 2K or higher resolution image in cubemap layout
• Google “Panorama Image Cubemap”
Capturing Panorama
• Stitching photos together
• Image Composite Editor (Microsoft)
• AutoPano (Kolor)
• Using 360 camera
• Ricoh Theta-S
• Fly360
Image Composite Editor (Microsoft)
• Free panorama stitching tool
• http://research.microsoft.com/en-us/um/redmond/projects/ice/
AutoPano (Kolor)
• Finds image from panoramas and stitches them together
• http://www.kolor.com/autopano/
Add Image Asset to Project
• Assets -> Import Asset
• Select desired image
• In Inspector
• Set Texture Type to Cubemap
• Set mapping to Latitude-
Longitude (Cylindrical)
• Hit Apply button
Create Skybox Material
• Assets -> Create -> Material
• Name material - e.g. 'Sky'
• Set Shader to Skybox -> Cubemap
• Drag texture to cubemap
Create Skybox
• Window -> Lighting
• new window pops up
• Drag Skybox material into
Skypebox form
Panorama Image Appears in Unity
One Last Thing..
• Check Clear Flags on Camera is set to Skybox
• Select Main Camera
• Look at Camera in Inspector
• Clear Flags -> Skybox
Test It Out
• Hit play button
• Use alt/option key + mouse to look around
Deploying to Phone (Android)
1. Plug phone into USB
• Put phone into debug mode
2. Open Build Settings
3. Change Target platform to Android
4. Resolution and Presentation
• Default Orientation -> Landscape Left
5. Under Player Settings
• Edit Bundle Identifier – eg com.UniSA.cubeTest
• Minimum API level
6. Build and Run
• Select .apk file name
Setting Path to Android
• You may need to tell Unity
where the Android SDK is
• Set the path:
• Edit -> Preferences ->
External Tools
Running on Phone
• Droid@Screen View on Desktop
Making Immersive Movie
• Create movie texture
• Convert 360 video to .ogg or ,mp4 file
• Add video texture as asset
• Make Sphere
• Equirectangular UV mapping
• Inward facing normals
• Move camera to centre of sphere
• Texture map video to sphere
• Easy Movie Texture ($65)
• Apply texture to 3D object
• For 3D 360 video
• Render two Spheres
• http://bernieroehl.com/360stereoinunity/
PROJECT 3:
CREATING A 3D VR SCENE
Key Steps
1. Creating a new project
2. Load Google VR SDK
3. Add GvrViewerMain to scene
4. Loading in 3D asset packages
5. Loading a SkyDome
6. Adding a plane floor
New Project
• GvrViewerMain added to Hierachy
Download Model Package
• Magic Lamp from 3dFoin
• Search on Asset store
Load Asset + Add to Scene
• Assets -> Import Package -> Custom Package
• Look for MagicLamp.unitypackage (If not installed already)
• Drag MagicLamp_LOD0 prefab into Hierarchy
• Assets -> MagicLamp -> MagicLamp_LOD0
• Position and rotate
Import SkySphere package
• SkySphere Volume1 on Asset store
Add SkySphere to Scene
• Drag Skyball_WithoutCap into Hierarchy
• SkySphere_V1 -> Meshes
• Rotate and Scale as needed (using Inspector)
Add Ground Plane
• GameObject -> 3D Object -> Plane
• Set Scale X to 3.0, Z to 3.0
Testing View
• Use alt/option key plus mouse to rotate view
Adding More Assets
• Load from Asset store – look for free assets
PROJECT 4:
ADDING MOVEMENT
Moving Through VR Scenes
• Move through looking
• Look at target to turn on/off moving
• Button/tapping screen
• Being in a vehicle (e.g. Roller Coaster)
Adding Movement Through Looking
Goal: Move in direction user is looking when button
on VR display pressed or screen touched
• Key Steps
1. Start with static scene
2. Create player body
3. Create movement script
4. Add movement script to player body
Key Steps
1. Create New Project
2. Import GoogleVRforUnity Package
3. Create objects in scene
4. Add player body
5. Include collision detection
6. Add player movement script
Create New Project
• Include GoogleVRforUnity
• Assets->ImportPackage->Custom Package
Add GvrViewerMain to Project
• Drag GvrViewerMain into Hierarchy
• from Asset->GoogleVR->Prefabs
Add Ground Plane and Objects
• Create simple scene of Ground Plane and obects
• GameObject -> 3D Object -> Plane/Cube/Sphere/Cylinder
• Scale and position as you like, add materials
• Add rigidbody components to objects (not plane) to enable collisions
• Select object -> Add Component -> Rigidbody
• Fix position of object: Constraints -> Freeze Position -> check x,y,z (Freeze Rotation)
Add Player Body
• Select Main Camera
• Add Component->Mesh Filter
• Click on circle icon on right ->
Select Capsule mesh
Make the Body Visible
• Select Main Camera
• Add component -> Mesh Renderer
• Create a material and drag onto capsule mesh
Add Collision Detection
• Allow player to collide with objects
• Select Main Camera
• Add Component -> Capsule Collider
• Add Component -> RigidBody
• Fix player to ground
• In RigidBody component
• Uncheck “Use Gravity”
• Uncheck “Is Kinematic”
• Check Constraints -> Freeze Position -> Y axis
Add Movement Script
• Select Main Camera
• Create new script called PlayerMovement
• Add component -> New Script
• Key variables - speed, rigidbody
public float speed = 3.0f;
Rigidbody rbody;
• Define fixedupdate movement function (move in direction looking)
void FixedUpdate () {
if(Input.touchCount>0||Input.GetMouseButton(0))
rbody.MovePosition(transform.position+transform.forward
* Time.deltaTime*speed);
}
PlayerMovement Script
Building VR Applications For Google Cardboard
Run Demo
• Use left mouse button to move in direction looking
• Button press/screen tap on mobile phone
Demo Problem
• Wait! I'm bouncing off objects
• Moving body hits fixed objects and gets
negative velocity
Stopping Camera Motion
• When camera collides it's given momentum
• velocity and angular velocity
• Need to set velocity and angular velocity to zero
• In player movement script
• Set rbody velocity components to zero
Revised PlayerMovement Script
Final Demo
• Move in direction camera looking
• Collide with objects and stop moving
PROJECT 5:
GAZE INTERACTION
Gaze Interaction
• Cause events to happen when looking at objects
• E.g look at a target to shoot at it
Key Steps
1. Begin with VR scene from Project 4
2. Add physics ray caster
• Casts a ray from camera (gaze ray)
3. Add function to object to respond to gaze
• E.g. when gaze ray hits target cause particle effect
4. Add event trigger to target object
5. Add event system to target object
Adding Physics Raycaster
• Aim: To send a virtual ray from camera view
• Process
• Select Main Camera
• Add GvrPointerPhysicsRaycaster Component to Main
Camera
• Add component -> GvrPointerPhysicsRaycaster
Add Gaze Function
• Select target object (the cube model)
• Add component -> new script
• Call script CubeInteraction
• Add OnGazeEnter(), OnGazeExit() public functions
• Decide what happens when gaze enters/exits Cube model
• Complete this later
Add Event Trigger
• Select Target Object (Cube)
• Add component
• EventTriger
• Add New Event Type -> PointerEntry
• Add object to event
• Hit ‘+’ tag
• Drag Cube object to box under Runtime Only
• Select Function to run
• Select function list -> scroll to CubeInteraction -> OnGazeEnter
• Repeat for OnGazeExit
Adding Event System
• Need to user Event System for trigger to work
• Looks for gaze events occuring with Cube object
• Add Event System to Hierachy
• Game Object -> UI -> Event System
• Add gazeInputModule to Event System
• Add component -> Gaze Input Module
Add Collider to Object
• Need to detect when target object is being looked at
• Select target Object
• Add Collider (eg Box)
• Add component -> Box Collider
• Adjust position and size of Collider if needed
• Make sure it covers the target area
Making Gaze Point Visible
• In current system can't see user's Gaze point
• Add viewing reticle
• Drag GvrReticlePointer prefab onto main camera
• Assets -> GoogleVR -> Prefabs -> UI
• Reticle changes shape when on active object
• Change reticle material to make it more visible
• Set color in GvrReticleMaterial (e.g. to Red)
Demo
• Reticle changes shape when gazing at an object
that responds to gaze events
Add Gaze Event
• Add code to the gaze functions
• Change cube colour when gazed at
• Get initial cube material
• Add code to gaze functions
Final CubeInteraction Script
Final Demo
• Cube changes to blue colour when gazed at
• Cube changes to white colour when gazed away from
PROJECT 6:
MENU INTERACTION
Menu Placement
• Different types of menu placement
• Screen aligned - always visible on screen
• World aligned - attached to object or location in VR scene
• Camera aligned - moves with the user
• This project shows a world aligned menu
Interacting with VR Menus
• Touch input
• Tap screen to select menu button
• Suitable for handheld applications
• Head/Gaze pointing
• Look at menu button, click to select
• Ideal for menus in VR display
Key Steps
1. Create New Scene and gaze support
2. Create User Interface menu object
3. Add buttons to user interface
4. Add button scripts
5. Add gaze interaction
6. Object interaction scripts
7. Make the menu disappear and reappear
Create New Scene
• Create scene with cube and plane
• Add materials
• Import GoogleVRforUnity package
• Drag GvrViewerMain into project hierachy
Setup Gaze Pointing
• Drag GvrReticlePointer to Main Camera
• Assets -> GoogleVR -> Prefabs -> UI
• Add Gvr Pointer Physics Raycaster to Main Camera
• Add component -> GvrPointerPhysicsRaycaster
Menu Functionality
• Want to set up a menu that changes cube colour
• Menu fixed in space
• Located need object which it affects
• Two buttons (white/blue)
• Look at blue button to set cube colour to blue
• Look at white button to set cube colour to white
Menu Implementation
• Create a 2D canvas plane
• Place canvas in VR scene where it is needed
• Add buttons to the plane
• Add scripts to the buttons
• Triggered based on gaze input
Setting up Menu Canvas
• Create Empty Object name it UserInterface
• Create image object under UserInterface
• Right click UserInterface -> UI -> Image
• Set the canvas to world space
• Move image until visible and resize
• Change image colour
Menu Canvas
Add Buttons
• Add two buttons to UI image
• Colour one blue (Image script colour)
• Remove button scripts
• We'll add our own
• Add sphere collider same size as button
Add Button Scripts
• Create identical scripts for Blue and White buttons
• Different names
• BlueButton, WhiteButton
• Include OnLook() Function
• Gaze function
Blue Button Script
Add Event Triggers
• Add event triggers to each button
• Add component -> Event Trigger
• Event trigger type as Pointer Enter
• Set target object as button
• Set target function as OnLook()
• Add Event System to Hierarchy
• Add component Gaze Input Module
Testing
• Reticle changes style over buttons
Add Cube Behaviour
• Add new script to cube, CubeActions
• Add component -> New Script
• Script that can change cube colour
• Define local materials, copy existing materials
• Create functions that can change colours
• SetColorWhite(), SetColorBlue()
CubeActions Script
Add Gaze Behaviour
• Edit button scripts to add cube colour changing
• Add public CubeActions object
• public CubeActions m_cube;
• Call set colour function in OnLook function
• m_cube.SetColorBlue();
• Drag Cube object to script form
Final BlueButton Script
• White button same, but use m_cube.SetColorWhite();
Testing It Out
• Cube changes colour depending on button looked at
Making the Menu Disappear
• Don't want menu visible all the time
• Right click with mouse to appear/disappear
• Double tap with VR headset to appear/disappear
• Create menu script
• ToggleMenu function - turns menu on and off
• Note: Add script to User Interface object
• Add menu image as arguement
Menu Script
Testing it Out
PROJECT 7:
MOVING MENU
Moving a Menu with the User
• World aligned menus good for actions on objects
• e.g. select to change colour
• However you may want to move a menu with the user
• e.g. menu for user navigation
• This project shows how to add a menu to the camera
• Menu moves with the user as they move through the VR scene
Key Steps
1. Start with scene from Project 6
2. Create canvas object
3. Add button to canvas
4. Create player
5. Add player movement script
6. Add script for canvas movement
User Experience
• Have a walk button on the ground
• When player looks down they can toggle button on and off
• Look at walk button, click to toggle walking on and off
Create MoveButton Canvas
• Create canvas object
• UI->Canvas
• Set render mode to world space
• Resize and reposition
• Put flat on plane, a little in front of camera
Add Image to Canvas
• Create image on canvas
• Right click canvas
• UI -> image
• Set image to transparent
• Set image size to smaller than canvas
Add Button to Image
• Right click image
• UI -> button
• Resize and move to fill image
• Set colour and pressed colour
• Set text to “Walk”
• Expand button to see text object
Create Player Object
• Create empty obect
• Rename it Player
• Create empty child
• Rename it LocalTrans
• Move Canvas under LocalTrans
• Move Main Camera under Player
Add PlayerMove Script
• Add script to Main Camera
• ToggleWalk function that toggles walking
• If walking on then move camera
PlayerMove Script
Connect Player Moving to Button
• Select Button Object
• In the Button Script On Click () Action
• Set target object as Main Camera
• Set target function as ToggleWalk
• PlayerMove -> ToggleWalk
Event System
• Make sure project has event system
• Add at same level as Player
• GameObject -> UI -> EventSystem
• Add Gaze Input Module component
• Add Component -> Gaze Input Module
• Remove Standalone Input Module script
• or deactivate by checking checkbox
Testing
• Look at Walk button and click
• Player moves, but button doesn't !
Moving Menu with Camera
• Add a script to the LocalTrans object
• CanvasMovement script
• Script does the following:
• finds the current camera position
• sets LocalTrans to that position
• rotates LocalTrans about y axis the same as camera
• Outcome:
• Menu moves with camera.
• User can look down to click on button
CanvasMovement Script
Final Result
• Menu follows camera/player
• Note: You may have to experiment with different
canvas position and scale settings for it to appear
DESIGN GUIDELINES
Google Design Guidelines
• Google’s Guidelines for good VR experiences:
• Physiological Considerations
• Interactive Patterns
• Setup
• Controls
• Feedback
• Display Reticle
• From http://www.google.com/design/spec-vr/designing-
for-google-cardboard/a-new-dimension.html
Physiological Considerations
• Factors to Consider
• Head tracking
• User control of movement
• Use constant velocity
• Grounding with fixed objects
• Brightness changes
Interactive Patterns - Setup
• Setup factors to consider:
• Entering and exiting
• Headset adaptation
• Full Screen mode
• API calls
• Indicating VR apps
System Control
• Issuing a command to change system state or mode
• Examples
• Launching application
• Changing system settings
• Opening a file
• Etc.
• Key points
• Make commands visible to user
• Support easy selection
Example: GearVR Interface
• 2D Interface in 3D Environment
• Head pointing and click to select
Interactive Patterns - Controls
• Use fuze buttons for selection in VR
Interactive Patterns - Feedback
• Use audio and haptic feedback
• Reduce visual overload
• Audio alerts
• 3D spatial sound
• Phone vibrations
Interactive Patterns - Display Reticle
• Easier for users to target objects with a display reticle
• Can display reticle only when near target object
• Highlight objects (e.g. with light source) that user can target
Use Ray-casting technique
• “Laser pointer” attached
to virtual hand or gaze
• First object intersected by
ray may be selected
• User only needs to control
2 DOFs
• Proven to perform well
for remote selection
• Variants:
• Cone casting
• Snap-to-object rays
Gaze Directed Steering
• Move in direction that you are looking
• Very intuitive, natural navigation
• Can be used on simple HMDs (Google Cardboard
• But: Can’t look in different direction while moving
Cardboard Design Lab Application
• Use Cardboard Design Lab app to explore design ideas
Cardboard Design Lab Video
https://www.youtube.com/watch?v=2Uf-ru2Ndvc
RESOURCES
Books
• Unity Virtual Reality Projects
• Jonathan Linowes
• Holistic Game Development
with Unity
• Penny de Byl
User Experiences for VR Website
• www.uxofvr.com
Useful Resources
• Google Cardboard main page
• https://www.google.com/get/cardboard/
• Developer Website
• https://vr.google.com/cardboard/developers/
• Building a VR app for Cardboard
• http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/
• Creating VR game for Cardboard
• http://danielborowski.com/posts/create-a-virtual-reality-game-for-
google-cardboard/
• Moving in VR space
• http://www.instructables.com/id/Prototyping-Interactive-Environments-
in-Virtual-Re/
Resources
• Unity Main site
• http://www.unity3d.com/
• Holistic Development with Unity
• http://holistic3d.com
• Official Unity Tutorials
• http://unity3d.com/learn/tutorials
• Unity Coder Blog
• http://unitycoder.com
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

PDF
Lecture 5: 3D User Interfaces for Virtual Reality
PDF
Using Interaction Design Methods for Creating AR and VR Interfaces
PDF
Comp 4010 2021 - Snap Tutorial-1
PDF
Comp4010 lecture6 Prototyping
PDF
Developing AR and VR Experiences with Unity
PDF
Comp4010 2021 Lecture2-Perception
PDF
Comp4010 Lecture10 VR Interface Design
PDF
Building AR and VR Experiences
Lecture 5: 3D User Interfaces for Virtual Reality
Using Interaction Design Methods for Creating AR and VR Interfaces
Comp 4010 2021 - Snap Tutorial-1
Comp4010 lecture6 Prototyping
Developing AR and VR Experiences with Unity
Comp4010 2021 Lecture2-Perception
Comp4010 Lecture10 VR Interface Design
Building AR and VR Experiences

What's hot (20)

PDF
Advanced Methods for User Evaluation in Enterprise AR
PDF
Comp 4010 2021 Lecture1-Introduction to XR
PPTX
Virtual reality ppt
PDF
COMP 4010 Lecture9 AR Interaction
PPTX
Microsoft Hololens
PDF
COMP 4010 - Lecture 4: 3D User Interfaces
PDF
Designing Usable Interface
PDF
2022 COMP4010 Lecture2: Perception
PDF
UI/UX Tips & Tricks for developers
PDF
COMP 4010 - Lecture 2: VR Technology
PDF
Lecture 4: VR Systems
PDF
2022 COMP4010 Lecture 6: Designing AR Systems
PDF
Introduction to Digital humanities
PDF
COMP 4010 Lecture 9 AR Interaction
PDF
Virtual Reality: Sensing the Possibilities
PDF
Mixed Reality
PDF
Visualising Data with Code
PDF
Comp4010 Lecture8 Introduction to VR
PDF
Lecture 8 Introduction to Augmented Reality
PPT
Ux team organization
Advanced Methods for User Evaluation in Enterprise AR
Comp 4010 2021 Lecture1-Introduction to XR
Virtual reality ppt
COMP 4010 Lecture9 AR Interaction
Microsoft Hololens
COMP 4010 - Lecture 4: 3D User Interfaces
Designing Usable Interface
2022 COMP4010 Lecture2: Perception
UI/UX Tips & Tricks for developers
COMP 4010 - Lecture 2: VR Technology
Lecture 4: VR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
Introduction to Digital humanities
COMP 4010 Lecture 9 AR Interaction
Virtual Reality: Sensing the Possibilities
Mixed Reality
Visualising Data with Code
Comp4010 Lecture8 Introduction to VR
Lecture 8 Introduction to Augmented Reality
Ux team organization
Ad

Viewers also liked (17)

PDF
Designing Outstanding AR Experiences
PDF
Fifty Shades of Augmented Reality: Creating Connection Using AR
PDF
COMP 4010: Lecture 6 Example VR Applications
PDF
Easy Virtual Reality
PDF
COMP 4010: Lecture8 - AR Technology
PDF
COMP 4010 - Lecture 7: Introduction to Augmented Reality
PDF
COMP 4010 - Lecture10: Mobile AR
PDF
COMP 4010 - Lecture11 - AR Applications
PDF
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
PDF
Collaborative Immersive Analytics
PDF
COMP 4010 Lecture 3 VR Input and Systems
PDF
Create Your Own VR Experience
PDF
COMP 4010 Lecture12 - Research Directions in AR and VR
PDF
Beyond Reality (2027): The Future of Virtual and Augmented Reality
PDF
COMP 4010 - Lecture1 Introduction to Virtual Reality
PDF
COMP 4010: Lecture2 VR Technology
PDF
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Designing Outstanding AR Experiences
Fifty Shades of Augmented Reality: Creating Connection Using AR
COMP 4010: Lecture 6 Example VR Applications
Easy Virtual Reality
COMP 4010: Lecture8 - AR Technology
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture11 - AR Applications
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
Collaborative Immersive Analytics
COMP 4010 Lecture 3 VR Input and Systems
Create Your Own VR Experience
COMP 4010 Lecture12 - Research Directions in AR and VR
Beyond Reality (2027): The Future of Virtual and Augmented Reality
COMP 4010 - Lecture1 Introduction to Virtual Reality
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Ad

Similar to Building VR Applications For Google Cardboard (20)

PDF
Cardboard VR: Building Low Cost VR Experiences
PDF
Mobile AR Tutorial
PDF
Developing VR Experiences with Unity
PPTX
Mixed reality for Windows 10
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
PDF
STEM Camp Virtual Reality
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
PPTX
Philipp Nagele (Wikitude): What's Next with Wikitude
PDF
How to Use WebVR to Enhance the Web Experience
PDF
Mobile AR Lecture6 - Introduction to Unity 3D
PPTX
Philipp Nagele (CTO, Wikitude) An Insider Deep-Dive into the Wikitude SDK
PPTX
Augmented Reality Application - Final Year Project
PPTX
Introduction to daydream for AnDevCon DC - 2017
PDF
Getting started with Verold and Three.js
PPT
IEEE VR-SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
PPTX
Introduction to Unity
PDF
Mobile AR Lecture 7 - Introduction to Vuforia
PDF
Comp4010 Lecture7 Designing AR Systems
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
PPTX
COMIT Sept 2016 - Experium (Vin Sumner)
Cardboard VR: Building Low Cost VR Experiences
Mobile AR Tutorial
Developing VR Experiences with Unity
Mixed reality for Windows 10
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
STEM Camp Virtual Reality
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Philipp Nagele (Wikitude): What's Next with Wikitude
How to Use WebVR to Enhance the Web Experience
Mobile AR Lecture6 - Introduction to Unity 3D
Philipp Nagele (CTO, Wikitude) An Insider Deep-Dive into the Wikitude SDK
Augmented Reality Application - Final Year Project
Introduction to daydream for AnDevCon DC - 2017
Getting started with Verold and Three.js
IEEE VR-SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
Introduction to Unity
Mobile AR Lecture 7 - Introduction to Vuforia
Comp4010 Lecture7 Designing AR Systems
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
COMIT Sept 2016 - Experium (Vin Sumner)

More from Mark Billinghurst (20)

PDF
Empathic Computing: Creating Shared Understanding
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
PDF
Research Directions in Heads-Up Computing
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
PDF
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
PDF
Research Directions for Cross Reality Interfaces
PDF
The Metaverse: Are We There Yet?
PDF
Human Factors of XR: Using Human Factors to Design XR Systems
PDF
IVE Industry Focused Event - Defence Sector 2024
Empathic Computing: Creating Shared Understanding
Reach Out and Touch Someone: Haptics and Empathic Computing
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Research Directions in Heads-Up Computing
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
IVE 2024 Short Course - Lecture14 - Evaluation
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Research Directions for Cross Reality Interfaces
The Metaverse: Are We There Yet?
Human Factors of XR: Using Human Factors to Design XR Systems
IVE Industry Focused Event - Defence Sector 2024

Recently uploaded (20)

PDF
cuic standard and advanced reporting.pdf
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PPTX
A Presentation on Artificial Intelligence
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
Modernizing your data center with Dell and AMD
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
Big Data Technologies - Introduction.pptx
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Machine learning based COVID-19 study performance prediction
PDF
Spectral efficient network and resource selection model in 5G networks
cuic standard and advanced reporting.pdf
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
A Presentation on Artificial Intelligence
Encapsulation_ Review paper, used for researhc scholars
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Network Security Unit 5.pdf for BCA BBA.
Unlocking AI with Model Context Protocol (MCP)
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Understanding_Digital_Forensics_Presentation.pptx
Modernizing your data center with Dell and AMD
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Advanced methodologies resolving dimensionality complications for autism neur...
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Diabetes mellitus diagnosis method based random forest with bat algorithm
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Review of recent advances in non-invasive hemoglobin estimation
Big Data Technologies - Introduction.pptx
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Machine learning based COVID-19 study performance prediction
Spectral efficient network and resource selection model in 5G networks

Building VR Applications For Google Cardboard

  • 1. BUILDING VR APPLICATIONS FOR GOOGLE CARDBOARD Mark Billinghurst mark.billinghurst@unisa.edu.au January 20th 2017
  • 2. Mark Billinghurst ▪ Director, Empathic Computing Lab University of South Australia ▪ Past Director of HIT Lab NZ, University of Canterbury ▪ PhD Univ. Washington ▪ Research on AR, mobile HCI, Collaborative Interfaces ▪ More than 300 papers in AR, VR, interface design
  • 3. What You Will Learn • Definitions of VR, Brief History of VR • Introduction to Mobile VR/Google Cardboard • Intoduction to Unity3D • Complete 7 projects • 1 Building a Unity Scene • 2 Immersive 360 Panorama • 3 Creating a 3D VR Scene • 4 Adding Movement • 5 Gaze based interaction • 6 Menu input • 7 Moving Menus • Cardboard interface design guidelines • Resources for learning more
  • 5. Virtual Reality Computer generated multi-sensory simulation of an artificial environment that is interactive and immersive.
  • 7. What is Virtual Reality? Virtual reality is.. a computer technology that replicates an environment, real or imagined, and simulates a user's physical presence and environment to allow for user interaction. (Wikipedia) • Defining Characteristics • Environment simulation • Presence • Interaction
  • 8. Key Technologies • Autonomy • Head tracking, body input • Intelligent systems • Interaction • User input devices, HCI • Presence • Graphics/audio/multisensory output • Multisensory displays • Visual, audio, haptic, olfactory, etc
  • 10. Brief History of Virtual Reality https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg
  • 11. Desktop VR - 1995 • Expensive - $150,000+ • 2 million polys/sec • VGA HMD – 30 Hz • Magnetic tracking
  • 12. Desktop VR 2016 • Graphics Desktop • $1,500 USD • >4 Billion poly/sec • $600 HMD • 1080x1200, 90Hz • Optical tracking • Room scale
  • 13. Oculus Rift Sony Morpheus HTC/Valve Vive 2016 - Rise of Consumer HMDs
  • 14. Google Cardboard - Mobile VR
  • 15. Computer Based vs. Mobile VR
  • 16. MobileVR:Google Cardboard • Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools + =
  • 17. Version 1.0 vs Version 2.0 • Version 1.0 – Android focused, magnetic switch, small phone • Version 2.0 – Touch input, iOS/Android, fits many phones
  • 19. Multiple Mobile VR Viewers Available
  • 20. • In 2016 – 46m possible desktop VR users vs. 400 m mobile VR users • https://thoughts.ishuman.co/vr-will-be-mobile- 11529fabf87c#.vfcjzy1vf
  • 22. Types of VR Experiences • Immersive Spaces • 360 Panorama’s/Movies • High visual quality • Limited interactivity • Changing viewpoint orientation • Immersive Experiences • 3D graphics • Lower visual quality • High interactivity • Movement in space • Interact with objects
  • 23. Immersive Panorama • High quality 360 image or video surrounding user • User can turn head to see different views • Fixed position
  • 24. Cardboard Camera (iOS/Android) • Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard
  • 25. Example Panorama Applications • Within • http://with.in • High quality 360 VR content • New York Times VR Experience • NYTVR application • Documentary experiences • YouTube 360 Videos • Collection of 360 videos
  • 26. Google Cardboard App • 7 default experiences • Earth: Fly on Google Earth • Tour Guide: Visit sites with guides • YouTube: Watch popular videos • Exhibit: Examine cultural artifacts • Photo Sphere: Immersive photos • Street View: Drive along a street • Windy Day: Interactive short story
  • 27. 100’s of Google Play Cardboard apps
  • 29. What You Need • Cardboard Viewer/VR Viewer • https://vr.google.com/cardboard/ • Smart phone • Android/iOS • Authoring Tools/SDK • Google VR SDK • Unity/Unreal game engine • Non programming tools • Content • 3D models, video, images, sounds
  • 30. Software Tools • Low level SDKs • Need programming ability • Java, C#, C++, etc • Example: Google VR SDK (iOS, Android) • https://developers.google.com/vr/ • Game Engines • Powerful, need scripting ability • Unity - https://unity3d.com/ • Unreal - https://www.unrealengine.com/vr • Combine with VR plugins (HMDs, input devices) • Google VR Unity plugin
  • 31. Unity 3D Game Editor
  • 32. Tools for Non-Programmers • Focus on Design, ease of use • Visual Programming, content arrangement • Examples • Insta-VR – 360 panoramas • http://www.instavr.co/ • Vizor – VR on the Web • http://vizor.io/ • A-frame – HTML based • https://aframe.io/ • ENTiTi – Both AR and VR authoring • http://www.wakingapp.com/ • Eon Creator – Drag and drop tool for AR/VR • http://www.eonreality.com/eon-creator/
  • 33. Google VR SDK for Unity Free Download https://developers.google.com/vr/unity/download/ Features: 1. Lens distortion correction 2. Head tracking 3. 3D calibration 4. Side-by-side rendering 5. Stereo geometry configuration 6. User input event handling 7. VR emulation mode, etc.. Unity Google VR SDK
  • 35. Unity Overview (see www.unity3d.com) • Created in 2005 • Tool for creating games and 2D/3D applications • Advanced graphics support • Support for multiplayer, analytics, performance, ads, etc • Cross Platform Game Engine • One of the most popular (> 1.5 million developers) • 27 platforms (iOS,Android, Windows, Mac, etc) • Multiple license models • Free for personal use/small business • Large developer community • Tutorials, support • User generated content/assets
  • 37. SETUP
  • 38. Download and Install (for Android) • Go to unity3d.com/download • Use Download Assistant – pick components you want • Make sure to install Android components • Also install Android studio (https://developer.android.com/studio/)
  • 39. Getting Started • First time running Unity you’ll be asked to create a project • Specify project name and location • Can pick asset packages (pre-made content)
  • 40. Unity Interface • Toolbar, Scene, Hierarchy, Project, Inspector
  • 42. Building Scenes • Use GameObjects: • Containers that hold different components • Eg 3D model, texture, animation • Use Inspector • View and edit object properties and other settings • Use Scene View • Position objects, camera, lights, other GameObjects etc • Scripting • Adding interaction, user input, events, etc
  • 43. GameObjects • Every object in Scene is a GameObject • GameObjects contain Components • Eg Transform Component, Camera Components • Clicking on object will show values in Inspector panel
  • 44. Adding 3D Content • Create 3D asset using modeling package, or download • Fbx, Obj file format for 3D models • Add file to Assets folder in Project • When project opened 3D model added to Project View • Drag mesh from Project View into Hierarchy or Scene View • Creates a game object
  • 45. Positioning/Scaling Objects • Click on object and choose transform
  • 46. Unity Prefabs • When download assets, often download Prefabs (blue squares) • Use by dragging and dropping into scene hierachy • Prefab is a way of storing a game object with properties and components already set • Prefab is a template from which you can create new object instances in the scene • Changes to a prefab asset will change all instances in the scene
  • 47. Unity Asset Store • Download thousands models, scripts, animations, etc • https://www.assetstore.unity3d.com/
  • 49. Making a Simple Scene - Key Steps 1. Create New Project 2. Create Game Object 3. Moving main camera position 4. Adding lights 5. Adding more objects 6. Adding physics 7. Changing object materials 8. Adding script behaviour
  • 50. CreateProject • Create new folder and project
  • 52. Create GameObject • Load a Sphere into the scene • GameObject -> 3D Object -> Sphere
  • 53. Moving main camera • Select Main Camera • Select translate icon • Move camera
  • 54. Add Light • GameObject -> Light -> Directional Light • Use inspector to modify light properties (colour, intensity)
  • 55. Add Physics • Select Sphere • Add Rigidbody component • Add Component -> Physics -> RigidBody • or Component -> Physics -> RigidBody • Modify inspector properties (mass, drag, etc)
  • 56. Add More Objects • Add several cubes • GameObject -> 3D Object – Cube • Move cube • Add Rigid Body component (uncheck gravity)
  • 57. Add Material • Assets -> Create -> Material • Click Albedo colour box in inspector • Select colour • Drag asset onto object to apply
  • 58. Add Script • Assets -> Create -> C# script • Edit script using Mono • Drag script onto Game Object
  • 59. Example C# Script GameObject Rotation using UnityEngine; using System.Collections; public class spin : MonoBehaviour {     // Use this for initialization     void Start () {          }          // Update is called once per frame     void Update () {         this.gameObject.transform.Rotate(Vector3.up*10);     } }
  • 60. Scripting C# Unity 3D • void Awake(): • Is called when the first scene is loaded and the game object is active • void Start(): • Called on first frame update • void FixedUpdate(): • Called before physics calculations are made • void Update(): • Called every frame before rendering • void LateUpdate(): • Once per frame after update finished
  • 63. Key Steps 1. Create a new project 2. Load the Google VR SDK 3. Load a panorama image asset 4. Create a Skymap 5. Add to VR scene 6. Deploy to mobile phone
  • 65. Load Google VR SDK • Assets -> Import Package -> Custom Package • Navigate to GoogleVRForUnity.unitypackage • Uncheck iOS (for Android build)
  • 66. Load Cardboard Main Camera • Drag GvrViewerMain prefab into Hierarchy • Assets -> GoogleVR -> Prefabs • Keep Main Camera
  • 67. Panorama Image Asset • Find/create suitable panorama image • Ideally 2K or higher resolution image in cubemap layout • Google “Panorama Image Cubemap”
  • 68. Capturing Panorama • Stitching photos together • Image Composite Editor (Microsoft) • AutoPano (Kolor) • Using 360 camera • Ricoh Theta-S • Fly360
  • 69. Image Composite Editor (Microsoft) • Free panorama stitching tool • http://research.microsoft.com/en-us/um/redmond/projects/ice/
  • 70. AutoPano (Kolor) • Finds image from panoramas and stitches them together • http://www.kolor.com/autopano/
  • 71. Add Image Asset to Project • Assets -> Import Asset • Select desired image • In Inspector • Set Texture Type to Cubemap • Set mapping to Latitude- Longitude (Cylindrical) • Hit Apply button
  • 72. Create Skybox Material • Assets -> Create -> Material • Name material - e.g. 'Sky' • Set Shader to Skybox -> Cubemap • Drag texture to cubemap
  • 73. Create Skybox • Window -> Lighting • new window pops up • Drag Skybox material into Skypebox form
  • 75. One Last Thing.. • Check Clear Flags on Camera is set to Skybox • Select Main Camera • Look at Camera in Inspector • Clear Flags -> Skybox
  • 76. Test It Out • Hit play button • Use alt/option key + mouse to look around
  • 77. Deploying to Phone (Android) 1. Plug phone into USB • Put phone into debug mode 2. Open Build Settings 3. Change Target platform to Android 4. Resolution and Presentation • Default Orientation -> Landscape Left 5. Under Player Settings • Edit Bundle Identifier – eg com.UniSA.cubeTest • Minimum API level 6. Build and Run • Select .apk file name
  • 78. Setting Path to Android • You may need to tell Unity where the Android SDK is • Set the path: • Edit -> Preferences -> External Tools
  • 79. Running on Phone • Droid@Screen View on Desktop
  • 80. Making Immersive Movie • Create movie texture • Convert 360 video to .ogg or ,mp4 file • Add video texture as asset • Make Sphere • Equirectangular UV mapping • Inward facing normals • Move camera to centre of sphere • Texture map video to sphere • Easy Movie Texture ($65) • Apply texture to 3D object • For 3D 360 video • Render two Spheres • http://bernieroehl.com/360stereoinunity/
  • 81. PROJECT 3: CREATING A 3D VR SCENE
  • 82. Key Steps 1. Creating a new project 2. Load Google VR SDK 3. Add GvrViewerMain to scene 4. Loading in 3D asset packages 5. Loading a SkyDome 6. Adding a plane floor
  • 83. New Project • GvrViewerMain added to Hierachy
  • 84. Download Model Package • Magic Lamp from 3dFoin • Search on Asset store
  • 85. Load Asset + Add to Scene • Assets -> Import Package -> Custom Package • Look for MagicLamp.unitypackage (If not installed already) • Drag MagicLamp_LOD0 prefab into Hierarchy • Assets -> MagicLamp -> MagicLamp_LOD0 • Position and rotate
  • 86. Import SkySphere package • SkySphere Volume1 on Asset store
  • 87. Add SkySphere to Scene • Drag Skyball_WithoutCap into Hierarchy • SkySphere_V1 -> Meshes • Rotate and Scale as needed (using Inspector)
  • 88. Add Ground Plane • GameObject -> 3D Object -> Plane • Set Scale X to 3.0, Z to 3.0
  • 89. Testing View • Use alt/option key plus mouse to rotate view
  • 90. Adding More Assets • Load from Asset store – look for free assets
  • 92. Moving Through VR Scenes • Move through looking • Look at target to turn on/off moving • Button/tapping screen • Being in a vehicle (e.g. Roller Coaster)
  • 93. Adding Movement Through Looking Goal: Move in direction user is looking when button on VR display pressed or screen touched • Key Steps 1. Start with static scene 2. Create player body 3. Create movement script 4. Add movement script to player body
  • 94. Key Steps 1. Create New Project 2. Import GoogleVRforUnity Package 3. Create objects in scene 4. Add player body 5. Include collision detection 6. Add player movement script
  • 95. Create New Project • Include GoogleVRforUnity • Assets->ImportPackage->Custom Package
  • 96. Add GvrViewerMain to Project • Drag GvrViewerMain into Hierarchy • from Asset->GoogleVR->Prefabs
  • 97. Add Ground Plane and Objects • Create simple scene of Ground Plane and obects • GameObject -> 3D Object -> Plane/Cube/Sphere/Cylinder • Scale and position as you like, add materials • Add rigidbody components to objects (not plane) to enable collisions • Select object -> Add Component -> Rigidbody • Fix position of object: Constraints -> Freeze Position -> check x,y,z (Freeze Rotation)
  • 98. Add Player Body • Select Main Camera • Add Component->Mesh Filter • Click on circle icon on right -> Select Capsule mesh
  • 99. Make the Body Visible • Select Main Camera • Add component -> Mesh Renderer • Create a material and drag onto capsule mesh
  • 100. Add Collision Detection • Allow player to collide with objects • Select Main Camera • Add Component -> Capsule Collider • Add Component -> RigidBody • Fix player to ground • In RigidBody component • Uncheck “Use Gravity” • Uncheck “Is Kinematic” • Check Constraints -> Freeze Position -> Y axis
  • 101. Add Movement Script • Select Main Camera • Create new script called PlayerMovement • Add component -> New Script • Key variables - speed, rigidbody public float speed = 3.0f; Rigidbody rbody; • Define fixedupdate movement function (move in direction looking) void FixedUpdate () { if(Input.touchCount>0||Input.GetMouseButton(0)) rbody.MovePosition(transform.position+transform.forward * Time.deltaTime*speed); }
  • 104. Run Demo • Use left mouse button to move in direction looking • Button press/screen tap on mobile phone
  • 105. Demo Problem • Wait! I'm bouncing off objects • Moving body hits fixed objects and gets negative velocity
  • 106. Stopping Camera Motion • When camera collides it's given momentum • velocity and angular velocity • Need to set velocity and angular velocity to zero • In player movement script • Set rbody velocity components to zero
  • 108. Final Demo • Move in direction camera looking • Collide with objects and stop moving
  • 110. Gaze Interaction • Cause events to happen when looking at objects • E.g look at a target to shoot at it
  • 111. Key Steps 1. Begin with VR scene from Project 4 2. Add physics ray caster • Casts a ray from camera (gaze ray) 3. Add function to object to respond to gaze • E.g. when gaze ray hits target cause particle effect 4. Add event trigger to target object 5. Add event system to target object
  • 112. Adding Physics Raycaster • Aim: To send a virtual ray from camera view • Process • Select Main Camera • Add GvrPointerPhysicsRaycaster Component to Main Camera • Add component -> GvrPointerPhysicsRaycaster
  • 113. Add Gaze Function • Select target object (the cube model) • Add component -> new script • Call script CubeInteraction • Add OnGazeEnter(), OnGazeExit() public functions • Decide what happens when gaze enters/exits Cube model • Complete this later
  • 114. Add Event Trigger • Select Target Object (Cube) • Add component • EventTriger • Add New Event Type -> PointerEntry • Add object to event • Hit ‘+’ tag • Drag Cube object to box under Runtime Only • Select Function to run • Select function list -> scroll to CubeInteraction -> OnGazeEnter • Repeat for OnGazeExit
  • 115. Adding Event System • Need to user Event System for trigger to work • Looks for gaze events occuring with Cube object • Add Event System to Hierachy • Game Object -> UI -> Event System • Add gazeInputModule to Event System • Add component -> Gaze Input Module
  • 116. Add Collider to Object • Need to detect when target object is being looked at • Select target Object • Add Collider (eg Box) • Add component -> Box Collider • Adjust position and size of Collider if needed • Make sure it covers the target area
  • 117. Making Gaze Point Visible • In current system can't see user's Gaze point • Add viewing reticle • Drag GvrReticlePointer prefab onto main camera • Assets -> GoogleVR -> Prefabs -> UI • Reticle changes shape when on active object • Change reticle material to make it more visible • Set color in GvrReticleMaterial (e.g. to Red)
  • 118. Demo • Reticle changes shape when gazing at an object that responds to gaze events
  • 119. Add Gaze Event • Add code to the gaze functions • Change cube colour when gazed at • Get initial cube material • Add code to gaze functions
  • 121. Final Demo • Cube changes to blue colour when gazed at • Cube changes to white colour when gazed away from
  • 123. Menu Placement • Different types of menu placement • Screen aligned - always visible on screen • World aligned - attached to object or location in VR scene • Camera aligned - moves with the user • This project shows a world aligned menu
  • 124. Interacting with VR Menus • Touch input • Tap screen to select menu button • Suitable for handheld applications • Head/Gaze pointing • Look at menu button, click to select • Ideal for menus in VR display
  • 125. Key Steps 1. Create New Scene and gaze support 2. Create User Interface menu object 3. Add buttons to user interface 4. Add button scripts 5. Add gaze interaction 6. Object interaction scripts 7. Make the menu disappear and reappear
  • 126. Create New Scene • Create scene with cube and plane • Add materials • Import GoogleVRforUnity package • Drag GvrViewerMain into project hierachy
  • 127. Setup Gaze Pointing • Drag GvrReticlePointer to Main Camera • Assets -> GoogleVR -> Prefabs -> UI • Add Gvr Pointer Physics Raycaster to Main Camera • Add component -> GvrPointerPhysicsRaycaster
  • 128. Menu Functionality • Want to set up a menu that changes cube colour • Menu fixed in space • Located need object which it affects • Two buttons (white/blue) • Look at blue button to set cube colour to blue • Look at white button to set cube colour to white
  • 129. Menu Implementation • Create a 2D canvas plane • Place canvas in VR scene where it is needed • Add buttons to the plane • Add scripts to the buttons • Triggered based on gaze input
  • 130. Setting up Menu Canvas • Create Empty Object name it UserInterface • Create image object under UserInterface • Right click UserInterface -> UI -> Image • Set the canvas to world space • Move image until visible and resize • Change image colour
  • 132. Add Buttons • Add two buttons to UI image • Colour one blue (Image script colour) • Remove button scripts • We'll add our own • Add sphere collider same size as button
  • 133. Add Button Scripts • Create identical scripts for Blue and White buttons • Different names • BlueButton, WhiteButton • Include OnLook() Function • Gaze function
  • 135. Add Event Triggers • Add event triggers to each button • Add component -> Event Trigger • Event trigger type as Pointer Enter • Set target object as button • Set target function as OnLook() • Add Event System to Hierarchy • Add component Gaze Input Module
  • 136. Testing • Reticle changes style over buttons
  • 137. Add Cube Behaviour • Add new script to cube, CubeActions • Add component -> New Script • Script that can change cube colour • Define local materials, copy existing materials • Create functions that can change colours • SetColorWhite(), SetColorBlue()
  • 139. Add Gaze Behaviour • Edit button scripts to add cube colour changing • Add public CubeActions object • public CubeActions m_cube; • Call set colour function in OnLook function • m_cube.SetColorBlue(); • Drag Cube object to script form
  • 140. Final BlueButton Script • White button same, but use m_cube.SetColorWhite();
  • 141. Testing It Out • Cube changes colour depending on button looked at
  • 142. Making the Menu Disappear • Don't want menu visible all the time • Right click with mouse to appear/disappear • Double tap with VR headset to appear/disappear • Create menu script • ToggleMenu function - turns menu on and off • Note: Add script to User Interface object • Add menu image as arguement
  • 146. Moving a Menu with the User • World aligned menus good for actions on objects • e.g. select to change colour • However you may want to move a menu with the user • e.g. menu for user navigation • This project shows how to add a menu to the camera • Menu moves with the user as they move through the VR scene
  • 147. Key Steps 1. Start with scene from Project 6 2. Create canvas object 3. Add button to canvas 4. Create player 5. Add player movement script 6. Add script for canvas movement
  • 148. User Experience • Have a walk button on the ground • When player looks down they can toggle button on and off • Look at walk button, click to toggle walking on and off
  • 149. Create MoveButton Canvas • Create canvas object • UI->Canvas • Set render mode to world space • Resize and reposition • Put flat on plane, a little in front of camera
  • 150. Add Image to Canvas • Create image on canvas • Right click canvas • UI -> image • Set image to transparent • Set image size to smaller than canvas
  • 151. Add Button to Image • Right click image • UI -> button • Resize and move to fill image • Set colour and pressed colour • Set text to “Walk” • Expand button to see text object
  • 152. Create Player Object • Create empty obect • Rename it Player • Create empty child • Rename it LocalTrans • Move Canvas under LocalTrans • Move Main Camera under Player
  • 153. Add PlayerMove Script • Add script to Main Camera • ToggleWalk function that toggles walking • If walking on then move camera
  • 155. Connect Player Moving to Button • Select Button Object • In the Button Script On Click () Action • Set target object as Main Camera • Set target function as ToggleWalk • PlayerMove -> ToggleWalk
  • 156. Event System • Make sure project has event system • Add at same level as Player • GameObject -> UI -> EventSystem • Add Gaze Input Module component • Add Component -> Gaze Input Module • Remove Standalone Input Module script • or deactivate by checking checkbox
  • 157. Testing • Look at Walk button and click • Player moves, but button doesn't !
  • 158. Moving Menu with Camera • Add a script to the LocalTrans object • CanvasMovement script • Script does the following: • finds the current camera position • sets LocalTrans to that position • rotates LocalTrans about y axis the same as camera • Outcome: • Menu moves with camera. • User can look down to click on button
  • 160. Final Result • Menu follows camera/player • Note: You may have to experiment with different canvas position and scale settings for it to appear
  • 162. Google Design Guidelines • Google’s Guidelines for good VR experiences: • Physiological Considerations • Interactive Patterns • Setup • Controls • Feedback • Display Reticle • From http://www.google.com/design/spec-vr/designing- for-google-cardboard/a-new-dimension.html
  • 163. Physiological Considerations • Factors to Consider • Head tracking • User control of movement • Use constant velocity • Grounding with fixed objects • Brightness changes
  • 164. Interactive Patterns - Setup • Setup factors to consider: • Entering and exiting • Headset adaptation • Full Screen mode • API calls • Indicating VR apps
  • 165. System Control • Issuing a command to change system state or mode • Examples • Launching application • Changing system settings • Opening a file • Etc. • Key points • Make commands visible to user • Support easy selection
  • 166. Example: GearVR Interface • 2D Interface in 3D Environment • Head pointing and click to select
  • 167. Interactive Patterns - Controls • Use fuze buttons for selection in VR
  • 168. Interactive Patterns - Feedback • Use audio and haptic feedback • Reduce visual overload • Audio alerts • 3D spatial sound • Phone vibrations
  • 169. Interactive Patterns - Display Reticle • Easier for users to target objects with a display reticle • Can display reticle only when near target object • Highlight objects (e.g. with light source) that user can target
  • 170. Use Ray-casting technique • “Laser pointer” attached to virtual hand or gaze • First object intersected by ray may be selected • User only needs to control 2 DOFs • Proven to perform well for remote selection • Variants: • Cone casting • Snap-to-object rays
  • 171. Gaze Directed Steering • Move in direction that you are looking • Very intuitive, natural navigation • Can be used on simple HMDs (Google Cardboard • But: Can’t look in different direction while moving
  • 172. Cardboard Design Lab Application • Use Cardboard Design Lab app to explore design ideas
  • 173. Cardboard Design Lab Video https://www.youtube.com/watch?v=2Uf-ru2Ndvc
  • 175. Books • Unity Virtual Reality Projects • Jonathan Linowes • Holistic Game Development with Unity • Penny de Byl
  • 176. User Experiences for VR Website • www.uxofvr.com
  • 177. Useful Resources • Google Cardboard main page • https://www.google.com/get/cardboard/ • Developer Website • https://vr.google.com/cardboard/developers/ • Building a VR app for Cardboard • http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/ • Creating VR game for Cardboard • http://danielborowski.com/posts/create-a-virtual-reality-game-for- google-cardboard/ • Moving in VR space • http://www.instructables.com/id/Prototyping-Interactive-Environments- in-Virtual-Re/
  • 178. Resources • Unity Main site • http://www.unity3d.com/ • Holistic Development with Unity • http://holistic3d.com • Official Unity Tutorials • http://unity3d.com/learn/tutorials • Unity Coder Blog • http://unitycoder.com