SlideShare a Scribd company logo
LECTURE 10: RESEARCH
DIRECTIONS IN AR AND VR
COMP 4010 – Virtual Reality
Semester 5 – 2018
Bruce Thomas, Mark Billinghurst
University of South Australia
October 30th 2018
Key Technologies for VR Systems
• Visual Display
• Stimulate visual sense
• Audio/Tactile Display
• Stimulate hearing/touch
• Tracking
• Changing viewpoint
• User input
• Input Devices
• Supporting user interaction
Many Directions for Research
• Research in each of the key technology areas for VR
• Display, input, tracking, graphics, etc.
• Research in the phenomena of VR
• Psychological experience of VR, measuring Presence, etc..
• Research in tools for VR
• VR authoring tools, automatic world creation, etc.
• Research in many application areas
• Collaborative/social, virtual characters, medical, education, etc.
Future Visions of VR: Ready Player One
• https://www.youtube.com/watch?v=LiK2fhOY0nE
Today vs. Tomorrow
VR in 2018 VR in 2045
Graphics High quality Photo-realistic
Display 110-150 degrees Total immersion
Interaction Handheld controller Full gesture/body/gaze
Navigation Limited movement Natural
Multiuser Few users Millions of users
Augmented Reality
• Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
Future Vision of AR: IronMan
• https://www.youtube.com/watch?v=Y1TEK2Wf_e8
Key Enabling Technologies
1. Combines Real andVirtual Images
Display Technology
2. Registered in 3D
Tracking Technologies
3. Interactive in real-time
InteractionTechnologies
Future research can be done in each of these areas
DISPLAY
• Past
• Bulky Head mounted displays
• Current
• Handheld, lightweight head mounted
• Future
• Projected AR
• Wide FOV see through
• Retinal displays
• Contact lens
Evolution in Displays
Wide FOV See-Through (3+ years)
• Waveguide techniques
• Wider FOV
• Thin see through
• Socially acceptable
• Pinlight Displays
• LCD panel + point light sources
• 110 degree FOV
• UNC/Nvidia
Lumus DK40
Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays: wide
field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014
Emerging Technologies (p. 20). ACM.
Pinlight Display Demo
https://www.youtube.com/watch?v=tJULL1Oou9k
Light Field Displays
https://www.youtube.com/watch?v=J28AvVBZWbg
Retinal Displays (5+ years)
• Photons scanned into eye
• Infinite depth of field
• Bright outdoor performance
• Overcome visual defects
• True 3D stereo with depth modulation
• Microvision (1993-)
• Head mounted monochrome
• MagicLeap (2013-)
• Projecting light field into eye
Contact Lens (10 – 15 + years)
• Contact Lens only
• Unobtrusive
• Significant technical challenges
• Power, data, resolution
• Babak Parviz (2008)
• Contact Lens + Micro-display
• Wide FOV
• socially acceptable
• Innovega (innovega-inc.com)
http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
TRACKING
Evolution of Tracking
• Past
• Location based, marker based,
• magnetic/mechanical
• Present
• Image based, hybrid tracking
• Future
• Ubiquitous
• Model based
• Environmental
Model Based Tracking (1-3 yrs)
• Track from known 3D model
• Use depth + colour information
• Match input to model template
• Use CAD model of targets
• Recent innovations
• Learn models online
• Tracking from cluttered scene
• Track from deformable objects
Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013).
Model based training, detection and pose estimation of texture-less 3D objects in heavily
cluttered scenes. In Computer Vision–ACCV 2012 (pp. 548-562). Springer Berlin Heidelberg.
Deformable Object Tracking
https://www.youtube.com/watch?v=KThSoK0VTDU
Environmental Tracking (3+ yrs)
• Environment capture
• Use depth sensors to capture scene & track from model
• InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/)
• Real time scene capture on mobiles, dense or sparse capture
• Dynamic memory swapping allows large environment capture
• Cross platform, open source library available
InfinitAM Demo
https://www.youtube.com/watch?v=47zTHHxJjQU
Fusion4D (2016)
• Shahram Izhadi (Microsoft + perceptiveIO)
• Real capture and dynamic reconstruction
• RGBD sensors + incremental reconstruction
Fusion4D Demo
https://www.youtube.com/watch?v=rnz0Kt36mOQ
Wide Area Outdoor Tracking (5+ yrs)
• Process
• Combine panorama’s into point cloud model (offline)
• Initialize camera tracking from point cloud
• Update pose by aligning camera image to point cloud
• Accurate to 25 cm, 0.5 degree over very wide area
Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
Wide Area Outdoor Tracking
https://www.youtube.com/watch?v=8ZNN0NeXV6s
Outdoor Localization using Maps
• Use 2D building footprints and approximate height
• Process
• Sensor input for initial position orientation
• Estimate camera orientation from straight line segments
• Estimate camera translation from façade segmentation
• Use pose estimate to initialise SLAM tracking
• Results – 90% < 4m position error, < 3
o
angular error
Arth, C., Pirchheim, C., Ventura, J., Schmalstieg, D., & Lepetit, V. (2015). Instant outdoor
localization and SLAM initialization from 2.5 D maps. IEEE transactions on visualization and
computer graphics, 21(11), 1309-1318.
Demo: Outdoor Tracking
https://www.youtube.com/watch?v=PzV8VKC5buQ
INTERACTION
Evolution of Interaction
• Past
• Limited interaction
• Viewpoint manipulation
• Present
• Screen based, simple gesture
• tangible interaction
• Future
• Natural gesture, Multimodal
• Intelligent Interfaces
• Physiological/Sensor based
Natural Gesture (2-5 years)
• Freehand gesture input
• Depth sensors for gesture capture
• Move beyond simple pointing
• Rich two handed gestures
• Eg Microsoft Research Hand Tracker
• 3D hand tracking, 30 fps, single sensor
• Commercial Systems
• Meta, MS Hololens, Occulus, Intel, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
Hand Tracking Demo
https://www.youtube.com/watch?v=QTz1zQAnMcU
Example: Eye Tracking Input
• Smaller/cheaper eye-tracking systems
• More HMDs with integrated eye-tracking
• Research questions
• How can eye gaze be used for interaction?
• What interaction metaphors are natural?
• What technology can be used for eye-tracking
• Etc..
Eye Gaze Interaction Methods
• Gaze for interaction
• Implicit vs. explicit input
• Exploring different gaze interaction
• Duo reticles – use eye saccade input
• Radial pursuit – use smooth pursuit motion
• Nod and roll – use the vestibular ocular reflex
• Hardware
• HTC Vive + Pupil Labs integrated eye-tracking
• User study to compare between methods for 3DUI
Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March).
Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User
Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR) or Eye-gaze Reticle (original
name)
A-1
As RR and IR are aligned,
alignment time counts
down
A-2 A-3
Selection completed
Radial Pursuit (RP)
B-1
Real-time Reticle
(RR)
B-2 B-3 B-4
𝑑"#$ = min 𝑑), 𝑑+, … , 𝑑- , 𝑑# =	∑ |𝑝(𝑖)5	 −	𝑝′5	 |$
5859:9;9<=
Nod and Roll (NR)
36
C-2
C-1
Head-gaze Reticle (HR)
Real-time Reticle
(RR)
C-3
Demo: Eye gaze interaction methods
https://www.youtube.com/watch?v=EpCGqxkmBKE
Multimodal Input (5+ years)
• Combine gesture and speech input
• Gesture good for qualitative input
• Speech good for quantitative input
• Support combined commands
• “Put that there” + pointing
• Eg HIT Lab NZ multimodal input
• 3D hand tracking, speech
• Multimodal fusion module
• Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
HIT Lab NZ Multimodal Input
https://www.youtube.com/watch?v=DSsrzMxGwcA
Intelligent Interfaces (10+ years)
• Move to Implicit Input vs. Explicit
• Recognize user behaviour
• Provide adaptive feedback
• Support scaffolded learning
• Move beyond check-lists of actions
• Eg AR + Intelligent Tutoring
• Constraint based ITS + AR
• PC Assembly (Westerfield (2015)
• 30% faster, 25% better retention
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for
Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
COLLABORATION
Collaborative VR Systems
• Directions for research
• Scalability – towards millions of users
• Graphics – support for multiple different devices
• User representation – realistic face/body input
• Support for communication cues – messaging, recording, etc
• Goal: Collaboration in VR as good/better than FtF
Altspace VR Facebook Spaces
Demo: High Fidelity
https://www.youtube.com/watch?v=-ivL1DDwUK4
ENHANCED
EXPERIENCES
Crossing Boundaries
Jun Rekimoto, Sony CSL
Invisible Interfaces
Jun Rekimoto, Sony CSL
Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
The MagicBook
Reality VirtualityAugmented
Reality (AR)
Augmented
Virtuality (AV)
The MagicBook
• Using AR to transition along Milgram’s continuum
• Moving seamlessly from Reality to AR to VR
• Support for Collaboration
• Face to Face, Shared AR/VR, Multi-scale
• Natural interaction
• Handheld AR and VR viewer
Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional
AR interface. Computers & Graphics, 25(5), 745-753.
Demo: MagicBook
https://www.youtube.com/watch?v=tNMljw0F-aw
Invisible Interfaces
Jun Rekimoto, Sony CSL
Example: Visualizing Sensor Networks
• Rauhala et. al. 2007 (Linkoping)
• Network of Humidity Sensors
• ZigBee wireless communication
• Use Mobile AR toVisualize Humidity
Rauhala, M., Gunnarsson, A. S., & Henrysson, A. (2006). A novel interface to sensor
networks using handheld augmented reality. In Proceedings of the 8th conference on
Human-computer interaction with mobile devices and services(pp. 145-148). ACM.
COMP 4010 Lecture10 AR/VR Research Directions
• Humidity information overlaid on real world shown in mobile AR
Invisible Interfaces
Jun Rekimoto, Sony CSL
UbiVR – CAMAR
CAMAR Companion
CAMAR Viewer
CAMAR Controller
GIST - Korea
ubiHome @ GIST
ÓubiHome
What/When/How
Where/When
Media services
Who/What/
When/How
ubiKey
Couch SensorPDA
Tag-it
Door Sensor
ubiTrack
When/HowWhen/HowWho/What/When/How
Light service MR window
SOCIAL ACCEPTANCE
Example:SocialAcceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
TATAugmented ID
TAT AugmentedID
https://www.youtube.com/watch?v=tb0pMeg1UN0
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
CONCLUSIONS
Research Needed in Many Areas
• Collaborative Experiences
• VR/AR teleconferencing
• Social Acceptance
• Overcome social problems with AR/VR
• Cloud Services
• Cloud based storage/processing
• Authoring Tools
• Easy content creation for non-experts for AR/VR
• Etc..
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

PDF
COMP 4010 - Lecture1 Introduction to Virtual Reality
PDF
Lecture3 - VR Technology
PDF
COMP 4010 Lecture12 Research Directions in AR
PDF
COMP 4010 - Lecture 3 VR Systems
PDF
COMP 4010 Lecture12 - Research Directions in AR and VR
PDF
Lecture 4: VR Systems
PDF
COMP 4026 - Lecture1 introduction
PDF
Designing Outstanding AR Experiences
COMP 4010 - Lecture1 Introduction to Virtual Reality
Lecture3 - VR Technology
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 - Lecture 3 VR Systems
COMP 4010 Lecture12 - Research Directions in AR and VR
Lecture 4: VR Systems
COMP 4026 - Lecture1 introduction
Designing Outstanding AR Experiences

What's hot (20)

PDF
COMP 4026 Lecture4: Processing and Advanced Interface Technology
PDF
COMP 4010 - Lecture 4: 3D User Interfaces
PDF
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
PDF
Lecture 5: 3D User Interfaces for Virtual Reality
PDF
AR in Education
PDF
COMP 4010 Lecture 3 VR Input and Systems
PDF
Future Directions for Augmented Reality
PDF
COMP 4010 Lecture9 AR Interaction
PDF
Comp4010 Lecture12 Research Directions
PDF
Mixed Reality in the Workspace
PDF
Comp4010 lecture3-AR Technology
PDF
Grand Challenges for Mixed Reality
PDF
Mobile AR lecture 9 - Mobile AR Interface Design
PDF
Lecture 2 Presence and Perception
PDF
COMP 4010 - Lecture11 - AR Applications
PDF
Comp4010 lecture6 Prototyping
PDF
COMP 4010: Lecture 4 - 3D User Interfaces for VR
PDF
COMP 4026 Lecture3 Prototyping and Evaluation
PDF
MHIT 603: Introduction to Interaction Design
PDF
Designing Augmented Reality Experiences
COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
AR in Education
COMP 4010 Lecture 3 VR Input and Systems
Future Directions for Augmented Reality
COMP 4010 Lecture9 AR Interaction
Comp4010 Lecture12 Research Directions
Mixed Reality in the Workspace
Comp4010 lecture3-AR Technology
Grand Challenges for Mixed Reality
Mobile AR lecture 9 - Mobile AR Interface Design
Lecture 2 Presence and Perception
COMP 4010 - Lecture11 - AR Applications
Comp4010 lecture6 Prototyping
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4026 Lecture3 Prototyping and Evaluation
MHIT 603: Introduction to Interaction Design
Designing Augmented Reality Experiences
Ad

Similar to COMP 4010 Lecture10 AR/VR Research Directions (20)

PDF
Future Research Directions for Augmented Reality
PDF
2016 AR Summer School - Lecture 5
PDF
Beyond Reality (2027): The Future of Virtual and Augmented Reality
PDF
Augmented Reality: The Next 20 Years
PDF
Augmented Reality: The Next 20 Years (AWE Asia 2015)
PDF
Application in Augmented and Virtual Reality
PDF
Mobile AR Lecture 10 - Research Directions
PPT
Natural Interfaces for Augmented Reality
PDF
Comp4010 2021 Lecture2-Perception
PDF
Lecture 9 AR Technology
PDF
Mobile AR Lecture 2 - Technology
PDF
426 Lecture 9: Research Directions in AR
PDF
Hands and Speech in Space: Multimodal Input for Augmented Reality
PDF
Natural Interaction for Augmented Reality Applications
PDF
The Reality of Augmented Reality: Are we there yet?
PDF
Multimodal Multi-sensory Interaction for Mixed Reality
PDF
The Evolution of UX in Virtual Reality
PDF
2022 COMP4010 Lecture2: Perception
PDF
COMP 4010: Lecture8 - AR Technology
PDF
Novel Interfaces for AR Systems
Future Research Directions for Augmented Reality
2016 AR Summer School - Lecture 5
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Application in Augmented and Virtual Reality
Mobile AR Lecture 10 - Research Directions
Natural Interfaces for Augmented Reality
Comp4010 2021 Lecture2-Perception
Lecture 9 AR Technology
Mobile AR Lecture 2 - Technology
426 Lecture 9: Research Directions in AR
Hands and Speech in Space: Multimodal Input for Augmented Reality
Natural Interaction for Augmented Reality Applications
The Reality of Augmented Reality: Are we there yet?
Multimodal Multi-sensory Interaction for Mixed Reality
The Evolution of UX in Virtual Reality
2022 COMP4010 Lecture2: Perception
COMP 4010: Lecture8 - AR Technology
Novel Interfaces for AR Systems
Ad

More from Mark Billinghurst (20)

PDF
Empathic Computing: Creating Shared Understanding
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
PDF
Research Directions in Heads-Up Computing
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
PDF
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
PDF
Research Directions for Cross Reality Interfaces
Empathic Computing: Creating Shared Understanding
Reach Out and Touch Someone: Haptics and Empathic Computing
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Research Directions in Heads-Up Computing
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
IVE 2024 Short Course - Lecture14 - Evaluation
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Research Directions for Cross Reality Interfaces

Recently uploaded (20)

PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
Cloud computing and distributed systems.
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
CIFDAQ's Market Insight: SEC Turns Pro Crypto
PPT
Teaching material agriculture food technology
PDF
NewMind AI Monthly Chronicles - July 2025
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
KodekX | Application Modernization Development
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Approach and Philosophy of On baking technology
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Electronic commerce courselecture one. Pdf
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
Encapsulation theory and applications.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Per capita expenditure prediction using model stacking based on satellite ima...
Cloud computing and distributed systems.
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
CIFDAQ's Market Insight: SEC Turns Pro Crypto
Teaching material agriculture food technology
NewMind AI Monthly Chronicles - July 2025
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Building Integrated photovoltaic BIPV_UPV.pdf
Understanding_Digital_Forensics_Presentation.pptx
KodekX | Application Modernization Development
The AUB Centre for AI in Media Proposal.docx
Approach and Philosophy of On baking technology
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Unlocking AI with Model Context Protocol (MCP)
Electronic commerce courselecture one. Pdf
Dropbox Q2 2025 Financial Results & Investor Presentation
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Encapsulation theory and applications.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf

COMP 4010 Lecture10 AR/VR Research Directions

  • 1. LECTURE 10: RESEARCH DIRECTIONS IN AR AND VR COMP 4010 – Virtual Reality Semester 5 – 2018 Bruce Thomas, Mark Billinghurst University of South Australia October 30th 2018
  • 2. Key Technologies for VR Systems • Visual Display • Stimulate visual sense • Audio/Tactile Display • Stimulate hearing/touch • Tracking • Changing viewpoint • User input • Input Devices • Supporting user interaction
  • 3. Many Directions for Research • Research in each of the key technology areas for VR • Display, input, tracking, graphics, etc. • Research in the phenomena of VR • Psychological experience of VR, measuring Presence, etc.. • Research in tools for VR • VR authoring tools, automatic world creation, etc. • Research in many application areas • Collaborative/social, virtual characters, medical, education, etc.
  • 4. Future Visions of VR: Ready Player One • https://www.youtube.com/watch?v=LiK2fhOY0nE
  • 5. Today vs. Tomorrow VR in 2018 VR in 2045 Graphics High quality Photo-realistic Display 110-150 degrees Total immersion Interaction Handheld controller Full gesture/body/gaze Navigation Limited movement Natural Multiuser Few users Millions of users
  • 6. Augmented Reality • Defining Characteristics [Azuma 97] • Combines Real andVirtual Images • Both can be seen at the same time • Interactive in real-time • The virtual content can be interacted with • Registered in 3D • Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 7. Future Vision of AR: IronMan • https://www.youtube.com/watch?v=Y1TEK2Wf_e8
  • 8. Key Enabling Technologies 1. Combines Real andVirtual Images Display Technology 2. Registered in 3D Tracking Technologies 3. Interactive in real-time InteractionTechnologies Future research can be done in each of these areas
  • 10. • Past • Bulky Head mounted displays • Current • Handheld, lightweight head mounted • Future • Projected AR • Wide FOV see through • Retinal displays • Contact lens Evolution in Displays
  • 11. Wide FOV See-Through (3+ years) • Waveguide techniques • Wider FOV • Thin see through • Socially acceptable • Pinlight Displays • LCD panel + point light sources • 110 degree FOV • UNC/Nvidia Lumus DK40 Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays: wide field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014 Emerging Technologies (p. 20). ACM.
  • 14. Retinal Displays (5+ years) • Photons scanned into eye • Infinite depth of field • Bright outdoor performance • Overcome visual defects • True 3D stereo with depth modulation • Microvision (1993-) • Head mounted monochrome • MagicLeap (2013-) • Projecting light field into eye
  • 15. Contact Lens (10 – 15 + years) • Contact Lens only • Unobtrusive • Significant technical challenges • Power, data, resolution • Babak Parviz (2008) • Contact Lens + Micro-display • Wide FOV • socially acceptable • Innovega (innovega-inc.com) http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
  • 17. Evolution of Tracking • Past • Location based, marker based, • magnetic/mechanical • Present • Image based, hybrid tracking • Future • Ubiquitous • Model based • Environmental
  • 18. Model Based Tracking (1-3 yrs) • Track from known 3D model • Use depth + colour information • Match input to model template • Use CAD model of targets • Recent innovations • Learn models online • Tracking from cluttered scene • Track from deformable objects Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013). Model based training, detection and pose estimation of texture-less 3D objects in heavily cluttered scenes. In Computer Vision–ACCV 2012 (pp. 548-562). Springer Berlin Heidelberg.
  • 20. Environmental Tracking (3+ yrs) • Environment capture • Use depth sensors to capture scene & track from model • InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/) • Real time scene capture on mobiles, dense or sparse capture • Dynamic memory swapping allows large environment capture • Cross platform, open source library available
  • 22. Fusion4D (2016) • Shahram Izhadi (Microsoft + perceptiveIO) • Real capture and dynamic reconstruction • RGBD sensors + incremental reconstruction
  • 24. Wide Area Outdoor Tracking (5+ yrs) • Process • Combine panorama’s into point cloud model (offline) • Initialize camera tracking from point cloud • Update pose by aligning camera image to point cloud • Accurate to 25 cm, 0.5 degree over very wide area Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
  • 25. Wide Area Outdoor Tracking https://www.youtube.com/watch?v=8ZNN0NeXV6s
  • 26. Outdoor Localization using Maps • Use 2D building footprints and approximate height • Process • Sensor input for initial position orientation • Estimate camera orientation from straight line segments • Estimate camera translation from façade segmentation • Use pose estimate to initialise SLAM tracking • Results – 90% < 4m position error, < 3 o angular error Arth, C., Pirchheim, C., Ventura, J., Schmalstieg, D., & Lepetit, V. (2015). Instant outdoor localization and SLAM initialization from 2.5 D maps. IEEE transactions on visualization and computer graphics, 21(11), 1309-1318.
  • 29. Evolution of Interaction • Past • Limited interaction • Viewpoint manipulation • Present • Screen based, simple gesture • tangible interaction • Future • Natural gesture, Multimodal • Intelligent Interfaces • Physiological/Sensor based
  • 30. Natural Gesture (2-5 years) • Freehand gesture input • Depth sensors for gesture capture • Move beyond simple pointing • Rich two handed gestures • Eg Microsoft Research Hand Tracker • 3D hand tracking, 30 fps, single sensor • Commercial Systems • Meta, MS Hololens, Occulus, Intel, etc Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S. (2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
  • 32. Example: Eye Tracking Input • Smaller/cheaper eye-tracking systems • More HMDs with integrated eye-tracking • Research questions • How can eye gaze be used for interaction? • What interaction metaphors are natural? • What technology can be used for eye-tracking • Etc..
  • 33. Eye Gaze Interaction Methods • Gaze for interaction • Implicit vs. explicit input • Exploring different gaze interaction • Duo reticles – use eye saccade input • Radial pursuit – use smooth pursuit motion • Nod and roll – use the vestibular ocular reflex • Hardware • HTC Vive + Pupil Labs integrated eye-tracking • User study to compare between methods for 3DUI Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March). Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
  • 34. Duo-Reticles (DR) Inertial Reticle (IR) Real-time Reticle (RR) or Eye-gaze Reticle (original name) A-1 As RR and IR are aligned, alignment time counts down A-2 A-3 Selection completed
  • 35. Radial Pursuit (RP) B-1 Real-time Reticle (RR) B-2 B-3 B-4 𝑑"#$ = min 𝑑), 𝑑+, … , 𝑑- , 𝑑# = ∑ |𝑝(𝑖)5 − 𝑝′5 |$ 5859:9;9<=
  • 36. Nod and Roll (NR) 36 C-2 C-1 Head-gaze Reticle (HR) Real-time Reticle (RR) C-3
  • 37. Demo: Eye gaze interaction methods https://www.youtube.com/watch?v=EpCGqxkmBKE
  • 38. Multimodal Input (5+ years) • Combine gesture and speech input • Gesture good for qualitative input • Speech good for quantitative input • Support combined commands • “Put that there” + pointing • Eg HIT Lab NZ multimodal input • 3D hand tracking, speech • Multimodal fusion module • Complete tasks faster with MMI, less errors Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
  • 39. HIT Lab NZ Multimodal Input https://www.youtube.com/watch?v=DSsrzMxGwcA
  • 40. Intelligent Interfaces (10+ years) • Move to Implicit Input vs. Explicit • Recognize user behaviour • Provide adaptive feedback • Support scaffolded learning • Move beyond check-lists of actions • Eg AR + Intelligent Tutoring • Constraint based ITS + AR • PC Assembly (Westerfield (2015) • 30% faster, 25% better retention Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
  • 42. Collaborative VR Systems • Directions for research • Scalability – towards millions of users • Graphics – support for multiple different devices • User representation – realistic face/body input • Support for communication cues – messaging, recording, etc • Goal: Collaboration in VR as good/better than FtF Altspace VR Facebook Spaces
  • 47. Milgram’s Reality-Virtuality continuum Mixed Reality Reality - Virtuality (RV) Continuum Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment
  • 48. The MagicBook Reality VirtualityAugmented Reality (AR) Augmented Virtuality (AV)
  • 49. The MagicBook • Using AR to transition along Milgram’s continuum • Moving seamlessly from Reality to AR to VR • Support for Collaboration • Face to Face, Shared AR/VR, Multi-scale • Natural interaction • Handheld AR and VR viewer Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers & Graphics, 25(5), 745-753.
  • 52. Example: Visualizing Sensor Networks • Rauhala et. al. 2007 (Linkoping) • Network of Humidity Sensors • ZigBee wireless communication • Use Mobile AR toVisualize Humidity Rauhala, M., Gunnarsson, A. S., & Henrysson, A. (2006). A novel interface to sensor networks using handheld augmented reality. In Proceedings of the 8th conference on Human-computer interaction with mobile devices and services(pp. 145-148). ACM.
  • 54. • Humidity information overlaid on real world shown in mobile AR
  • 56. UbiVR – CAMAR CAMAR Companion CAMAR Viewer CAMAR Controller GIST - Korea
  • 57. ubiHome @ GIST ÓubiHome What/When/How Where/When Media services Who/What/ When/How ubiKey Couch SensorPDA Tag-it Door Sensor ubiTrack When/HowWhen/HowWho/What/When/How Light service MR window
  • 59. Example:SocialAcceptance • People don’t want to look silly • Only 12% of 4,600 adults would be willing to wear AR glasses • 20% of mobile AR browser users experience social issues • Acceptance more due to Social than Technical issues • Needs further study (ethnographic, field tests, longitudinal)
  • 65. Research Needed in Many Areas • Collaborative Experiences • VR/AR teleconferencing • Social Acceptance • Overcome social problems with AR/VR • Cloud Services • Cloud based storage/processing • Authoring Tools • Easy content creation for non-experts for AR/VR • Etc..