EMPATHIC COMPUTING
Mark Billinghurst
mark.billinghurst@auckland.ac.nz
November 21st 2023
DELIVERING THE POTENTIAL OF THE METAVERSE
Empathic Computing: Delivering  the Potential of the Metaverse
Innovation occurs at the boundary
Empathic Computing: Delivering  the Potential of the Metaverse
Google Search Volume on the Metaverse
• 5-year search volume
• Blue – Metaverse, Red – Virtual Reality
Facebook -> Meta
Academic Publications Mentioning the Metaverse (Nov 2023)
Meta’s Metaverse
The Metaverse: A Definition?
AWE 2022 John Riccitiello
ex-CEO, Unity
Real-time, 3D, Interactive, Social, Persistent
• The Metaverse Roadmap (2007)
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
A Better Definition
Four Key Components
• Virtual Worlds
• Augmented Reality
• Mirror Worlds
• Lifelogging
Metaverse Taxonomy
• Simulations of external space/content
• Capturing and sharing surroundings
• Photorealistic content
• Digital twins
Matterport Google Street View
Soul Machines
MirrorWorlds
• Measuring user’s internal state
• Capturing physiological cues
• Recording everyday life
• Augmenting humans
Appl
e
Shimmer
OpenBCI
LifeLogging
Sensing
Immersing
Augmenting
Capturing
Research Opportunities: Crossing Boundaries
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
Milgram’s Mixed Reality Continuum
Example: MagicBook (2000)
Reality Augmented Reality Virtual Reality
• Seamlessly transition between Reality and Virtual Reality
• Support ego-centric and exo-centric collaboration
Example: MagicBook (2000)
Apple VisionPro
• Survey of Scopus papers
• ~1900 papers found with Metaverse in abstract/keywords
• Further analysis
• Publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL)
• Look for research across boundaries
• Application analysis
• Most popular application domains
What is the Metaverse Research Landscape Like?
• Of the papers mentioning AR, VR, MW, LL
• 66% only mentioned one topic
• VR the most popular research topic (72%)
Most Research in One Quadrant
• Key Findings
• Most research combines AR and VR (AR/VR – 67%)
• Little research involving LL (7% of 2Q)
• < 2% research combined AR, VR, MW, LL
Research Crossing Boundaries
67%
0
%
23%
5
%
2%
3%
Application Areas
• Education - 23%
• Medicine/Healthcare – 13%
• Marketing/Retail – 9%
• Industry – 5%
• Tourism – 5%
• Gaming – 3%
• Collaboration – 2%
Key Lessons Learned
• Research Strengths
• Most Metaverse research VR related
• Strong connections between AR/VR
• Strong connections between MW/VR
• Research Opportunities
• Opportunities across boundaries - 2% papers in AR/LL, 0% in MW/LL
• Opportunities for research combining all elements
• Broadening application space – industry, finance, etc.
Capturing the Whole Metaverse
Augmenting Sensing
Capturing Immersing
Capturing the Whole Metaverse
Empathic
Computing
“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
Understanding
Emotion Recognition - Life Logging
Experiencing
Content/Environment capture - VR, Mirror Worlds
Sharing
Enhanced communication cues - AR
Key Elements of Empathic Systems
1. Understanding: Affective Computing
•Ros Picard – MIT Media Lab
•Systems that recognize emotion
2. Experiencing: Virtual Reality
"Virtual reality offers a whole different
medium to tell stories that really connect
people and create an empathic connection."
Nonny de la Peña
http://www.emblematicgroup.com/
3: Sharing: Empathic Computing
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Example: Sharing Communication Cues
• Measuring non-verbal cues
• Gaze, face expression, heart rate
• Sharing in Augmented Reality
• Collaborative AR experiences
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Empathic Computing: Delivering  the Potential of the Metaverse
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
Empathic Computing: Delivering  the Potential of the Metaverse
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
Live 3D Scene Capture
Scene Capture and Sharing
Scene Reconstruction Remote Expert Local Worker
AR View Remote Expert View
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
Remote Communication with Avatars
Empathic Computing: Delivering  the Potential of the Metaverse
Example: AR/VR/MW Collaboration
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing Virtual Communication Cues (2019)
Sharing Virtual Communication Cues
• Gesture input
• AR/VR displays
• Room scale tracking
• Sharing viewpoint cues
Empathic Computing: Delivering  the Potential of the Metaverse
Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
Sharing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K., Matthews, B., Lee, G., & Billinghurst, M. (2022). The Impact of Sharing Gaze Behaviours in Collaborative Mixed Reality.
Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 1-27.
System Design
➔ 360 Panaramic Camera + Mixed Reality View
➔ Combination of HoloLens2 + Vive Pro Eye
➔ 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle
Gaze Visualisation
Browse Focus
Mutual Fixed
Circle-map
• Bi-directional gaze with behaviours (BWB) significantly improved performance
• Behaviour visualisations stimulate frequent joint attention
• Sharing gaze cues made communication easier and more effective
Results
Joint gaze frequency
Sharing Gesture Cues
• What type of gesture cues should be shared in AR/VR collaboration?
Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD-
based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13).
Augmented Reality
Virtual Reality
Communication Cues
• Four different cues used
• (1) Hands Only (HO), (2) Hands + Pointer (HP)
• (3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS)
• Three experimental tasks
• Lego assembly, Tangram puzzle, Origami folding
Key Results
• Task completion time
• Sketch cues enabled users to complete tasks significantly faster (task dep.)
• Adding pointing didn’t improve task completion time
• Co-Presence
• Adding pointing and sketch cues didn’t improve feeling of co-presence
• User Preference
• User’s overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last
Task completion time
“sketching is pretty useful for
describing actions that was
difficult with words and could
express more details".
Multi-Scale Collaboration
• Changing the user’s virtual body scale
Empathic Computing: Delivering  the Potential of the Metaverse
On the Shoulder of the Giant
Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A
multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI (pp. 1-17).
Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April).
Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Empathic Computing: Delivering  the Potential of the Metaverse
Results from User Evaluation
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks
• Asymmetric, symmetric collaboration
• Significant performance improvement
• 20% faster with Mini-Me
• Social Presence
• Higher sense of Presence
• Users preferred
• People felt the task was easier to complete
• 60-75% preference
“I feel like I am
talking to my
partner”
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Technology Trends
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Sensor Enhanced HMDs
Project Galea
EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
Empathic Computing: Delivering  the Potential of the Metaverse
NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Set Up
Empathic Computing: Delivering  the Potential of the Metaverse
NeuralDrum – Brain Synchronisation in XR
Poor Player Good Player
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
• HTC Vive HMD
• Empathic glove
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
Being in the Body of Another
VR Environments
• Butterfly World: calm scene, collect butterflies
• Zombie Attack: scary scene, fighting zombies
Empathic Computing: Delivering  the Potential of the Metaverse
Results:
• Game experience had significant impact
• In zombie game, sharing heart rate
• Enabled players to understand emotional state of partner
• Helped players to be significantly more attentive
• Subjects had a preference for sharing heart rate
• But no significant difference on feeling connected
Understanding emotional state
Empathic Tele-Existence
• Based on Empathic Computing
• Creating shared understanding
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
Conclusions
• We need a broader Metaverse definition and taxonomy
• Including AR, VR, Mirror Worlds, Life Logging
• Current research is narrowly focused
• But many research opportunities exist in crossing boundaries
• Empathic Computing encompasses the entire Metaverse
• Transforming face to face and remote collaboration
MiniMe
Virtual Cues Enhanced Emotion
Brain
Synchronization
Emotion Recognition
Scene Capture
AI
Delivering on the Metaverse potential
Empathic Computing: Delivering  the Potential of the Metaverse
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

PDF
Empathic Computing: Capturing the Potential of the Metaverse
PDF
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
PDF
Research Directions for Cross Reality Interfaces
PDF
ISS2022 Keynote
PDF
Future Research Directions for Augmented Reality
PDF
Empathic Computing: Designing for the Broader Metaverse
PDF
Empathic Computing: Developing for the Whole Metaverse
PDF
Research Directions in Heads-Up Computing
Empathic Computing: Capturing the Potential of the Metaverse
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Research Directions for Cross Reality Interfaces
ISS2022 Keynote
Future Research Directions for Augmented Reality
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Developing for the Whole Metaverse
Research Directions in Heads-Up Computing

What's hot (20)

PDF
Grand Challenges for Mixed Reality
PDF
Metaverse Learning
PDF
2022 COMP 4010 Lecture 7: Introduction to VR
PDF
Comp4010 lecture3-AR Technology
PDF
Advanced Methods for User Evaluation in AR/VR Studies
PDF
Research Directions in Transitional Interfaces
PDF
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
PDF
Novel Interfaces for AR Systems
PDF
COMP 4010 Lecture9 AR Interaction
PDF
2013 426 Lecture 1: Introduction to Augmented Reality
PDF
Lecture3 - VR Technology
PDF
COMP 4010 Lecture6 - Virtual Reality Input Devices
PDF
Comp4010 Lecture12 Research Directions
PDF
2022 COMP4010 Lecture 6: Designing AR Systems
PDF
COMP 4010 - Lecture 4: 3D User Interfaces
PDF
Multimodal Multi-sensory Interaction for Mixed Reality
PDF
COSC 426 Lecture 1: Introduction to Augmented Reality
PDF
Comp4010 Lecture13 More Research Directions
PDF
Comp4010 lecture11 VR Applications
PDF
COMP 4010 Lecture10: AR Tracking
Grand Challenges for Mixed Reality
Metaverse Learning
2022 COMP 4010 Lecture 7: Introduction to VR
Comp4010 lecture3-AR Technology
Advanced Methods for User Evaluation in AR/VR Studies
Research Directions in Transitional Interfaces
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
Novel Interfaces for AR Systems
COMP 4010 Lecture9 AR Interaction
2013 426 Lecture 1: Introduction to Augmented Reality
Lecture3 - VR Technology
COMP 4010 Lecture6 - Virtual Reality Input Devices
Comp4010 Lecture12 Research Directions
2022 COMP4010 Lecture 6: Designing AR Systems
COMP 4010 - Lecture 4: 3D User Interfaces
Multimodal Multi-sensory Interaction for Mixed Reality
COSC 426 Lecture 1: Introduction to Augmented Reality
Comp4010 Lecture13 More Research Directions
Comp4010 lecture11 VR Applications
COMP 4010 Lecture10: AR Tracking
Ad

Similar to Empathic Computing: Delivering the Potential of the Metaverse (20)

PDF
Empathic Computing: Creating Shared Understanding
PDF
Empathic Computing: New Approaches to Gaming
PDF
Empathic Computing
PDF
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
PDF
Fifty Shades of Augmented Reality: Creating Connection Using AR
PDF
Application in Augmented and Virtual Reality
PDF
Mark Billinghurst (University of South Australia ): Augmented Teleportation
PDF
Augmented TelePortation
PDF
Empathic Mixed Reality
PDF
Future Directions for Augmented Reality
PDF
Empathic Tele-Existence
PDF
Evaluation Methods for Social XR Experiences
PDF
The Metaverse: Are We There Yet?
PDF
Beyond Reality (2027): The Future of Virtual and Augmented Reality
PDF
Using Augmented Reality to Create Empathic Experiences
PPTX
Using Physiological sensing and scene reconstruction in remote collaboration
PDF
Collaborative Immersive Analytics
PPTX
Better Together: Building Social Immersive Technologies
PDF
Gamification with virtual characters at the borders of mixed realities and al...
PDF
Everywhere but Here: how eXtended Reality is Changing The Ways We Communicate...
Empathic Computing: Creating Shared Understanding
Empathic Computing: New Approaches to Gaming
Empathic Computing
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
Fifty Shades of Augmented Reality: Creating Connection Using AR
Application in Augmented and Virtual Reality
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Augmented TelePortation
Empathic Mixed Reality
Future Directions for Augmented Reality
Empathic Tele-Existence
Evaluation Methods for Social XR Experiences
The Metaverse: Are We There Yet?
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Using Augmented Reality to Create Empathic Experiences
Using Physiological sensing and scene reconstruction in remote collaboration
Collaborative Immersive Analytics
Better Together: Building Social Immersive Technologies
Gamification with virtual characters at the borders of mixed realities and al...
Everywhere but Here: how eXtended Reality is Changing The Ways We Communicate...
Ad

More from Mark Billinghurst (19)

PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
PDF
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
PDF
Human Factors of XR: Using Human Factors to Design XR Systems
PDF
IVE Industry Focused Event - Defence Sector 2024
Reach Out and Touch Someone: Haptics and Empathic Computing
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
IVE 2024 Short Course - Lecture14 - Evaluation
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Human Factors of XR: Using Human Factors to Design XR Systems
IVE Industry Focused Event - Defence Sector 2024

Recently uploaded (20)

PDF
NewMind AI Weekly Chronicles – August ’25 Week III
PDF
Architecture types and enterprise applications.pdf
PDF
Getting started with AI Agents and Multi-Agent Systems
DOCX
search engine optimization ppt fir known well about this
PDF
A proposed approach for plagiarism detection in Myanmar Unicode text
PDF
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
PDF
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
PDF
Developing a website for English-speaking practice to English as a foreign la...
PDF
A comparative study of natural language inference in Swahili using monolingua...
PDF
OpenACC and Open Hackathons Monthly Highlights July 2025
PDF
STKI Israel Market Study 2025 version august
PDF
TrustArc Webinar - Click, Consent, Trust: Winning the Privacy Game
PDF
CloudStack 4.21: First Look Webinar slides
PPT
Module 1.ppt Iot fundamentals and Architecture
PDF
sustainability-14-14877-v2.pddhzftheheeeee
PPT
Galois Field Theory of Risk: A Perspective, Protocol, and Mathematical Backgr...
PPTX
AI IN MARKETING- PRESENTED BY ANWAR KABIR 1st June 2025.pptx
PPTX
Final SEM Unit 1 for mit wpu at pune .pptx
PDF
Two-dimensional Klein-Gordon and Sine-Gordon numerical solutions based on dee...
PDF
Five Habits of High-Impact Board Members
NewMind AI Weekly Chronicles – August ’25 Week III
Architecture types and enterprise applications.pdf
Getting started with AI Agents and Multi-Agent Systems
search engine optimization ppt fir known well about this
A proposed approach for plagiarism detection in Myanmar Unicode text
A Late Bloomer's Guide to GenAI: Ethics, Bias, and Effective Prompting - Boha...
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
Developing a website for English-speaking practice to English as a foreign la...
A comparative study of natural language inference in Swahili using monolingua...
OpenACC and Open Hackathons Monthly Highlights July 2025
STKI Israel Market Study 2025 version august
TrustArc Webinar - Click, Consent, Trust: Winning the Privacy Game
CloudStack 4.21: First Look Webinar slides
Module 1.ppt Iot fundamentals and Architecture
sustainability-14-14877-v2.pddhzftheheeeee
Galois Field Theory of Risk: A Perspective, Protocol, and Mathematical Backgr...
AI IN MARKETING- PRESENTED BY ANWAR KABIR 1st June 2025.pptx
Final SEM Unit 1 for mit wpu at pune .pptx
Two-dimensional Klein-Gordon and Sine-Gordon numerical solutions based on dee...
Five Habits of High-Impact Board Members

Empathic Computing: Delivering the Potential of the Metaverse

  • 1. EMPATHIC COMPUTING Mark Billinghurst mark.billinghurst@auckland.ac.nz November 21st 2023 DELIVERING THE POTENTIAL OF THE METAVERSE
  • 3. Innovation occurs at the boundary
  • 5. Google Search Volume on the Metaverse • 5-year search volume • Blue – Metaverse, Red – Virtual Reality Facebook -> Meta
  • 6. Academic Publications Mentioning the Metaverse (Nov 2023)
  • 8. The Metaverse: A Definition? AWE 2022 John Riccitiello ex-CEO, Unity Real-time, 3D, Interactive, Social, Persistent
  • 9. • The Metaverse Roadmap (2007) • The Metaverse is the convergence of: • 1) virtually enhanced physical reality • 2) physically persistent virtual space A Better Definition
  • 10. Four Key Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging Metaverse Taxonomy
  • 11. • Simulations of external space/content • Capturing and sharing surroundings • Photorealistic content • Digital twins Matterport Google Street View Soul Machines MirrorWorlds
  • 12. • Measuring user’s internal state • Capturing physiological cues • Recording everyday life • Augmenting humans Appl e Shimmer OpenBCI LifeLogging
  • 15. Augmented Reality Virtual Reality Real World Virtual World Mixed Reality "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays Internet of Things Milgram’s Mixed Reality Continuum
  • 16. Example: MagicBook (2000) Reality Augmented Reality Virtual Reality • Seamlessly transition between Reality and Virtual Reality • Support ego-centric and exo-centric collaboration
  • 19. • Survey of Scopus papers • ~1900 papers found with Metaverse in abstract/keywords • Further analysis • Publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL) • Look for research across boundaries • Application analysis • Most popular application domains What is the Metaverse Research Landscape Like?
  • 20. • Of the papers mentioning AR, VR, MW, LL • 66% only mentioned one topic • VR the most popular research topic (72%) Most Research in One Quadrant
  • 21. • Key Findings • Most research combines AR and VR (AR/VR – 67%) • Little research involving LL (7% of 2Q) • < 2% research combined AR, VR, MW, LL Research Crossing Boundaries 67% 0 % 23% 5 % 2% 3%
  • 22. Application Areas • Education - 23% • Medicine/Healthcare – 13% • Marketing/Retail – 9% • Industry – 5% • Tourism – 5% • Gaming – 3% • Collaboration – 2%
  • 23. Key Lessons Learned • Research Strengths • Most Metaverse research VR related • Strong connections between AR/VR • Strong connections between MW/VR • Research Opportunities • Opportunities across boundaries - 2% papers in AR/LL, 0% in MW/LL • Opportunities for research combining all elements • Broadening application space – industry, finance, etc.
  • 24. Capturing the Whole Metaverse Augmenting Sensing Capturing Immersing
  • 25. Capturing the Whole Metaverse Empathic Computing
  • 26. “Empathy is Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 27. Understanding Emotion Recognition - Life Logging Experiencing Content/Environment capture - VR, Mirror Worlds Sharing Enhanced communication cues - AR Key Elements of Empathic Systems
  • 28. 1. Understanding: Affective Computing •Ros Picard – MIT Media Lab •Systems that recognize emotion
  • 29. 2. Experiencing: Virtual Reality "Virtual reality offers a whole different medium to tell stories that really connect people and create an empathic connection." Nonny de la Peña http://www.emblematicgroup.com/
  • 30. 3: Sharing: Empathic Computing Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 31. Example: Sharing Communication Cues • Measuring non-verbal cues • Gaze, face expression, heart rate • Sharing in Augmented Reality • Collaborative AR experiences
  • 32. Empathy Glasses • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 33. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 35. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 37. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 38. Live 3D Scene Capture
  • 39. Scene Capture and Sharing Scene Reconstruction Remote Expert Local Worker
  • 40. AR View Remote Expert View
  • 41. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 44. Example: AR/VR/MW Collaboration • Augmented Reality • Bringing remote people into your real space • Virtual Reality • Bringing elements of the real world into VR • AR/VR for sharing communication cues • Sharing non-verbal communication cues
  • 45. • Using AR/VR to share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing Virtual Communication Cues (2019)
  • 46. Sharing Virtual Communication Cues • Gesture input • AR/VR displays • Room scale tracking • Sharing viewpoint cues
  • 48. Results • Predictions • Eye/Head pointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  • 49. Sharing Gaze Cues How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment. ➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system ➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared between a local host (AR) and a remote collaborator (VR). Jing, A., May, K., Matthews, B., Lee, G., & Billinghurst, M. (2022). The Impact of Sharing Gaze Behaviours in Collaborative Mixed Reality. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 1-27.
  • 50. System Design ➔ 360 Panaramic Camera + Mixed Reality View ➔ Combination of HoloLens2 + Vive Pro Eye ➔ 4 gaze behavioural visualisations: browse, focus, mutual, fixated circle
  • 52. • Bi-directional gaze with behaviours (BWB) significantly improved performance • Behaviour visualisations stimulate frequent joint attention • Sharing gaze cues made communication easier and more effective Results Joint gaze frequency
  • 53. Sharing Gesture Cues • What type of gesture cues should be shared in AR/VR collaboration? Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD- based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13). Augmented Reality Virtual Reality
  • 54. Communication Cues • Four different cues used • (1) Hands Only (HO), (2) Hands + Pointer (HP) • (3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS) • Three experimental tasks • Lego assembly, Tangram puzzle, Origami folding
  • 55. Key Results • Task completion time • Sketch cues enabled users to complete tasks significantly faster (task dep.) • Adding pointing didn’t improve task completion time • Co-Presence • Adding pointing and sketch cues didn’t improve feeling of co-presence • User Preference • User’s overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last Task completion time “sketching is pretty useful for describing actions that was difficult with words and could express more details".
  • 56. Multi-Scale Collaboration • Changing the user’s virtual body scale
  • 58. On the Shoulder of the Giant Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI (pp. 1-17).
  • 59. Sharing: Separating Cues from Body • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI (pp. 1-13). Collaborating Collaborator out of View
  • 60. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 62. Results from User Evaluation • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks • Asymmetric, symmetric collaboration • Significant performance improvement • 20% faster with Mini-Me • Social Presence • Higher sense of Presence • Users preferred • People felt the task was easier to complete • 60-75% preference “I feel like I am talking to my partner”
  • 63. • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping Technology Trends
  • 64. • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 65. Sensor Enhanced HMDs Project Galea EEG, EMG, EDA, PPG, EOG, eye gaze, etc.
  • 67. NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 68. • HTC Vive HMD • OpenBCI • 3 EEG electrodes Set Up
  • 70. NeuralDrum – Brain Synchronisation in XR Poor Player Good Player "It’s quite interesting, I actually felt like my body was exchanged with my partner."
  • 71. • HTC Vive HMD • Empathic glove • Empatica E4 Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of CHI 2017 (pp. 4045-4056). Being in the Body of Another
  • 72. VR Environments • Butterfly World: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  • 74. Results: • Game experience had significant impact • In zombie game, sharing heart rate • Enabled players to understand emotional state of partner • Helped players to be significantly more attentive • Subjects had a preference for sharing heart rate • But no significant difference on feeling connected Understanding emotional state
  • 75. Empathic Tele-Existence • Based on Empathic Computing • Creating shared understanding • Covering the entire Metaverse • AR, VR, Lifelogging, Mirror Worlds • Transforming collaboration • Observer to participant • Feeling of doing things together • Supporting Implicit collaboration
  • 76. Conclusions • We need a broader Metaverse definition and taxonomy • Including AR, VR, Mirror Worlds, Life Logging • Current research is narrowly focused • But many research opportunities exist in crossing boundaries • Empathic Computing encompasses the entire Metaverse • Transforming face to face and remote collaboration
  • 77. MiniMe Virtual Cues Enhanced Emotion Brain Synchronization Emotion Recognition Scene Capture AI
  • 78. Delivering on the Metaverse potential