SlideShare a Scribd company logo
COSC 426: Augmented Reality

            Mark Billinghurst
     mark.billinghurst@hitlabnz.org

              Sept 5th 2012

    Lecture 7: Designing AR Interfaces
AR Interfaces
  Browsing Interfaces
    simple (conceptually!), unobtrusive
  3D AR Interfaces
    expressive, creative, require attention
  Tangible Interfaces
    Embedded into conventional environments
  Tangible AR
    Combines TUI input + AR display
AR Interfaces as Data Browsers
  2D/3D virtual objects are
   registered in 3D
    “VR in Real World”
  Interaction
    2D/3D virtual viewpoint
     control
  Applications
    Visualization, training
3D AR Interfaces
  Virtual objects displayed in 3D
   physical space and manipulated
     HMDs and 6DOF head-tracking
     6DOF hand trackers for input
  Interaction
     Viewpoint control
     Traditional 3D user interface           Kiyokawa, et al. 2000
      interaction: manipulation, selection,
      etc.
Augmented Surfaces and
Tangible Interfaces
  Basic principles
     Virtual objects are
      projected on a surface
     Physical objects are used
      as controls for virtual
      objects
     Support for collaboration
Tangible Interfaces - Ambient
  Dangling String
     Jeremijenko 1995
     Ambient ethernet monitor
     Relies on peripheral cues
  Ambient Fixtures
     Dahley, Wisneski, Ishii 1998
     Use natural material qualities
       for information display
Back to the Real World

  AR overcomes limitation of TUIs
    enhance display possibilities
    merge task/display space
    provide public and private views


  TUI + AR = Tangible AR
    Apply TUI methods to AR interface design
  Space-multiplexed
      Many devices each with one function
        -  Quicker to use, more intuitive, clutter
        -  Real Toolbox

  Time-multiplexed
      One device with many functions
        -  Space efficient
        -  mouse
Tangible AR: Tiles (Space Multiplexed)
  Tiles semantics
     data tiles
     operation tiles
  Operation on tiles
     proximity
     spatial arrangements
     space-multiplexed
Tangible AR: Time-multiplexed Interaction
  Use of natural physical object manipulations to
   control virtual objects
  VOMAR Demo
     Catalog book:
      -  Turn over the page
     Paddle operation:
      -  Push, shake, incline, hit, scoop
Building Compelling AR Experiences

          experiences

          applications   Interaction


             tools       Authoring


          components     Tracking, Display



                                       Sony CSL © 2004
Interface Design Path
1/ Prototype Demonstration
2/ Adoption of Interaction Techniques from other
  interface metaphors            Augmented Reality
3/ Development of new interface metaphors
  appropriate to the medium
                                     Virtual Reality
4/ Development of formal theoretical models for
  predicting and modeling user actions
                                     Desktop WIMP
Interface metaphors
  Designed to be similar to a physical entity but also has own
   properties
      e.g. desktop metaphor, search engine
  Exploit user’s familiar knowledge, helping them to understand
   ‘the unfamiliar’
  Conjures up the essence of the unfamiliar activity, enabling
   users to leverage of this to understand more aspects of the
   unfamiliar functionality
  People find it easier to learn and talk about what they are
   doing at the computer interface in terms familiar to them
Example: The spreadsheet
  Analogous to ledger
   sheet
  Interactive and
   computational
  Easy to understand
  Greatly extending
   what accountants
   and others could do



www.bricklin.com/history/refcards.htm
Why was it so good?
  It was simple, clear, and obvious to the users how to
   use the application and what it could do
  “it is just a tool to allow others to work out their
   ideas and reduce the tedium of repeating the same
   calculations.”
  capitalized on user’s familiarity with ledger sheets
  Got the computer to perform a range of different
   calculations in response to user input
Another classic
  8010 Star office system targeted at workers not
   interested in computing per se
  Spent several person-years at beginning working out
   the conceptual model
  Simplified the electronic world, making it seem more
   familiar, less alien, and easier to learn




  Johnson et al (1989)
The Star interface
Benefits of interface metaphors
  Makes learning new systems easier
  Helps users understand the underlying
   conceptual model
  Can be innovative and enable the realm of
   computers and their applications to be made
   more accessible to a greater diversity of users
Problems with interface metaphors
              (Nielson, 1990)
  Break conventional and cultural rules
      e.g., recycle bin placed on desktop
  Can constrain designers in the way they conceptualize a problem
  Conflict with design principles
  Forces users to only understand the system in terms of the
   metaphor
  Designers can inadvertently use bad existing designs and transfer
   the bad parts over
  Limits designers’ imagination with new conceptual models
426 lecture 7: Designing AR Interfaces
Microsoft Bob
426 lecture 7: Designing AR Interfaces
  PSDoom – killing processes
AR Design Principles
  Interface Components
    Physical components
    Display elements
    -  Visual/audio
    Interaction metaphors
             Physical                 Display
             Elements   Interaction   Elements
                        Metaphor
                Input                  Output
Back to the Real World

  AR overcomes limitation of TUIs
    enhance display possibilities
    merge task/display space
    provide public and private views


  TUI + AR = Tangible AR
    Apply TUI methods to AR interface design
AR Design Space

    Reality                            Virtual Reality

                  Augmented Reality




Physical Design                       Virtual Design
Tangible AR Design Principles
  Tangible AR Interfaces use TUI principles
    Physical controllers for moving virtual content
    Support for spatial 3D interaction techniques
    Time and space multiplexed interaction
    Support for multi-handed interaction
    Match object affordances to task requirements
    Support parallel activity with multiple objects
    Allow collaboration between multiple users
  Space-multiplexed
     Many devices each with one function
       -  Quicker to use, more intuitive, clutter
       -  Tiles Interface, toolbox


  Time-multiplexed
     One device with many functions
       -  Space efficient
       -  VOMAR Interface, mouse
Design of Objects
  Objects
     Purposely built – affordances
     “Found” – repurposed
     Existing – already at use in marketplace
  Make affordances obvious (Norman)
       Object affordances visible
       Give feedback
       Provide constraints
       Use natural mapping
       Use good cognitive model
Object Design
Affordances: to give a clue
  Refers to an attribute of an object that allows people to
   know how to use it
     e.g. a mouse button invites pushing, a door handle affords
      pulling

  Norman (1988) used the term to discuss the design of
   everyday objects
  Since has been much popularised in interaction design
   to discuss how to design interface objects
     e.g. scrollbars to afford moving up and down, icons to afford
      clicking on
"...the term affordance refers to the perceived and
    actual properties of the thing, primarily those
    fundamental properties that determine just how the
    thing could possibly be used. [...] Affordances
    provide strong clues to the operations of things.
    Plates are for pushing. Knobs are for turning. Slots
    are for inserting things into. Balls are for throwing or
    bouncing. When affordances are taken advantage of,
    the user knows what to do just by looking: no
    picture, label, or instruction needed."
    (Norman, The Psychology of Everyday Things 1988, p.9)
Physical Affordances
  Physical affordances:
   How do the following physical objects afford?
   Are they obvious?
‘Affordance’ and Interface Design?
  Interfaces are virtual and do not have affordances
   like physical objects
  Norman argues it does not make sense to talk
   about interfaces in terms of ‘real’ affordances
  Instead interfaces are better conceptualized as
   ‘perceived’ affordances
     Learned conventions of arbitrary mappings between
      action and effect at the interface
     Some mappings are better than others
Virtual Affordances
  Virtual affordances
   How do the following screen objects afford?
   What if you were a novice user?
   Would you know what to do with them?
  AR is mixture of physical affordance and
   virtual affordance
  Physical
     Tangible controllers and objects
  Virtual
     Virtual graphics and audio
Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
  MagicLenses
     Developed at Xerox PARC in 1993
     View a region of the workspace differently to the rest
     Overlap MagicLenses to create composite effects
3D MagicLenses
MagicLenses extended to 3D (Veiga et. al. 96)
  Volumetric and flat lenses
AR Lens Design Principles
  Physical Components
    Lens handle
     -  Virtual lens attached to real object
  Display Elements
    Lens view
     -  Reveal layers in dataset
  Interaction Metaphor
    Physically holding lens
3D AR Lenses: Model Viewer
    Displays models made up of multiple parts
    Each part can be shown or hidden through the lens
    Allows the user to peer inside the model
    Maintains focus + context
AR Lens Demo
AR Lens Implementation



Stencil Buffer        Outside Lens




Inside Lens      Virtual Magnifying Glass
AR FlexiLens




Real handles/controllers with flexible AR lens
Techniques based on AR Lenses
  Object Selection
     Select objects by targeting them with the lens
  Information Filtering
     Show different representations through the lens
     Hide certain content to reduce clutter, look inside things
  Move between AR and VR
     Transition along the reality-virtuality continuum
     Change our viewpoint to suit our needs
Case Study 2 : LevelHead




  Block based game
Case Study 2: LevelHead
  Physical Components
    Real blocks
  Display Elements
    Virtual person and rooms
  Interaction Metaphor
    Blocks are rooms
426 lecture 7: Designing AR Interfaces
Case Study 3: AR Chemistry (Fjeld 2002)
  Tangible AR chemistry education
Goal: An AR application to test molecular
 structure in chemistry
  Physical Components
    Real book, rotation cube, scoop, tracking markers
  Display Elements
    AR atoms and molecules
  Interaction Metaphor
    Build your own molecule
AR Chemistry Input Devices
426 lecture 7: Designing AR Interfaces
Case Study 4: Transitional Interfaces
Goal: An AR interface supporting transitions
 from reality to virtual reality
  Physical Components
    Real book
  Display Elements
    AR and VR content
  Interaction Metaphor
    Book pages hold virtual scenes
Milgram’s Continuum (1994)
                      Mixed Reality (MR)


Reality        Augmented            Augmented            Virtuality
(Tangible      Reality (AR)         Virtuality (AV)      (Virtual
Interfaces)                                              Reality)


  Central Hypothesis
       The next generation of interfaces will support transitions
        along the Reality-Virtuality continuum
Transitions
  Interfaces of the future will need to support
   transitions along the RV continuum
  Augmented Reality is preferred for:
    co-located collaboration
  Immersive Virtual Reality is preferred for:
    experiencing world immersively (egocentric)
    sharing views
    remote collaboration
The MagicBook
  Design Goals:
    Allows user to move smoothly between reality
     and virtual reality
    Support collaboration
MagicBook Metaphor
Features
  Seamless transition between Reality and Virtuality
     Reliance on real decreases as virtual increases
  Supports egocentric and exocentric views
     User can pick appropriate view
  Computer becomes invisible
     Consistent interface metaphors
     Virtual content seems real
  Supports collaboration
Collaboration
  Collaboration on multiple levels:
    Physical Object
    AR Object
    Immersive Virtual Space
  Egocentric + exocentric collaboration
    multiple multi-scale users
  Independent Views
    Privacy, role division, scalability
Technology
  Reality
     No technology
  Augmented Reality
     Camera – tracking
     Switch – fly in
  Virtual Reality
     Compass – tracking
     Press pad – move
     Switch – fly out
Scientific Visualization
Education
Summary
  When designing AR interfaces, think of:
    Physical Components
     -  Physical affordances
    Virtual Components
     -  Virtual affordances
    Interface Metaphors
OSGART:
From Registration to Interaction
Keyboard and Mouse Interaction
    Traditional input techniques
    OSG provides a framework for handling keyboard
     and mouse input events (osgGA)
      1.  Subclass osgGA::GUIEventHandler
      2.  Handle events:
         •    Mouse up / down / move / drag / scroll-wheel
         •    Key up / down
      3.  Add instance of new handler to the viewer
Keyboard and Mouse Interaction
       Create your own event handler class
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {

public:
   KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }

     virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,
        osg::Object* obj, osg::NodeVisitor* nv) {

         switch (ea.getEventType()) {
            // Possible events we can handle
            case osgGA::GUIEventAdapter::PUSH: break;
            case osgGA::GUIEventAdapter::RELEASE: break;
            case osgGA::GUIEventAdapter::MOVE: break;
            case osgGA::GUIEventAdapter::DRAG: break;
            case osgGA::GUIEventAdapter::SCROLL: break;
            case osgGA::GUIEventAdapter::KEYUP: break;
            case osgGA::GUIEventAdapter::KEYDOWN: break;
         }

         return false;
     }
};


       Add it to the viewer to receive events
viewer.addEventHandler(new KeyboardMouseEventHandler());
Keyboard Interaction
    Handle W,A,S,D keys to move an object
case osgGA::GUIEventAdapter::KEYDOWN: {

   switch (ea.getKey()) {
      case 'w': // Move forward 5mm
         localTransform->preMult(osg::Matrix::translate(0, -5, 0));
         return true;
      case 's': // Move back 5mm
         localTransform->preMult(osg::Matrix::translate(0, 5, 0));
         return true;
      case 'a': // Rotate 10 degrees left
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));
         return true;
      case 'd': // Rotate 10 degrees right
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));
         return true;
      case ' ': // Reset the transformation
         localTransform->setMatrix(osg::Matrix::identity());
         return true;
   }

break;


localTransform = new osg::MatrixTransform();
localTransform->addChild(osgDB::readNodeFile("media/car.ive"));
arTransform->addChild(localTransform.get());
Keyboard Interaction Demo
Mouse Interaction
  Mouse is pointing device…
  Use mouse to select objects in an AR scene
  OSG provides methods for ray-casting and
   intersection testing
    Return an osg::NodePath (the path from the hit
     node all the way back to the root)


                          Projection
                        Plane (screen)    scene
Mouse Interaction
  Compute the list of nodes under the clicked position
  Invoke an action on nodes that are hit, e.g. select, delete
case osgGA::GUIEventAdapter::PUSH:

   osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);
   osgUtil::LineSegmentIntersector::Intersections intersections;

   // Clear previous selections
   for (unsigned int i = 0; i < targets.size(); i++) {
      targets[i]->setSelected(false);
   }

   // Find new selection based on click position
   if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {
      for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin();
         iter != intersections.end(); iter++) {

            if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {
               std::cout << "HIT!" << std::endl;
               target->setSelected(true);
               return true;
            }
       }
   }

   break;
Mouse Interaction Demo
Proximity Techniques
  Interaction based on
    the distance between a marker and the camera
    the distance between multiple markers
Single Marker Techniques: Proximity
  Use distance from camera to marker as
   input parameter
     e.g. Lean in close to examine
  Can use the osg::LOD class to show
   different content at different depth
   ranges                                  Image: OpenSG Consortium
Single Marker Techniques: Proximity
// Load some models
osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");
osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");
osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");

// Use a Level-Of-Detail node to show each model at different distance ranges.
osg::ref_ptr<osg::LOD> lod = new osg::LOD();
lod->addChild(farNode.get(), 500.0f, 10000.0f);      // Show the "far" node from 50cm to 10m away
lod->addChild(closerNode.get(), 200.0f, 500.0f);     // Show the "closer" node from 20cm to 50cm away
lod->addChild(nearNode.get(), 0.0f, 200.0f);         // Show the "near" node from 0cm to 2cm away

arTransform->addChild(lod.get());




  Define depth ranges for each node
  Add as many as you want
  Ranges can overlap
Single Marker Proximity Demo
Multiple Marker Concepts
  Interaction based on the relationship between
   markers
    e.g. When the distance between two markers
     decreases below threshold invoke an action
    Tangible User Interface
  Applications:
    Memory card games
    File operations
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance > Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance <= Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity
  Use a node callback to test for proximity and update the relevant nodes

virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {

    if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {
       if (mMarkerA->valid() && mMarkerB->valid()) {

            osg::Vec3 posA =   mMarkerA->getTransform().getTrans();
            osg::Vec3 posB =   mMarkerB->getTransform().getTrans();
            osg::Vec3 offset   = posA - posB;
            float distance =   offset.length();

            if (distance <= mThreshold) {
               if (mSwitchA->getNumChildren()   > 1) mSwitchA->setSingleChildOn(1);
               if (mSwitchB->getNumChildren()   > 1) mSwitchB->setSingleChildOn(1);
            } else {
               if (mSwitchA->getNumChildren()   > 0) mSwitchA->setSingleChildOn(0);
               if (mSwitchB->getNumChildren()   > 0) mSwitchB->setSingleChildOn(0);
            }

        }

    }

    traverse(node,nv);

}
Multiple Marker Proximity
Paddle Interaction
  Use one marker as a tool for selecting and
   manipulating objects (tangible user interface)
  Another marker provides a frame of reference
      A grid of markers can alleviate problems with occlusion




 MagicCup (Kato et al)    VOMAR (Kato et al)
Paddle Interaction
  Often useful to adopt a local coordinate system

                                                       Allows the camera
                                                        to move without
                                                        disrupting Tlocal

                                                       Places the paddle in
                                                        the same coordinate
                                                        system as the
                                                        content on the grid
                                                            Simplifies interaction
  osgART computes Tlocal using the osgART::LocalTransformationCallback
Tilt and Shake Interaction

  Detect types of paddle movement:
    Tilt
      -  gradual change in orientation
    Shake
      -  short, sudden changes in translation
Building Tangible AR Interfaces
        with ARToolKit
Required Code
  Calculating Camera Position
     Range to marker
  Loading Multiple Patterns/Models
  Interaction between objects
     Proximity
     Relative position/orientation
  Occlusion
     Stencil buffering
     Multi-marker tracking
Tangible AR Coordinate Frames
Local vs. Global Interactions
  Local
     Actions determined from single camera to marker
      transform
      -  shaking, appearance, relative position, range
  Global
     Actions determined from two relationships
      -  marker to camera, world to camera coords.
      -  Marker transform determined in world coordinates
           •  object tilt, absolute position, absolute rotation, hitting
Range-based Interaction
  Sample File: RangeTest.c

/* get the camera transformation */
arGetTransMat(&marker_info[k], marker_center,
  marker_width, marker_trans);

/* find the range */
Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
Zpos = marker_trans[2][3];
range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
Loading Multiple Patterns
  Sample File: LoadMulti.c
     Uses object.c to load
  Object Structure
   typedef struct {
     char       name[256];
     int        id;
     int        visible;
     double     marker_coord[4][2];
     double     trans[3][4];
     double     marker_width;
     double     marker_center[2];
   } ObjectData_T;
Finding Multiple Transforms
  Create object list
ObjectData_T        *object;

  Read in objects - in init( )
read_ObjData( char *name, int *objectnum );

  Find Transform – in mainLoop( )
for( i = 0; i < objectnum; i++ ) {
    ..Check patterns
    ..Find transforms for each marker
  }
Drawing Multiple Objects
  Send the object list to the draw function
draw( object, objectnum );
  Draw each object individually
for( i = 0; i < objectnum; i++ ) {
   if( object[i].visible == 0 ) continue;
   argConvGlpara(object[i].trans, gl_para);
   draw_object( object[i].id, gl_para);
}
Proximity Based Interaction

  Sample File – CollideTest.c
  Detect distance between markers
  checkCollisions(object[0],object[1], DIST)
  If distance < collide distance
  Then change the model/perform interaction
Multi-marker Tracking
  Sample File – multiTest.c
  Multiple markers to establish a
   single coordinate frame
    Reading in a configuration file
    Tracking from sets of markers
    Careful camera calibration
MultiMarker Configuration File
  Sample File - Data/multi/marker.dat
  Contains list of all the patterns and their exact
   positions
   #the number of patterns to be recognized
   6
                                  Pattern File

   #marker 1
                                           Pattern Width +
   Data/multi/patt.a
                                           Coordinate Origin
   40.0
   0.0 0.0                                    Pattern Transform
   1.0000 0.0000 0.0000 -100.0000             Relative to Global
   0.0000 1.0000 0.0000 50.0000               Origin
   0.0000 0.0000 1.0000 0.0000
   …
Camera Transform Calculation
  Include <AR/arMulti.h>
  Link to libARMulti.lib
  In mainLoop()
     Detect markers as usual
      arDetectMarkerLite(dataPtr, thresh,
              &marker_info, &marker_num)
     Use MultiMarker Function
      if( (err=arMultiGetTransMat(marker_info,
                            marker_num, config)) <
      0 ) {
               argSwapBuffers();
               return;
         }
Paddle-based Interaction




Tracking single marker relative to multi-marker set
  - paddle contains single marker
Paddle Interaction Code
  Sample File – PaddleDemo.c
  Get paddle marker location + draw paddle before drawing
   background model
   paddleGetTrans(paddleInfo, marker_info,
       marker_flag, marker_num, &cparam);

  /* draw the paddle */
  if( paddleInfo->active ){
      draw_paddle( paddleInfo);
  }
draw_paddle uses a Stencil Buffer to increase realism
Paddle Interaction Code II
  Sample File – paddleDrawDemo.c
  Finds the paddle position relative to global coordinate frame:
   setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
  Sample File – paddleTouch.c
  Finds the paddle position:
   findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
  Checks for collisions:
   checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
General Tangible AR Library
  command_sub.c, command_sub.h
  Contains functions for recognizing a range of
   different paddle motions:
   int   check_shake( );
   int   check_punch( );
   int   check_incline( );
   int   check_pickup( );
   int   check_push( );
  Eg: to check angle between paddle and base
   check_incline(paddle->trans, base->trans, &ang)

More Related Content

PPTX
Deepfakes: Trick or Treat?
PPTX
Augmented reality vs Virtual reality.pptx
PPT
Virtual Reality
PPTX
4G Technology
PPTX
PPTX
Virtual reality
PDF
426 lecture2: AR Technology
PPT
Mixed Reality
Deepfakes: Trick or Treat?
Augmented reality vs Virtual reality.pptx
Virtual Reality
4G Technology
Virtual reality
426 lecture2: AR Technology
Mixed Reality

What's hot (20)

PDF
2022 COMP 4010 Lecture 7: Introduction to VR
PPT
Introduction to Biometric lectures... Prepared by Dr.Abbas
PDF
Deepfakes: An Emerging Internet Threat and their Detection
PPTX
Virtual reality
PDF
Tish Shute (HuaweiXR): - The Future of Intelligence
PDF
Lecture 5: 3D User Interfaces for Virtual Reality
PPTX
Virtual reality
PPTX
flexpad
PPTX
Augmented reality ppt
PPTX
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
PDF
Advanced Methods for User Evaluation in AR/VR Studies
PPTX
Mixed reality
PDF
Comp 4010 2021 - Snap Tutorial-1
PPTX
Augmented Reality
PDF
Virtual Mouse Control Using Hand Gesture Recognition
PPTX
Google Cardboard Virtual Reality
PDF
Comp4010 Lecture8 Introduction to VR
PDF
Application in Augmented and Virtual Reality
PDF
Introduction to Extended Reality - XR
PPT
Virtual Mouse
2022 COMP 4010 Lecture 7: Introduction to VR
Introduction to Biometric lectures... Prepared by Dr.Abbas
Deepfakes: An Emerging Internet Threat and their Detection
Virtual reality
Tish Shute (HuaweiXR): - The Future of Intelligence
Lecture 5: 3D User Interfaces for Virtual Reality
Virtual reality
flexpad
Augmented reality ppt
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
Advanced Methods for User Evaluation in AR/VR Studies
Mixed reality
Comp 4010 2021 - Snap Tutorial-1
Augmented Reality
Virtual Mouse Control Using Hand Gesture Recognition
Google Cardboard Virtual Reality
Comp4010 Lecture8 Introduction to VR
Application in Augmented and Virtual Reality
Introduction to Extended Reality - XR
Virtual Mouse
Ad

Similar to 426 lecture 7: Designing AR Interfaces (20)

PDF
COSC 426 lect. 4: AR Interaction
PDF
2016 AR Summer School Lecture3
PDF
426 lecture6b: AR Interaction
PDF
Tangible AR Interface
PDF
Mobile AR lecture 9 - Mobile AR Interface Design
PDF
COMP 4010 Lecture 9 AR Interaction
PDF
Designing Outstanding AR Experiences
PDF
2022 COMP4010 Lecture5: AR Prototyping
PDF
Tangible A
PDF
COMP 4010: Lecture11 AR Interaction
PPT
If You Want To Be A Winner, Change Your VRDOLL Philosophy Now!
PDF
Comp4010 lecture6 Prototyping
PDF
COMP 4010 Lecture9 AR Interaction
PDF
2013 Lecture 6: AR User Interface Design Guidelines
PDF
Building Usable AR Interfaces
PPT
Tangible User Interface Showcase
PPTX
hcid2011 - Gesture Based Interfaces: Jacques chueke (HCID, City University L...
PDF
Designing Augmented Reality Experiences
PDF
"Click to Continue" by Sam Otis, from Content+Design Meetup, Oct. 4, 2017
PDF
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
COSC 426 lect. 4: AR Interaction
2016 AR Summer School Lecture3
426 lecture6b: AR Interaction
Tangible AR Interface
Mobile AR lecture 9 - Mobile AR Interface Design
COMP 4010 Lecture 9 AR Interaction
Designing Outstanding AR Experiences
2022 COMP4010 Lecture5: AR Prototyping
Tangible A
COMP 4010: Lecture11 AR Interaction
If You Want To Be A Winner, Change Your VRDOLL Philosophy Now!
Comp4010 lecture6 Prototyping
COMP 4010 Lecture9 AR Interaction
2013 Lecture 6: AR User Interface Design Guidelines
Building Usable AR Interfaces
Tangible User Interface Showcase
hcid2011 - Gesture Based Interfaces: Jacques chueke (HCID, City University L...
Designing Augmented Reality Experiences
"Click to Continue" by Sam Otis, from Content+Design Meetup, Oct. 4, 2017
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
Ad

More from Mark Billinghurst (20)

PDF
Empathic Computing: Creating Shared Understanding
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
PDF
Research Directions in Heads-Up Computing
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
PDF
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
PDF
Research Directions for Cross Reality Interfaces
Empathic Computing: Creating Shared Understanding
Reach Out and Touch Someone: Haptics and Empathic Computing
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Research Directions in Heads-Up Computing
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
IVE 2024 Short Course - Lecture14 - Evaluation
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Research Directions for Cross Reality Interfaces

Recently uploaded (20)

PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
cuic standard and advanced reporting.pdf
PDF
Unlocking AI with Model Context Protocol (MCP)
PPTX
sap open course for s4hana steps from ECC to s4
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PPTX
Spectroscopy.pptx food analysis technology
PPTX
Big Data Technologies - Introduction.pptx
PPT
Teaching material agriculture food technology
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Electronic commerce courselecture one. Pdf
PDF
Encapsulation theory and applications.pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
NewMind AI Weekly Chronicles - August'25 Week I
“AI and Expert System Decision Support & Business Intelligence Systems”
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
cuic standard and advanced reporting.pdf
Unlocking AI with Model Context Protocol (MCP)
sap open course for s4hana steps from ECC to s4
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Spectroscopy.pptx food analysis technology
Big Data Technologies - Introduction.pptx
Teaching material agriculture food technology
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
The AUB Centre for AI in Media Proposal.docx
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Electronic commerce courselecture one. Pdf
Encapsulation theory and applications.pdf
Chapter 3 Spatial Domain Image Processing.pdf
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Digital-Transformation-Roadmap-for-Companies.pptx
NewMind AI Weekly Chronicles - August'25 Week I

426 lecture 7: Designing AR Interfaces

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org Sept 5th 2012 Lecture 7: Designing AR Interfaces
  • 2. AR Interfaces   Browsing Interfaces   simple (conceptually!), unobtrusive   3D AR Interfaces   expressive, creative, require attention   Tangible Interfaces   Embedded into conventional environments   Tangible AR   Combines TUI input + AR display
  • 3. AR Interfaces as Data Browsers   2D/3D virtual objects are registered in 3D   “VR in Real World”   Interaction   2D/3D virtual viewpoint control   Applications   Visualization, training
  • 4. 3D AR Interfaces   Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input   Interaction   Viewpoint control   Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  • 5. Augmented Surfaces and Tangible Interfaces   Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
  • 6. Tangible Interfaces - Ambient   Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues   Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
  • 7. Back to the Real World   AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views   TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 8.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox   Time-multiplexed   One device with many functions -  Space efficient -  mouse
  • 9. Tangible AR: Tiles (Space Multiplexed)   Tiles semantics   data tiles   operation tiles   Operation on tiles   proximity   spatial arrangements   space-multiplexed
  • 10. Tangible AR: Time-multiplexed Interaction   Use of natural physical object manipulations to control virtual objects   VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
  • 11. Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  • 12. Interface Design Path 1/ Prototype Demonstration 2/ Adoption of Interaction Techniques from other interface metaphors Augmented Reality 3/ Development of new interface metaphors appropriate to the medium Virtual Reality 4/ Development of formal theoretical models for predicting and modeling user actions Desktop WIMP
  • 13. Interface metaphors   Designed to be similar to a physical entity but also has own properties   e.g. desktop metaphor, search engine   Exploit user’s familiar knowledge, helping them to understand ‘the unfamiliar’   Conjures up the essence of the unfamiliar activity, enabling users to leverage of this to understand more aspects of the unfamiliar functionality   People find it easier to learn and talk about what they are doing at the computer interface in terms familiar to them
  • 14. Example: The spreadsheet   Analogous to ledger sheet   Interactive and computational   Easy to understand   Greatly extending what accountants and others could do www.bricklin.com/history/refcards.htm
  • 15. Why was it so good?   It was simple, clear, and obvious to the users how to use the application and what it could do   “it is just a tool to allow others to work out their ideas and reduce the tedium of repeating the same calculations.”   capitalized on user’s familiarity with ledger sheets   Got the computer to perform a range of different calculations in response to user input
  • 16. Another classic   8010 Star office system targeted at workers not interested in computing per se   Spent several person-years at beginning working out the conceptual model   Simplified the electronic world, making it seem more familiar, less alien, and easier to learn Johnson et al (1989)
  • 18. Benefits of interface metaphors   Makes learning new systems easier   Helps users understand the underlying conceptual model   Can be innovative and enable the realm of computers and their applications to be made more accessible to a greater diversity of users
  • 19. Problems with interface metaphors (Nielson, 1990)   Break conventional and cultural rules   e.g., recycle bin placed on desktop   Can constrain designers in the way they conceptualize a problem   Conflict with design principles   Forces users to only understand the system in terms of the metaphor   Designers can inadvertently use bad existing designs and transfer the bad parts over   Limits designers’ imagination with new conceptual models
  • 23.   PSDoom – killing processes
  • 24. AR Design Principles   Interface Components   Physical components   Display elements -  Visual/audio   Interaction metaphors Physical Display Elements Interaction Elements Metaphor Input Output
  • 25. Back to the Real World   AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views   TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 26. AR Design Space Reality Virtual Reality Augmented Reality Physical Design Virtual Design
  • 27. Tangible AR Design Principles   Tangible AR Interfaces use TUI principles   Physical controllers for moving virtual content   Support for spatial 3D interaction techniques   Time and space multiplexed interaction   Support for multi-handed interaction   Match object affordances to task requirements   Support parallel activity with multiple objects   Allow collaboration between multiple users
  • 28.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Tiles Interface, toolbox   Time-multiplexed   One device with many functions -  Space efficient -  VOMAR Interface, mouse
  • 29. Design of Objects   Objects   Purposely built – affordances   “Found” – repurposed   Existing – already at use in marketplace   Make affordances obvious (Norman)   Object affordances visible   Give feedback   Provide constraints   Use natural mapping   Use good cognitive model
  • 31. Affordances: to give a clue   Refers to an attribute of an object that allows people to know how to use it   e.g. a mouse button invites pushing, a door handle affords pulling   Norman (1988) used the term to discuss the design of everyday objects   Since has been much popularised in interaction design to discuss how to design interface objects   e.g. scrollbars to afford moving up and down, icons to afford clicking on
  • 32. "...the term affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things. Plates are for pushing. Knobs are for turning. Slots are for inserting things into. Balls are for throwing or bouncing. When affordances are taken advantage of, the user knows what to do just by looking: no picture, label, or instruction needed." (Norman, The Psychology of Everyday Things 1988, p.9)
  • 33. Physical Affordances   Physical affordances: How do the following physical objects afford? Are they obvious?
  • 34. ‘Affordance’ and Interface Design?   Interfaces are virtual and do not have affordances like physical objects   Norman argues it does not make sense to talk about interfaces in terms of ‘real’ affordances   Instead interfaces are better conceptualized as ‘perceived’ affordances   Learned conventions of arbitrary mappings between action and effect at the interface   Some mappings are better than others
  • 35. Virtual Affordances   Virtual affordances How do the following screen objects afford? What if you were a novice user? Would you know what to do with them?
  • 36.   AR is mixture of physical affordance and virtual affordance   Physical   Tangible controllers and objects   Virtual   Virtual graphics and audio
  • 37. Case Study 1: 3D AR Lens Goal: Develop a lens based AR interface   MagicLenses   Developed at Xerox PARC in 1993   View a region of the workspace differently to the rest   Overlap MagicLenses to create composite effects
  • 38. 3D MagicLenses MagicLenses extended to 3D (Veiga et. al. 96)   Volumetric and flat lenses
  • 39. AR Lens Design Principles   Physical Components   Lens handle -  Virtual lens attached to real object   Display Elements   Lens view -  Reveal layers in dataset   Interaction Metaphor   Physically holding lens
  • 40. 3D AR Lenses: Model Viewer   Displays models made up of multiple parts   Each part can be shown or hidden through the lens   Allows the user to peer inside the model   Maintains focus + context
  • 42. AR Lens Implementation Stencil Buffer Outside Lens Inside Lens Virtual Magnifying Glass
  • 43. AR FlexiLens Real handles/controllers with flexible AR lens
  • 44. Techniques based on AR Lenses   Object Selection   Select objects by targeting them with the lens   Information Filtering   Show different representations through the lens   Hide certain content to reduce clutter, look inside things   Move between AR and VR   Transition along the reality-virtuality continuum   Change our viewpoint to suit our needs
  • 45. Case Study 2 : LevelHead   Block based game
  • 46. Case Study 2: LevelHead   Physical Components   Real blocks   Display Elements   Virtual person and rooms   Interaction Metaphor   Blocks are rooms
  • 48. Case Study 3: AR Chemistry (Fjeld 2002)   Tangible AR chemistry education
  • 49. Goal: An AR application to test molecular structure in chemistry   Physical Components   Real book, rotation cube, scoop, tracking markers   Display Elements   AR atoms and molecules   Interaction Metaphor   Build your own molecule
  • 52. Case Study 4: Transitional Interfaces Goal: An AR interface supporting transitions from reality to virtual reality   Physical Components   Real book   Display Elements   AR and VR content   Interaction Metaphor   Book pages hold virtual scenes
  • 53. Milgram’s Continuum (1994) Mixed Reality (MR) Reality Augmented Augmented Virtuality (Tangible Reality (AR) Virtuality (AV) (Virtual Interfaces) Reality) Central Hypothesis   The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  • 54. Transitions   Interfaces of the future will need to support transitions along the RV continuum   Augmented Reality is preferred for:   co-located collaboration   Immersive Virtual Reality is preferred for:   experiencing world immersively (egocentric)   sharing views   remote collaboration
  • 55. The MagicBook   Design Goals:   Allows user to move smoothly between reality and virtual reality   Support collaboration
  • 57. Features   Seamless transition between Reality and Virtuality   Reliance on real decreases as virtual increases   Supports egocentric and exocentric views   User can pick appropriate view   Computer becomes invisible   Consistent interface metaphors   Virtual content seems real   Supports collaboration
  • 58. Collaboration   Collaboration on multiple levels:   Physical Object   AR Object   Immersive Virtual Space   Egocentric + exocentric collaboration   multiple multi-scale users   Independent Views   Privacy, role division, scalability
  • 59. Technology   Reality   No technology   Augmented Reality   Camera – tracking   Switch – fly in   Virtual Reality   Compass – tracking   Press pad – move   Switch – fly out
  • 62. Summary   When designing AR interfaces, think of:   Physical Components -  Physical affordances   Virtual Components -  Virtual affordances   Interface Metaphors
  • 64. Keyboard and Mouse Interaction   Traditional input techniques   OSG provides a framework for handling keyboard and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events: •  Mouse up / down / move / drag / scroll-wheel •  Key up / down 3.  Add instance of new handler to the viewer
  • 65. Keyboard and Mouse Interaction   Create your own event handler class class KeyboardMouseEventHandler : public osgGA::GUIEventHandler { public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; } };   Add it to the viewer to receive events viewer.addEventHandler(new KeyboardMouseEventHandler());
  • 66. Keyboard Interaction   Handle W,A,S,D keys to move an object case osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; } break; localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get());
  • 68. Mouse Interaction   Mouse is pointing device…   Use mouse to select objects in an AR scene   OSG provides methods for ray-casting and intersection testing   Return an osg::NodePath (the path from the hit node all the way back to the root) Projection Plane (screen) scene
  • 69. Mouse Interaction   Compute the list of nodes under the clicked position   Invoke an action on nodes that are hit, e.g. select, delete case osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } } break;
  • 71. Proximity Techniques   Interaction based on   the distance between a marker and the camera   the distance between multiple markers
  • 72. Single Marker Techniques: Proximity   Use distance from camera to marker as input parameter   e.g. Lean in close to examine   Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
  • 73. Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg"); // Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away arTransform->addChild(lod.get());   Define depth ranges for each node   Add as many as you want   Ranges can overlap
  • 75. Multiple Marker Concepts   Interaction based on the relationship between markers   e.g. When the distance between two markers decreases below threshold invoke an action   Tangible User Interface   Applications:   Memory card games   File operations
  • 76. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 77. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 78. Multiple Marker Proximity   Use a node callback to test for proximity and update the relevant nodes virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv); }
  • 80. Paddle Interaction   Use one marker as a tool for selecting and manipulating objects (tangible user interface)   Another marker provides a frame of reference   A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (Kato et al)
  • 81. Paddle Interaction   Often useful to adopt a local coordinate system   Allows the camera to move without disrupting Tlocal   Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction   osgART computes Tlocal using the osgART::LocalTransformationCallback
  • 82. Tilt and Shake Interaction   Detect types of paddle movement:   Tilt -  gradual change in orientation   Shake -  short, sudden changes in translation
  • 83. Building Tangible AR Interfaces with ARToolKit
  • 84. Required Code   Calculating Camera Position   Range to marker   Loading Multiple Patterns/Models   Interaction between objects   Proximity   Relative position/orientation   Occlusion   Stencil buffering   Multi-marker tracking
  • 86. Local vs. Global Interactions   Local   Actions determined from single camera to marker transform -  shaking, appearance, relative position, range   Global   Actions determined from two relationships -  marker to camera, world to camera coords. -  Marker transform determined in world coordinates •  object tilt, absolute position, absolute rotation, hitting
  • 87. Range-based Interaction   Sample File: RangeTest.c /* get the camera transformation */ arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); /* find the range */ Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  • 88. Loading Multiple Patterns   Sample File: LoadMulti.c   Uses object.c to load   Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
  • 89. Finding Multiple Transforms   Create object list ObjectData_T *object;   Read in objects - in init( ) read_ObjData( char *name, int *objectnum );   Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
  • 90. Drawing Multiple Objects   Send the object list to the draw function draw( object, objectnum );   Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }
  • 91. Proximity Based Interaction   Sample File – CollideTest.c   Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
  • 92. Multi-marker Tracking   Sample File – multiTest.c   Multiple markers to establish a single coordinate frame   Reading in a configuration file   Tracking from sets of markers   Careful camera calibration
  • 93. MultiMarker Configuration File   Sample File - Data/multi/marker.dat   Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 …
  • 94. Camera Transform Calculation   Include <AR/arMulti.h>   Link to libARMulti.lib   In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)   Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { argSwapBuffers(); return; }
  • 95. Paddle-based Interaction Tracking single marker relative to multi-marker set - paddle contains single marker
  • 96. Paddle Interaction Code   Sample File – PaddleDemo.c   Get paddle marker location + draw paddle before drawing background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); } draw_paddle uses a Stencil Buffer to increase realism
  • 97. Paddle Interaction Code II   Sample File – paddleDrawDemo.c   Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])   Sample File – paddleTouch.c   Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);   Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
  • 98. General Tangible AR Library   command_sub.c, command_sub.h   Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );   Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)