SlideShare a Scribd company logo
Bhakti Patil
Assistant Professor,
AISSMS College of Engineering
Introduction
 HCI used to empower people to cooperate with
different devices.
 To perform tasks and support activities efficiently.
 Designed to be useful and to be helpful to be usable.
 Human: users of system, single or many, having
diverse abilities.
 Computer: not only PCs but also different devices.
 Interaction: commands to handle imaginary objects.
Why it is needed?
 Poor UI can lead to higher training costs, higher usage
cost,
 Ultimately it will lead to low sales of the product.
 Poor UI may also lead to higher error rates (not
acceptable in critical systems).
 Based on the user experience success is dependant.
Explicit HCI
 User is always at the center of interaction.
 System control responds to and generated by human
 System is not driven internally but by user.
 Complex to coordinate lot of inputs by different
devices to perform concurrent activities.
HCI motivation
 To support more effective use.
 useful: accomplish a user task that the user requires to
be done
 usable: do the task easily, naturally, safely (without
danger of error)
 used: enrich the user experience by making it attractive,
engaging, fun, etc.
 The success of a product depends largely on both the
user’s experience with how usable a product is and
how useful it is in terms of the value it brings to them.
Haeckel’s law and inverse law
 Heckel’s law states that the quality of the user interface
of an appliance is relatively unimportant in
determining its adoption by users if the perceived
value of the appliance is high.
 Heckel’s inverse law states the importance of the user
interface design in the adoption of an appliance is
inversely proportional to the perceived value of the
appliance.
 Although the usability of the UI is important, the
overriding concern is the usefulness of the device
itself.
Implicit HCI motivation
 Explicit HCI (eHCI) design supports direct human
intervention.
 Pure explicit interaction is context free.
 Users must repeat and reconfigure the same
application access every session even if every session
repeats itself.
 It is also more about H2C (Human to Computer)
Interaction.
 Focus is on the human having a model of the system (a
mental model) rather than the system having a model
of the individual user.
Implicit HCI (iHCI)
 Eg: Person entering dark room
 Its an action, performed by the user that is not
primarily aimed to interact with a computerised
system but which such a system understands as input.
 Context aware
 C2H (Computer to Human) Interaction
 Computer has a certain understanding of users’
behaviour in a given situation(additional input)
 Complex to design than eHCI.
Complexity and challenges of iHCI
 To accurately and reliably determine the user context
 Systems may also require time in order to learn and
build an accurate model of a user.
 The user context determination may invade and
distract users’ attention
Ubiquitous audio video content access
Continued…
 Individual voice, video and audio services are often not
aware of each other and sometimes are not user
configurable.
 Eg: when a voice call arrives, TV and radio are
automatically paused or muted.
 Voice calls can be recorded in answer phone devices
but they cannot easily be exported or reformatted.
 To support such dynamic service composition requires
the use of a pervasive network infrastructure, standard
multimedia data exchange formats and certain
metadata.
Ubiquitous information access and
E-books
 Personal digital calendar- can be accessed through
different devices.
 Pull type interaction allows users to initiate the
information exchange eg: searching the web.
 Push type notification services are used for customers
to be notified of events, e.g., news.
 PC remained the dominant interactive information
access device, but not in all cases eg: kitchen , which
motivated use of mobile devices.
 Electronic information access is better as compared to
papers.
Universal local control of ICT system
User-Awareness and Personal Spaces
 Personalization can make the system customized.
 Configuration of services can also be personalized.
 Eg: coordinate and configure different home
appliances
 Complex issue is to manage shared social spaces.
Diversity of ICT Device Interaction
 Usually PC is a device with programmable chip, haptic i/o,
visual UI.
 Embedded system in a device to perform specialised task
have different i/o interfaces
 Devices characterised based on
 Size:
 Haptic input:
 Interaction modalities:
 Single user versus shared interaction:
 Posture for human operator:
 Distance of output display to input control:
 Position during operation:
 Connectivity:
 Tasking:
 Multimedia content access:
 Integrated:
UI and Interaction for Four Widely Used
Devices
 Personal computer ,
 Hand held mobile devices used for communication,
 Games consoles and
 Remote controlled AV displays, players and recorders.
PC Interface
 Early interfaces- command based
 In 1995 WIMPS interface had been introduced
 WIMPS- not only commands but interactive screen
objects can be controlled
 WIMPS(window, icon, menu, pointer devices)
 Most dominant interface- can perform direct
manipulations
WIMPS interface
 WIMPS interface is associated with a desktop
metaphor.
 Documents-Windowed areas of the screen.
 Windows can be arranged in stacks, created,
discarded, moved, organized and modified on the
display screen using the pointer device(Direct
manipulation)
 Advantages of theWIMPS UI over the command UI
 Order of multiple command is adhoc
 Users do not need to remember command names
Dialogues in WIMP
 Dialogues-users are informed about pertinent
information that they must acknowledge receipt of or
they ask for input to constrain a query.
 Typically displayed as a pop up window called a dialog
box.
 Eg. Form filling dialog interfaces are used by many
applications for alpha numeric data input
 These enable applications to receive data input in a
structured way, reducing the processing used by a
computer.
Drawbacks of WIMP
 not necessarily an improvement for visually impaired
users
 consumes screen space which is more critical in lower
resolution displays;
 the meaning of visual representations may be unclear
or misleading to specific users;
 mouse pointer control and input require good hand
eye coordination and can be slow.
Mobile handheld device interface
 PC style WIMPS not effective on mobile devices
 Display area is smaller.
 It is impractical to have several windows open at a time.
 It can be difficult to locate windows and icons if they
are deeply stacked one on another
 Difficult to resize windows.
 Screen navigation using fingers on a touch pad or an
external device may be too big and unwieldy for small
devices.
 In addition, the keyboard is smaller for user input and
there is a greater variety of input devices.
 Instead of using the inbuilt device interface, the device
can be attached to different kinds of external input
interface
Handling limited key input
 Different modes- limited number of keys and the
minimum key size
 Same interface interaction can lead to different action
 Multi-Tap, T9, Fastap, Soft keys and Soft Keyboard
1. Multitap-12 keys having combinations(Explicit)
2. T9 –enhances experience of multitap(implicit)
3. Fastap-two keypads, one with smaller keys raised at
the corners above the other keypad keys.
 The upper one is used for alphabetic input, the lower one for
number input,
 If several keys are hit at once, a technique called
 passive chording allows the system to work out what the user
intended to enter.
Continued..
4. Soft keys-two left and right keys at the top of keypad to
be determined by information on the screen;
 Allows the same keys to be reused to support application and
task specific choices.
 Instead of having two soft keys, a whole mini keyboard, a soft
keyboard, could also be displayed if there is sufficient screen
space.
 Internal pointer devices -tracker pad, roller pads, mini
joysticks ,keyboard arrow keys can be used to move the
pointer on the screen.
 Touch screens whose areas can be activated using some
physical stick like pointer, pen or a finger
 Auditory interfaces –voice commands
Handling Limited Output
 If output is too large- cropped ,content resolution can
be reduced or a zooming interface can be used.
Zooming (in and out) coupled with scrolling (up and
down) and panning (side to side) control.
 Marking which part of the whole view that is currently
zoomed in -useful for orientation.
 Peephole display- Sensors determine the position of
the device in relation to the user
 Use of projectors or organic displays(foldable)
 Audible outputs(visual output already engaged)
 Haptic outputs(urgency of call can be conveyed from
the vibrations)
Games console interface
 Seven different generations of games consoles based
upon the technologies they use.
 Current, seventh generation, game consoles include
the Nintendo Wii
 Nintendo -micro sensors in the form of accelerometers
located inside the controller and an infrared detector
to sense its position in 3D space.
 Scoring system -tuned to the interface(as the game
progresses, it becomes difficult to score points)
 Wii wand(natural interface)-easy for user to interact
with the system and immersed in the virtual game
environment.
Localised remote control
 To reduce the degree of manual interaction
 Design issue- overlapping features ,devices need to be
orchestrated with respect to a common feature.
 Eg: increasing volume of home entertainment system
 Solution-universal localised remote control
Universal local remote control
Hidden UI Via Basic Smart Devices
 WIMPS is more obtrusive-needs users to think
continuously
 More natural interfaces are required –gestures, senses,
speech etc
 Multimodal interface , gesture interface, natural
language interfaces
Multimodal interface
 Modality- mode of human interaction using one of the
human senses-5 senses
 human senses such as cameras ,touch screens,
microphones ,chemical sensors .
 Majority of ICT system have single mode- but human
interaction is multimodal.
 Eg: attentive interface-rely on attention
wearable interface- worn by user
vision based human motion analysis system
Gesture interface
 Meaningful and expressive body movements
 Can be sensed by
wearable device-gloves
magnetic trackers
body attachments- accelerometers, gyroscopes
computer vision techniques
 Two types of gestures
 Contactful gesture-handshake, use of touchscreen
 Contactless gesture- waving at someone
 Eg: Sony’s eye toy, current devices having gyroscope
Human Computer Interaction
Reflective Versus Active Displays
 Ebooks are light weight, thin, long lasting powered,
pocket sized devices with touch screens enabling pages
to be turned by touch.
 It differs in type of display it uses- reflective
 No energy required, readable in sunlight, can be read
from any direction
 Based on electrophoretic display
 EPDs –electrophoretic phenomenon of charged
particles suspended in a solvent.
 Displayed text and images can be electrically written
or erased repeatedly
Human Computer Interaction
Combining I/O interfaces
 Resistive v/s capacitive touchscreen
 TUI –augmenting real physical world by connecting
digital information to everyday physical objects and
environments.
 Eg of TUI are ambient wood, datatiles
 Organic interface-resemble natural human -physical
and human-human interaction
 Eg: Organic Light Emitting Diode (OLED) display
Advantages of OLED
 Lower cost
 Lightweight and flexible
 Good resolution
 Wider viewing angles and improved brightness
 Power efficiency
 Eg: samsung galaxy note edge, LG G Flex
Auditory interface
 Communicative connections between machine and
user
 Replacement to keyboard text
 For visually impaired users
 Challenges- noise removal, ambiguity of commands
Hidden UI Via Wearable and
Implanted Devices
 Device can be – accompanied, wearable, implanted
 Accompanied-external to body, not attached- mobile
device, smart cards
 Wearable-external but attached to body-hearing aids,
earpieces
 Implants-internal to body-medical purposes
 Eg: Eyetap, head-up-display, clothes as computer,
computer implants
Human centered design life cycle
Activities of HCD
1. Define the context of use in terms of scenarios, use
cases, task models, and the ICT, physical and social
environment context of use.
2. The stakeholder and organizational requirements
must be specified.
3. Multiple alternative UI designs need to be built.
4. Designs need to be validated against user
requirements.

More Related Content

PPTX
Properties of ubiquitous computing
PPT
program partitioning and scheduling IN Advanced Computer Architecture
PPTX
Human Computer Interaction - INPUT OUTPUT CHANNELS
PPT
HCI 3e - Ch 7: Design rules
PPTX
IOT DATA MANAGEMENT AND COMPUTE STACK.pptx
PPTX
Underlying principles of parallel and distributed computing
PPTX
Virtual mouse
PPTX
Context model
Properties of ubiquitous computing
program partitioning and scheduling IN Advanced Computer Architecture
Human Computer Interaction - INPUT OUTPUT CHANNELS
HCI 3e - Ch 7: Design rules
IOT DATA MANAGEMENT AND COMPUTE STACK.pptx
Underlying principles of parallel and distributed computing
Virtual mouse
Context model

What's hot (20)

PPT
UBIQUITOUS COMPUTING - Mary M
PPTX
Human Computer Interface (HCI)
PPT
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirements
PPT
Multimedia user interface principles
PPT
HCI 3e - Ch 12: Cognitive models
PPT
User Interface Design Chapter 2 Galiz
PPTX
PPTX
Ubiquitous computing
PDF
Mobile computing unit 5
PPTX
Distributed Systems Real Life Applications
PPT
Ubiquitous Computing
PPT
JINI Technology
PPT
Ubiquitous Computing: Privacy Issues
PPT
cloud computing:Types of virtualization
PPTX
WEB INTERFACE DESIGN
PPTX
Cloud Security Mechanisms
PPTX
SELECT THE PROPER DEVICE BASED CONTROLS
PDF
Remote backup system
PDF
Edge Computing : future of IoT ?
PDF
Overview of computing paradigm
UBIQUITOUS COMPUTING - Mary M
Human Computer Interface (HCI)
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirements
Multimedia user interface principles
HCI 3e - Ch 12: Cognitive models
User Interface Design Chapter 2 Galiz
Ubiquitous computing
Mobile computing unit 5
Distributed Systems Real Life Applications
Ubiquitous Computing
JINI Technology
Ubiquitous Computing: Privacy Issues
cloud computing:Types of virtualization
WEB INTERFACE DESIGN
Cloud Security Mechanisms
SELECT THE PROPER DEVICE BASED CONTROLS
Remote backup system
Edge Computing : future of IoT ?
Overview of computing paradigm
Ad

Similar to Human Computer Interaction (20)

PPTX
INTERACTION AND INTERFACES MODEL OF THE INTERACTION
PPTX
HCI Presentation
PDF
Chapter 1.pdf
PPT
Discovery methods for HCI
PPTX
HCI user interface & characterstics of interfaces.pptx
PPTX
Being Human
PPTX
1.Usability Engineering.pptx
PPT
Chapter 4 universal design
PPTX
Human Computer Interactions Lecture 1.pptx
PPTX
Human Computer Interaction Lecture Notes
PPT
Hci lecture set_03_00
PDF
Hci [4]interaction
PPT
5945479
PDF
HCI Chapter_1.pdf
PDF
COMP 4026 - Lecture1 introduction
PDF
ICS3211 Week 4
PPTX
Ubitous computing ppt
PPTX
human computer interface
PPT
Lcture 1
INTERACTION AND INTERFACES MODEL OF THE INTERACTION
HCI Presentation
Chapter 1.pdf
Discovery methods for HCI
HCI user interface & characterstics of interfaces.pptx
Being Human
1.Usability Engineering.pptx
Chapter 4 universal design
Human Computer Interactions Lecture 1.pptx
Human Computer Interaction Lecture Notes
Hci lecture set_03_00
Hci [4]interaction
5945479
HCI Chapter_1.pdf
COMP 4026 - Lecture1 introduction
ICS3211 Week 4
Ubitous computing ppt
human computer interface
Lcture 1
Ad

Recently uploaded (20)

PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PPTX
Strings in CPP - Strings in C++ are sequences of characters used to store and...
PPTX
MET 305 MODULE 1 KTU 2019 SCHEME 25.pptx
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
DOCX
573137875-Attendance-Management-System-original
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPTX
web development for engineering and engineering
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PPTX
Lecture Notes Electrical Wiring System Components
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PPTX
CH1 Production IntroductoryConcepts.pptx
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
Well-logging-methods_new................
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PPTX
bas. eng. economics group 4 presentation 1.pptx
PDF
Arduino robotics embedded978-1-4302-3184-4.pdf
PPTX
OOP with Java - Java Introduction (Basics)
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Strings in CPP - Strings in C++ are sequences of characters used to store and...
MET 305 MODULE 1 KTU 2019 SCHEME 25.pptx
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
573137875-Attendance-Management-System-original
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
web development for engineering and engineering
Foundation to blockchain - A guide to Blockchain Tech
Lecture Notes Electrical Wiring System Components
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
CYBER-CRIMES AND SECURITY A guide to understanding
CH1 Production IntroductoryConcepts.pptx
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Well-logging-methods_new................
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
bas. eng. economics group 4 presentation 1.pptx
Arduino robotics embedded978-1-4302-3184-4.pdf
OOP with Java - Java Introduction (Basics)

Human Computer Interaction

  • 2. Introduction  HCI used to empower people to cooperate with different devices.  To perform tasks and support activities efficiently.  Designed to be useful and to be helpful to be usable.  Human: users of system, single or many, having diverse abilities.  Computer: not only PCs but also different devices.  Interaction: commands to handle imaginary objects.
  • 3. Why it is needed?  Poor UI can lead to higher training costs, higher usage cost,  Ultimately it will lead to low sales of the product.  Poor UI may also lead to higher error rates (not acceptable in critical systems).  Based on the user experience success is dependant.
  • 4. Explicit HCI  User is always at the center of interaction.  System control responds to and generated by human  System is not driven internally but by user.  Complex to coordinate lot of inputs by different devices to perform concurrent activities.
  • 5. HCI motivation  To support more effective use.  useful: accomplish a user task that the user requires to be done  usable: do the task easily, naturally, safely (without danger of error)  used: enrich the user experience by making it attractive, engaging, fun, etc.  The success of a product depends largely on both the user’s experience with how usable a product is and how useful it is in terms of the value it brings to them.
  • 6. Haeckel’s law and inverse law  Heckel’s law states that the quality of the user interface of an appliance is relatively unimportant in determining its adoption by users if the perceived value of the appliance is high.  Heckel’s inverse law states the importance of the user interface design in the adoption of an appliance is inversely proportional to the perceived value of the appliance.  Although the usability of the UI is important, the overriding concern is the usefulness of the device itself.
  • 7. Implicit HCI motivation  Explicit HCI (eHCI) design supports direct human intervention.  Pure explicit interaction is context free.  Users must repeat and reconfigure the same application access every session even if every session repeats itself.  It is also more about H2C (Human to Computer) Interaction.  Focus is on the human having a model of the system (a mental model) rather than the system having a model of the individual user.
  • 8. Implicit HCI (iHCI)  Eg: Person entering dark room  Its an action, performed by the user that is not primarily aimed to interact with a computerised system but which such a system understands as input.  Context aware  C2H (Computer to Human) Interaction  Computer has a certain understanding of users’ behaviour in a given situation(additional input)  Complex to design than eHCI.
  • 9. Complexity and challenges of iHCI  To accurately and reliably determine the user context  Systems may also require time in order to learn and build an accurate model of a user.  The user context determination may invade and distract users’ attention
  • 10. Ubiquitous audio video content access
  • 11. Continued…  Individual voice, video and audio services are often not aware of each other and sometimes are not user configurable.  Eg: when a voice call arrives, TV and radio are automatically paused or muted.  Voice calls can be recorded in answer phone devices but they cannot easily be exported or reformatted.  To support such dynamic service composition requires the use of a pervasive network infrastructure, standard multimedia data exchange formats and certain metadata.
  • 12. Ubiquitous information access and E-books  Personal digital calendar- can be accessed through different devices.  Pull type interaction allows users to initiate the information exchange eg: searching the web.  Push type notification services are used for customers to be notified of events, e.g., news.  PC remained the dominant interactive information access device, but not in all cases eg: kitchen , which motivated use of mobile devices.  Electronic information access is better as compared to papers.
  • 13. Universal local control of ICT system
  • 14. User-Awareness and Personal Spaces  Personalization can make the system customized.  Configuration of services can also be personalized.  Eg: coordinate and configure different home appliances  Complex issue is to manage shared social spaces.
  • 15. Diversity of ICT Device Interaction  Usually PC is a device with programmable chip, haptic i/o, visual UI.  Embedded system in a device to perform specialised task have different i/o interfaces  Devices characterised based on  Size:  Haptic input:  Interaction modalities:  Single user versus shared interaction:  Posture for human operator:  Distance of output display to input control:  Position during operation:  Connectivity:  Tasking:  Multimedia content access:  Integrated:
  • 16. UI and Interaction for Four Widely Used Devices  Personal computer ,  Hand held mobile devices used for communication,  Games consoles and  Remote controlled AV displays, players and recorders.
  • 17. PC Interface  Early interfaces- command based  In 1995 WIMPS interface had been introduced  WIMPS- not only commands but interactive screen objects can be controlled  WIMPS(window, icon, menu, pointer devices)  Most dominant interface- can perform direct manipulations
  • 18. WIMPS interface  WIMPS interface is associated with a desktop metaphor.  Documents-Windowed areas of the screen.  Windows can be arranged in stacks, created, discarded, moved, organized and modified on the display screen using the pointer device(Direct manipulation)  Advantages of theWIMPS UI over the command UI  Order of multiple command is adhoc  Users do not need to remember command names
  • 19. Dialogues in WIMP  Dialogues-users are informed about pertinent information that they must acknowledge receipt of or they ask for input to constrain a query.  Typically displayed as a pop up window called a dialog box.  Eg. Form filling dialog interfaces are used by many applications for alpha numeric data input  These enable applications to receive data input in a structured way, reducing the processing used by a computer.
  • 20. Drawbacks of WIMP  not necessarily an improvement for visually impaired users  consumes screen space which is more critical in lower resolution displays;  the meaning of visual representations may be unclear or misleading to specific users;  mouse pointer control and input require good hand eye coordination and can be slow.
  • 21. Mobile handheld device interface  PC style WIMPS not effective on mobile devices  Display area is smaller.  It is impractical to have several windows open at a time.  It can be difficult to locate windows and icons if they are deeply stacked one on another  Difficult to resize windows.  Screen navigation using fingers on a touch pad or an external device may be too big and unwieldy for small devices.  In addition, the keyboard is smaller for user input and there is a greater variety of input devices.  Instead of using the inbuilt device interface, the device can be attached to different kinds of external input interface
  • 22. Handling limited key input  Different modes- limited number of keys and the minimum key size  Same interface interaction can lead to different action  Multi-Tap, T9, Fastap, Soft keys and Soft Keyboard 1. Multitap-12 keys having combinations(Explicit) 2. T9 –enhances experience of multitap(implicit) 3. Fastap-two keypads, one with smaller keys raised at the corners above the other keypad keys.  The upper one is used for alphabetic input, the lower one for number input,  If several keys are hit at once, a technique called  passive chording allows the system to work out what the user intended to enter.
  • 23. Continued.. 4. Soft keys-two left and right keys at the top of keypad to be determined by information on the screen;  Allows the same keys to be reused to support application and task specific choices.  Instead of having two soft keys, a whole mini keyboard, a soft keyboard, could also be displayed if there is sufficient screen space.  Internal pointer devices -tracker pad, roller pads, mini joysticks ,keyboard arrow keys can be used to move the pointer on the screen.  Touch screens whose areas can be activated using some physical stick like pointer, pen or a finger  Auditory interfaces –voice commands
  • 24. Handling Limited Output  If output is too large- cropped ,content resolution can be reduced or a zooming interface can be used. Zooming (in and out) coupled with scrolling (up and down) and panning (side to side) control.  Marking which part of the whole view that is currently zoomed in -useful for orientation.  Peephole display- Sensors determine the position of the device in relation to the user  Use of projectors or organic displays(foldable)  Audible outputs(visual output already engaged)  Haptic outputs(urgency of call can be conveyed from the vibrations)
  • 25. Games console interface  Seven different generations of games consoles based upon the technologies they use.  Current, seventh generation, game consoles include the Nintendo Wii  Nintendo -micro sensors in the form of accelerometers located inside the controller and an infrared detector to sense its position in 3D space.  Scoring system -tuned to the interface(as the game progresses, it becomes difficult to score points)  Wii wand(natural interface)-easy for user to interact with the system and immersed in the virtual game environment.
  • 26. Localised remote control  To reduce the degree of manual interaction  Design issue- overlapping features ,devices need to be orchestrated with respect to a common feature.  Eg: increasing volume of home entertainment system  Solution-universal localised remote control
  • 28. Hidden UI Via Basic Smart Devices  WIMPS is more obtrusive-needs users to think continuously  More natural interfaces are required –gestures, senses, speech etc  Multimodal interface , gesture interface, natural language interfaces
  • 29. Multimodal interface  Modality- mode of human interaction using one of the human senses-5 senses  human senses such as cameras ,touch screens, microphones ,chemical sensors .  Majority of ICT system have single mode- but human interaction is multimodal.  Eg: attentive interface-rely on attention wearable interface- worn by user vision based human motion analysis system
  • 30. Gesture interface  Meaningful and expressive body movements  Can be sensed by wearable device-gloves magnetic trackers body attachments- accelerometers, gyroscopes computer vision techniques  Two types of gestures  Contactful gesture-handshake, use of touchscreen  Contactless gesture- waving at someone  Eg: Sony’s eye toy, current devices having gyroscope
  • 32. Reflective Versus Active Displays  Ebooks are light weight, thin, long lasting powered, pocket sized devices with touch screens enabling pages to be turned by touch.  It differs in type of display it uses- reflective  No energy required, readable in sunlight, can be read from any direction  Based on electrophoretic display  EPDs –electrophoretic phenomenon of charged particles suspended in a solvent.  Displayed text and images can be electrically written or erased repeatedly
  • 34. Combining I/O interfaces  Resistive v/s capacitive touchscreen  TUI –augmenting real physical world by connecting digital information to everyday physical objects and environments.  Eg of TUI are ambient wood, datatiles  Organic interface-resemble natural human -physical and human-human interaction  Eg: Organic Light Emitting Diode (OLED) display
  • 35. Advantages of OLED  Lower cost  Lightweight and flexible  Good resolution  Wider viewing angles and improved brightness  Power efficiency  Eg: samsung galaxy note edge, LG G Flex
  • 36. Auditory interface  Communicative connections between machine and user  Replacement to keyboard text  For visually impaired users  Challenges- noise removal, ambiguity of commands
  • 37. Hidden UI Via Wearable and Implanted Devices  Device can be – accompanied, wearable, implanted  Accompanied-external to body, not attached- mobile device, smart cards  Wearable-external but attached to body-hearing aids, earpieces  Implants-internal to body-medical purposes  Eg: Eyetap, head-up-display, clothes as computer, computer implants
  • 38. Human centered design life cycle
  • 39. Activities of HCD 1. Define the context of use in terms of scenarios, use cases, task models, and the ICT, physical and social environment context of use. 2. The stakeholder and organizational requirements must be specified. 3. Multiple alternative UI designs need to be built. 4. Designs need to be validated against user requirements.