SlideShare a Scribd company logo
FR100: You Call That Intuitive? Principles for
Integrating User Experience Into Training
Chris King
Project Director
Jeff Barnes
User Experience Designer
You Call That Intuitive?
2SRA Proprietary
What is User Experience?
User Experience in Training
Design
User Interface Design
Examples
Conclusion
What is User Experience?
3SRA Proprietary
Information Architecture
Section 508 Compliance
InterfaceDesign
UI Development
User Centered Design
Interface Development Interaction Design
Content Management Systems
Usability
Human Computer Interaction
Wireframe
Web Development
User Experience
Content
Strategy
Personas
Dashboard Design
HumanFactors
Websites
Accessibility
Search Engine Optimization
Mobile Usability Web Analytics
HelpDesk
User Experience in Training Design
Efficiency
The time it takes to
complete a task or
learn a system
Effectiveness
The number errors
in task completion
Satisfaction
The user’s
perception of how
the system makes
them feel based on
their expectations
4SRA Proprietary
User Experience in
Training Design
Apply UX
Principles
Increase
user
satisfaction
Higher
conversion
rate
Reduce
helpdesk
tickets
Lighter
cognitive
load
Technology
fades into
background
5SRA Proprietary
User Experience in Training Design
6SRA Proprietary
User Experience in Training Design
7SRA Proprietary
User Research
Heuristics
Personas
Prototypes
IA
Heuristics
Agile/Iteration
Interface
508/Accessibility
Instructions
Business Process
Communication
User Surveys
User Testing
Web Analytics
Help Desk
Analyze
Design
DevelopImplement
Evaluate
Audience Analysis
• User Research
Interviews, contextual observations, focus groups, web analytics,
and helpdesk or user support data.
• Personas
A profile of a user that details
their goals, frustrations,
and demographics
8SRA Proprietary
Design Principles
Visibility of system status
Match between system and the
real world
User control and freedom
Consistency and standards
Error prevention
Recognition rather than recall
Flexibility and efficiency of use
Aesthetic and minimalist design
Help users recognize, diagnose,
and recover from errors
Help and documentation
SRA Proprietary 9
Copyright © 2005 by Jakob Nielsen. ISSN 1548-5552
Heuristics
Iterative Design
10SRA Proprietary
Traditional ADDIE
has the bulk of
testing after
deployment. Result
is rework and re-
deployment 4 or 5
times
Iterative Design
11SRA Proprietary
Iterative process (Agile,
LLAMA, SAM, etc.) has
evaluation, identification
of issues and re-design
all taking place before
the training has been
deployed
Iterative Design
12SRA Proprietary
By conducting heuristic reviews, quick user tests, and getting feedback
earlier in the design lifecycle, there is a potential to save time and reduce
waste in both the development and deployment phases.
Supplemental Documentation
Instructions
and
Communication
Help and
Training Aids
Completion
Certificates
13SRA Proprietary
Supplemental documentation like email communication, instructions, and
training aids are integral to the user experience and deserve the same
amount of consideration as you course itself.
Evaluating User Experience
User Surveys
Did you have any difficulty
accessing this training?
Did you have any difficulty getting
your certificate after you
completed the training?
Was the navigator easy to use?
What do you find most frustrating
about this training?
14SRA Proprietary
Ask users directly by
adding or modifying 1-2
questions on your Level
1 evaluations. A bad user
experience will reduce L1
results regardless of the
instructional integrity of
the content.
Evaluating User Experience
Paper Prototype
User Observations
Formal User Testing
15SRA Proprietary
User Testing
Collect information
directly from the
source. The more
information you can
collect from real users
before or after the
launch, the better off in
the end.
Consistency 1
16
Is it clear
where
the user
should
click on
this
screen to
proceed?
Nav button
has
disappeared!
Consistency 1
17
Set your
user’s
expectations
– tell them
where to click
if you break
the navigation
convention
Consistency 2
18
All
navigation
has
disappeared!
The user
must now
click on a
button in the
content area,
again
breaking the
navigation
convention.
Consistency 3
19
Sticky note,
meant to be
the “more
information”
resource
Consistency 3
20
!
Why is it over
here now?
Consistency 3
21
!
And a 3rd
location,
same course.
Inconsistency
looks
disorganized
and can make
the interface
more
noticeable to
the user.
Error Prevention 1
22
Good
Example: the
process is
very clear,
pick a
response
option, then
click submit
Error Prevention 2
23
The user is
already
looking at that
portion of the
screen where
the answer
appears. The
navigation
fades into the
background
and the user
knows to
select
“Continue”
Match the Real World
24
Good Example:
Here the
interface
presents the
alphabet in a
continuous
format, as the
user would
expect.
Actionable
letters are
highlighted;
bonus points
for alternate
navigation
presented on
the left side.
Navigator Controls
25
Here is a list of
menu options
on the left, exit
in the middle,
and a
navigation
control on the
right
there is no visual distinction
made between these different
types of buttons
Icons should be
used
consistently, not
mixed with text
Navigator Controls
26
Example of
good controls.
Buttons on the
bottom have
clear icons;
there is a solid
navigation
strategy in the
bottom right
which includes
a progress
indicator.
Top right you
has a list of
menu options,
that
consistently
pop-up
windows.
Visibility of System Status
27
Good Example:
the progress
bar shows the
portion of the
course the user
has completed,
and the bottom
bar shows
progress
through the
current video.
This helps the
user see
quickly where
they are and
what the
system is
doing.
SRA Proprietary 28
Visual Layout
Training automatically loads player in a small,
almost unusable size; open on top of cluttered
page, this can be confusing; poor user interface
reduces the learner’s trust on our competence, and
therefore reduces their faith in the content
SRA Proprietary 29
Visual Layout
Screen
presented from
the LMS,
developer has
zero control.
What to do with
this mess? A
job aid that
explains what
the user is
seeing and how
to interact with
this screen.
?
?
?
Summary
• User experience is
about efficiency,
effectiveness, and
satisfaction
• You can use these
techniques to
improve all three of
these criteria
30SRA Proprietary
User Research
Heuristics
Personas
Prototypes
IA
Heuristics
Agile/Iteration
Interface
508/Accessibility
Instructions
Business Process
Communication
User Surveys
User Testing
Web Analytics
Help Desk
Analyze
Design
DevelopImplement
Evaluate
31SRA Proprietary
Thank you!
Chris King chris@crklearning.com
Jeff Barnes Jeff_Barnes@sra.com

More Related Content

PPTX
Ux design-fundamentals
PPTX
Danielmele software evaluation
PDF
UI / UX Engineering for Web Applications
PPS
User Interface Design @iRajLal
PDF
What is usability
PPTX
PlayNetwork: UX Design Process and Artifacts
PPT
Joel Baskin UX Design and Artifacts
PPTX
UI & UX Engineering
Ux design-fundamentals
Danielmele software evaluation
UI / UX Engineering for Web Applications
User Interface Design @iRajLal
What is usability
PlayNetwork: UX Design Process and Artifacts
Joel Baskin UX Design and Artifacts
UI & UX Engineering

What's hot (15)

PPTX
Google Sketch Up Software Evaluation Report
PPTX
Mobile UI Design – User Centered Design and UI Best Practices
 
PPT
Designing usable web applications (part 1) experience dynamics web seminar
PPTX
Global Conductor Explained
PDF
WCAG 2.1 UX Scotland 2019
PPT
User interface design for the Web Engineering Psychology
PPTX
Sp11 faculty
PPTX
User Interface design of mobile based project sharing platform
PDF
Heuristic evaluation
PPTX
Safety in numbers: A framework for benchmarking the user experience
PPT
User centered Design
PPTX
Macadamian product camp sv-2011
PDF
Design Simple but Powerful application
PPT
Software Engineering chapter 19
Google Sketch Up Software Evaluation Report
Mobile UI Design – User Centered Design and UI Best Practices
 
Designing usable web applications (part 1) experience dynamics web seminar
Global Conductor Explained
WCAG 2.1 UX Scotland 2019
User interface design for the Web Engineering Psychology
Sp11 faculty
User Interface design of mobile based project sharing platform
Heuristic evaluation
Safety in numbers: A framework for benchmarking the user experience
User centered Design
Macadamian product camp sv-2011
Design Simple but Powerful application
Software Engineering chapter 19
Ad

Viewers also liked (18)

DOCX
The Intuitive Warrior
DOC
Intuitive Technology Notes
DOCX
Intuitive cooperation and selfish thoughts
PPT
Intuitive Technology
PPTX
Copy (1)Da vinci robot 2
PPTX
Rapid acceleration of intuitive technology
PDF
Understanding Consumer Trends
PPTX
Why making choices based on intuition can be successful?
PDF
Intuition tok 2
PDF
Inside the ANN: A visual and intuitive journey to understand how artificial n...
PPTX
Intuition
PPTX
Agile application delivery trio webinar
PDF
Why Intuition Matters More Than You Think
PPTX
Presentation on Intuitive Robotics
PPTX
Ways of knowing (Memory and Imagination)
PDF
An Intuitive Approach to Fourier Optics
PPTX
Intuition - its power, mystery & how to get it
PPTX
Ways of knowing (intuition and faith)
The Intuitive Warrior
Intuitive Technology Notes
Intuitive cooperation and selfish thoughts
Intuitive Technology
Copy (1)Da vinci robot 2
Rapid acceleration of intuitive technology
Understanding Consumer Trends
Why making choices based on intuition can be successful?
Intuition tok 2
Inside the ANN: A visual and intuitive journey to understand how artificial n...
Intuition
Agile application delivery trio webinar
Why Intuition Matters More Than You Think
Presentation on Intuitive Robotics
Ways of knowing (Memory and Imagination)
An Intuitive Approach to Fourier Optics
Intuition - its power, mystery & how to get it
Ways of knowing (intuition and faith)
Ad

Similar to You Call That Intuitive? Principles for Integrating User Experience Into Training (20)

PDF
Guiding UX Principles 3/20/12
PDF
User Experience Design: an Overview
PPTX
Teaching UX to Your Team
PPTX
How design decisions affect user performance
PDF
UX is for Losers
PPTX
You Don't Know C.R.A.P. about UX/UI
PDF
UXD HK Info Session
PDF
Guiding UX Principles
PPTX
Turkcell 7. His Usability day 17 Kasim 2011
PDF
UX Design with Limited Resources
PDF
UX for E-learning: Designing the Learner Experience
PDF
User Experience
PDF
User Testing talk by Chris Rourke of User Vision
PDF
Webinar UI/UX by Francesco Marcellino
PPTX
Intro to UX part two
PPTX
UDSA Unit 4.pptx
PPT
UX Research: What They Don't Teach You in Grad School
PPT
UX Deliverables in Practice
PDF
Richard Marsh, Enterprising User Experience - Flex and the city
PPTX
Delivering Effective User Interfaces
Guiding UX Principles 3/20/12
User Experience Design: an Overview
Teaching UX to Your Team
How design decisions affect user performance
UX is for Losers
You Don't Know C.R.A.P. about UX/UI
UXD HK Info Session
Guiding UX Principles
Turkcell 7. His Usability day 17 Kasim 2011
UX Design with Limited Resources
UX for E-learning: Designing the Learner Experience
User Experience
User Testing talk by Chris Rourke of User Vision
Webinar UI/UX by Francesco Marcellino
Intro to UX part two
UDSA Unit 4.pptx
UX Research: What They Don't Teach You in Grad School
UX Deliverables in Practice
Richard Marsh, Enterprising User Experience - Flex and the city
Delivering Effective User Interfaces

More from Christopher King (8)

PPTX
Extending Learning beyond the Classroom: Improving Performance in the Workflow
PDF
Launching Performance Support: It's the Message That Matters
PPTX
Climbing the Learning Curve for Performance Support Project Rookies
PPTX
The Agile Method and AGILE ISD; how to use each to improve your training program
PPTX
The New Normal: Learning and Collaborating in a Virtual Classroom
PPTX
The Agony and the Ecstasy: Converting Traditional ILT to the Virtual Classroom
PPTX
Avoiding "Ready. Fire. Aim!" with a Mobile Learning Strategy
PPTX
Designing Social Learning: "Informal" Does Not Mean "Unplanned"
Extending Learning beyond the Classroom: Improving Performance in the Workflow
Launching Performance Support: It's the Message That Matters
Climbing the Learning Curve for Performance Support Project Rookies
The Agile Method and AGILE ISD; how to use each to improve your training program
The New Normal: Learning and Collaborating in a Virtual Classroom
The Agony and the Ecstasy: Converting Traditional ILT to the Virtual Classroom
Avoiding "Ready. Fire. Aim!" with a Mobile Learning Strategy
Designing Social Learning: "Informal" Does Not Mean "Unplanned"

Recently uploaded (20)

PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
A systematic review of self-coping strategies used by university students to ...
PPTX
Cell Structure & Organelles in detailed.
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
Complications of Minimal Access Surgery at WLH
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
01-Introduction-to-Information-Management.pdf
PDF
RMMM.pdf make it easy to upload and study
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
A systematic review of self-coping strategies used by university students to ...
Cell Structure & Organelles in detailed.
Chinmaya Tiranga quiz Grand Finale.pdf
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Supply Chain Operations Speaking Notes -ICLT Program
Complications of Minimal Access Surgery at WLH
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Final Presentation General Medicine 03-08-2024.pptx
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Abdominal Access Techniques with Prof. Dr. R K Mishra
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
01-Introduction-to-Information-Management.pdf
RMMM.pdf make it easy to upload and study
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Anesthesia in Laparoscopic Surgery in India
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf

You Call That Intuitive? Principles for Integrating User Experience Into Training

  • 1. FR100: You Call That Intuitive? Principles for Integrating User Experience Into Training Chris King Project Director Jeff Barnes User Experience Designer
  • 2. You Call That Intuitive? 2SRA Proprietary What is User Experience? User Experience in Training Design User Interface Design Examples Conclusion
  • 3. What is User Experience? 3SRA Proprietary Information Architecture Section 508 Compliance InterfaceDesign UI Development User Centered Design Interface Development Interaction Design Content Management Systems Usability Human Computer Interaction Wireframe Web Development User Experience Content Strategy Personas Dashboard Design HumanFactors Websites Accessibility Search Engine Optimization Mobile Usability Web Analytics HelpDesk
  • 4. User Experience in Training Design Efficiency The time it takes to complete a task or learn a system Effectiveness The number errors in task completion Satisfaction The user’s perception of how the system makes them feel based on their expectations 4SRA Proprietary
  • 5. User Experience in Training Design Apply UX Principles Increase user satisfaction Higher conversion rate Reduce helpdesk tickets Lighter cognitive load Technology fades into background 5SRA Proprietary
  • 6. User Experience in Training Design 6SRA Proprietary
  • 7. User Experience in Training Design 7SRA Proprietary User Research Heuristics Personas Prototypes IA Heuristics Agile/Iteration Interface 508/Accessibility Instructions Business Process Communication User Surveys User Testing Web Analytics Help Desk Analyze Design DevelopImplement Evaluate
  • 8. Audience Analysis • User Research Interviews, contextual observations, focus groups, web analytics, and helpdesk or user support data. • Personas A profile of a user that details their goals, frustrations, and demographics 8SRA Proprietary
  • 9. Design Principles Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation SRA Proprietary 9 Copyright © 2005 by Jakob Nielsen. ISSN 1548-5552 Heuristics
  • 10. Iterative Design 10SRA Proprietary Traditional ADDIE has the bulk of testing after deployment. Result is rework and re- deployment 4 or 5 times
  • 11. Iterative Design 11SRA Proprietary Iterative process (Agile, LLAMA, SAM, etc.) has evaluation, identification of issues and re-design all taking place before the training has been deployed
  • 12. Iterative Design 12SRA Proprietary By conducting heuristic reviews, quick user tests, and getting feedback earlier in the design lifecycle, there is a potential to save time and reduce waste in both the development and deployment phases.
  • 13. Supplemental Documentation Instructions and Communication Help and Training Aids Completion Certificates 13SRA Proprietary Supplemental documentation like email communication, instructions, and training aids are integral to the user experience and deserve the same amount of consideration as you course itself.
  • 14. Evaluating User Experience User Surveys Did you have any difficulty accessing this training? Did you have any difficulty getting your certificate after you completed the training? Was the navigator easy to use? What do you find most frustrating about this training? 14SRA Proprietary Ask users directly by adding or modifying 1-2 questions on your Level 1 evaluations. A bad user experience will reduce L1 results regardless of the instructional integrity of the content.
  • 15. Evaluating User Experience Paper Prototype User Observations Formal User Testing 15SRA Proprietary User Testing Collect information directly from the source. The more information you can collect from real users before or after the launch, the better off in the end.
  • 16. Consistency 1 16 Is it clear where the user should click on this screen to proceed? Nav button has disappeared!
  • 17. Consistency 1 17 Set your user’s expectations – tell them where to click if you break the navigation convention
  • 18. Consistency 2 18 All navigation has disappeared! The user must now click on a button in the content area, again breaking the navigation convention.
  • 19. Consistency 3 19 Sticky note, meant to be the “more information” resource
  • 20. Consistency 3 20 ! Why is it over here now?
  • 21. Consistency 3 21 ! And a 3rd location, same course. Inconsistency looks disorganized and can make the interface more noticeable to the user.
  • 22. Error Prevention 1 22 Good Example: the process is very clear, pick a response option, then click submit
  • 23. Error Prevention 2 23 The user is already looking at that portion of the screen where the answer appears. The navigation fades into the background and the user knows to select “Continue”
  • 24. Match the Real World 24 Good Example: Here the interface presents the alphabet in a continuous format, as the user would expect. Actionable letters are highlighted; bonus points for alternate navigation presented on the left side.
  • 25. Navigator Controls 25 Here is a list of menu options on the left, exit in the middle, and a navigation control on the right there is no visual distinction made between these different types of buttons Icons should be used consistently, not mixed with text
  • 26. Navigator Controls 26 Example of good controls. Buttons on the bottom have clear icons; there is a solid navigation strategy in the bottom right which includes a progress indicator. Top right you has a list of menu options, that consistently pop-up windows.
  • 27. Visibility of System Status 27 Good Example: the progress bar shows the portion of the course the user has completed, and the bottom bar shows progress through the current video. This helps the user see quickly where they are and what the system is doing.
  • 28. SRA Proprietary 28 Visual Layout Training automatically loads player in a small, almost unusable size; open on top of cluttered page, this can be confusing; poor user interface reduces the learner’s trust on our competence, and therefore reduces their faith in the content
  • 29. SRA Proprietary 29 Visual Layout Screen presented from the LMS, developer has zero control. What to do with this mess? A job aid that explains what the user is seeing and how to interact with this screen. ? ? ?
  • 30. Summary • User experience is about efficiency, effectiveness, and satisfaction • You can use these techniques to improve all three of these criteria 30SRA Proprietary User Research Heuristics Personas Prototypes IA Heuristics Agile/Iteration Interface 508/Accessibility Instructions Business Process Communication User Surveys User Testing Web Analytics Help Desk Analyze Design DevelopImplement Evaluate
  • 31. 31SRA Proprietary Thank you! Chris King chris@crklearning.com Jeff Barnes Jeff_Barnes@sra.com

Editor's Notes

  • #3: CHRIS: Welcome to “You Call That Intuitive?” or should it really say “You call that Intuitive?”. Today Jeff and I will be talking about User Experience or “UX” and how it can contribute to training design. We are going to walk you through the key principles and practices to successfully implement US to improve your training courses. In the second half of our session we will walk you through several examples of both good and bad design. Disclaimer: the examples are from my work, Jeff was kind enough to include them here, names have been changed to protect the innocent! Introductions are in order! I’m Chris King… JEFF: and I’m Jeff Barnes, I’ve been applying User Experience Principles to training, websites, and online tools for the Federal Government for 6 years, including the Department of Veteran’s Affairs “MyCareer@VA” career development website. I have a graduate degree in Human Factors Psychology from George Mason University.
  • #4: JEFF: User experience (UX) is a broad term used to explain all aspects of a person’s interaction with a system, application, or tool. This includes the user interface, information architecture, graphics, interaction design, content strategy, communication, and even the guides and helpdesk support. The goal of UX Design is to create products that give users a positive experience through the measurement and improvement of the products efficiency, effectiveness, and satisfaction.
  • #5: JEFF: So what do we mean here? Efficiency helps us to understand how quickly a user can accomplish a task or how long it takes them to learn an interaction. When we think of good design we often use words like “usable” or “intuitive”. These words are describing how easy it is for a user to adapt to a new system or to become proficient in its use. To demonstrate this point several recent studies have indicated the magical “three click” rule in web design is not necessarily hard-and-fast for user satisfaction, frequently, a user’s satisfaction can be more closely attributed to the difficulty of decisions they make along the way, rather than the number of clicks it takes. Effectiveness is a measure of how many errors the user makes when interacting with a system. This can be anything from clicking on the wrong the wrong link, to not being able to locate the desired information. Effectiveness is interesting, because the impact of errors can vary so greatly. If a user accidently presses “Exit” in your training course, the impact of this error depends on whether you designed a verification step in your course, if you did then the impact is minimized and the user selects “cancel”, if you did not, the user exits the course and loses their progress. Which leads me to, user satisfaction. User satisfaction is how we measure the user’s perspective on their interaction, this can be done through observation or surveys and it gives us insight into what the user’s attitude is toward the product. It is heavily dependent on the users perception of the system and their expectations for the interaction. Sometimes, especially with web-based training courses. A measure of success can be as simple as hearing a user say “That was less painful then I thought it would be!”
  • #6: CHRIS: So, how do these principles apply to training design? What is there to be gained from integrating user experience practices into a training design lifecycle? These are some of the outcomes you can expect from more focusing on the user’s experience in training design. Since most folks probably aren't familiar with the term “conversions”, Jeff can you explain that bullet? JEFF: Sure, conversion rates one way we measure the success in our design. We identify key performance indicators like how many users click on the “learn more” button, then measure the number or percentage of “conversions” based on the entire user population. This tells us how successful that button was in the wild. I would also like to call special attention to the last two points on the left side. The goal of training is provide learning outcomes for the users. The best way to do that is to minimize the impact of the technology and the delivery method so the users can just focus on the learning content. Users don’t typically notice good UX. When the experience is fluid and intuitive, the user is so focused on the content that they don’t pay attention to the design.
  • #7: CHRIS ALL: our good friend ADDIE, its iterative, very similar to the UX process and makes it easier to apply the principles to the different phases of ADDIE In the ADDIE instructional design methodology, an iterative process of analysis through evaluation allows the designers to understand their audience, design the training to their needs, and evaluate whether those needs are met. This process is very similar to user experience design, and the principles and tools of UX can easily be integrated into this model. Next slides shows us an overlay of the UX stuff on ADDIE, how you can apply them to learning design
  • #8: JEFF: As you can see, at each step in the instructional design methodology, user experience can be incorporated to improve the user outcomes. These tools require varying levels of time and resources to apply, and can serve as a toolbox for instructional designers to choose from. I will go into detail about a few of these methods, and how they can be applied effectively in the training design lifecycle.
  • #9: JEFF: Knowing your users is critical to designing a product that will provide a great experience. We use research methods such as interviewing, contextual observations, focus groups, web analytics, and helpdesk or user support data to gain a better understanding of the user groups before moving forward with development. Some key distinctions in our data collection process that affect user experience include, the users age, level of comfort with technology, frequency of system interaction, and expectations or goals. Frequently, ISDs don’t look at these factors when evaluating the audience of learners CHRIS: We can add in these data points to our existing audience analysis to give the UX team something to work with JEFF: We can use this data to create personas, which are profiles of representative users from each user group. They include demographic information, goals, frustrations, and other relevant information. These personas are based on data we’ve already collected about system users and help us to make tough design decisions. We refer to them throughout the design lifecycle by asking ourselves “Would this make sense to Mathew?” “How could this information best be conveyed to him?” Reap the benefits of the audience analysis you’ve performed by applying personas to the design process for your course. CHRIS: How many people in the audience have used personas in a training design? What was your experience like?
  • #10: JEFF: Now we move on to the design phase… When most people think of user experience or usability, what they are envisioning is a heuristic review, where a UX designer or a UX team reviews a product and makes recommended changes to the layout, color, and organization of information on the interface. UX professionals use heuristics as our best practices for ensuring the usability of a product. Heuristic design principles are guidelines based on scientific research on how users perceive, process, and remember information. By observing your training’s conformance to these heuristics, you can predict areas where a user may have difficulty, or how the training will perform in user testing and in the wild. These guidelines are foundational to the field of Usability and there are many Heuristic review templates and checklists available to help guide you through your product review. CHRIS: In training we call this the User Interface Review… how many people have integrated these into their design process, frequency? Budget? Hours?
  • #11: JEFF: After you have collected all the necessary information about your user groups and designed a product that you think will work, now its time to iterate! Test out your ideas and find out if the design you have so heavily invested in really works for the users you intended it for. Iterative design is critical to good User Experience and will save you lots of time, frustration, and dollars when you catch issues before your product moves into development. -slide- I would like to give two examples of iterative design to demonstrate this point. Frequently, in the training world, the bulk of testing comes after deployment. The designer creates the course based on their understanding of the learners, then asks subject matter experts to review the training course for any issues prior to distributing the training to the target audience. CHRIS: does this sound familiar? JEFF: While this method works and often creates a high-quality product, incorporating UX methods even earlier in the design lifecycle can lead to better results. CHRIS: This is the process my team would use we would design and deploy… we would end up deploying 3 or 4 times -slide- In this second example, you can see that evaluation, identification of issues and re-design are all taking place before the training has been deployed, and in some cases before certain portions are even developed. By conducting heuristic reviews, quick user tests, and getting feedback earlier in the design lifecycle, there is a potential to save time and reduce waste in both the development and deployment phases. CHRIS: So lets take 90 seconds and talk with your neighbor about which of these methods you use, and how you could improve that process in your next training How many people have used SAM? Megan Torrance LAMA. alternative to ADDIE…. JEFF: Now that we are ready to deploy our well designed product. Lets talk about implementation!
  • #12: JEFF: After you have collected all the necessary information about your user groups and designed a product that you think will work, now its time to iterate! Test out your ideas and find out if the design you have so heavily invested in really works for the users you intended it for. Iterative design is critical to good User Experience and will save you lots of time, frustration, and dollars when you catch issues before your product moves into development. -slide- I would like to give two examples of iterative design to demonstrate this point. Frequently, in the training world, the bulk of testing comes after deployment. The designer creates the course based on their understanding of the learners, then asks subject matter experts to review the training course for any issues prior to distributing the training to the target audience. CHRIS: does this sound familiar? JEFF: While this method works and often creates a high-quality product, incorporating UX methods even earlier in the design lifecycle can lead to better results. CHRIS: This is the process my team would use we would design and deploy… we would end up deploying 3 or 4 times -slide- In this second example, you can see that evaluation, identification of issues and re-design are all taking place before the training has been deployed, and in some cases before certain portions are even developed. By conducting heuristic reviews, quick user tests, and getting feedback earlier in the design lifecycle, there is a potential to save time and reduce waste in both the development and deployment phases. CHRIS: So lets take 90 seconds and talk with your neighbor about which of these methods you use, and how you could improve that process in your next training How many people have used SAM? Megan Torrance LAMA. alternative to ADDIE…. JEFF: Now that we are ready to deploy our well designed product. Lets talk about implementation!
  • #13: JEFF: After you have collected all the necessary information about your user groups and designed a product that you think will work, now its time to iterate! Test out your ideas and find out if the design you have so heavily invested in really works for the users you intended it for. Iterative design is critical to good User Experience and will save you lots of time, frustration, and dollars when you catch issues before your product moves into development. -slide- I would like to give two examples of iterative design to demonstrate this point. Frequently, in the training world, the bulk of testing comes after deployment. The designer creates the course based on their understanding of the learners, then asks subject matter experts to review the training course for any issues prior to distributing the training to the target audience. CHRIS: does this sound familiar? JEFF: While this method works and often creates a high-quality product, incorporating UX methods even earlier in the design lifecycle can lead to better results. CHRIS: This is the process my team would use we would design and deploy… we would end up deploying 3 or 4 times -slide- In this second example, you can see that evaluation, identification of issues and re-design are all taking place before the training has been deployed, and in some cases before certain portions are even developed. By conducting heuristic reviews, quick user tests, and getting feedback earlier in the design lifecycle, there is a potential to save time and reduce waste in both the development and deployment phases. CHRIS: So lets take 90 seconds and talk with your neighbor about which of these methods you use, and how you could improve that process in your next training How many people have used SAM? Megan Torrance LAMA. alternative to ADDIE…. JEFF: Now that we are ready to deploy our well designed product. Lets talk about implementation!
  • #14: JEFF: In the implementation phase, UX can still provide value for training outcomes. UX does not stop after deployment. This is when the rubber meets the road. Thinking about how to support the users in the implementation phase is just as important as it is in the design phase. Supplemental documentation like email communication, instructions, and training aids are integral to the user experience and deserve the same amount of consideration as you course itself. Communication about your course is key, even more important is heading off any known issues with the technology. So If your player is bad and you cant fix it, put the instructions in the email. Head off design issues, and set your users expectations for the experience. We will get back to completion certificates in a bit, but I want to make a point about training aids. Make sure that they provide value to the user and solicit feedback to find out what they should include! CHRIS: How do I help users get through the interface? I lead them so they don’t have to make decisions unless it’s part of the training. Something about Gagne (Conditions of Learning)
  • #15: JEFF: When it comes to evaluating user experience, the best method is to ask users directly! In the case of online trainings, many folks already do some sort of survey feedback whether it is to assess the quality of the training course itself, the learning outcomes, or both. What you are probably not currently assessing is the usability of your training course. When you follow-up with a user, by asking the right questions, you might be surprised what you can learn! CHRIS: Hey Jeff, how many questions would you add to a survey??? JEFF: You can get a lot of valuable UX information by adding just one or two questions to a Kirkpatrick level 1 survey, or in some cases by simply rephrasing questions to incorporate issues related to User Experience In one example, I worked with a training team who was issuing web-based compliance training to a large group of learners. They were consistently getting low scores on the question “How satisfied were you with this training course?”. After a few quick follow-up interviews with users, we decided to ask the question “What did you find most frustrating about this training?” and as it turned out, it was not a learning content issue that was leading to the low satisfaction. The majority of users who rated the training low, identified issues related to three categories 1. Accessing the training course, 2. Loading the videos during the training, and 3. Downloading the training certificate. This is just one example of how user experience issues related to the technology solution interfered with the leaners overall experience with the training course. CHRIS: How many of you use the Kirkpatrick level 1 survey as your main metric for evaluating training outcomes? Imagine the drag on those outcomes that could come from bad UX! JEFF: By asking the right questions, questions which expand feedback from just learning outcomes can identify a larger pool of issues that may be effecting the quality of the final product. CHRIS: As you can see on this slide we have some examples of survey questions that can give you UX survey data and fit easily with your existing evaluations!
  • #16: JEFF: User testing is another method of evaluating User Experience by collecting information directly from the source. These tests can be as simple as a paper prototype test, where the designer prints out some screenshots and gets quick feedback about the layout, navigation, content, or structure of the course. User testing can vary in complexity, including structured observations and even eye tracking of implicit feedback like emotional reactions! This is infrequently necessary (or cost effective) in the training development lifecycle, but the more information you can collect from real users before or after the launch, the better off in the end. CHRIS: Here is where I take minute to ask how many of you use a peer review process? How many people have used SMEs to test their courses? How many have used actual users!?! So what's the difference between these reviews and user tests? SUMMARY CHRIS: These are the 5 phases of ADDIE and the tools that we borrow from the UX community. Jeff is going to take us through some examples of actual eLearning courses to show UX in action.
  • #17: JEFF: In these first few examples I want to talk about my biggest pet-peeve in training UX, consistency! On the screen you can see that the user is prompted with a choice of two response options, and the navigation button in the bottom left has disappeared. Is it clear to you (or the user) that you have to click on the training slide to proceed?
  • #18: JEFF: In these first few examples I want to talk about my biggest pet-peeve in training UX, consistency! On the screen you can see that the user is prompted with a choice of two response options, and the navigation button in the bottom left has disappeared. Is it clear to you (or the user) that you have to click on the training slide to proceed?
  • #19: JEFF: Next, we see that in a later slide, the navigator window has disappeared entirely and the user must now click on a button in the content area, not a text box. These issues are admittedly small individually, but each little change can cause the user difficulty and these challenges can add up to create a negative user experience!
  • #20: JEFF: Ok, game time! No looking ahead! On this slide we see the navigator, content, and a sticky note. These notes are meant to be the “more information” resource. Please note the location on the current slide!
  • #21: JEFF: Now, where is the sticky note going to appear on this slide? I’ve blocked it out so you can guess, anyone? -slide- It actually appears in the bottom right, why did it move? Was this necessary or just an oversight?
  • #22: JEFF: Ok, how about now? -slide- Bottom right! Ok, enough picking on Chris… This is just an example to show how inconsistency can give the appearance of disorganization and make the interface more noticeable to the user. You should provide the user an opportunity to learn how your system will perform and follow those rules as closely as possible so they are never left guessing.
  • #23: JEFF: Next, I want to give you an example of good design and briefly talk about error prevention. Here you can see that the process is very clear, pick a response option and hit submit.
  • #24: JEFF: An immediate change to the screen leaves the question and response options up, and now the user is already looking at that portion of the screen where the answer appeared. The background fades out and the user can now select “Continue” CHRIS: Too bad this is in the same training with the previous bad examples. JEFF: That’s true, but here you created a very clear path for the user and prevented them from making errors related to the interface.
  • #25: JEFF: Another important heuristic is creating systems that match our understanding of how the real world works. Think of this as “don’t fix it if it isn’t broken”. Here the interface presents the alphabet in a continuous format from left to right, just as the user would expect. The actionable letters are highlighted and an alternate method of navigation is even presented on the left side. Just because this is an online interaction, we cannot separate the users knowledge of the real world from their understanding of how the system will work. Don’t just show the letters that want, help orient the user with something they already know.
  • #26: JEFF: Many trainings have some form of navigator window with controls. -slide- Here we can see a list of menu options on the left, exit in the middle, and a navigation control on the right. The problem is that there is no visual distinction made between these different types of buttons. The ones on the left are drop down menus, exit will take the user out of the course entirely, and the previous/next buttons on the right navigate through the course. These should be distinguished. -slide- In the bottom right are icons… these icons need to be used consistently so a simple change to the “Hide CC” control would make the appearance consistent across the three buttons.
  • #27: JEFF: This is an example of good controls. The buttons on the bottom have clear icons and there is a solid navigation strategy in the bottom right which includes a progress indicator. On the top right you can see a list of menu options, that all lead to pop-up windows.
  • #28: JEFF: Earlier I mentioned visibility of system status, here we have a good example. You can see the progress bar shows the portion of the course the user has completed, and the bottom bar shows progress through the current video. This helps the user see quickly where they are and what the system is doing. This heuristic can also apply to things like loading messages, which help keep the user informed of actions the system may be taking in that are otherwise not readily apparent.
  • #29: JEFF: Visual layout can present a unique challenge with eLearning courses, sometimes a course or window will not appear the way we intended, or an error in an uncontrolled element of the system will cause them to behave erratically. In this case, the system presenting the training automatically loads the player in a reduced size, and leaves the other windows open behind. This can be distracting to the user. In addition, the player is far to small for the user to read the slides. Errors like this can impact our user’s perceptions of the value of a training course. How can we expect the user to trust our course content when the technology used to deliver that content seems so obviously faulty. Does this look like a modern web interaction?
  • #30: JEFF: For this final example, please raise your had if you think the screen has loaded correctly in the browser window. It did! This player simply does not use what we would consider a modern layout for the buttons used to navigate the exam. So, what can an instructional designer do when faced with this kind of challenge? The ISD can create additional instructions, use an overlay, or place reminders in the training course as an alternative solution when a technology modification just isn’t feasible. Remember, the course’s perception will be effected by the user’s expectations. If you show them a screenshot of how the exam is laid out, they won’t be surprised by the unusual button placement later. --------- CHRIS: Lets conclude today with a thought experiment… “One of the most common ways to keep users engaged in a eLearning course is by creating interactions where the user must click into the training window to progress dialogue, review additional content, or respond to a prompt. Do these interactions truly add value to the user’s experience with the training or does it simply keep them awake? What are some other methods that we could use to increase user interaction? I will give you a minute to discuss among yourselves…. Any thoughts?”
  • #31: JEFF: So we leave you with this. The principles and tools of user experience can be fluidly integrated into your existing instructional design processes. Doing so will help you to better know and reach your users and ultimately contribute to the learning outcomes of your online training courses. CHRIS: I’m out! (drop the mic)