SlideShare a Scribd company logo
Enhancing the interaction space
of a tabletop computing system
to design paper prototypes
for mobile applications
Master Thesis
Francesco Bonadiman
2016
Introduction
1
Blended Prototyping
● tabletop system based on hand-drawn paper sketches
2
Blended Prototyping
● tabletop system based on hand-drawn paper sketches
○ converted into digital versions
→ projected on the table
→ expanded further
2
● tabletop system based on hand-drawn paper sketches
○ converted into digital versions
→ projected on the table
→ expanded further
○ transformed into testable applications
→ on a mobile device
Blended Prototyping
2
● tabletop system based on hand-drawn paper sketches
○ converted into digital versions
→ projected on the table
→ expanded further
○ transformed into testable applications
→ on a mobile device
● enhances development of mobile applications
○ accelerates early design phases
Blended Prototyping
2
Hardware
● video projector
○ projects mobile frames & prototypes
3
Hardware
● video projector
○ projects mobile frames & prototypes
● webcam
○ recognizes barcodes of screens
3
Hardware
● video projector
○ projects mobile frames & prototypes
● webcam
○ recognizes barcodes of screens
● DSLR camera
○ shoots HQ pics of the sketches
3
Hardware
● video projector
○ projects mobile frames & prototypes
● webcam
○ recognizes barcodes of screens
● DSLR camera
○ shoots HQ pics of the sketches
● tablet
○ allows to perform actions on the prototypes
3
Software
● Java application
○ controls behavior of projector & cameras
4
Software
● Java application
○ controls behavior of projector & cameras
○ identifies barcode markers on paper sheets
4
Software
● Java application
○ controls behavior of projector & cameras
○ identifies barcode markers on paper sheets
○ digitize sketches & perform actions on them
4
Software
● Java application
○ controls behavior of projector & cameras
○ identifies barcode markers on paper sheets
○ digitize sketches & perform actions on them
● Barcode (similar to QR-code)
○ wider → optimized for webcam
○ gives sheet’s position & rotation on table
4
Context
5
Advantages of Paper Prototyping
● potential of Paper-Prototyping & mobile devices = combined
6
Advantages of Paper Prototyping
● potential of Paper-Prototyping & mobile devices = combined
● first impression of not-yet-developed product
6
Advantages of Paper Prototyping
● potential of Paper-Prototyping & mobile devices = combined
● first impression of not-yet-developed product
● paper → cheap, fast & intuitive (Snyder)
○ no unimportant details → iterative refinement (Nielsen)
6
Advantages of Paper Prototyping
● potential of Paper-Prototyping & mobile devices = combined
● first impression of not-yet-developed product
● paper → cheap, fast & intuitive (Snyder)
○ no unimportant details → iterative refinement (Nielsen)
○ quickly create multiple design alternatives (Landay)
6
Advantages of Paper Prototyping
● potential of Paper-Prototyping & mobile devices = combined
● first impression of not-yet-developed product
● paper → cheap, fast & intuitive (Snyder)
○ no unimportant details → iterative refinement (Nielsen)
○ quickly create multiple design alternatives (Landay)
● same usability issues as Hi-Fi discovered
○ benefits of early usability data = 10+ times bigger (Snyder)
6
● encourages collaboration & critiques →
interdisciplinary & creative (no “gaps”)
Advantages of the System
7
● encourages collaboration & critiques →
interdisciplinary & creative (no “gaps”)
● testing within real-life scenarios (de Sá / Carriço)
→ in thousand of different usage conditions
Advantages of the System
7
● encourages collaboration & critiques →
interdisciplinary & creative (no “gaps”)
● testing within real-life scenarios (de Sá / Carriço)
→ in thousand of different usage conditions
● add code & functionalities
○ define dynamic interface behavior
○ smooth transition to development
Advantages of the System
7
Core features
Via TABLET → Determine widgets & semantics
8
Core features
Via TABLET → Determine widgets & semantics
● manually define “hotspots” on prototype
● turn these into different design patterns
8
Core features
Via TABLET → Determine widgets & semantics
● manually define “hotspots” on prototype
● turn these into different design patterns
● create links between the prototypes
● remove components & connections
8
Core features
Via TABLET → Determine widgets & semantics
● manually define “hotspots” on prototype
● turn these into different design patterns
● create links between the prototypes
● remove components & connections
● convert into working code
8
Problem
● several devices + media → different degrees of fidelity
○ paper & office supplies → computers & smart objects
9
Problem
● several devices + media → different degrees of fidelity
○ paper & office supplies → computers & smart objects
BUT
9
Problem
● several devices + media → different degrees of fidelity
○ paper & office supplies → computers & smart objects
BUT
Changing fidelity of tool & modality of interaction
→ disrupts creative design process
9
Problem
● several devices + media → different degrees of fidelity
○ paper & office supplies → computers & smart objects
BUT
Changing fidelity of tool & modality of interaction
→ disrupts creative design process
→ shifting continuously confuses users
9
Fidelity Clash
● combination of low- & high-tech solutions = puzzling
10
Fidelity Clash
● combination of low- & high-tech solutions = puzzling
○ one single user stops ideation process
→ to perform any action
10
Fidelity Clash
● combination of low- & high-tech solutions = puzzling
○ one single user stops ideation process
→ to perform any action
○ breaking of collaborative & creative moment (Snyder)
10
Fidelity Clash
● combination of low- & high-tech solutions = puzzling
○ one single user stops ideation process
→ to perform any action
○ breaking of collaborative & creative moment (Snyder)
● users distracted → do not interact anymore
○ get unfocused → lose the “flow”
10
Fidelity Clash
● combination of low- & high-tech solutions = puzzling
○ one single user stops ideation process
→ to perform any action
○ breaking of collaborative & creative moment (Snyder)
● users distracted → do not interact anymore
○ get unfocused → lose the “flow”
→ perceived as too technical, isolating & distracting
10
Objectives
● replace high-tech interactions with low-tech approaches
11
Objectives
● replace high-tech interactions with low-tech approaches
● new interaction techniques → not to shift fidelity of media
11
Objectives
● replace high-tech interactions with low-tech approaches
● new interaction techniques → not to shift fidelity of media
THEN
11
Objectives
● replace high-tech interactions with low-tech approaches
● new interaction techniques → not to shift fidelity of media
THEN
● find & implement alternative solutions to tablet
11
Objectives
● replace high-tech interactions with low-tech approaches
● new interaction techniques → not to shift fidelity of media
THEN
● find & implement alternative solutions to tablet
● low-tech approaches to define “hotspots” & perform actions
○ keep interaction techniques learnable & usable
11
Enhanced System
12
Tasks
1. digitize a screen by taking a picture
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
3. detect a component (button, image or textbox)
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
3. detect a component (button, image or textbox)
4. connect two screens (link a button to next screen)
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
3. detect a component (button, image or textbox)
4. connect two screens (link a button to next screen)
5. remove a component from a screen
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
3. detect a component (button, image or textbox)
4. connect two screens (link a button to next screen)
5. remove a component from a screen
6. remove a connection between two screens
13
Tasks
1. digitize a screen by taking a picture
2. duplicate a whole screen
3. detect a component (button, image or textbox)
4. connect two screens (link a button to next screen)
5. remove a component from a screen
6. remove a connection between two screens
7. remove a whole screen
13
Alternatives considered
● Voice recognition
● Object recognition
● Gesture recognition
● Special pen
● Special button
● Colored objects
● Barcode recognition
● Transparent layers
14
Alternatives considered
● Voice recognition
● Object recognition
● Gesture recognition
● Special pen
● Special button
● Colored objects
● Barcode recognition
● Transparent layers
● most approaches = unfeasible
○ noisy / confusing environment
○ need of not-yet-existing hardware
○ time constraints
14
Color
Detection
A
Barcode
Recognition
B
15
A. Color Detection
● no specific hardware
● dynamic & “playful” interaction
16
A. Color Detection
● no specific hardware
● dynamic & “playful” interaction
Concept: different color = different component
16
A. Color Detection
● no specific hardware
● dynamic & “playful” interaction
Concept: different color = different component
→ colored markers + semi-transparent BBPapier
→ users can see through without “ruining” sketches
16
A. Color Detection
● no specific hardware
● dynamic & “playful” interaction
Concept: different color = different component
→ colored markers + semi-transparent BBPapier
→ users can see through without “ruining” sketches
→ JavaCV & OpenCV
16
→ Procedure
● control card = “toolbox” → paint with markers
● same color → fill up the chosen component
17
→ Procedure
● control card = “toolbox” → paint with markers
● same color → fill up the chosen component
● digitize the sketch → creating threshold
○ average color in toolbox → range (+30/-30)
○ exterior pixels discarded → central values
→ avoid mistakes & colors overlapping
17
→ Procedure
● control card = “toolbox” → paint with markers
● same color → fill up the chosen component
● digitize the sketch → creating threshold
○ average color in toolbox → range (+30/-30)
○ exterior pixels discarded → central values
→ avoid mistakes & colors overlapping
● threshold = calculate component’s RGB value
● any pixel within → associated to specific UI component
17
→ Approximation
Once pixels & colors detected
● algorithm recognizes contours of shape
● approximates it to a rectangle
● component created & projected
18
→ Approximation
Once pixels & colors detected
● algorithm recognizes contours of shape
● approximates it to a rectangle
● component created & projected
Need for filling component
● algorithm not efficient only for outline
● rough drawings typical of sketching
18
B. Barcode Recognition
Barcode marker = tool for specific operations
19
B. Barcode Recognition
Barcode marker = tool for specific operations
● take pictures of device
● connect screens
● recognize & delete components
● copy screens
19
B. Barcode Recognition
Barcode marker = tool for specific operations
● take pictures of device
● connect screens
● recognize & delete components
● copy screens
Algorithm continuously running
→ checks if new barcodes are detected
19
→ Tools
20
Further improvements
Sidebar → alternative approach for some functionalities
21
Further improvements
Sidebar → alternative approach for some functionalities
● rectangular transparent area + at top of table
● if barcode placed inside → operation performed
21
Further improvements
Sidebar → alternative approach for some functionalities
● rectangular transparent area + at top of table
● if barcode placed inside → operation performed
Clock (progress indicator) → added onto center-top area
21
Further improvements
Sidebar → alternative approach for some functionalities
● rectangular transparent area + at top of table
● if barcode placed inside → operation performed
Clock (progress indicator) → added onto center-top area
● immediate visual feedback to users
● if appears → users realize they are performing an action
21
Further improvements
Sidebar → alternative approach for some functionalities
● rectangular transparent area + at top of table
● if barcode placed inside → operation performed
Clock (progress indicator) → added onto center-top area
● immediate visual feedback to users
● if appears → users realize they are performing an action
● 3 seconds = time-frame users might change their mind
→ effective to avoid accidental mistakes
21
Evaluation
22
Modality A: Sidebar
● digitize a screen → taking a picture
○ by placing single barcode inside
23
Modality A: Sidebar
● digitize a screen → taking a picture
○ by placing single barcode inside
● duplicate a screen
○ by placing two barcodes inside
○ one is already digitized (source)
○ one is empty (destination)
23
Modality A: Colors
● toolbox projected
→ boxes for button, image, textbox…
24
Modality A: Colors
● toolbox projected
→ boxes for button, image, textbox…
● BBPapier over toolbox → paint with color
● same color → BBPapier over component
24
Modality A: Colors
● toolbox projected
→ boxes for button, image, textbox…
● BBPapier over toolbox → paint with color
● same color → BBPapier over component
● digitize screen with toolbox visible on table
→ colored square displayed to represent component
24
Modality A: Colors
To connect two screens
● same + use FROM → TO box
To delete component → digitize again
● physically remove colored BBPapier
● or cover it with white paper
To remove connection
● same with button-connector
25
Modality B
● camera tool to digitize
● copy tool to duplicate
● handles to detect component
● arrow tool to connect screens
● rubber tool to delete
○ component
○ connection
○ screen (only way)
26
Dependent Variables
● measure overall success of new interaction techniques
○ if quick & easy-to-use without distracting user
○ which of the two modalities = most effective
27
Dependent Variables
● measure overall success of new interaction techniques
○ if quick & easy-to-use without distracting user
○ which of the two modalities = most effective
● prove if efficient & offers satisfactory user-experience
27
Dependent Variables
● measure overall success of new interaction techniques
○ if quick & easy-to-use without distracting user
○ which of the two modalities = most effective
● prove if efficient & offers satisfactory user-experience
4 parameters:
→ Quickness → Ease-of-use → Distraction → User-Experience
27
Quickness & Ease-of-Use
Quickness
● time for task completion
● evaluation is videotaped
28
Quickness & Ease-of-Use
Quickness
● time for task completion
● evaluation is videotaped
● Effective Time
○ duration of task
→ without system errors
→ avoids malfunctionings
28
Quickness & Ease-of-Use
Quickness
● time for task completion
● evaluation is videotaped
● Effective Time
○ duration of task
→ without system errors
→ avoids malfunctionings
Ease-of-use
● % successfully completed tasks
● ratio calculated by checking
interactions executed perfectly
28
Quickness & Ease-of-Use
Quickness
● time for task completion
● evaluation is videotaped
● Effective Time
○ duration of task
→ without system errors
→ avoids malfunctionings
Ease-of-use
● % successfully completed tasks
● ratio calculated by checking
interactions executed perfectly
○ users make no mistakes
○ no significant problems
○ not need any help / hint
28
Distraction
Two factors: quickness & workload index (RTLX)
● unweighted (Raw) version of NASA Task Load Index (TLX)
● 6 subscales:
○ Mental, Physical, Temporal Demand; Performance; Effort; Frustration
29
Distraction
Two factors: quickness & workload index (RTLX)
● unweighted (Raw) version of NASA Task Load Index (TLX)
● 6 subscales:
○ Mental, Physical, Temporal Demand; Performance; Effort; Frustration
● own survey → filled in by users after every task
● 7-points linear scales → then averaged = RTLX
● the lower → the less demanding & distracting task is
29
User-Experience
● dimensions of AttrakDiff Survey → studies by Hassenzahl
○ “how users rate the usability and design of your interactive product”
30
User-Experience
● dimensions of AttrakDiff Survey → studies by Hassenzahl
○ “how users rate the usability and design of your interactive product”
● own custom version → filled in after finishing whole study
● 28 7-points semantic differential scales
○ opposite adjectives at both poles ("good - bad" / "human - technical")
○ implicitly divided into 4 dimensions (Pragmatic Quality, Hedonic
Quality - Identity, Hedonic Quality - Stimulation, Attractiveness)
30
Independent Variables
● Task: 1 to 7
● Modality: A | B
● Group: AB | BA
● Area of expertise: HCI | IT | Other
● Sketching familiarity: 1 to 5
● Mobile familiarity: 1 to 5
● Tabletop use: Yes | No
● Lighting conditions: Day | Night
31
Experiment Design
● two different design solutions → A/B Testing
● “within-subject” design → every user tests both versions
○ no interactions in common → no influence
32
Experiment Design
● two different design solutions → A/B Testing
● “within-subject” design → every user tests both versions
○ no interactions in common → no influence
● to avoid possible bias = AB & BA groups
32
User Demographics
● one user at a time → focus on interactions
○ easier to observe & more difficult to get biased
33
User Demographics
● one user at a time → focus on interactions
○ easier to observe & more difficult to get biased
● 24 participants → 11 females & 13 males
● half aged 18-24, 10 aged 25-34, two older
● 7 background in HCI, 8 IT, rest other expertise
● via Facebook groups & personal connections
● reward = tasty gift or house utensils
33
Analysis of the Results
34
Results
● reviewed videos from GoPro
● investigate how long took users to complete tasks
● detect when they had troubles & needed help
35
Results
● reviewed videos from GoPro
● investigate how long took users to complete tasks
● detect when they had troubles & needed help
● SPSS → ONE-WAY ANOVA test
○ ideal for type of Dependent Variables (continuous)
○ valid for any group of users of study
○ robust to violations in underlying assumptions
35
Quickness
● Modality B faster than Modality A
○ time for A = 40.1s, B only 23.25s
○ statistically significant (p < .05)
36
Quickness
● Modality B faster than Modality A
○ time for A = 40.1s, B only 23.25s
○ statistically significant (p < .05)
● Effective Time (without system errors)
○ time for A = 29.24s, B only 18.38s
○ statistically significant (p < .05)
36
Quickness
● Modality B faster than Modality A
○ time for A = 40.1s, B only 23.25s
○ statistically significant (p < .05)
● Effective Time (without system errors)
○ time for A = 29.24s, B only 18.38s
○ statistically significant (p < .05)
→ color detection = long process
36
Ease-of-use
● Modality A easier to use than B
37
Ease-of-use
● Modality A easier to use than B
● perfectly accomplished tasks:
○ ratio: A = 84%, B = 60%
○ statistically significant (p < .05)
37
Distraction
● A almost as demanding as B
○ A RTLX index = 1.90
○ B RTLX index = 1.84
38
Distraction
● A almost as demanding as B
○ A RTLX index = 1.90
○ B RTLX index = 1.84
● On a 7-points scale →
not statistically significant
(F1,286 = 0.289, p = 0.591)
38
User-Experience
● users feel “assisted by the product”
○ usability can be improved
39
User-Experience
● users feel “assisted by the product”
○ usability can be improved
● users are “stimulated by the product”
○ users can identify with it
○ room for hedonic improvement
39
User-Experience
● users feel “assisted by the product”
○ usability can be improved
● users are “stimulated by the product”
○ users can identify with it
○ room for hedonic improvement
● system considered as “rather desired”
○ attractiveness above average
39
Single Tasks Analysis
40
Task 1 Analysis
● sidebar (A) better
○ faster
○ easier to use
○ less demanding
41
Task 1 Analysis
● sidebar (A) better
○ faster
○ easier to use
○ less demanding
● camera tool slower
○ no clue on how /
where to place it
41
Task 2 Analysis
● sidebar (A)
○ easier to use
42
Task 2 Analysis
● sidebar (A)
○ easier to use
○ faster (only E.T.)
○ causes several
system errors →
see later
42
Task 2 Analysis
● sidebar (A)
○ easier to use
○ faster (only E.T.)
○ causes several
system errors →
see later
● RTLX == copy tool
42
Task 3 Analysis
● handles (B)
○ faster
○ less demanding
○ easier to use
43
Task 3 Analysis
● handles (B)
○ faster
○ less demanding
○ easier to use
● color detection
○ 3 tasks in one
○ digitize + create
43
Task 4 Analysis
● arrow (B)
○ faster
○ less demanding
44
Task 4 Analysis
● arrow (B)
○ faster
○ less demanding
○ where?
44
Task 4 Analysis
● arrow (B)
○ faster
○ less demanding
○ where?
● color detection
○ easier to use
○ create + connect
44
Task 5 Analysis
● rubber (B)
○ faster (only E.T.)
45
Task 5 Analysis
● rubber (B)
○ faster (only E.T.)
○ bug found!
45
Task 5 Analysis
● rubber (B)
○ faster (only E.T.)
○ bug found!
● color detection (A)
○ easier to use
○ less demanding
○ more accurate
45
Task 6 Analysis
● rubber (B)
○ faster
○ less demanding
46
Task 6 Analysis
● rubber (B)
○ faster
○ less demanding
● color detection (A)
○ easier to use
○ experience?
46
Interesting facts
● people with background in HCI
→ performed more easily & faster
● good sketching abilities
→ same + found system more attractive
47
Interesting facts
● people with background in HCI
→ performed more easily & faster
● good sketching abilities
→ same + found system more attractive
● people familiar with mobile devices
→ faster + system less distracting
47
Interesting facts
● people with background in HCI
→ performed more easily & faster
● good sketching abilities
→ same + found system more attractive
● people familiar with mobile devices
→ faster + system less distracting
● during day-time → better performance & lower RTLX
47
Considerations
● accuracy = ratio of task’s precision
○ global accuracy = 86%
○ color detection = 67%
→ more delicate + error-prone
48
Considerations
● accuracy = ratio of task’s precision
○ global accuracy = 86%
○ color detection = 67%
→ more delicate + error-prone
● barcodes + hands into digitized image → distraction + inexperience
48
Considerations
● accuracy = ratio of task’s precision
○ global accuracy = 86%
○ color detection = 67%
→ more delicate + error-prone
● barcodes + hands into digitized image → distraction + inexperience
● if BBPapier wavy → grey area projected
○ projector’s brightness → shadow (“ghost shape”)
48
Projector’s beam
● when toolbox at sides of projection → outer areas appear darker
○ either no color is detected
○ “fake” grey color recognized → multiple components created
49
Projector’s beam
● when toolbox at sides of projection → outer areas appear darker
○ either no color is detected
○ “fake” grey color recognized → multiple components created
● most system errors
○ excessively intense light causing reflections on table
49
Projector’s beam
● when toolbox at sides of projection → outer areas appear darker
○ either no color is detected
○ “fake” grey color recognized → multiple components created
● most system errors
○ excessively intense light causing reflections on table
→ closest barcodes could not be detected
→ devices continuously refreshing
49
Projector’s beam
● when toolbox at sides of projection → outer areas appear darker
○ either no color is detected
○ “fake” grey color recognized → multiple components created
● most system errors
○ excessively intense light causing reflections on table
→ closest barcodes could not be detected
→ devices continuously refreshing
○ “effective time” → task completed if interaction is correct
49
Conclusions
50
Which better?
51
Which better?
● color detection (A)
○ easier to use
○ takes longer time
51
Which better?
● color detection (A)
○ easier to use
○ takes longer time
● barcode recognition (B)
○ definitely faster
○ harder to understand
51
Which better?
● color detection (A)
○ easier to use
○ takes longer time
● barcode recognition (B)
○ definitely faster
○ harder to understand
● both have similar workload
○ around 1.9 out of 7
○ positively low →
not too distracting
51
Ideal implementation
1-2. sidebar → digitize & duplicate screens
52
Ideal implementation
1-2. sidebar → digitize & duplicate screens
2. copy tool too → duplicate without moving papers
52
Ideal implementation
1-2. sidebar → digitize & duplicate screens
2. copy tool too → duplicate without moving papers
3-4. both approaches to detect components & create connections
a. color detection = from scratch + multiple tasks at same time
52
Ideal implementation
1-2. sidebar → digitize & duplicate screens
2. copy tool too → duplicate without moving papers
3-4. both approaches to detect components & create connections
a. color detection = from scratch + multiple tasks at same time
b. handles & arrow = fast & easy-to-use
52
Ideal implementation
1-2. sidebar → digitize & duplicate screens
2. copy tool too → duplicate without moving papers
3-4. both approaches to detect components & create connections
a. color detection = from scratch + multiple tasks at same time
b. handles & arrow = fast & easy-to-use
5-6-7. rubber tool to remove components, connections & screens
→ implement separate ones to avoid possible mistakes
52
Future Work
● visual feedback when projections disappear from table
53
Future Work
● visual feedback when projections disappear from table
● icon near clock → tells which action is being performed
53
Future Work
● visual feedback when projections disappear from table
● icon near clock → tells which action is being performed
● exploring color detection further
○ better algorithms to calculate the threshold
○ better ways to detect shapes through contours
53
Future Work
● visual feedback when projections disappear from table
● icon near clock → tells which action is being performed
● exploring color detection further
○ better algorithms to calculate the threshold
○ better ways to detect shapes through contours
● barcodes = built with wood / opaque plastic → reduce reflections
53
Future Work
● visual feedback when projections disappear from table
● icon near clock → tells which action is being performed
● exploring color detection further
○ better algorithms to calculate the threshold
○ better ways to detect shapes through contours
● barcodes = built with wood / opaque plastic → reduce reflections
● compare interaction techniques with original system
53
Future Work
● visual feedback when projections disappear from table
● icon near clock → tells which action is being performed
● exploring color detection further
○ better algorithms to calculate the threshold
○ better ways to detect shapes through contours
● barcodes = built with wood / opaque plastic → reduce reflections
● compare interaction techniques with original system
● test system in real-life scenario → inside company or startup
53
Thanks!
Any questions?
images from pixabay.com & papers by Bähr

More Related Content

PDF
NTU Workshop: 03 What Is The Distributed Manufacturing Scenario
PDF
Privacy by Design workshop for Developers - School for Computer Science (HBO-...
PDF
Speck&Tech 3 - The Right to be Forgotten
PDF
DwesaBot - a Telegram bot for Africa
PDF
EIT Alumni Startup Days 2016 - Lisbon
PDF
HutteBot - Smart & Sustainable tourism
PDF
MonMaps: Let your friends see the city through your eyes
PDF
BarWin - Social and interactive gaming
NTU Workshop: 03 What Is The Distributed Manufacturing Scenario
Privacy by Design workshop for Developers - School for Computer Science (HBO-...
Speck&Tech 3 - The Right to be Forgotten
DwesaBot - a Telegram bot for Africa
EIT Alumni Startup Days 2016 - Lisbon
HutteBot - Smart & Sustainable tourism
MonMaps: Let your friends see the city through your eyes
BarWin - Social and interactive gaming

Viewers also liked (20)

PDF
EyeCity - a Data-Driven Decision Maker
PDF
FrancigenR - Plan your trip, meet like-minded travellers
PDF
Enhancing the interaction space of a tabletop computing system to design pape...
PDF
A tabletop system to paper-prototype for mobile applications
PPTX
Pietismus und Aufklärung
PDF
Heart Activity and ECG
PDF
Applications of Emotions Recognition
PDF
Lo sportello online del comune di trento - Trento Smart City Week
PDF
SpeakerRank - Find the right speaker
PDF
Smart & sustainable transportation for everyone by chatbots
PDF
Smart trento ver_finale_23marzo2013
PDF
GaZIR: Gaze-based Zooming Interface for Image Retrieval
PDF
Thetha - Open Transparency
PDF
Game consoles market
PDF
Il relativismo in Pirandello e LOST
PDF
Avenue Verte - Green Way
PDF
Car shark - A new paradigm for car sharing
PDF
The EIT Digital Alumni Community
PDF
Speck & Tech vs. EIT Digital
PDF
PageLinker: Integrating Contextual Bookmarks within a Browser
EyeCity - a Data-Driven Decision Maker
FrancigenR - Plan your trip, meet like-minded travellers
Enhancing the interaction space of a tabletop computing system to design pape...
A tabletop system to paper-prototype for mobile applications
Pietismus und Aufklärung
Heart Activity and ECG
Applications of Emotions Recognition
Lo sportello online del comune di trento - Trento Smart City Week
SpeakerRank - Find the right speaker
Smart & sustainable transportation for everyone by chatbots
Smart trento ver_finale_23marzo2013
GaZIR: Gaze-based Zooming Interface for Image Retrieval
Thetha - Open Transparency
Game consoles market
Il relativismo in Pirandello e LOST
Avenue Verte - Green Way
Car shark - A new paradigm for car sharing
The EIT Digital Alumni Community
Speck & Tech vs. EIT Digital
PageLinker: Integrating Contextual Bookmarks within a Browser
Ad

Similar to Enhancing the interaction space of a tabletop computing system to design paper prototypes for mobile applications (20)

PPTX
Tech4goodPGH – Rapid Prototyping Workshop
PDF
Decoding design(ers) tinkerform
PPTX
UI and UX for Mobile Developers
PPT
UCD and low-fidelity prototyping
PDF
The Glass Class at AWE 2015
PDF
How the i pad has changed us
PDF
User Experience 101 - A Practical Guide
PDF
SVC UX class - Prototyping + Testing
PDF
Trending Time on Google Glass - see what everyone's buzzing about
PPTX
EIA2019Italy - Digital Prototyping/Usability Testing - Mari-Liis Kärsten & Ka...
PDF
APIA2018 - Zahra Tashakorinia - Design Hacks & Paper Prototyping
PDF
Accessibility & Universal Design
PDF
App Development with Swift, by Apple
PDF
Open creative toolkit_apps_3
PDF
Open /// Creative Toolkit
PPTX
體驗劇場_1050503_W11_原型設計_楊政達
PPTX
Teaching Your Client Android Design, or, Don't Be An iPhoney
PPTX
VR Overview with L2D
PDF
Designing Mobile Interfaces - Goodpatch Workshop SF
PPTX
EIA Porto Paper Prototyping 2023 Rick R.
Tech4goodPGH – Rapid Prototyping Workshop
Decoding design(ers) tinkerform
UI and UX for Mobile Developers
UCD and low-fidelity prototyping
The Glass Class at AWE 2015
How the i pad has changed us
User Experience 101 - A Practical Guide
SVC UX class - Prototyping + Testing
Trending Time on Google Glass - see what everyone's buzzing about
EIA2019Italy - Digital Prototyping/Usability Testing - Mari-Liis Kärsten & Ka...
APIA2018 - Zahra Tashakorinia - Design Hacks & Paper Prototyping
Accessibility & Universal Design
App Development with Swift, by Apple
Open creative toolkit_apps_3
Open /// Creative Toolkit
體驗劇場_1050503_W11_原型設計_楊政達
Teaching Your Client Android Design, or, Don't Be An iPhoney
VR Overview with L2D
Designing Mobile Interfaces - Goodpatch Workshop SF
EIA Porto Paper Prototyping 2023 Rick R.
Ad

More from Francesco Bonadiman (13)

PDF
5 choices that changed my life
PDF
SpazioDati at the Speck&Tech Retreat
PDF
The history of Speck&Tech
PDF
The first 1000 days of Speck&Tech
PDF
Digital Single Market - Trento
PDF
BenVeneto - the adVeneture
PDF
Design Thinking Workshop for Smart Cities
PDF
Cattolica Go - InsurTech
PDF
Sell me anything *
PDF
Artusi Learning - Blended course
PDF
The Right to be Forgotten
PDF
Privacy and ethical issues in Biometric Systems
PDF
Frequency-Place-Transformation
5 choices that changed my life
SpazioDati at the Speck&Tech Retreat
The history of Speck&Tech
The first 1000 days of Speck&Tech
Digital Single Market - Trento
BenVeneto - the adVeneture
Design Thinking Workshop for Smart Cities
Cattolica Go - InsurTech
Sell me anything *
Artusi Learning - Blended course
The Right to be Forgotten
Privacy and ethical issues in Biometric Systems
Frequency-Place-Transformation

Recently uploaded (20)

PDF
KodekX | Application Modernization Development
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PPTX
Programs and apps: productivity, graphics, security and other tools
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Empathic Computing: Creating Shared Understanding
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Approach and Philosophy of On baking technology
KodekX | Application Modernization Development
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Programs and apps: productivity, graphics, security and other tools
Digital-Transformation-Roadmap-for-Companies.pptx
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Unlocking AI with Model Context Protocol (MCP)
Advanced methodologies resolving dimensionality complications for autism neur...
Encapsulation_ Review paper, used for researhc scholars
Reach Out and Touch Someone: Haptics and Empathic Computing
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
The Rise and Fall of 3GPP – Time for a Sabbatical?
Empathic Computing: Creating Shared Understanding
20250228 LYD VKU AI Blended-Learning.pptx
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
NewMind AI Weekly Chronicles - August'25 Week I
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Approach and Philosophy of On baking technology

Enhancing the interaction space of a tabletop computing system to design paper prototypes for mobile applications

  • 1. Enhancing the interaction space of a tabletop computing system to design paper prototypes for mobile applications Master Thesis Francesco Bonadiman 2016
  • 3. Blended Prototyping ● tabletop system based on hand-drawn paper sketches 2
  • 4. Blended Prototyping ● tabletop system based on hand-drawn paper sketches ○ converted into digital versions → projected on the table → expanded further 2
  • 5. ● tabletop system based on hand-drawn paper sketches ○ converted into digital versions → projected on the table → expanded further ○ transformed into testable applications → on a mobile device Blended Prototyping 2
  • 6. ● tabletop system based on hand-drawn paper sketches ○ converted into digital versions → projected on the table → expanded further ○ transformed into testable applications → on a mobile device ● enhances development of mobile applications ○ accelerates early design phases Blended Prototyping 2
  • 7. Hardware ● video projector ○ projects mobile frames & prototypes 3
  • 8. Hardware ● video projector ○ projects mobile frames & prototypes ● webcam ○ recognizes barcodes of screens 3
  • 9. Hardware ● video projector ○ projects mobile frames & prototypes ● webcam ○ recognizes barcodes of screens ● DSLR camera ○ shoots HQ pics of the sketches 3
  • 10. Hardware ● video projector ○ projects mobile frames & prototypes ● webcam ○ recognizes barcodes of screens ● DSLR camera ○ shoots HQ pics of the sketches ● tablet ○ allows to perform actions on the prototypes 3
  • 11. Software ● Java application ○ controls behavior of projector & cameras 4
  • 12. Software ● Java application ○ controls behavior of projector & cameras ○ identifies barcode markers on paper sheets 4
  • 13. Software ● Java application ○ controls behavior of projector & cameras ○ identifies barcode markers on paper sheets ○ digitize sketches & perform actions on them 4
  • 14. Software ● Java application ○ controls behavior of projector & cameras ○ identifies barcode markers on paper sheets ○ digitize sketches & perform actions on them ● Barcode (similar to QR-code) ○ wider → optimized for webcam ○ gives sheet’s position & rotation on table 4
  • 16. Advantages of Paper Prototyping ● potential of Paper-Prototyping & mobile devices = combined 6
  • 17. Advantages of Paper Prototyping ● potential of Paper-Prototyping & mobile devices = combined ● first impression of not-yet-developed product 6
  • 18. Advantages of Paper Prototyping ● potential of Paper-Prototyping & mobile devices = combined ● first impression of not-yet-developed product ● paper → cheap, fast & intuitive (Snyder) ○ no unimportant details → iterative refinement (Nielsen) 6
  • 19. Advantages of Paper Prototyping ● potential of Paper-Prototyping & mobile devices = combined ● first impression of not-yet-developed product ● paper → cheap, fast & intuitive (Snyder) ○ no unimportant details → iterative refinement (Nielsen) ○ quickly create multiple design alternatives (Landay) 6
  • 20. Advantages of Paper Prototyping ● potential of Paper-Prototyping & mobile devices = combined ● first impression of not-yet-developed product ● paper → cheap, fast & intuitive (Snyder) ○ no unimportant details → iterative refinement (Nielsen) ○ quickly create multiple design alternatives (Landay) ● same usability issues as Hi-Fi discovered ○ benefits of early usability data = 10+ times bigger (Snyder) 6
  • 21. ● encourages collaboration & critiques → interdisciplinary & creative (no “gaps”) Advantages of the System 7
  • 22. ● encourages collaboration & critiques → interdisciplinary & creative (no “gaps”) ● testing within real-life scenarios (de Sá / Carriço) → in thousand of different usage conditions Advantages of the System 7
  • 23. ● encourages collaboration & critiques → interdisciplinary & creative (no “gaps”) ● testing within real-life scenarios (de Sá / Carriço) → in thousand of different usage conditions ● add code & functionalities ○ define dynamic interface behavior ○ smooth transition to development Advantages of the System 7
  • 24. Core features Via TABLET → Determine widgets & semantics 8
  • 25. Core features Via TABLET → Determine widgets & semantics ● manually define “hotspots” on prototype ● turn these into different design patterns 8
  • 26. Core features Via TABLET → Determine widgets & semantics ● manually define “hotspots” on prototype ● turn these into different design patterns ● create links between the prototypes ● remove components & connections 8
  • 27. Core features Via TABLET → Determine widgets & semantics ● manually define “hotspots” on prototype ● turn these into different design patterns ● create links between the prototypes ● remove components & connections ● convert into working code 8
  • 28. Problem ● several devices + media → different degrees of fidelity ○ paper & office supplies → computers & smart objects 9
  • 29. Problem ● several devices + media → different degrees of fidelity ○ paper & office supplies → computers & smart objects BUT 9
  • 30. Problem ● several devices + media → different degrees of fidelity ○ paper & office supplies → computers & smart objects BUT Changing fidelity of tool & modality of interaction → disrupts creative design process 9
  • 31. Problem ● several devices + media → different degrees of fidelity ○ paper & office supplies → computers & smart objects BUT Changing fidelity of tool & modality of interaction → disrupts creative design process → shifting continuously confuses users 9
  • 32. Fidelity Clash ● combination of low- & high-tech solutions = puzzling 10
  • 33. Fidelity Clash ● combination of low- & high-tech solutions = puzzling ○ one single user stops ideation process → to perform any action 10
  • 34. Fidelity Clash ● combination of low- & high-tech solutions = puzzling ○ one single user stops ideation process → to perform any action ○ breaking of collaborative & creative moment (Snyder) 10
  • 35. Fidelity Clash ● combination of low- & high-tech solutions = puzzling ○ one single user stops ideation process → to perform any action ○ breaking of collaborative & creative moment (Snyder) ● users distracted → do not interact anymore ○ get unfocused → lose the “flow” 10
  • 36. Fidelity Clash ● combination of low- & high-tech solutions = puzzling ○ one single user stops ideation process → to perform any action ○ breaking of collaborative & creative moment (Snyder) ● users distracted → do not interact anymore ○ get unfocused → lose the “flow” → perceived as too technical, isolating & distracting 10
  • 37. Objectives ● replace high-tech interactions with low-tech approaches 11
  • 38. Objectives ● replace high-tech interactions with low-tech approaches ● new interaction techniques → not to shift fidelity of media 11
  • 39. Objectives ● replace high-tech interactions with low-tech approaches ● new interaction techniques → not to shift fidelity of media THEN 11
  • 40. Objectives ● replace high-tech interactions with low-tech approaches ● new interaction techniques → not to shift fidelity of media THEN ● find & implement alternative solutions to tablet 11
  • 41. Objectives ● replace high-tech interactions with low-tech approaches ● new interaction techniques → not to shift fidelity of media THEN ● find & implement alternative solutions to tablet ● low-tech approaches to define “hotspots” & perform actions ○ keep interaction techniques learnable & usable 11
  • 43. Tasks 1. digitize a screen by taking a picture 13
  • 44. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 13
  • 45. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 3. detect a component (button, image or textbox) 13
  • 46. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 3. detect a component (button, image or textbox) 4. connect two screens (link a button to next screen) 13
  • 47. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 3. detect a component (button, image or textbox) 4. connect two screens (link a button to next screen) 5. remove a component from a screen 13
  • 48. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 3. detect a component (button, image or textbox) 4. connect two screens (link a button to next screen) 5. remove a component from a screen 6. remove a connection between two screens 13
  • 49. Tasks 1. digitize a screen by taking a picture 2. duplicate a whole screen 3. detect a component (button, image or textbox) 4. connect two screens (link a button to next screen) 5. remove a component from a screen 6. remove a connection between two screens 7. remove a whole screen 13
  • 50. Alternatives considered ● Voice recognition ● Object recognition ● Gesture recognition ● Special pen ● Special button ● Colored objects ● Barcode recognition ● Transparent layers 14
  • 51. Alternatives considered ● Voice recognition ● Object recognition ● Gesture recognition ● Special pen ● Special button ● Colored objects ● Barcode recognition ● Transparent layers ● most approaches = unfeasible ○ noisy / confusing environment ○ need of not-yet-existing hardware ○ time constraints 14
  • 53. A. Color Detection ● no specific hardware ● dynamic & “playful” interaction 16
  • 54. A. Color Detection ● no specific hardware ● dynamic & “playful” interaction Concept: different color = different component 16
  • 55. A. Color Detection ● no specific hardware ● dynamic & “playful” interaction Concept: different color = different component → colored markers + semi-transparent BBPapier → users can see through without “ruining” sketches 16
  • 56. A. Color Detection ● no specific hardware ● dynamic & “playful” interaction Concept: different color = different component → colored markers + semi-transparent BBPapier → users can see through without “ruining” sketches → JavaCV & OpenCV 16
  • 57. → Procedure ● control card = “toolbox” → paint with markers ● same color → fill up the chosen component 17
  • 58. → Procedure ● control card = “toolbox” → paint with markers ● same color → fill up the chosen component ● digitize the sketch → creating threshold ○ average color in toolbox → range (+30/-30) ○ exterior pixels discarded → central values → avoid mistakes & colors overlapping 17
  • 59. → Procedure ● control card = “toolbox” → paint with markers ● same color → fill up the chosen component ● digitize the sketch → creating threshold ○ average color in toolbox → range (+30/-30) ○ exterior pixels discarded → central values → avoid mistakes & colors overlapping ● threshold = calculate component’s RGB value ● any pixel within → associated to specific UI component 17
  • 60. → Approximation Once pixels & colors detected ● algorithm recognizes contours of shape ● approximates it to a rectangle ● component created & projected 18
  • 61. → Approximation Once pixels & colors detected ● algorithm recognizes contours of shape ● approximates it to a rectangle ● component created & projected Need for filling component ● algorithm not efficient only for outline ● rough drawings typical of sketching 18
  • 62. B. Barcode Recognition Barcode marker = tool for specific operations 19
  • 63. B. Barcode Recognition Barcode marker = tool for specific operations ● take pictures of device ● connect screens ● recognize & delete components ● copy screens 19
  • 64. B. Barcode Recognition Barcode marker = tool for specific operations ● take pictures of device ● connect screens ● recognize & delete components ● copy screens Algorithm continuously running → checks if new barcodes are detected 19
  • 66. Further improvements Sidebar → alternative approach for some functionalities 21
  • 67. Further improvements Sidebar → alternative approach for some functionalities ● rectangular transparent area + at top of table ● if barcode placed inside → operation performed 21
  • 68. Further improvements Sidebar → alternative approach for some functionalities ● rectangular transparent area + at top of table ● if barcode placed inside → operation performed Clock (progress indicator) → added onto center-top area 21
  • 69. Further improvements Sidebar → alternative approach for some functionalities ● rectangular transparent area + at top of table ● if barcode placed inside → operation performed Clock (progress indicator) → added onto center-top area ● immediate visual feedback to users ● if appears → users realize they are performing an action 21
  • 70. Further improvements Sidebar → alternative approach for some functionalities ● rectangular transparent area + at top of table ● if barcode placed inside → operation performed Clock (progress indicator) → added onto center-top area ● immediate visual feedback to users ● if appears → users realize they are performing an action ● 3 seconds = time-frame users might change their mind → effective to avoid accidental mistakes 21
  • 72. Modality A: Sidebar ● digitize a screen → taking a picture ○ by placing single barcode inside 23
  • 73. Modality A: Sidebar ● digitize a screen → taking a picture ○ by placing single barcode inside ● duplicate a screen ○ by placing two barcodes inside ○ one is already digitized (source) ○ one is empty (destination) 23
  • 74. Modality A: Colors ● toolbox projected → boxes for button, image, textbox… 24
  • 75. Modality A: Colors ● toolbox projected → boxes for button, image, textbox… ● BBPapier over toolbox → paint with color ● same color → BBPapier over component 24
  • 76. Modality A: Colors ● toolbox projected → boxes for button, image, textbox… ● BBPapier over toolbox → paint with color ● same color → BBPapier over component ● digitize screen with toolbox visible on table → colored square displayed to represent component 24
  • 77. Modality A: Colors To connect two screens ● same + use FROM → TO box To delete component → digitize again ● physically remove colored BBPapier ● or cover it with white paper To remove connection ● same with button-connector 25
  • 78. Modality B ● camera tool to digitize ● copy tool to duplicate ● handles to detect component ● arrow tool to connect screens ● rubber tool to delete ○ component ○ connection ○ screen (only way) 26
  • 79. Dependent Variables ● measure overall success of new interaction techniques ○ if quick & easy-to-use without distracting user ○ which of the two modalities = most effective 27
  • 80. Dependent Variables ● measure overall success of new interaction techniques ○ if quick & easy-to-use without distracting user ○ which of the two modalities = most effective ● prove if efficient & offers satisfactory user-experience 27
  • 81. Dependent Variables ● measure overall success of new interaction techniques ○ if quick & easy-to-use without distracting user ○ which of the two modalities = most effective ● prove if efficient & offers satisfactory user-experience 4 parameters: → Quickness → Ease-of-use → Distraction → User-Experience 27
  • 82. Quickness & Ease-of-Use Quickness ● time for task completion ● evaluation is videotaped 28
  • 83. Quickness & Ease-of-Use Quickness ● time for task completion ● evaluation is videotaped ● Effective Time ○ duration of task → without system errors → avoids malfunctionings 28
  • 84. Quickness & Ease-of-Use Quickness ● time for task completion ● evaluation is videotaped ● Effective Time ○ duration of task → without system errors → avoids malfunctionings Ease-of-use ● % successfully completed tasks ● ratio calculated by checking interactions executed perfectly 28
  • 85. Quickness & Ease-of-Use Quickness ● time for task completion ● evaluation is videotaped ● Effective Time ○ duration of task → without system errors → avoids malfunctionings Ease-of-use ● % successfully completed tasks ● ratio calculated by checking interactions executed perfectly ○ users make no mistakes ○ no significant problems ○ not need any help / hint 28
  • 86. Distraction Two factors: quickness & workload index (RTLX) ● unweighted (Raw) version of NASA Task Load Index (TLX) ● 6 subscales: ○ Mental, Physical, Temporal Demand; Performance; Effort; Frustration 29
  • 87. Distraction Two factors: quickness & workload index (RTLX) ● unweighted (Raw) version of NASA Task Load Index (TLX) ● 6 subscales: ○ Mental, Physical, Temporal Demand; Performance; Effort; Frustration ● own survey → filled in by users after every task ● 7-points linear scales → then averaged = RTLX ● the lower → the less demanding & distracting task is 29
  • 88. User-Experience ● dimensions of AttrakDiff Survey → studies by Hassenzahl ○ “how users rate the usability and design of your interactive product” 30
  • 89. User-Experience ● dimensions of AttrakDiff Survey → studies by Hassenzahl ○ “how users rate the usability and design of your interactive product” ● own custom version → filled in after finishing whole study ● 28 7-points semantic differential scales ○ opposite adjectives at both poles ("good - bad" / "human - technical") ○ implicitly divided into 4 dimensions (Pragmatic Quality, Hedonic Quality - Identity, Hedonic Quality - Stimulation, Attractiveness) 30
  • 90. Independent Variables ● Task: 1 to 7 ● Modality: A | B ● Group: AB | BA ● Area of expertise: HCI | IT | Other ● Sketching familiarity: 1 to 5 ● Mobile familiarity: 1 to 5 ● Tabletop use: Yes | No ● Lighting conditions: Day | Night 31
  • 91. Experiment Design ● two different design solutions → A/B Testing ● “within-subject” design → every user tests both versions ○ no interactions in common → no influence 32
  • 92. Experiment Design ● two different design solutions → A/B Testing ● “within-subject” design → every user tests both versions ○ no interactions in common → no influence ● to avoid possible bias = AB & BA groups 32
  • 93. User Demographics ● one user at a time → focus on interactions ○ easier to observe & more difficult to get biased 33
  • 94. User Demographics ● one user at a time → focus on interactions ○ easier to observe & more difficult to get biased ● 24 participants → 11 females & 13 males ● half aged 18-24, 10 aged 25-34, two older ● 7 background in HCI, 8 IT, rest other expertise ● via Facebook groups & personal connections ● reward = tasty gift or house utensils 33
  • 95. Analysis of the Results 34
  • 96. Results ● reviewed videos from GoPro ● investigate how long took users to complete tasks ● detect when they had troubles & needed help 35
  • 97. Results ● reviewed videos from GoPro ● investigate how long took users to complete tasks ● detect when they had troubles & needed help ● SPSS → ONE-WAY ANOVA test ○ ideal for type of Dependent Variables (continuous) ○ valid for any group of users of study ○ robust to violations in underlying assumptions 35
  • 98. Quickness ● Modality B faster than Modality A ○ time for A = 40.1s, B only 23.25s ○ statistically significant (p < .05) 36
  • 99. Quickness ● Modality B faster than Modality A ○ time for A = 40.1s, B only 23.25s ○ statistically significant (p < .05) ● Effective Time (without system errors) ○ time for A = 29.24s, B only 18.38s ○ statistically significant (p < .05) 36
  • 100. Quickness ● Modality B faster than Modality A ○ time for A = 40.1s, B only 23.25s ○ statistically significant (p < .05) ● Effective Time (without system errors) ○ time for A = 29.24s, B only 18.38s ○ statistically significant (p < .05) → color detection = long process 36
  • 101. Ease-of-use ● Modality A easier to use than B 37
  • 102. Ease-of-use ● Modality A easier to use than B ● perfectly accomplished tasks: ○ ratio: A = 84%, B = 60% ○ statistically significant (p < .05) 37
  • 103. Distraction ● A almost as demanding as B ○ A RTLX index = 1.90 ○ B RTLX index = 1.84 38
  • 104. Distraction ● A almost as demanding as B ○ A RTLX index = 1.90 ○ B RTLX index = 1.84 ● On a 7-points scale → not statistically significant (F1,286 = 0.289, p = 0.591) 38
  • 105. User-Experience ● users feel “assisted by the product” ○ usability can be improved 39
  • 106. User-Experience ● users feel “assisted by the product” ○ usability can be improved ● users are “stimulated by the product” ○ users can identify with it ○ room for hedonic improvement 39
  • 107. User-Experience ● users feel “assisted by the product” ○ usability can be improved ● users are “stimulated by the product” ○ users can identify with it ○ room for hedonic improvement ● system considered as “rather desired” ○ attractiveness above average 39
  • 109. Task 1 Analysis ● sidebar (A) better ○ faster ○ easier to use ○ less demanding 41
  • 110. Task 1 Analysis ● sidebar (A) better ○ faster ○ easier to use ○ less demanding ● camera tool slower ○ no clue on how / where to place it 41
  • 111. Task 2 Analysis ● sidebar (A) ○ easier to use 42
  • 112. Task 2 Analysis ● sidebar (A) ○ easier to use ○ faster (only E.T.) ○ causes several system errors → see later 42
  • 113. Task 2 Analysis ● sidebar (A) ○ easier to use ○ faster (only E.T.) ○ causes several system errors → see later ● RTLX == copy tool 42
  • 114. Task 3 Analysis ● handles (B) ○ faster ○ less demanding ○ easier to use 43
  • 115. Task 3 Analysis ● handles (B) ○ faster ○ less demanding ○ easier to use ● color detection ○ 3 tasks in one ○ digitize + create 43
  • 116. Task 4 Analysis ● arrow (B) ○ faster ○ less demanding 44
  • 117. Task 4 Analysis ● arrow (B) ○ faster ○ less demanding ○ where? 44
  • 118. Task 4 Analysis ● arrow (B) ○ faster ○ less demanding ○ where? ● color detection ○ easier to use ○ create + connect 44
  • 119. Task 5 Analysis ● rubber (B) ○ faster (only E.T.) 45
  • 120. Task 5 Analysis ● rubber (B) ○ faster (only E.T.) ○ bug found! 45
  • 121. Task 5 Analysis ● rubber (B) ○ faster (only E.T.) ○ bug found! ● color detection (A) ○ easier to use ○ less demanding ○ more accurate 45
  • 122. Task 6 Analysis ● rubber (B) ○ faster ○ less demanding 46
  • 123. Task 6 Analysis ● rubber (B) ○ faster ○ less demanding ● color detection (A) ○ easier to use ○ experience? 46
  • 124. Interesting facts ● people with background in HCI → performed more easily & faster ● good sketching abilities → same + found system more attractive 47
  • 125. Interesting facts ● people with background in HCI → performed more easily & faster ● good sketching abilities → same + found system more attractive ● people familiar with mobile devices → faster + system less distracting 47
  • 126. Interesting facts ● people with background in HCI → performed more easily & faster ● good sketching abilities → same + found system more attractive ● people familiar with mobile devices → faster + system less distracting ● during day-time → better performance & lower RTLX 47
  • 127. Considerations ● accuracy = ratio of task’s precision ○ global accuracy = 86% ○ color detection = 67% → more delicate + error-prone 48
  • 128. Considerations ● accuracy = ratio of task’s precision ○ global accuracy = 86% ○ color detection = 67% → more delicate + error-prone ● barcodes + hands into digitized image → distraction + inexperience 48
  • 129. Considerations ● accuracy = ratio of task’s precision ○ global accuracy = 86% ○ color detection = 67% → more delicate + error-prone ● barcodes + hands into digitized image → distraction + inexperience ● if BBPapier wavy → grey area projected ○ projector’s brightness → shadow (“ghost shape”) 48
  • 130. Projector’s beam ● when toolbox at sides of projection → outer areas appear darker ○ either no color is detected ○ “fake” grey color recognized → multiple components created 49
  • 131. Projector’s beam ● when toolbox at sides of projection → outer areas appear darker ○ either no color is detected ○ “fake” grey color recognized → multiple components created ● most system errors ○ excessively intense light causing reflections on table 49
  • 132. Projector’s beam ● when toolbox at sides of projection → outer areas appear darker ○ either no color is detected ○ “fake” grey color recognized → multiple components created ● most system errors ○ excessively intense light causing reflections on table → closest barcodes could not be detected → devices continuously refreshing 49
  • 133. Projector’s beam ● when toolbox at sides of projection → outer areas appear darker ○ either no color is detected ○ “fake” grey color recognized → multiple components created ● most system errors ○ excessively intense light causing reflections on table → closest barcodes could not be detected → devices continuously refreshing ○ “effective time” → task completed if interaction is correct 49
  • 136. Which better? ● color detection (A) ○ easier to use ○ takes longer time 51
  • 137. Which better? ● color detection (A) ○ easier to use ○ takes longer time ● barcode recognition (B) ○ definitely faster ○ harder to understand 51
  • 138. Which better? ● color detection (A) ○ easier to use ○ takes longer time ● barcode recognition (B) ○ definitely faster ○ harder to understand ● both have similar workload ○ around 1.9 out of 7 ○ positively low → not too distracting 51
  • 139. Ideal implementation 1-2. sidebar → digitize & duplicate screens 52
  • 140. Ideal implementation 1-2. sidebar → digitize & duplicate screens 2. copy tool too → duplicate without moving papers 52
  • 141. Ideal implementation 1-2. sidebar → digitize & duplicate screens 2. copy tool too → duplicate without moving papers 3-4. both approaches to detect components & create connections a. color detection = from scratch + multiple tasks at same time 52
  • 142. Ideal implementation 1-2. sidebar → digitize & duplicate screens 2. copy tool too → duplicate without moving papers 3-4. both approaches to detect components & create connections a. color detection = from scratch + multiple tasks at same time b. handles & arrow = fast & easy-to-use 52
  • 143. Ideal implementation 1-2. sidebar → digitize & duplicate screens 2. copy tool too → duplicate without moving papers 3-4. both approaches to detect components & create connections a. color detection = from scratch + multiple tasks at same time b. handles & arrow = fast & easy-to-use 5-6-7. rubber tool to remove components, connections & screens → implement separate ones to avoid possible mistakes 52
  • 144. Future Work ● visual feedback when projections disappear from table 53
  • 145. Future Work ● visual feedback when projections disappear from table ● icon near clock → tells which action is being performed 53
  • 146. Future Work ● visual feedback when projections disappear from table ● icon near clock → tells which action is being performed ● exploring color detection further ○ better algorithms to calculate the threshold ○ better ways to detect shapes through contours 53
  • 147. Future Work ● visual feedback when projections disappear from table ● icon near clock → tells which action is being performed ● exploring color detection further ○ better algorithms to calculate the threshold ○ better ways to detect shapes through contours ● barcodes = built with wood / opaque plastic → reduce reflections 53
  • 148. Future Work ● visual feedback when projections disappear from table ● icon near clock → tells which action is being performed ● exploring color detection further ○ better algorithms to calculate the threshold ○ better ways to detect shapes through contours ● barcodes = built with wood / opaque plastic → reduce reflections ● compare interaction techniques with original system 53
  • 149. Future Work ● visual feedback when projections disappear from table ● icon near clock → tells which action is being performed ● exploring color detection further ○ better algorithms to calculate the threshold ○ better ways to detect shapes through contours ● barcodes = built with wood / opaque plastic → reduce reflections ● compare interaction techniques with original system ● test system in real-life scenario → inside company or startup 53
  • 150. Thanks! Any questions? images from pixabay.com & papers by Bähr