SlideShare a Scribd company logo
Authors:
Nada Sherief, Nan Jiang, Mahmood Hosseini, Keith Phalp, Raian Ali
{nsherief, njiang, mhosseini, kphalp, rali} @bournemouth.ac.uk
Presented by:
Nada Sherief
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 2
Introduction(1/2)
 Crowdsourcing is a new form of problem solving, which is typically
online and relies on a large number of people.
 Crowdsourcing is a flexible business model and it requires a less strict
recruitment and contracting process. This does not mean that the
process should be open without limits.
 Crowdsourcing is typically used for non-critical tasks and tasks which
naturally require input from the general public and where the right
answer is based on people’s acceptance and dynamics.
13 May 2014 EASE'14 | London, UK 3
Introduction(2/2)
 The stakeholders of software evaluation include users who utilize the
software to reach their needs and expectations, i.e. requirements.
 Users’ acceptance and efficient use of the software is a main subject of
evaluation.
 In a dynamic world, users’ acceptance needs to be captured throughout
the life time of the software to stay up-to-date to communicate their
perception of the role and quality of software at runtime.
 The ultimate goal of this proposal is to maximize the efficiency of
software evaluation and its scalability to cater for complex and highly
variable systems where the power of the crowd, when systematically
coupled, could become a great aid.
13 May 2014 EASE'14 | London, UK 4
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 5
Crowdsourcing Evaluation(1/2)
 Traditional evaluation methods, which is mainly a design-time activity,
have a range of limitations including the following:
 Limitations in predicting and simulating the actual context of use.
 The unstructured and varied ways in which users provide their
feedback typically in a natural language.
 Capturing the opinion of only an elite group of users.
13 May 2014 EASE'14 | London, UK 6
Crowdsourcing Evaluation(2/2)
 Crowdsourced evaluation has various potential benefits in comparison
to centralized approaches:
 An ability to evaluate software while users are using it in practice
(i.e. at runtime).
 The access to a large crowd enables fast and scalable evaluation.
 An ability to maintain the evaluation knowledge up-to-date
 An access to a wider and diverse set of users and contexts of use that
are unpredictable by designers at design time
 Enabling users to introduce new quality attributes and
requirements.
13 May 2014 EASE'14 | London, UK 7
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 8
Related Work(1/3)
 There are several established approaches where the role of users is
central, such as: User centered design, User Experience, Agile
methodology, and Usability Testing.
 Recently, more work has been directed towards inventing systematic
methods for representing and obtaining users’ feedback.
 Processes for continuous and context-aware user input were
proposed.
 An empirical study on users’ involvement for the purpose of
software evaluation and evolution has been conducted.
 The crowd feedback was also advocated for shaping software
adaptation.
13 May 2014 EASE'14 | London, UK 9
Related Work(2/3)
 In general, when designing an empirical study in software engineering,
engaging the necessary type of participants and appropriate number is
always a challenge.
 Researchers are often required to perform trade-offs to be able to
perform the study.
 The crowdsourcing paradigm and the mTurk platform have been used
in evaluating the usability of a school’s website.
 The advantages are claimed to be more participants’ involvement, low
cost, high speed, and various users’ backgrounds.
 The disadvantages include lower quality feedback, less interactions,
more spammers, less focused user groups.
13 May 2014 EASE'14 | London, UK 10
Related Work(3/3)
 Most of the existing studies in crowdsourcing in general, and those
exploiting the paradigm for software evaluation, advocate the use of
the paradigm and use commercial platforms, such as mTurk
(https://www.mturk.com/).
 The literature is still limited in providing engineering approaches and
foundations to develop crowdsourcing platforms for software
evaluation.
 Currently, the design and conduct of feedback acquisition are heavily
reliant on developers’ creativity.
 We still need to investigate and devise systematic approaches when
designing feedback requests and aid developers with proper tools.
13 May 2014 EASE'14 | London, UK 11
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 12
 We took an empirical approach by conducting a multi-session focus
group study, which is a popular technique of qualitative research in
software engineering .
 Both junior and senior software developers were invited to join the first
session where the emphasis of this session was to:
 understand how software developers normally gather user feedback
 how they think a good feedback should be structured
 how they collaborate and communicate with users in the
development as this could inform the way we design feedback
requests
Method(1/3)
12 May 2014 EASE'14| London, UK 13
 The second session was conducted with regular software users who are
used to provide feedback.
 The emphasis of this session was to:
 explore the ways that users would like feedback requests to look like
 what drives them to provide feedback
 their concerns for not getting involved enough and also for being
involved more than what they expect
 This session was also used to investigate their motivations to take
part in projects and learn their experience from that participation
Method(2/3)
12 May 2014 EASE'14| London, UK 14
Method(3/3)
 A total of 15 volunteers, 8 males and 7 females aged between 18 and 40,
were invited to participate in the focus group study. There were 8
participants in the first session and 7 participants in the second
session.
 These participants mainly came from Egypt and UK with various
backgrounds ranging from management, student, research and IT and
had different experiences in using software and providing feedback.
 Most participants were already familiar with the notion of
crowdsourcing and they have used it in the past for simple tasks such
as collecting the notes for lectures, using programming forums to get
solutions for certain coding and debugging problems.
13 May 2014 EASE'14 | London, UK 15
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 16
Results(1/3)
13 May 2014 EASE'14 | London, UK 17
Thematic Area Theme Example
Subject Method “Snapshots, Text, or Audio”
Clarity “reach a clear problem specification”
Specificity “specific to the software’s features”
Structure Timing “real-time feedback”
Level of Detail “giving detailed feedbacks”
Measurement “a group of predefined keywords”
“structured in a specific way”
Specificity “give feedback to specific problems”
Results(2/3)
12 May 2014 EASE'14| London, UK 18
Results(3/3)
12 May 2014 EASE'14| London, UK 19
Thematic Area Theme Example
Engagement Recognition “a friendly confirmation for participation”
“more personalized options for feedback”
Value “it encourages users to give feedback if they can meet with
analysts to discuss problems in some ways”.
Channel “the interactions should be very simple”
Transparency “it would increase the users’ trust and willingness to give
feedback if they know the cycle of how their feedback will be
used”
Involvement Privacy “would like to stay anonymous”
“it is important if the user can control who is able to see his
feedback”.
Response “the software’s speed of response to my feedback affects my
willingness to give feedback”
Support “there can be videos to explain to the users what they can do (in
order to provide feedback)”
Rewards “the system should do some favor to the user who gave useful
feedback that helped enhance the system”
“trying new features or versions for free if the user gave good
feedback”
“users who gave useful feedback can be accredited to the use to
increase their reputations”
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 20
Research Challenges (1/2)
 Translation of evaluation metrics and users judgment to terms and
language which are perceivable by users.
 Metrics to decide the right crowd in terms of engagement and expertise
in the requirements and software functionalities to evaluate.
 Ensuring privacy when the evaluation is established as a collaborative
activity
 Aggregation of highly diverse feedback coming from large scale crowd
with different interests and expertise.
13 May 2014 EASE'14 | London, UK 21
Research Challenges (2/2)
 Balancing between user-friendliness of interaction methods and
precision of collected evaluation feedback.
 Balancing between reward mechanisms, which should incentivize the
crowd, and quality of provided feedback.
 Capture of the context of use in which the software was used and
evaluated to increase feedback expressiveness.
13 May 2014 EASE'14 | London, UK 22
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgements
13 May 2014 EASE'14 | London, UK 23
Conclusion
 This paper proposed a systematic development of crowdsourcing-based
solution to software evaluation.
 While the concept of crowdsourcing is shown to be promising
considering the increasing complexity and diversity of contexts for
current systems, there is still a lack of foundations on how to engineer
it and ensure correctness and maximize quality.
 This paper focused on the activity of interacting with users and getting
their feedback on software quality as one important step for a holistic
approach of crowdsourcing software evaluation.
13 May 2014 EASE'14 | London, UK 24
Agenda
 Introduction
 Crowdsourcing Evaluation
 Related Work
 Method
 Results
 Research Challenges
 Conclusion
 Acknowledgments
13 May 2014 EASE'14 | London, UK 25
Acknowledgements
 The research was supported by an FP7 Marie Curie CIG grant (the
SOCIAD Project) and by Bournemouth University – Fusion Investment
Fund (the BBB, BUUU and VolaComp projects) and the Graduate
School Santander Grant for PGR Development.
13 May 2014 EASE'14 | London, UK 26
13 May 2014 EASE'14 | London, UK 27
13 May 2014 EASE'14 | London, UK 28

More Related Content

PPTX
Software evaluation via users’ feedback at runtime
PDF
Generating a Domain Specific Inspection Evaluation Method through an Adaptive...
PPT
Rashid Kashani Virtual Worlds and Future of Medicine November 3 2016
PPT
Brunel opensourcing 1
PPTX
The UX Research Methodology
PPT
When Users Becom Collaborators: Towards Continuous and Context-Aware User Input
PPTX
The design of adaptive acquisition of users feedback an empirical study (rcis...
PPTX
Purdue U - Investigating Impact Entrepreneurship Edu on Engineering Students ...
Software evaluation via users’ feedback at runtime
Generating a Domain Specific Inspection Evaluation Method through an Adaptive...
Rashid Kashani Virtual Worlds and Future of Medicine November 3 2016
Brunel opensourcing 1
The UX Research Methodology
When Users Becom Collaborators: Towards Continuous and Context-Aware User Input
The design of adaptive acquisition of users feedback an empirical study (rcis...
Purdue U - Investigating Impact Entrepreneurship Edu on Engineering Students ...

What's hot (13)

PPT
Webinar Agile Presentation V.1.0
PPT
Master thesis presentation
PPT
Master thesis presentation
PDF
Crowdsource Testing presented by uTest
PDF
IRJET- Analysis of Question and Answering Recommendation System
PPTX
Heuristic Evaluation of Bentley.edu
PDF
User Experience Testing
PDF
2017-09-13 EC-TEL MOOQ Interactive Workshop Results
PPTX
Accessibility to mobile interfaces for older people
PDF
Implementing Crowdsourced Testing
PDF
Is Crowd Testing (relevant) for Software Engineers?
PPTX
Industry-academia collaborations in software engineering research: Experience...
PDF
Evaluating Problem Gambling KTE
Webinar Agile Presentation V.1.0
Master thesis presentation
Master thesis presentation
Crowdsource Testing presented by uTest
IRJET- Analysis of Question and Answering Recommendation System
Heuristic Evaluation of Bentley.edu
User Experience Testing
2017-09-13 EC-TEL MOOQ Interactive Workshop Results
Accessibility to mobile interfaces for older people
Implementing Crowdsourced Testing
Is Crowd Testing (relevant) for Software Engineers?
Industry-academia collaborations in software engineering research: Experience...
Evaluating Problem Gambling KTE
Ad

Similar to Crowdsourcing Software Evaluation (20)

PPTX
Crowd centric requirements engineering - ra
PPTX
Crowdsourcing
PPT
Socially-Driven Software Adaptation
PDF
Supersede overview presentation
PPTX
Evaluation in hci
PDF
Getting Started With UX Research
PPTX
evaluation -human computer interaction.pptx
PDF
Cloudppt Interact 2011
PDF
Leveraging User Research
PDF
Surveys in Software Engineering
PPTX
PhD proposal: Specialized heuristics for crowdsourcing website design
PDF
Adaptive software based feedback acquisition a personas-based design
PDF
Proposing a System to Support Crowdsourcing
PPTX
Choosing the Right UX Method
PDF
How to effectively implement different online research methods - UXPA 2015 - ...
PDF
How to Effectively Implement Different Online Research Techniques for Rapid U...
PDF
On to code review lessons learned at microsoft
PPTX
Hci evaluationa frame work lec 14
PDF
Crowdsourcing Documentation in Software Engineering
PPTX
User Research
Crowd centric requirements engineering - ra
Crowdsourcing
Socially-Driven Software Adaptation
Supersede overview presentation
Evaluation in hci
Getting Started With UX Research
evaluation -human computer interaction.pptx
Cloudppt Interact 2011
Leveraging User Research
Surveys in Software Engineering
PhD proposal: Specialized heuristics for crowdsourcing website design
Adaptive software based feedback acquisition a personas-based design
Proposing a System to Support Crowdsourcing
Choosing the Right UX Method
How to effectively implement different online research methods - UXPA 2015 - ...
How to Effectively Implement Different Online Research Techniques for Rapid U...
On to code review lessons learned at microsoft
Hci evaluationa frame work lec 14
Crowdsourcing Documentation in Software Engineering
User Research
Ad

More from Engineering and Social Informatics (ESOTICS) (17)

PPTX
Digital addiction and what you need to know
PDF
Pragmatic requirements for adaptive systems a goal driven modeling and analys...
PDF
Socially augmented software empowering software operation through social cont...
PDF
Mitigating circumstances in cyber crime
PPTX
The design of software based peer groups to combat digital addiction
PPTX
Crowdsourcing transparency requirements through structured feedback and social
PDF
Persuasive and culture aware feedback acquisition
PDF
A modelling language for transparency requirements in business information sy...
PDF
Modelling and analysing contextual failures for dependability requirements
PDF
Engineering software based motivation a persona-based approach
PDF
REfine a gamifiedplatform for participatory requirements engineering
PPTX
The Emerging Requirement for Digital Addiction Labels
PPTX
Gamification for volunteer cloud computing
PPTX
Towards a code of ethics for gamification at enterprise po em
PPTX
Consideration in software mediated social interaction
PPTX
PPTX
The Four Pillars of Crowdsourcing
Digital addiction and what you need to know
Pragmatic requirements for adaptive systems a goal driven modeling and analys...
Socially augmented software empowering software operation through social cont...
Mitigating circumstances in cyber crime
The design of software based peer groups to combat digital addiction
Crowdsourcing transparency requirements through structured feedback and social
Persuasive and culture aware feedback acquisition
A modelling language for transparency requirements in business information sy...
Modelling and analysing contextual failures for dependability requirements
Engineering software based motivation a persona-based approach
REfine a gamifiedplatform for participatory requirements engineering
The Emerging Requirement for Digital Addiction Labels
Gamification for volunteer cloud computing
Towards a code of ethics for gamification at enterprise po em
Consideration in software mediated social interaction
The Four Pillars of Crowdsourcing

Recently uploaded (20)

PDF
Wondershare Filmora 15 Crack With Activation Key [2025
PDF
Complete Guide to Website Development in Malaysia for SMEs
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PDF
iTop VPN 6.5.0 Crack + License Key 2025 (Premium Version)
PDF
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PPTX
Weekly report ppt - harsh dattuprasad patel.pptx
PDF
AutoCAD Professional Crack 2025 With License Key
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PPTX
Oracle Fusion HCM Cloud Demo for Beginners
PDF
iTop VPN Crack Latest Version Full Key 2025
PPTX
Operating system designcfffgfgggggggvggggggggg
PPTX
Patient Appointment Booking in Odoo with online payment
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
PPTX
assetexplorer- product-overview - presentation
PDF
How AI/LLM recommend to you ? GDG meetup 16 Aug by Fariman Guliev
PDF
Download FL Studio Crack Latest version 2025 ?
PPTX
Computer Software and OS of computer science of grade 11.pptx
PPTX
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
PDF
Nekopoi APK 2025 free lastest update
Wondershare Filmora 15 Crack With Activation Key [2025
Complete Guide to Website Development in Malaysia for SMEs
Design an Analysis of Algorithms I-SECS-1021-03
iTop VPN 6.5.0 Crack + License Key 2025 (Premium Version)
EN-Survey-Report-SAP-LeanIX-EA-Insights-2025.pdf
Odoo Companies in India – Driving Business Transformation.pdf
Weekly report ppt - harsh dattuprasad patel.pptx
AutoCAD Professional Crack 2025 With License Key
Design an Analysis of Algorithms II-SECS-1021-03
Oracle Fusion HCM Cloud Demo for Beginners
iTop VPN Crack Latest Version Full Key 2025
Operating system designcfffgfgggggggvggggggggg
Patient Appointment Booking in Odoo with online payment
wealthsignaloriginal-com-DS-text-... (1).pdf
assetexplorer- product-overview - presentation
How AI/LLM recommend to you ? GDG meetup 16 Aug by Fariman Guliev
Download FL Studio Crack Latest version 2025 ?
Computer Software and OS of computer science of grade 11.pptx
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
Nekopoi APK 2025 free lastest update

Crowdsourcing Software Evaluation

  • 1. Authors: Nada Sherief, Nan Jiang, Mahmood Hosseini, Keith Phalp, Raian Ali {nsherief, njiang, mhosseini, kphalp, rali} @bournemouth.ac.uk Presented by: Nada Sherief
  • 2. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 2
  • 3. Introduction(1/2)  Crowdsourcing is a new form of problem solving, which is typically online and relies on a large number of people.  Crowdsourcing is a flexible business model and it requires a less strict recruitment and contracting process. This does not mean that the process should be open without limits.  Crowdsourcing is typically used for non-critical tasks and tasks which naturally require input from the general public and where the right answer is based on people’s acceptance and dynamics. 13 May 2014 EASE'14 | London, UK 3
  • 4. Introduction(2/2)  The stakeholders of software evaluation include users who utilize the software to reach their needs and expectations, i.e. requirements.  Users’ acceptance and efficient use of the software is a main subject of evaluation.  In a dynamic world, users’ acceptance needs to be captured throughout the life time of the software to stay up-to-date to communicate their perception of the role and quality of software at runtime.  The ultimate goal of this proposal is to maximize the efficiency of software evaluation and its scalability to cater for complex and highly variable systems where the power of the crowd, when systematically coupled, could become a great aid. 13 May 2014 EASE'14 | London, UK 4
  • 5. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 5
  • 6. Crowdsourcing Evaluation(1/2)  Traditional evaluation methods, which is mainly a design-time activity, have a range of limitations including the following:  Limitations in predicting and simulating the actual context of use.  The unstructured and varied ways in which users provide their feedback typically in a natural language.  Capturing the opinion of only an elite group of users. 13 May 2014 EASE'14 | London, UK 6
  • 7. Crowdsourcing Evaluation(2/2)  Crowdsourced evaluation has various potential benefits in comparison to centralized approaches:  An ability to evaluate software while users are using it in practice (i.e. at runtime).  The access to a large crowd enables fast and scalable evaluation.  An ability to maintain the evaluation knowledge up-to-date  An access to a wider and diverse set of users and contexts of use that are unpredictable by designers at design time  Enabling users to introduce new quality attributes and requirements. 13 May 2014 EASE'14 | London, UK 7
  • 8. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 8
  • 9. Related Work(1/3)  There are several established approaches where the role of users is central, such as: User centered design, User Experience, Agile methodology, and Usability Testing.  Recently, more work has been directed towards inventing systematic methods for representing and obtaining users’ feedback.  Processes for continuous and context-aware user input were proposed.  An empirical study on users’ involvement for the purpose of software evaluation and evolution has been conducted.  The crowd feedback was also advocated for shaping software adaptation. 13 May 2014 EASE'14 | London, UK 9
  • 10. Related Work(2/3)  In general, when designing an empirical study in software engineering, engaging the necessary type of participants and appropriate number is always a challenge.  Researchers are often required to perform trade-offs to be able to perform the study.  The crowdsourcing paradigm and the mTurk platform have been used in evaluating the usability of a school’s website.  The advantages are claimed to be more participants’ involvement, low cost, high speed, and various users’ backgrounds.  The disadvantages include lower quality feedback, less interactions, more spammers, less focused user groups. 13 May 2014 EASE'14 | London, UK 10
  • 11. Related Work(3/3)  Most of the existing studies in crowdsourcing in general, and those exploiting the paradigm for software evaluation, advocate the use of the paradigm and use commercial platforms, such as mTurk (https://www.mturk.com/).  The literature is still limited in providing engineering approaches and foundations to develop crowdsourcing platforms for software evaluation.  Currently, the design and conduct of feedback acquisition are heavily reliant on developers’ creativity.  We still need to investigate and devise systematic approaches when designing feedback requests and aid developers with proper tools. 13 May 2014 EASE'14 | London, UK 11
  • 12. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 12
  • 13.  We took an empirical approach by conducting a multi-session focus group study, which is a popular technique of qualitative research in software engineering .  Both junior and senior software developers were invited to join the first session where the emphasis of this session was to:  understand how software developers normally gather user feedback  how they think a good feedback should be structured  how they collaborate and communicate with users in the development as this could inform the way we design feedback requests Method(1/3) 12 May 2014 EASE'14| London, UK 13
  • 14.  The second session was conducted with regular software users who are used to provide feedback.  The emphasis of this session was to:  explore the ways that users would like feedback requests to look like  what drives them to provide feedback  their concerns for not getting involved enough and also for being involved more than what they expect  This session was also used to investigate their motivations to take part in projects and learn their experience from that participation Method(2/3) 12 May 2014 EASE'14| London, UK 14
  • 15. Method(3/3)  A total of 15 volunteers, 8 males and 7 females aged between 18 and 40, were invited to participate in the focus group study. There were 8 participants in the first session and 7 participants in the second session.  These participants mainly came from Egypt and UK with various backgrounds ranging from management, student, research and IT and had different experiences in using software and providing feedback.  Most participants were already familiar with the notion of crowdsourcing and they have used it in the past for simple tasks such as collecting the notes for lectures, using programming forums to get solutions for certain coding and debugging problems. 13 May 2014 EASE'14 | London, UK 15
  • 16. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 16
  • 17. Results(1/3) 13 May 2014 EASE'14 | London, UK 17
  • 18. Thematic Area Theme Example Subject Method “Snapshots, Text, or Audio” Clarity “reach a clear problem specification” Specificity “specific to the software’s features” Structure Timing “real-time feedback” Level of Detail “giving detailed feedbacks” Measurement “a group of predefined keywords” “structured in a specific way” Specificity “give feedback to specific problems” Results(2/3) 12 May 2014 EASE'14| London, UK 18
  • 19. Results(3/3) 12 May 2014 EASE'14| London, UK 19 Thematic Area Theme Example Engagement Recognition “a friendly confirmation for participation” “more personalized options for feedback” Value “it encourages users to give feedback if they can meet with analysts to discuss problems in some ways”. Channel “the interactions should be very simple” Transparency “it would increase the users’ trust and willingness to give feedback if they know the cycle of how their feedback will be used” Involvement Privacy “would like to stay anonymous” “it is important if the user can control who is able to see his feedback”. Response “the software’s speed of response to my feedback affects my willingness to give feedback” Support “there can be videos to explain to the users what they can do (in order to provide feedback)” Rewards “the system should do some favor to the user who gave useful feedback that helped enhance the system” “trying new features or versions for free if the user gave good feedback” “users who gave useful feedback can be accredited to the use to increase their reputations”
  • 20. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 20
  • 21. Research Challenges (1/2)  Translation of evaluation metrics and users judgment to terms and language which are perceivable by users.  Metrics to decide the right crowd in terms of engagement and expertise in the requirements and software functionalities to evaluate.  Ensuring privacy when the evaluation is established as a collaborative activity  Aggregation of highly diverse feedback coming from large scale crowd with different interests and expertise. 13 May 2014 EASE'14 | London, UK 21
  • 22. Research Challenges (2/2)  Balancing between user-friendliness of interaction methods and precision of collected evaluation feedback.  Balancing between reward mechanisms, which should incentivize the crowd, and quality of provided feedback.  Capture of the context of use in which the software was used and evaluated to increase feedback expressiveness. 13 May 2014 EASE'14 | London, UK 22
  • 23. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgements 13 May 2014 EASE'14 | London, UK 23
  • 24. Conclusion  This paper proposed a systematic development of crowdsourcing-based solution to software evaluation.  While the concept of crowdsourcing is shown to be promising considering the increasing complexity and diversity of contexts for current systems, there is still a lack of foundations on how to engineer it and ensure correctness and maximize quality.  This paper focused on the activity of interacting with users and getting their feedback on software quality as one important step for a holistic approach of crowdsourcing software evaluation. 13 May 2014 EASE'14 | London, UK 24
  • 25. Agenda  Introduction  Crowdsourcing Evaluation  Related Work  Method  Results  Research Challenges  Conclusion  Acknowledgments 13 May 2014 EASE'14 | London, UK 25
  • 26. Acknowledgements  The research was supported by an FP7 Marie Curie CIG grant (the SOCIAD Project) and by Bournemouth University – Fusion Investment Fund (the BBB, BUUU and VolaComp projects) and the Graduate School Santander Grant for PGR Development. 13 May 2014 EASE'14 | London, UK 26
  • 27. 13 May 2014 EASE'14 | London, UK 27
  • 28. 13 May 2014 EASE'14 | London, UK 28