Moneyworks Data Analysis
Recommendations
A. Continuerefining existing systems
1. Spend more time & resources increasing timestamp
consistency & accuracy.
2. Establish data analysis/data mining weekly needs to
inform future app upgrades.
B. Maintain as much homogeneity between testers as
possible.
1. Onboard successfully.
a) Establishtechnical difficulties point of contactin
initial email.
b) Stress needto keepapp on during the whole day.
2. Stay interactive.
a) Starbucks check-in:awesome!Fun suggestions
keeppeople engaged.
b) Ask with eachnew rollout what people have been
thinking.
Data References
On multiple accounts, user and system data did not always match up.
 MW was set out to measure activity every 15 seconds. It cannot be determined whether there is a
specific system glitch or user behavior breaking this parameter but there are spans as large as 3
hours 29 minutes (Jonathan Morales, 7/30) and small as 4 seconds (Timothy Prentice, 08/05,
23:17:02 to 23:17:06) between measurements. When assessing user/system accuracy, small time
intervals are very important. Jonathan Erickson’s times seemed the most consistently small
(every 9-10 seconds), while even users without possible user behavior interference show
inconsistent measurement times (Sammy Puga ranges from 9 seconds to 2 minutes and 15
seconds). A possible explanation could be with Android’s confidence levels on user activity.
Further analysis is warranted.
 As data analyses continue, analysts can focus on specifities. Being the first analysis, MW can
gather a general sense of how user and system data are generally merging. Going forward, MW
can decide if it wants to focus further on studying accuracy of specific times (breakfast, lunch,
weekends, Mondays) and/or activities (still, still/walking, driving). Boiling down specific data
times/user activities will enable a good foundation for future MW user trend reports.
 Of the six users (Jonathan Erickson, Jonathan Morales, Mark Shepherd, Michael Argano, Sammy
Puga, and Timothy Prentice), data cannot be analyzed for two. While MW is successfully
receiving Michael Argano’s system data, there is either a technical difficulty or the user is not
submitting his user logs. Mark Shepherd is just the opposite: the user data is coming in fine but
there is a dirth of system data. I recommend having a check-in after 3 days with every new tester
to make sure everything is functioning properly.
Next Steps
1. Clean up analysis sheets to minimize excess system timestamps
2. Follow up with Mark Shepherd and Michael Argano to see if we can help somehow with data
collection.
3. Ask all testers to keep app running in the background throughout the day.
4. Dig into the data! Study confidence levels with MW status versus actual status.

More Related Content

DOCX
Crowdsourcing predictors of behavioral outcomes
DOCX
Crowdsourcing predictors of behavioral outcomes
PDF
Data Science Methodology for Analytics and Solution Implementation
PDF
Shrink Link Presentation
DOCX
Spring 2016
DOCX
IEEE 2014 JAVA DATA MINING PROJECTS Searching dimension incomplete databases
PPTX
Plantbase
PPTX
A Predictive Analytics Primer
Crowdsourcing predictors of behavioral outcomes
Crowdsourcing predictors of behavioral outcomes
Data Science Methodology for Analytics and Solution Implementation
Shrink Link Presentation
Spring 2016
IEEE 2014 JAVA DATA MINING PROJECTS Searching dimension incomplete databases
Plantbase
A Predictive Analytics Primer

What's hot (9)

DOCX
on false data-injection attacks against power system state estimation modelin...
PDF
10 Year Impact Award Presentation - Duplicate Bug Reports Considered Harmful ...
PPTX
Machine Learning
PDF
CAPI _TRIPS_SMS
PDF
Statistics in Journalism
PPTX
A Predictive Analytics by tom davenport.ppt
PPTX
ERS Project PPT
PPTX
quantitative marketing techniques
PDF
Emotion Sense: From Design to Deployment
on false data-injection attacks against power system state estimation modelin...
10 Year Impact Award Presentation - Duplicate Bug Reports Considered Harmful ...
Machine Learning
CAPI _TRIPS_SMS
Statistics in Journalism
A Predictive Analytics by tom davenport.ppt
ERS Project PPT
quantitative marketing techniques
Emotion Sense: From Design to Deployment
Ad

Similar to 08.07 Moneyworks Data Analysis (20)

PPTX
PEX Week: iDatix Workshop Part 3
PPTX
7 Tips from Siemens Energy for Success with Automation
PDF
Process Mining in Internal Audit - CZ use case
PDF
Winter Simulation Conference 2021 - Process Wind Tunnel Talk
PDF
TRI-1-Case Studies in Improving TRIRIGA Application Performance
DOCX
Business Effectiveness Assignment 2 Report
PDF
Process wind tunnel - A novel capability for data-driven business process imp...
PPTX
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
PPTX
Test case design techniques
PPTX
Test case design techniques
PPTX
Six Sigma Process Improvement Foundational Steps
PPTX
Leveraging Your Security System to Impact Your Bottom line
PPT
Rebuilding Trust in the User Centered Design Process, IA Summit 2-28-04
PPTX
Towards Task Analysis Tool Support
PPTX
Ch4 Performance metrics
PPTX
Ibm innovate 2013_presentation
PPT
Work Measurement and Operational Effectiveness
PDF
Enterprise system implementation strategies and phases
PDF
Moving the Needle with UX Benchmarking
PDF
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
PEX Week: iDatix Workshop Part 3
7 Tips from Siemens Energy for Success with Automation
Process Mining in Internal Audit - CZ use case
Winter Simulation Conference 2021 - Process Wind Tunnel Talk
TRI-1-Case Studies in Improving TRIRIGA Application Performance
Business Effectiveness Assignment 2 Report
Process wind tunnel - A novel capability for data-driven business process imp...
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
Test case design techniques
Test case design techniques
Six Sigma Process Improvement Foundational Steps
Leveraging Your Security System to Impact Your Bottom line
Rebuilding Trust in the User Centered Design Process, IA Summit 2-28-04
Towards Task Analysis Tool Support
Ch4 Performance metrics
Ibm innovate 2013_presentation
Work Measurement and Operational Effectiveness
Enterprise system implementation strategies and phases
Moving the Needle with UX Benchmarking
TEST_AUTOMATION_CASE_STUDY_(2)2[1]
Ad

08.07 Moneyworks Data Analysis

  • 1. Moneyworks Data Analysis Recommendations A. Continuerefining existing systems 1. Spend more time & resources increasing timestamp consistency & accuracy. 2. Establish data analysis/data mining weekly needs to inform future app upgrades. B. Maintain as much homogeneity between testers as possible. 1. Onboard successfully. a) Establishtechnical difficulties point of contactin initial email. b) Stress needto keepapp on during the whole day. 2. Stay interactive. a) Starbucks check-in:awesome!Fun suggestions keeppeople engaged. b) Ask with eachnew rollout what people have been thinking.
  • 2. Data References On multiple accounts, user and system data did not always match up.  MW was set out to measure activity every 15 seconds. It cannot be determined whether there is a specific system glitch or user behavior breaking this parameter but there are spans as large as 3 hours 29 minutes (Jonathan Morales, 7/30) and small as 4 seconds (Timothy Prentice, 08/05, 23:17:02 to 23:17:06) between measurements. When assessing user/system accuracy, small time intervals are very important. Jonathan Erickson’s times seemed the most consistently small (every 9-10 seconds), while even users without possible user behavior interference show inconsistent measurement times (Sammy Puga ranges from 9 seconds to 2 minutes and 15 seconds). A possible explanation could be with Android’s confidence levels on user activity. Further analysis is warranted.  As data analyses continue, analysts can focus on specifities. Being the first analysis, MW can gather a general sense of how user and system data are generally merging. Going forward, MW can decide if it wants to focus further on studying accuracy of specific times (breakfast, lunch, weekends, Mondays) and/or activities (still, still/walking, driving). Boiling down specific data times/user activities will enable a good foundation for future MW user trend reports.  Of the six users (Jonathan Erickson, Jonathan Morales, Mark Shepherd, Michael Argano, Sammy Puga, and Timothy Prentice), data cannot be analyzed for two. While MW is successfully receiving Michael Argano’s system data, there is either a technical difficulty or the user is not submitting his user logs. Mark Shepherd is just the opposite: the user data is coming in fine but there is a dirth of system data. I recommend having a check-in after 3 days with every new tester to make sure everything is functioning properly. Next Steps 1. Clean up analysis sheets to minimize excess system timestamps 2. Follow up with Mark Shepherd and Michael Argano to see if we can help somehow with data collection. 3. Ask all testers to keep app running in the background throughout the day. 4. Dig into the data! Study confidence levels with MW status versus actual status.