1. The document discusses using crowdsourcing platforms like Mechanical Turk for conducting user studies and collecting data for human-computer interaction (HCI) research.
2. It describes experiments where crowdsourced workers provided ratings of Wikipedia articles that correlated reasonably well with expert ratings, with some initial issues around gaming that were addressed through task design changes.
3. It provides tips for using crowdsourcing effectively for HCI research, such as using verifiable questions to ensure quality, balancing objective and subjective tasks, and considering different incentive mechanisms.