This document describes a methodology for crowdsourcing the assessment of Linked Data quality using both Linked Data experts and Amazon Mechanical Turk workers. It presents research on detecting three types of quality issues in DBpedia data via crowdsourcing: incorrect/incomplete object values, incorrect data types, and incorrect outlinks. The methodology uses a two-phase approach, with experts identifying issues in the "Find" phase and workers verifying issues in the "Verify" phase via microtasks. The results indicate that crowdsourcing is effective for detecting quality issues and that experts and workers are suited for different tasks based on required skills.
Related topics: