Back to Results
First PageMeta Content
Human–computer interaction / Design / Technology / Technical communication / Software testing / CrowdFlower / Human factors / Utest / Amazon Mechanical Turk / Usability / Human–computer interaction / Crowdsourcing


Crowdsourcing for Usability Testing Di Liu, Matthew Lease, Rebecca Kuipers, and Randolph Bias School of Information University of Texas at Austin {ericliu, ml, rkuipers, rbias}@ischool.utexas.edu
Add to Reading List

Document Date: 2012-03-08 20:01:06


Open Document

File Size: 314,39 KB

Share Result on Facebook

City

San Francisco / Holbrook / Seattle / Dallas / New York / FAQ / Huang / Vancouver / Hoboken / /

Company

Dell / Amazon / CrowdFlower / ACM Press / oDesk / Wiley and Sons Inc. / Hyperionics LLC / User/Machine Systems / uTest / /

Country

United States / Canada / India / /

Currency

USD / /

/

Facility

Information University of Texas / New York University / /

IndustryTerm

targeted demographic testing / prior internet experience / search box / target web site / online crowdsource platform / Test web sites / online crowd workers / screen capture software / rent equipment / online website / Online Surveys / online crowdsourcing platform / Internet age / predecessor web site / /

Organization

New York University / mTurk commission / University of Texas at Austin / Society for Technical Communication Summit / CrowdFlower commission / Gold Unit / Randolph Bias School / Association for Computational Linguistics / /

Person

Rebecca Kuipers / CHI EA / Problems Identified Lab / Mayhew / Morgan Kaufmann / Problems Identified Lab Usability / Schroeder / Matthew Lease / /

Position

D. J. / D.J. / Mechanical Turk Worker / professor / designer / Major / representative / /

ProvinceOrState

British Columbia / New York / South Carolina / M. B. / /

PublishedMedium

Computational Linguistics / /

Technology

Information Technology / screen capture / laptop computer / operating systems / /

URL

http /

SocialTag