A Comprehensive Quality Evaluation of Security and Privacy Advice on the Web
Elissa M. Redmiles, Noel Warford, Amritha Jayanti, and Aravind Koneru, University of Maryland; Sean Kross, University of California, San Diego; Miraida Morales, Rutgers University; Rock Stevens and Michelle L. Mazurek, University of Maryland
Distinguished Paper Award Winner
End users learn defensive security behaviors from a variety of channels, including a plethora of security advice given in online articles. A great deal of effort is devoted to getting users to follow this advice. Surprisingly then, little is known about the quality of this advice: Is it comprehensible? Is it actionable? Is it effective? To answer these questions, we first conduct a large-scale, user-driven measurement study to identify 374 unique recommended behaviors contained within 1,264 documents of online security and privacy advice. Second, we develop and validate measurement approaches for evaluating the quality -- comprehensibility, perceived actionability, and perceived efficacy -- of security advice. Third, we deploy these measurement approaches to evaluate the 374 unique pieces of security advice in a user-study with 1,586 users and 41 professional security experts. Our results suggest a crisis of advice prioritization. The majority of advice is perceived by the most users to be at least somewhat actionable, and somewhat comprehensible. Yet, both users and experts struggle to prioritize this advice. For example, experts perceive 89% of the hundreds of studied behaviors as being effective, and identify 118 of them as being among the "top 5" things users should do, leaving end-users on their own to prioritize and take action to protect themselves.
View the full USENIX Security '20 program at https://www.usenix.org/conference/usenixsecurity20/technical-sessions