The human element is often considered the weakest element in security. Although many kinds of humans interact with systems that are designed to be secure, one particular type of human is especially important, the security and privacy information workers who develop, use, and manipulate privacy and security-related information and data as a significant part of their jobs. Security/privacy information workers include:
This workshop aims to develop and stimulate discussion about security information workers. We will consider topics including but not limited to:
Successful submissions to this workshop will explicitly be informed by an understanding of how security/privacy information workers do their jobs, and the results will explicitly address how we understand these workers.
Time | |
---|---|
09:00 – 09:15 | Welcome and workshop agenda |
09:15 – 09:35 | How a Security Adoption Process Model Might Differ for Information Workers
Cori Faklaris (University of North Carolina at Charlotte)
Abstract: Given the behavioral complexities of cybersecurity, information security
workers would benefit from a model of security practice adoption that is tailored to
the organizational context. In this workshop paper, I describe my research to date to
synthesize a preliminary model for U.S. internet users aged 18 or above. I then pose
a series of open questions as to how this preliminary model may need to be
modified for the security information work context. The discussion will inform
follow-up work to examine the process of security information behavior adoption at
the level of system users’ policy compliance and workarounds, at the level of senior
administrators’ configuration of security affordances for critical infrastructure, and
at the level of information workers’ interaction with others in the social role of a
tech helper. |
09:35 – 09:55 | Measuring the Effectiveness of U.S. Government Security Awareness Programs: A Mixed-Methods Study
Jody Jacobs, Julie Haney, Susanne Furman (National Institute of Standards and
Technology)
Abstract: The goal of organizational security awareness programs is to positively
influence employee security behaviors. However, organizations may struggle to
determine program effectiveness, often relying on training policy compliance
metrics (training completion rates) rather than measuring actual impact. Few
studies have begun to discover approaches and challenges to measuring security
awareness program effectiveness, particularly within compliance-focused sectors
such as the U.S. government. To address this gap, we conducted a mixed-methods
research study that leveraged both focus group and survey methodologies focused
on U.S. government organizations. We discovered that organizations do indeed
place emphasis on compliance metrics and are challenged in determining other
ways to gauge success. Our results can inform guidance and other initiatives to aid
organizations in measuring the effectiveness of their security awareness programs.
While the research focused on the U.S. government, our findings may also have
implications for other sectors and countries. |
09:55 – 10:05 | Break |
10:05 – 11:05 |
Keynote Christian Dameff, Medical Director of Cybersecurity, University of California San Diego |
11:05 – 11:15 | Break |
11:15 – 11:35 | WiP: Where’s Eve? Evaluating Student Threat Modeling Performance and Perceptions
Carson Powers, Nickolas Gravel, Maxwell Mitchell, Daniel Votipka (Tufts University)
Abstract: The process of identifying threats and developing mitigation
strategies — referred to as Threat Modeling (TM) — is an important step in the early
phases of secure system development. Despite being highly recommended and
sometimes required by federal regulation, there has been limited work investigating
developers’ ability to perform this task. In particular, we focus on students to
understand how well prepared they are upon entering the workforce and guide
future work to improve education in this domain. To answer this question, we
conducted preliminary semi-structured interviews asking students to complete a TM
exercise while describing their thought process aloud. Our initial results indicate
students struggle to identify technically detailed threats and that the concept of
repudiation is particularly confusing to students. We conclude with
recommendations to guide future work. |
11:35 – 11:55 | Industry Perspectives on Offensive Security Tooling
Kaitlyn DeValk, Matthew Gwilliam, Thomas Hanson, Michael Harrity, Michelle
Mazurek (University of Maryland, College Park)
Abstract: The proliferation and public release of offensive security tooling is a hotly
contested topic within the information security industry. Perspectives online vary
significantly, and often it seems that the most extreme voices are the ones which
garner the highest visibility. We believe it is important to study the general
perspectives of professionals within the industry on this topic, not just those with
preexisting public platforms. With this aim, we conducted a pilot interview study of
eight security professionals to gain novel insight into thoughts and opinions from
across the industry. We performed qualitative analysis to distill our results into
themes. We used these themes, and our process, to make recommendations for
future work surrounding this discourse. |
11:55 – 12:25 | Birds of a Feather |
12:25 – 12:30 | Closing remarks |
We solicit papers describing new research contributions in this area as well as case studies, work in progress, preliminary results, novel ideas, and position papers. Papers should be at most six pages (excluding references) using the SOUPS template format (MS Word or LaTeX).
Submissions should be fully anonymized. Submissions may be made at https://wsiw2022.usenix.hotcrp.com/.
A word about paper length. Papers should be succinct, but thorough in presenting the work. Typical papers will be 5-6 pages long (plus references) but papers can be shorter (e.g. 2-3 pages) if, for example, they present an novel idea with limited preliminary results or a position likely to drive a lively discussion. Shorter, more focused papers are encouraged and will be reviewed like any other paper. If you only need 2 or 4 pages (plus references) to clearly explain your work or idea, please submit a paper of that length. Reviewers will be instructed to assess the value of the talk to the workshop audience irrespective of the paper length; however, we stress again that the presentation should be sufficiently thorough for reviewers to make this evaluation.
Workshop papers will be made available to attendees prior to the workshop. However, they will not appear in the official SOUPS proceedings. Paper presentations will be approximately 10-12 minutes in length followed by 5 minutes of questions and answers. Presentations must be made in-person; remote presentations may be accomodated.
The workshop will feature a keynote talk and paper presentations, as well as breakout sessions to provide an opportunity for smaller group interactive discussion about related topics of interest, which may include methods, challenges, and future directions in security information workers research.
The deadline for submissions is May 26 June 2 (extended) 23:59 AoE (Anywhere on Earth).
You can find out more at http://security-information-workers.org/
or by emailing soups22-wsiw@usenix.org.
WSIW'22 will happen in-person. Registration costs $75 and can be completed through the same mechanism as registering for SOUPS.
Workshop paper submission deadline | |
Workshop paper acceptance notification to authors | Thursday, June 9, 2022 |
Workshop camera-ready papers due | Thursday, June 23, 2022 |
Workshop | Sunday, August 7, 2022. 09.00 am - 01:00 pm |