The human element is often considered the weakest element in security. Although many kinds of humans interact with systems that are designed to be secure, one particular type of human is especially important, the security and privacy information workers who develop, use, and manipulate privacy and security-related information and data as a significant part of their jobs. Security/privacy information workers include:
This workshop aims to develop and stimulate discussion about security information workers. We will consider topics including but not limited to:
May 3, 2021 11:00 – 12:30 Eastern Standard Time
Zoom link: TBD
11:00 - 11:10 | Welcoming Remarks |
11:10 – 11:35 | An overview of cyber security skills in the UK labour market Steven Furnell, University of Nottingham |
Abstract | This talk will present a summary of a recent report addressing cyber security skills in the UK labour market, based upon research conducted by Ipsos MORI and published by the Department for Digital, Culture, Media and Sport. Key findings from the study will be outlined, which include current areas in which cyber skills gaps are identified, and the types of roles that are impacted. Attention will also be given to the approaches that are used to address these skills gaps, focusing upon recruitment, training and outsourcing as potential solutions. Additionally, various related challenges, such as dealing with staff turnover and ensuring diversity will also be highlighted. |
11:35 – 12:00 | Demystifying Cyber Deception: Measuring Effectiveness of Game-Theoretic Cyber Deception Strategies Faris Kokulu and Sukwha Kyung, Arizona State University |
Abstract | Cyber deception tools and techniques empowered by game theory have continuously evolved, enabling network defenders to analyze adversarial behavior and increase attack cost. However, prior works on game-theoretic deceptive models lack empirical validation and verification in terms of security and usability. Even though various game theoretic cyber deception strategies are proven to be effective through mathematical and simulation-based analysis, there still exist doubts about their effectiveness in networks with actual human attackers, mainly because they often contain case-specific or even impractical assumptions. The goal of our work is to perform empirical evaluation on the effectiveness of these game theoretic cyber deception strategies to analyze their practicality and effectiveness. To that end, we design and perform a set of experiments involving professional cybersecurity experts in various computer network environments with different categories of game-theoretic deception strategies. In this talk, we present the categorization of different game-theoretic models, our design for the experiments. We also demonstrate how we analyze the collected data in both quantitative and qualitative ways to measure the effectiveness of different deceptive strategies. |
12:00 – 12:25 | Deciding on Personalised Ads: Nudging Developers About User Privacy Alisa Frik, International Computer Science Institute and University of California Berkeley |
Abstract | Mobile advertising networks present personalised advertisements to developers as a way to increase revenue, these types of ads use data about users to select potentially more relevant content, but the choice framing also impacts developers' decisions which in turn impacts their users' privacy. Currently, ad networks provide choices in developer-facing dashboards that control the types of information collected by the ad network as well as how users will be asked for consent. Framing and nudging have been shown to impact users' choices about privacy, we anticipate that they have a similar impact on choices made by developers. We conducted a survey-based online experiment with 400 participants with experience in mobile app development. Across six conditions, we varied the choice framing of options around ad personalisation. Participants in the condition where privacy consequences of ads personalisation are highlighted in the options are significantly (11.06 times) more likely to choose non-personalised ads compared to participants in the Control condition with no information about privacy. Participants' choices of an ad type are driven by impact on revenue, user privacy, and relevance to users. Our findings suggest that developers are impacted by interfaces and need transparent options. |
12:25 - 12:30 | Closing Remarks |