Seamful security (and privacy)
Giving room to reflect on one's privacy needs

Table of Contents

There are no universal "problems" in privacy

On the topic of bringing broader notions of "design" to "privacy by design" Wong2019… There's an inclination, however implicit, among usable security and privacy toward seamlessness. In their 2007 NSPW paper, Security Automation Considered Harmful?, Edwards et al critique usable security's desire to "remove… security decisions from the hands of users," asking, "in what cases does automation actually create more problems than it promises to solve?” Edwards2007 1.

Their frame of "problems" that have "solutions," however, remains universalizing. Not everyone has the same security or privacy needs Pierce2018, yet usable systems "seamlessly" make provisions to protect users from those problems the designers imagine users facing. By nature of the seamlessness, users may be unaware what security provisions are being made, why, or what consequences

Unfortunately, users sometimes must take agency over their own security and privacy. Designers to foresee all users' vulnerabilities across varying contexts of use and re-use. An ongoing challenge for design is to strategically reconfigure user agency to allow users opportunities to reflect on, and make decisions about, their personal security and privacy needs.

One door here is to uncover the "seams," the unexpected revelations of underlying complexity, in secure software. Rather than analyzing these seams as "usability problems" to be "solved," we can view them as design assets—"strategic revalations" that prompt users to think about security, and the complementary agencies of designer and user in caring for security through the design process.

Some research ideas

The general idea is to analyze some existing seams, talking to people who encounter them (and perform the "artful work" of bridging them). This empirical work should help spur some design guidelines, which should in turn motivate an some intentional design intervention.

  1. Analyze intentional seams in Qubes OS, a "reasonably secure" (if highly unfriendly) operating system. For example, you can't copy and paste between security domans—on purpose! You can't fullscreen apps (what if the apps pretend to be the OS)? What do these seems achieve, and how do they help users reflect or adapt to their specific vulnerabilities?
  2. Analyze seams in the supposedly "usable" Signal messenger, or Tor browser. For example in Signal, you can't backup or restore messages when you change devices. How do users experience losing messages, and how do they reflect on it? You have to load messages individually when the app starts, which can take a while. How do users reflect on needing to wait for their messages to decrypt?
  3. Find existing, user-introduced seams. For example, using DuckDuckGo to search for certain things; intentionally signing out of Facebook after each use; open a new browser window after each use. (I, for one, block stuff in my hosts file to protect me from checking email neurotically). What folk theories of security come up in these intentional seams?

To me, (1) and (2) seem the most promising for generating design guidelines. In (3), we're probably going to get a lot of folk theories about security, which could be an interesting follow-up paper.

How I developed the idea

Seamful design (as a concept)

the term seamful design was coined by none other than Mark Weiser, the progenitor of "seamless" ubicomp systems. from Chalmers2003:

Weiser describes seamlessness as a misleading or misguided concept. In his invited talks to UIST94 and USENIX95 he suggested that making things seamless amounts to making everything the same, reducing components, tools and systems to their "lowest common denominator". 2

as its supposed opposite, we have 'seamfulness.' Inman and Ribes, 2019 present the common view on this binary Inman2019:

Roughly, seamless design emphasizes clarity, simplicity, ease of use, and consistency to facilitate technological interaction. In contrast, seamful design emphasizes configurability, user appropriation, and revelation of complexity, ambiguity, or inconsistency.

they're able to trouble this dichotomy. here are the most important points:

1. Design always combines seamful and seamless elements

To be seamless does not mean to be invisible, but to be compatible, mundane, interoperable […] central to seamful design is revelation […] but there is no system that is wholly revealing, wholly configurable for appropriation […] design… must offer configurability to local circumstances while simultaneously presenting some form of parsimony

this sums up many issues in usable privacy and security!.

2. People adapt to seams. for example, driving a car or riding a bike becomes tacit knowledge. usable security researchers are used to the case of people ignoring security warnings.

3. Seams can be spatial, technological—or temporal. Inman and Ribes connect seamfulness to slow technology, and Sengers' work on time

4. Seams allow agency only differentially

user "agency" has always been a large part of the debate around seams. however, Inman and Ribes point out that seams can increase or decrease agency

…a highly configurable technology, adaptable to unforeseen circumstances and novel uses, revealing of its complex internal computations, transformations, limits and boundaries, too is enabling at times and for some… Contrary to what some authors have asserted, increased or decreased agency is not the bailiwick of either seamful nor seamless design.

indeed! who gets to take advantage of seams in, for example, UNIX, when piping text between bash commands?

Seamful securtiy

remember, according to Inman and Ribes, seams are never permanent (they become tacit), nor complete (designed systems always provide both seams and seamlessness). introducing seams can, at least temporarily, produce room for reflection or "revelation."

can we can 'exploit' (sorry) the fact that seams are bounded in space and time to present users with an opportunity to reflect on their security and privacy needs? from these seams, perhaps we can prompt users to make decisions that will impact their security and privacy decisions going forward.

one existing example of seamful security are phishing campaigns! these intentional seams give users (and their managers) a chance to reflect on

Introducing security seams as a design intervention   ARCHIVE

Footnotes:

1

usable security folks would almost certainly argue that we encounter seams when we, to name one example, ignore or apply software updates. but do these seams allow differentially vulnerable users to reflect upon, perhaps for the first time, their personal security and privacy needs?

2

we can probably argue about what Weiser meant by "reducing [things] to their 'lowest common denominator.'" i'm inclined to look at this activity as potentially fruitful.

Bibliography

  • [Wong2019] Wong & Mulligan, Bringing Design to the Privacy Table, CHI '19, 1-17 (2019). doi.
  • [Edwards2007] Edwards, Poole & Stoll, Security automation considered harmful?, Proceedings New Security Paradigms Workshop, 33-42 (2007). doi.
  • [Pierce2018] Pierce, Fox, Merrill & Wong, Differential Vulnerabilities and a Diversity of Tactics: What Toolkits Teach Us About Cybersecurity, in in: Proceedings of the 2019 ACM Conference on Computer Supported Cooperative Work (CSCW '18), edited by (2018)
  • [Chalmers2003] Chalmers, Seamful design and ubicomp infrastructure, Proc. Ubicomp 2003 Workshop At The Crossroads: The Interacton of HCI and Systems Issues in Ubicomp., 4 (2003). link.
  • [Inman2019] Inman & Ribes, "Beautiful Seams": Strategic Revelations and Concealments, , 1-14 (2019). doi.

Author: ffff

Created: 2020-03-16 Mon 10:26

Validate