TTC Labs - Spotlight Interview: Dr. Jennifer King on Dark Pattern Design
News

Spotlight Interview: Dr. Jennifer King on Dark Pattern Design

Peter Tanham

Peter Tanham

Data Policy Manager, Meta & TTC Labs

Dr. Jennifer King has been working at the sharp end of data privacy since the days of dial-up modems. In many ways she’s a trailblazer. One of the first to combine a commercial background and academia to champion digital privacy.

Right now, much of her work centres around dark patterns; how policy makers and UX designers can combat them.

“In many ways I was ahead of my time,” she says. “I had decided to undertake a PhD after seven years working in industry, and fully expected to go back to the business world. It was my supervisor who suggested I stay in the public sector. I’ve always been interested in free speech, governance and privacy, and here I’m able to focus on how that aligns to UX and policy.”

Dr Jen King

Some Things Never Change

‘Here’ is Stanford University, where Jennifer is the Privacy and Data Policy Fellow at the The Stanford Institute for Human Centred Artificial Intelligence

“I began working at start-ups and companies like Yahoo in the days of Web 1.0,” she says. “So I saw some of the earliest privacy issues close up. What’s striking in this age of AI is that many of those challenges are the same.” 

One of those same things is, of course, dark patterns.

Dark Patterns Are Older Than You Think

Dark patterns lead users into making decisions that they may not have decided upon naturally. And while they could be seen as a purely digital issue, they’re actually the 30-year result of three trends. The first is deceptive retail practices, the second the behavioural economics idea of ‘nudging, and the third growth hacking. 

When it comes to modern dark patterns, you can find them anywhere, like sales countdown clocks, hidden opt out buttons or being forced to link a phone number to a social media app. Sometimes they are accidental, a developer rushing to build what works rather than what’s right, but other times they are purposefully deceptive. 

“Some of the worst are those where an entire business model depends on a dark pattern for its revenue,” says Jennifer. “I’ve worked with the Federal Trade Commission in the US on several of these cases; one had a pre-selection check box that was hidden at the bottom of a very long page. It was almost impossible to find, but once they were told to remove it, the business went bust.”  

Designers Can’t Make a Stand on Their Own

But dark patterns are not only seen in dubious page design. Perhaps unsurprisingly, they’re increasingly found in algorithms. 

“The original work that Harry Brignull did in identifying dark patterns was important, but that was 15 years ago now,” says Jennifer. “In the last five years we’ve done more work, especially around algorithmic manipulation and UI’s that are driven by AI.” 

The big question of course, is what can be done where there is no deceptive intent. When a dark pattern is introduced accidentally by a designer.

“Education is part of the issue,” she continues. “But designers are not the source of power in a company. It’s difficult for them to take an ethical stand as they would probably be out of a job. They’re professional, but not professionalised, there’s no licence you need to be a designer, or consistent education. I think there needs to be a professional group that comes up with a code of ethics, which would then allow designers to point to a decision and say ‘what you are asking me to do is not cool’.”

Regulation Could Be Around the Corner

So, if designers are hamstrung, what about the people who lead companies and make the final decisions? 

"Unfortunately I think it is more about sticks than carrots,” says Jennifer. “Designers usually answer to the engineering, product and marketing teams, and they’re all looking for things like high conversions. So at the minimum you need guidance, while at the maximum you need regulation. In-house lawyers need to be able to step in and say ‘hey, this is risky’, but global regulators have not laid down what’s permissible and what’s not.”

Will it happen? Jennifer thinks yes. 

“I’m not sure how though,” she adds. “Perhaps as part of a bigger piece of legislation. There are two acts in California, for instance, that contain dark pattern provisions. And so does the EU’s proposed Digital Services Act.”  

The Fightback Heats Up

To wrap up, while dark patterns are an ongoing issue, what’s also obvious from talking to Jennifer is the wider interest there is in dealing with them.

“At Stanford we’ve just taken over the Dark Patterns Tip Line, which allows the public to report any examples they find,” she says. “We also run a Policy Lab for our undergrad and graduate students, who will work to develop clear guidance for businesses and regulators. It’s the first of its kind in the world and we’re seeing great interest from across the university’s student population in being involved.” 

For more information about the Dark Pattern Tip Line, visit: https://darkpatternstipline.org

Peter Tanham

Peter Tanham

Data Policy Manager, Meta & TTC Labs

Peter is a Data Policy Manager at Meta and TTC Labs, based in Dublin. Before joining Meta he ran an analytics company and worked on transparency advocacy and political campaigning.

TTC Labs is a cross-industry effort to create innovative design solutions that put people in control of their privacy.

Initiated and supported by Meta, and built on collaboration, the movement has grown to include hundreds of organisations, including major global businesses, startups, civic organisations and academic institutions.