When asked to build web pages, LLMs found to include manipulative design practices
A team of computer scientists at Technical University of Darmstadt, working with a colleague from the University of Glasgow, and another from Humbold University of Berlin, has found evidence via experiments they ran, that when asked to build a web page, LLMs often include manipulative design practices. The group has posted their research on the arXiv preprint server.
Prior studies have shown that many web developers use what are known as “dark patterns” as a way to manipulate visitors into doing things, or to avoid doing other things, while on a website. One example would be making the color of a button asking the user to subscribe or to buy something bright and inviting, while using a dark or even a gray color for a button that will end a subscription.
In this new study, the researchers, noting that LLMs have matured to the point that they can be prompted to design a web page, wanted to know if they would use such practices in their designs. To find out, they ran an experiment that involved asking 20 study participants to ask ChatGPT to design a web page that could serve as an e-commerce site. Each was also asked to use “neutral” language when telling the LLM what they were looking for in a design.
The research team found that every single web page generated by the LLM used dark patterns as part of their design. Examples included urgency messages, manipulative highlighting and fake documents. Perhaps most concerning were the fake reviews generated by the LLM. Also, they noted that there was just one instance of a word of warning to users about anything on the site. The researchers repeated the experiment with several other LLMs and found similar results.
The researchers conclude by suggesting their findings highlight the degree to which web pages generated by humans use dark patterns to manipulate visitors to their site—the LLMs learned their skills from them, after all. This finding, they suggest, shows that the practice has become normalized. They suggest that users demand regulation of web pages to prevent such manipulation in the future.
More information:
Veronika Krauß et al, “Create a Fear of Missing Out” — ChatGPT Implements Unsolicited Deceptive Designs in Generated Websites Without Warning, arXiv (2024). DOI: 10.48550/arxiv.2411.03108
arXiv
© 2024 Science X Network
Citation:
When asked to build web pages, LLMs found to include manipulative design practices (2024, November 28)
retrieved 29 November 2024
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
link