Ending AI theft.

In terms of endings, theft is a really frustrating and disturbing. Beyond the material theft of a product, like a bike or a phone, the theft of a personal or artistic endeavour is particularly upsetting. It is a common experience for artists across history, who have had to grapple with individuals copying their work, to businesses profiting from it. And once it happens, it is impossible to undo it. Any technical leap forward has to be grappled with legally, artistically, and emotionally. What should artist do with AI?

William Hogarth, the 18th century artist, had such problems having his work The Rakes Progress, plagiarised that he lobbied the British government for the Copyright Act in 1735, that then protected writers and artist. ⁠1

Legislation is always trying to keep up. What seems airtight is often beaten by a technical advancement. Types of new media often undermine laws that were drafted without coverage of these areas.

The internet created an enormous foundational challenges with piracy and lots of media being copied illegally. Spotify tried to use the new technology to work with music publishers and artist by streaming music. Although it was technically legal, many artist felt they were being exploited.

Cookie con

As the internet became more active as a marketing and shopping platform, web cookies emerged as a method of tracking consumer behaviour. Many people were woefully ignorant of what these pieces of code do, yet, they clicked the consent button. Legally agreeing to surveillance of everything they do on their supposedly “personal” computer. That data then being shared to thousands of marketing organisations. People with know how put cookie blockers on their machines to wrestle back control.

It is not theft if it is cool tech

Recently, Scarlett Johansson questioned OpenAI about the uncanny similarities to her voice of their AI system. The AI voice came out just after Sam Altman was refused by Johansson to use her voice. Obviously they just went ahead without her permission, because, well, massive tech bro ego.  They have since dropped the voice.⁠2

Many of the  30 million ⁠3 Adobe users don’t have the financial resources that Johansson does. Lots of them felt really disturbed by a recent change in the Terms and Conditions that suggested Adobe own their work. “Solely for the purposes of operating or improving the Services and Software, you grant us a non-exclusive, worldwide, royalty-free sublicensable, license, to use, reproduce, publicly display, distribute, modify, create derivative works based on, publicly perform, and translate the Content. For example, we may sublicense our right to the Content to our service providers or to other users to allow the Services and Software to operate with others, such as enabling you to share photos” Scary legal jargon for lots of people who use the Adobe products everyday for their work.

Adobe, like many AI companies have needed real world examples to train their models on. After some of these companies have been criticised for using peoples art without permission companies created opt-out lists so artist can make sure their art work is not included in the learning content.

Artists, and I think rightly, argued this puts the responsibility on the artist to say ‘don’t steel my work for your machine’. There have been many examples of these lists not being honoured by the platforms. So trust is unsurprisingly low.

Creatives are getting angry about the situation. One group have cleverly been poisoning their work for AI training algorithms. In the groups own words the Nightshade software “turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.”⁠4

Future boundaries

The ebb of legislation protects rights in society, as the new flow of technology breaks the legal boundaries. These transition points are historically common, but are always difficult. Often biasing the benefits of big businesses. But we have been here before. These are the types of counter movements that made way for Organic or Fair trade? Will future Fair Trade companies promise their AI was never trained on artist work? Or Organic ratings include a version of plagiarism scores. Or certification of ‘no artist damaged in the making of this AI’?

1 https://www.nationalgallery.org.uk/artists/william-hogarth#:~:text=His%20engravings%20were%20so%20plagiarised,paintings%20in%20the%20grand%20manner.

2 https://www.npr.org/2024/05/20/1252495087/openai-pulls-ai-voice-that-was-compared-to-scarlett-johansson-in-the-movie-her

3 https://techreport.com/statistics/software-web/adobe-statistics/#:~:text=In%202022%2C%20Adobe%20Inc.,other%20creative%20toolkits%20to%20businesses.

4 https://nightshade.cs.uchicago.edu/whatis.html

Joe Macleod
Joe Macleod has been working in the mobile design space since 1998 and has been involved in a pretty diverse range of projects. At Nokia he developed some of the most streamlined packaging in the world, he created a hack team to disrupt the corporate drone of powerpoint, produced mobile services for pregnant women in Africa and pioneered lighting behavior for millions of phones. For the last four years he has been helping to build the amazing design team at ustwo, with over 100 people in London and around 180 globally, and successfully building education initiatives on the back of the IncludeDesign campaign which launched in 2013. He has been researching Closure Experiences and there impact on industry for over 15 years.
www.mrmacleod.com
Previous
Previous

Can traceable plastic help avoid consumer shame?

Next
Next

The end of insurance.