Direkt zum Inhalt springen

Diskurs

Mittwoch, 19.04.2023

Joint statement on generative AI

Call for Safeguards Around Generative AI

Authors and Performers Call for Safeguards Around Generative AI in the European AI Act

WE ARE authors, performers and copyright holders from industries such as photography, journalism, music, film + TV, books and texts, illustration, fine arts, etc. We are represented by over 40 associations and trade unions that joined the Authors’ Rights Initiative. There are several million creatives like us in Europe; we all live from our creative activities in culture, the creative industries and media. Our work is the starting point for material and ideal value creation which we consider as an indispensable pilar of the European value and economic community.

WE WARN, together with partners from the book industry, the music industry, film industry, broadcasting and press, for which we are active: pillars of our society are under threat! AI-generated products intervene directly in social life; the immanent disinformation and manipulative potential of generative AI systems poses profound challenges to every individual and society as a whole. We share the concern, increasingly expressed even by AI experts, about a loss of control over such systems and the calls for legal restrains. It comes as a great surprise to us when some in politics nevertheless state "no need for action".

The envisaged European AI Act, which is entering the Trialogue these days, not only disregards our (copy-)rights, it is on its way to allow generative AI systems under minimum requirements that do not even do justice to the misuse of such systems and their social and economic implications that can already be observed today.

The output of AI systems depends on the input they are trained with; this includes texts, images, videos and other material from authors, performers and other copyright holders: Our entire digital repertoire serves training purposes, often without consent, without remuneration and not always for legitimate uses. The unauthorised usage of protected training material, its non-transparent processing, and the foreseeable substitution of the sources by the output of generative AI raise fundamental questions of accountability, liability and remuneration, which need to be addressed before irreversible harm occurs.

Our position paper analyses these developments and proposes practical and constructive solutions. As the last weeks have shown, any meaningful AI regulation requires specific guardrails for generative AI. Citizens and society must be at the centre of any policy decision.

WE DEMAND:

Regulation of generative Artificial Intelligence –

to limit foreseeable damage for European society, economy and culture.

Regulation NOW –

because the window of opportunity for effective regulation of market entry is about to close soon.

Kind regards

Your Creators and Performers

A. Executive Summary

We, the undersigned 43 guilds and unions, representing thousands of authors, performers and creative copyright holders from various industries are calling for effective regulatory measures to address the serious harm that can be caused if generative AI in placed on European markets at scale. While welcoming the latest proposals to include specific provisions on general-purpose AI, we do not think that they are anywhere near enough to protect the digital ecosystem and society at large.

In Chapter B. we submit a joint statement outlining the high risks that generative AI poses to society and why such risks call for particular legal safeguards in the AI Act. We find these observations confirmed by the first-hand experience of our members in their respective industries, as outlined in Chapter C.

Based on our analysis, we call upon all parties of the Trilogue - Parliament, Commission and Council - to consider adding the following amendments to the AI Act:

· Generative AI must be regulated across the entire product cycle, with particular focus on providers of foundation models (large language models and other large foundation models).

· The placing of such foundation models on European markets should be conditioned on providers demonstrating that they fulfil the following minimum requirements:

o full transparency about the training material used;

o sufficient resilience of the training material in terms of veracity, accuracy, objectivity and diversity, in particular documentation that an adequate share of the training material

- originates from European sources;

- originates from professional sources, as compared to user-generated or illegal content;

o evidence of a legal basis for the collection and use of the training material, both for personal data (under the GDPR) and non-personal data (under European Copyright Law); including on the adoption, implementation, and adherence to an effective and workable system for granular machine-readable communication of usage rights;

o liability for all content generated and disseminated by the AI, in particular for infringement of personal rights and copyrights, misinformation or discrimination;

o no algorithmic or other promotion of AI-generated content over human-generated content or defamation of the latter, and reasonable measures to prevent users’ overreliance on AI content;

o structural separation of generation of dissemination of AI output: providers of foundation model shall not simultaneously operate central platform services for the distribution of digital content as defined in the Digital Markets Act, in particular no search engines or social media;

o a minimum of continental compute infrastructure: the minimum share of the inference of generative AI systems should run on computing infrastructure located in Europe, with the share of domestic data processing increasing over time.

Adjusting the AI Act to today’s realities is the first indispensable step. As a second step we would call for a re-balancing of the interests in copyright law. In particular, it should be clarified that the text and data mining exceptions laid down in Articles 3 and 4 of the DSM Directive (EU 2019/790) never allowed generative AI systems to substitute its sources without any compensation.

Download the full joint statement on generative AI below (English and German Version).

Joint Statement: Authors and Performers Call for Safeguards Around Generative AI (pdf, 670.56 KB) IU Stellungnahme: Urheber und Künstler fordern Schutz vor generativer KI (pdf, 668.21 KB)

Pressekontakt: info@urheber.info