OpenAI Extends a Gesture of Goodwill to Artists Nervous About Fueling AI Algorithms

0:00

OpenAI is currently facing legal challenges from artists, writers, and publishers who claim their work was used without permission to train AI systems like ChatGPT. To address these concerns, the company has announced the upcoming launch of Media Manager in 2025. This tool will allow content creators to opt out their work from OpenAI’s AI development process. The goal is to give creators more control over how their work is used in machine learning research and training.

OpenAI stated in a blog post that the Media Manager tool will enable creators to specify which works should be included or excluded from AI development. The company is collaborating with creators, content owners, and regulators to develop the tool, aiming to establish an industry standard.

However, questions remain about the functionality of the Media Manager tool. It is unclear whether content owners can make a single request to cover all their works and whether requests can apply to models that have already been trained. Additionally, the concept of machine “unlearning” is still being explored, which could allow AI systems to retroactively adjust training data contributions.

Industry experts, like Ed Newton-Rex from Fairly Trained, welcome OpenAI’s efforts to address these issues but emphasize the importance of implementation details. The effectiveness of the Media Manager tool in supporting artists will depend on how it is executed. Newton-Rex raised concerns about whether the tool is merely an opt-out mechanism or if it signifies a broader shift in OpenAI’s practices.

OpenAI has not disclosed specific details on how the Media Manager tool will operate or whether other companies can utilize it. Jordan Meyer from Spawning, a startup that offers Do Not Train registry for creators, expressed willingness to collaborate with OpenAI on similar initiatives to streamline opt-out processes across AI projects.

Overall, OpenAI’s move to introduce the Media Manager tool reflects a growing trend in the tech industry to provide creators with more control over the use of their work and personal data in AI development. Other companies, like Adobe and Tumblr, offer similar opt-out tools, highlighting the increasing importance of addressing ethical concerns in AI projects.

Kate Knibbs
Kate Knibbs
Kate Knibbs is a senior writer covering culture. She was previously a writer at The Ringer and Gizmodo.

Latest stories

Ad

Related Articles

Leave a reply

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

Ad
Continue on app