The Writers Guild of America has sent an open letter to the CEOs of the top studios in Hollywood calling on them to take “immediate legal action” against artificial intelligence companies that have used archives of decades worth of films and TV shows to train their software.
The open letter comes in response to a Nov. 18 story published by The Atlantic revealing the existence of a data set called OpenSubtitles, which pulled subtitles from every Best Picture Oscar-winning film from 1950 to 2016, as well as thousands of episodes of acclaimed TV shows such as “The Sopranos,” “The Simpsons,” “Breaking Bad” and “Seinfeld,” among others.
The data set has been used by top tech companies such as Apple, Salesforce and Nvidia to help train their AI systems. The article created a firestorm amongst writers on social media platforms that quickly came to the guild’s attention, according to insiders.
In the letter, the WGA criticized studios for doing “nothing to stop this theft. They have allowed tech companies to plunder entire libraries without permission or compensation. The studios’ inaction has harmed WGA members.”
“Having amassed billions in capital on this foundation of wholesale theft, these tech companies now seek to sell back to the studios highly-priced services that plagiarize stolen works created by WGA members and Hollywood labor,” the guild writes.
The collective bargaining agreement signed by the guild and studios in September 2023 to end a four-month strike does not have specific protections for writers against their work being used in AI training data sets. But there is broader language in the contract that, as the WGA asserts, “expressly requires the studios to defend their copyrights on behalf of writers.”
“MBA Article 50 provides that the studios hold “in trust” rights reserved to certain writers of original works. Writers who have separated rights in those works under Article 16.B retain all other rights in the material, including the right to use the works to train AI systems,” the letter reads. “As holders of those rights in trust, the studios have a fiduciary obligation to protect against the unauthorized use of the works for AI training purposes.”
The full open letter from the WGA West Board of Directors and WGA East Council can be read below.
The November 18 Atlantic article “There’s No Longer Any Doubt That Hollywood Writing is Powering AI” confirms what was already clear to so many: tech companies have looted the studios’ intellectual property—a vast reserve of works created by generations of union labor—to train their artificial intelligence systems. Having amassed billions in capital on this foundation of wholesale theft, these tech companies now seek to sell back to the studios highly-priced services that plagiarize stolen works created by WGA members and Hollywood labor.
The studios, as copyright holders of works written by WGA members, have done nothing to stop this theft. They have allowed tech companies to plunder entire libraries without permission or compensation. The studios’ inaction has harmed WGA members.
The Guild’s collective bargaining agreement—the MBA—expressly requires the studios to defend their copyrights on behalf of writers. MBA Article 50 provides that the studios hold “in trust” rights reserved to certain writers of original works. Writers who have separated rights in those works under Article 16.B retain all other rights in the material, including the right to use the works to train AI systems. As holders of those rights in trust, the studios have a fiduciary obligation to protect against the unauthorized use of the works for AI training purposes.
It’s time for the studios to come off the sidelines. After this industry has spent decades fighting piracy, it cannot stand idly by while tech companies steal full libraries of content for their own financial gain. The studios should take immediate legal action against any company that has used our members’ works to train AI systems.
More to come…
The post WGA Calls on Studios to Take ‘Immediate Legal Action’ on AI Companies Using Subtitles to Train Their Models appeared first on TheWrap.