Description: In early March 2024, a network named CopyCop began publishing modified news stories using AI, altering content to spread partisan biases and disinformation. These articles, initially from legitimate sources, were manipulated by AI models, possibly developed by OpenAI, to disseminate Russian propaganda. Over 19,000 articles were published, targeting divisive political issues and creating false narratives.
Entities
View all entitiesAlleged: OpenAI and ChatGPT developed an AI system deployed by CopyCop and Russia-linked network, which harmed General public , Journalism and Democracy.
Incident Stats
Incident ID
680
Report Count
2
Incident Date
2024-03-01
Editors
Incident Reports
Reports Timeline
recordedfuture.com · 2024
- View the original report at its source
- View the report at the Internet Archive
In early March 2024, Insikt Group identified a malign influence network, CopyCop, skillfully leveraging inauthentic media outlets in the US, UK, and France. This network is suspected to be operated from Russia and is likely aligned with th…
economist.com · 2024
- View the original report at its source
- View the report at the Internet Archive
In the 1980s the kgb had a well-worn method for pumping disinformation around the world. "We preferred to work on genuine documents," recalled Oleg Kalugin, a former kgb general, "with some additions and changes." That method has not change…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Gender Biases in Google Translate
· 10 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Gender Biases in Google Translate
· 10 reports