Connect with us

Hi, what are you looking for?

Technology

OpenAI Deleted An Iranian Operation Interfering In The US Election

OpenAI on Friday removed a network of Iranian accounts that were attempting to influence the US presidential election from abroad by creating lengthy articles and social media remarks using its ChatGPT chatbot.

TakeAway Points:

  • OpenAI has deleted a network of Iranian accounts that were attempting to influence the US presidential election from abroad by creating lengthy articles and social media remarks using its ChatGPT chatbot.
  • OpenAI said it had identified a dozen accounts on X and one on Instagram involved in the effort.
  • According to the report, the Iranian operation marks the latest possible indication that foreign operatives are still working on how to capitalise on AI tools that can quickly spit out convincing writing and images for little to no cost.

OpenAI deletes a network of Iranian accounts

The accounts produced content that seemed to be from people with liberal and conservative leanings. Some of the messages suggested that former US President Donald Trump was the target of social media censorship and was getting ready to usurp the throne.

Another person referred to Vice President Kamala Harris’s choice of Tim Walz as her running mate as a “planned choice.

The influence campaign, which also included posts about the Israel-Gaza war, the Olympic Games in Paris, and fashion and beauty subjects, does not appear to have received significant audience engagement, said Ben Nimmo, principal investigator on OpenAI’s Intelligence and Investigations team, in a press briefing.

The Iranian operation marks the latest suspicious social media effort that used AI only to fail to get much traction, a possible indication that foreign operatives are still working on how to capitalise on AI tools that can quickly spit out convincing writing and images for little to no cost.

This month, alleged groups connected with Iran’s government have increased cyber-influence efforts ahead of the US presidential election.

“They’ve laid the groundwork for influence campaigns on trending election-related topics and begun to activate these campaigns in an apparent effort to stir up controversy or sway voters, especially in swing states,” a blog post from Microsoft’s general manager for threat analysis, Clint Watts, said.

AI tools misuse

Earlier this year, Meta said it had removed hundreds of Facebook accounts associated with influence operations from Iran, China, and Russia, some of which relied on AI tools to spread disinformation.

Google said in a research report this week that it had observed a government-supported Iranian hacking group targeting “high-profile users in Israel and the US, including current and former government officials, political campaigns, diplomats, individuals who work at think tanks, as well as NGOs and academic institutions that contribute to foreign policy conversations.”

OpenAI said it had identified a dozen accounts on X and one on Instagram involved in the effort.

The disclosure comes after Mr. Trump’s campaign accused Iran of hacking after Politico reported it had begun receiving emails in July containing internal documents from an anonymous account.

The FBI has opened an investigation into the alleged hack. Iran has denied involvement.

The US intelligence community has consistently warned about foreign governments trying to shape Americans’ opinions, with the Office of the Director of National Intelligence in July saying that Iran, Russia, and China were recruiting people in the US to try spreading their propaganda.

OpenAI said in May that networks from Russia, China, Iran, and Israel had tried using the company’s AI products to enhance their propaganda efforts.

It said the networks it disrupted had used AI to generate text and images in a larger volume than otherwise would have been possible by human creators, helping the content appear more authentic.

However, these campaigns also failed to generate significantly more engagement, according to the start-up.







Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like