NEW YORK (AP) 鈥 Nearly three years after rioters stormed the U.S. Capitol, the false election conspiracy theories that drove the violent attack remain prevalent on social media and cable news: suitcases filled with ballots, late-night ballot dumps, dead people voting.
Experts warn it will likely be worse in the coming The safeguards that attempted to counter the bogus claims the last time are eroding, while the tools and systems that create and spread them are only getting stronger.
Many Americans, , have continued to push the unsupported idea that elections throughout the U.S. can鈥檛 be trusted. A majority of Republicans (57%) Democrat was not legitimately elected president.
Meanwhile, have made it far cheaper and easier to spread the kind of misinformation that can mislead voters and potentially influence elections. And social media companies that once invested heavily in correcting the record have .
鈥淚 expect a tsunami of misinformation,鈥 said Oren Etzioni, an artificial intelligence expert and professor emeritus at the University of Washington. 鈥淚 hope to be proven wrong. But the ingredients are there, and I am completely terrified.鈥
AI DEEPFAKES GO MAINSTREAM
Manipulated images and videos surrounding elections are nothing new, but 2024 will be the first U.S. presidential election in which sophisticated AI tools that can produce convincing fakes in seconds are just a few clicks away.
The , videos and audio clips known as deepfakes have started making their way into experimental presidential campaign ads. More sinister versions could easily spread without labels and days before an election, Etzioni said.
鈥淵ou could see a political candidate like President Biden being rushed to a hospital,鈥 he said. 鈥淵ou could see a candidate saying things that he or she never actually said."
Faced with content that is made to look and sound real, 鈥渆verything that we鈥檝e been wired to do through evolution is going to come into play to have us believe in the fabrication rather than the actual reality,鈥 said misinformation scholar Kathleen Hall Jamieson, director of the Annenberg Public Policy Center at the University of Pennsylvania.
The Federal Election Commission and Republicans and Democrats in Congress are exploring steps to regulate the technology, but they .
A handful of states requiring deepfakes to be labeled or banning those that misrepresent candidates. Some social media companies, including and , which owns Facebook and Instagram, have introduced AI labeling policies. It remains to be seen whether they will be able to consistently catch violators.
SOCIAL MEDIA GUARDRAILS FADE
It was just over a year ago that and began firing its executives, dismantling some of its core features and reshaping the social media platform into what鈥檚 now known as X.
Since then, he has , leaving public officials vulnerable to impersonators. He has gutted the teams that once fought misinformation on the platform, leaving the community of users to moderate itself. And he has of conspiracy theorists and extremists who were previously banned.
The changes have been applauded by many conservatives who say Twitter鈥檚 previous moderation attempts amounted to censorship of their views. But pro-democracy advocates argue the takeover has shifted what once was a flawed but useful resource for news and election information into a largely unregulated echo chamber that amplifies hate speech and misinformation.
In the run-up to 2024, X, Meta and YouTube have together removed 17 policies that protected against hate and misinformation, according to a from Free Press, a nonprofit that advocates for civil rights in tech and media.
In June, YouTube that while it would still regulate content that misleads about current or upcoming elections, it would stop removing content that the or other previous U.S. elections were marred by 鈥渨idespread fraud, errors or glitches.鈥 The platform said the policy was an attempt to protect the ability to 鈥渙penly debate political ideas, even those that are controversial or based on disproven assumptions.鈥
X, Meta and YouTube also have laid off thousands of employees and contractors since 2020, some of whom have included content moderators.
The shrinking of such teams 鈥渟ets the stage for things to be worse in 2024 than in 2020,鈥 said Kate Starbird, a misinformation expert at the University of Washington.
Meta on its website that it has some 40,000 people devoted to safety and security. It also frequently networks of fake social media accounts that aim to sow discord.
Ivy Choi, a YouTube spokesperson, said the platform has recommendation and information panels, which provide users with reliable election news.
The rise of TikTok and other, less regulated platforms such as Telegram, Truth Social and Gab, also has created more information silos online where baseless claims can spread. Some apps such as WhatsApp and WeChat, rely on private chats, making it hard for outside groups to see the misinformation that may spread.
鈥淚鈥檓 worried that in 2024, we鈥檙e going to see similar recycled, ingrained false narratives but more sophisticated tactics,鈥 said Roberta Braga, founder and executive director of the Digital Democracy Institute of the Americas. 鈥淏ut on the positive side, I am hopeful there is more social resilience to those things.鈥
THE TRUMP FACTOR
Trump鈥檚 front-runner status in the Republican presidential primary is top of mind for misinformation researchers who worry that it will exacerbate election misinformation and potentially lead to election vigilantism or violence.
The former president still to have won the 2020 election.
Without evidence, Trump has to expect fraud in the 2024 election, urging them to intervene to 鈥 鈥 to prevent vote rigging in diverse Democratic cities. Trump has a long history of elections are rigged if he doesn鈥檛 win and did so before the voting in and .
That continued wearing away of voter trust in democracy can lead to violence, said Bret Schafer, a senior fellow at the nonpartisan Alliance for Securing Democracy, which tracks misinformation.
鈥淚f people don鈥檛 ultimately trust information related to an election, democracy just stops working,鈥 he said.
ELECTION OFFICIALS RESPOND
Election officials have spent the years since 2020 preparing for the expected resurgence of election denial narratives.
In Colorado, Secretary of State Jena Griswold said informative paid social media and TV campaigns that humanize election workers have helped inoculate voters against misinformation.
鈥淭his is an uphill battle, but we have to be proactive,鈥 she said. 鈥淢isinformation is one of the biggest threats to American democracy we see today.鈥
Minnesota Secretary of State Steve Simon鈥檚 office is spearheading #TrustedInfo2024, a new online public education effort by the 香港六合彩挂牌资料 Association of Secretaries of State to promote election officials as a trusted source of election information in 2024.
His office also is planning meetings with county and city election officials and will update a 鈥淔act and Fiction鈥 information page on its website as false claims emerge. A new law in Minnesota will from threats and harassment, bar people from knowingly distributing misinformation ahead of elections and criminalize people who non-consensually to hurt a political candidate or influence an election.
In a county north of Green Bay, Oconto County Clerk Kim Pytleski has traveled the region giving talks and presentations to small groups about voting and elections to boost voters鈥 trust.
鈥淏eing able to talk directly with your elections officials makes all the difference,鈥 she said. 鈥淏eing able to see that there are real people behind these processes who are committed to their jobs and want to do good work helps people understand we are here to serve them.鈥
___
Fernando reported from Chicago. Associated Press writer Christina A. Cassidy in Atlanta contributed to this report.
___
The Associated Press鈥痳eceives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP鈥檚 democracy initiative . The AP is solely responsible for all content.