The dispute — along with inaction at the FEC and Congress — could leave voters with limited federal protections against those who use AI to mislead the public or disguise their political messages during the final stretch of the campaign. New generative AI technologies are already proven capable of creating extremely realistic images.
“AI has the potential to be very influential in our elections, and right now, there’s a total regulatory vacuum on this issue,” said Ellen Weintraub, the Democratic vice chair of the Federal Election Commission.
Over a dozen states have passed laws regulating the use of AI in campaigns, but Congress has yet to step in despite widespread concern over the tool’s influence on Capitol Hill.
caught
Condensed stories to quickly stay informed
Adav Noti, executive director of the Campaign Legal Center and a former FEC associate general counsel, said that given the bureaucratic quagmire, the likelihood of federal restrictions on campaign use of AI before November’s presidential election is “extremely unlikely.” low”.
“The cavalry is not coming,” he said.
AI deepfakes have targeted officials and politicians this year. Democratic operative Steve Kramer was indicted last month over an AI-generated robocall impersonating President Biden that instructed New Hampshire residents not to vote early. Soon after, the FCC banned AI-generated voice imitations in robocalls. Last week, a fake video surfaced purporting to show State Department spokesman Matthew Miller calling the Russian city of Belgorod a possible target for Ukrainian strikes with US weapons.
Any major AI issues in the campaign could cause headaches for the Biden administration, which has made the rapid movement of AI a political centerpiece. Biden issued an executive order in October that required a number of federal agencies to quickly formulate regulations on the use of AI technologies.
FCC Chairwoman Jessica Rosenworcel (D) announced plans last month to consider a rule requiring political advertisers to include on-air or written disclosures when using “AI-generated content.”
But this week, a top election official and an FCC member, both Republicans, threw a wrench into those plans, accusing the agency’s Democratic leadership of overstepping its authority.
FEC Chairman Sean Cooksey wrote in a letter to Rosenworcel that the proposal would violate his agency’s role as the primary enforcer of federal campaign law. The FCC’s maneuver could create “irreconcilable conflicts” with potential FEC rules and prompt a legal challenge, Cooksey wrote.
The FCC’s proposal has not yet been made public, but Rosenworcel said the measure would not ban the use of AI but instead make “consumers have the right to know when AI tools are being used in the political ads they see.”
In an interview, Cooksey argued that implementing disclosure requirements so close to an election could do more harm than good by creating public confusion about the standards.
“This will sow chaos with political campaigns and interfere with future elections,” he said.
Fellow Republicans in Congress AND at the FCC opposed Rosenworcel’s plan. The Chairman of the Energy and Trade Committee in the House of Representatives, Rep. Cathy McMorris Rodgers (R-Wash.), said in a statement that the agency “does not have the expertise or authority to regulate political campaigns or AI.”
FCC Commissioner Brendan Carr (R) argued that because the rules would apply only to political ads on TV and radio and not to online streaming platforms such as YouTubeTV or Hulu, the sudden increase in AI disclosures in some countries , but not in others, “it would end up being too confusing for consumers.” He joined Cooksey in calling for the agency to table the case until after the election, if not indefinitely.
“The FCC should, first of all, not introduce a sea change in the regulation of political speech on the eve of a national election,” Carr said.
Rosenworcel said in a statement that the FCC has required campaign ads to disclose sponsors for decades, and that adapting those rules to the advent of new technologies is nothing new.
“The time to act on public disclosure of the use of AI is now,” she said. “There are benefits to this technology, but we also know it has the potential to deceive the public and misinform voters with fabricated voices and images that impersonate people without their permission.”
With a 3-2 majority, Democrats at the FCC could override Carr’s objections and move forward with plans before the election, but the specter of a legal challenge could stymie efforts.
Without legislation outlining how AI should be regulated, the actions of any federal agency “will almost certainly be challenged in court one way or another,” Noti said.
Multiple federal initiatives aimed at curbing the influence of AI in the 2024 race face an uncertain fate in Washington, even as officials in both parties warn of the technology’s potential to wreak havoc on the election process.
The FEC is considering its own petition on the issue, which would explicitly prohibit candidates from using AI to intentionally misrepresent opponents in political ads. But Democratic and Republican FEC officials have expressed skepticism about the agency’s ability to wade into the issue and have called on Congress to enact new rules instead of the proposal.
Unlike the FCC, the FEC is split evenly between the two major parties with a rotating chair, a structure that has often bogged down the agency as election reform has become increasingly polarized.
On Capitol Hill, senators have advanced a package of bills that would require AI-generated political ads to display disclaimers, among other restrictions. However, despite the calls ABOUT action on this issue by senior congressional leaders, Congress’s window to act before Election Day is rapidly closing.
“While it’s good that federal agencies are looking at the potential of AI to subvert campaigns and elections, we look forward to putting comprehensive guardrails in place to deal with these threats,” said Sen. Amy Klobuchar (D-Minn.). who leads legislative efforts.
#elections #Regulators #decide #problem
Image Source : www.washingtonpost.com