FCC Chair Rosenworcel Proposes FCC Require Disclosure of AI Content in Political Ads
The proposed disclosure requirements would ensure consumers are told about AI technology being used in political ads
WASHINGTON, D.C.—As concerns grow about the potential impact of fake AI-generated photos and videos being used to influence elections, Federal Communications Commission Chairwoman Jessica Rosenworcel is proposing that the agency require that the public be informed when AI-generated content is aired in political ads on radio and TV.
The agency stressed that the proposal does not call for any prohibition of such content, only the disclosure of any AI-generated content within political ads.
The FCC said that Rosenworcel has shared a Notice of Proposed Rulemaking with her colleagues at the agency that would initiate a proceeding that recognizes consumers’ right to know when AI tools are being used in the political ads they view.
If adopted, the FCC said that this proposal would increase transparency by:
- Seeking comment on whether to require an on-air disclosure and written disclosure in broadcasters’ political files when there is AI-generated content in political ads,
- Proposing to apply the disclosure rules to both candidate and issue advertisements,
- Requesting comment on a specific definition of AI-generated content, and
- Proposing to apply the disclosure requirements to broadcasters and entities that engage in origination programming, including cable operators, satellite TV and radio providers and section 325(c) permittees.
"As artificial intelligence tools become more accessible, the Commission wants to make sure consumers are fully informed when the technology is used,” said chairwoman Rosenworcel. “Today, I’ve shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue.”
In announcing the proposal the FCC said that the use of AI is expected to play a substantial role in the creation of political ads in 2024 and beyond, but the use of AI-generated content in political ads also creates a potential for providing deceptive information to voters, in particular, the potential use of “deep fakes” – altered images, videos, or audio recordings that depict people doing or saying things that did not actually do or say, or events that did not actually occur.
The FCC said that the Bipartisan Campaign Reform Act provides the Commission with authority regarding political advertising. There is also a clear public interest obligation for Commission licensees, regulatees, and permittees to protect the public from false, misleading, or deceptive programming and to promote an informed public – and the proposed rules seek to achieve that goal, the agency said.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
If adopted, this proposal would launch a proceeding during which the Commission would take public comment on the proposed rules, which wouldn't take effect until the com
Ishan Mehta, Common Cause Media and Democracy program director applauded the idea of requiring disclosure of AI-generated content in political ads.
"Americans expect and deserve to know whether the content they see on our public airwaves is real or AI-generated content – especially as the technology is increasingly being used to mislead voters," Mehta said. "This rulemaking is welcome news as the use of deceptive AI and deepfakes threaten our democracy and is already being used to erode trust in our institutions and our elections. We have seen the impact of AI in politics in the form of primary ads using AI voices and images, and in robocalls during the primary in New Hampshire."
Given those pressures, Mehta added that "we commend the FCC and Chair Rosenworcel for this work to require disclosures for AI-generated content in political ads. It is imperative that regulations around political advertising keep pace with the onward march of new and evolving technologies. We urge Congress and other agencies like the FEC (Federal Election Commission) to follow the FCC’s lead and take proactive steps to protect our democracy from very serious threat posed by AI. That is why we have previously filed comments with the FEC urging the agency to amend its regulation on `fraudulent misrepresentation' to include `deliberately false Artificial Intelligence-generated content in campaign ads or other communications.'”
George Winslow is the senior content producer for TV Tech. He has written about the television, media and technology industries for nearly 30 years for such publications as Broadcasting & Cable, Multichannel News and TV Tech. Over the years, he has edited a number of magazines, including Multichannel News International and World Screen, and moderated panels at such major industry events as NAB and MIP TV. He has published two books and dozens of encyclopedia articles on such subjects as the media, New York City history and economics.