Victor Miller, 42, recently filed paperwork to run for mayor of Cheyenne, Wyoming, but it wasn’t exactly for him: He listed the name of a customized AI chatbot to appear on the ballot.
He argued that the chatbot, named VIC – for “virtual integrated citizen” – would use technology from artificial intelligence company OpenAI to make all political decisions and help govern the city. (VIC is not affiliated with a political party.)
AI has “helped me in my life personally … such as helping me with my resume,” Miller told CNN. “I think it could add a layer to help a town. I really want to see that happen.”
On Tuesday, however, OpenAI told CNN it shut down Miller’s access to the tool that was being used to interact with and persuade voters. Using ChatGPT in this way, according to the company, is against its policies.
AI is making politics even more complicated as government regulators, companies and consumers are still figuring out how to use AI tools responsibly, and the tech is advancing faster than social, legal and regulatory guardrails.
“We’ve taken action against these uses of our technology for violating our policies against political campaigning,” an OpenAI spokesperson told CNN.
On its website, OpenAI states it prohibits “engaging in political campaigning or lobbying, including generating campaign materials personalized to or targeted at specific demographics.”
Miller said he was motivated to create VIC after he was denied access to city records about policies and procedures because he made his request anonymously.
“If I was able to ask AI and interact with this new intelligence, it would have known the law and I would have gotten the records,” he said.
The city did not respond to request for comment on the denial of records, but Wyoming Secretary of State Chuck Gray told CNN in a statement that he is “closely” monitoring Miller’s bid for mayor.
“Wyoming law is clear that, to run for office, one must be a ‘qualified elector,’ which necessitates being a real person,” Gray said in the statement. “Therefore, an AI bot is not a qualified elector.”
He also suggested the chatbot is essentially a fig leaf for a Miller candidacy. Gray said he wrote a letter to the Cheyenne municipal clerk raising his concerns about the bid.
Although the public-facing version of VIC has been removed by OpenAI, Miller said it still works on his own ChatGPT account. He plans to bring it, along with a microphone, to a local Cheyenne library and give voters the opportunity to directly ask it questions via its voice-to-text feature.
OpenAI told CNN it also took action against another candidate in the UK who was using its AI models to help campaign for Parliament. Steve Endacott, chairman of an AI company called Neural Voice, answers questions from voters via AI Steve, a chatbot, on his site. He is running as an independent. Endacott did not respond to a request for comment.
His website offered a ChatGPT chatbot where voters were able to leave opinions and help create policies. If a voter asked the tool a question about AI Steve’s policies and it didn’t have an answer, it would conduct a search and create a policy suggestion. (While his website continues to operate, the tool is no longer powered by ChatGPT.)
‘Gimmick is the right word’
Although AI chatbots are getting smarter, some experts told CNN the technology should never substitute for human judgment in running any part of government.
“When it comes to AI now and what it will be like in the future, it should never be used to make automated decisions,” said Jen Golbeck, a professor at the College of Information Sciences at the University of Maryland.
“AI has always been designed for decision support – it gives some data to help a human make decisions but is not set up to make decisions by itself.”
The emergence of AI political candidates also comes amid growing concerns about how the spread of misinformation could impact elections. Earlier this year, for example, a fake recording of a candidate in Slovakia saying he rigged the election went viral.
Golbeck said, however, that there may be a place for AI in politics when it comes to helping with various tasks, such as answering forms from constituents or directing how to get problems solved.
“You may be able to train a chatbot with all of the knowledge found in an office,” she said. “But the decision making should always be left to humans.”
David Karpf, an associate professor of media and public affairs at George Washington University, agreed, noting the people behind an AI candidacy are leaning into “a cultural moment” and shouldn’t be taken seriously.
“Gimmick is the right word for it,” Karpf told CNN. “ChatGPT is not qualified to run your government.”
Karpf said he doesn’t believe lawmakers need to make formal legislation around AI chatbots running for office because “no one is going to vote for an AI chatbot to run a city.”
Karpf believes that the timing of these cases is noteworthy, too. “We have a very serious election coming up, and I don’t mind levity in it,” he said. “And that’s what this is: We should laugh for a minute and get back to work.”
But Miller said he hopes the attention around his efforts inspires more AI candidates in the months ahead.
“I think this can expand beyond the mayor and Parliament, and [reach the whole] world,” he said.