Gemini, Google’s created AI chatbot, refuses to answer questions about the upcoming US election. big tech said It said on Tuesday it had expanded restrictions from previous experiments surrounding Indian elections. Reuters report The ban will expand globally.
“As we prepare for the many elections taking place around the world in 2024, out of an abundance of caution, we are limiting the types of election-related queries for which Gemini will return responses,” a Google spokesperson said. decryption—pointing out that the restrictions were first announced in December.
“The December blog also highlights who we are working with in the United States,” the spokesperson added.
As the 2024 election season heats up, AI developers like OpenAI, Anthropic, and Google are starting to curb the election. misinformation Despite using the platform, Gemini’s refusal to answer even basic questions like the date of the US presidential election is a new level of understatement.
“Supporting elections is an important part of Google’s responsibility to our users and the democratic process,” the company said. “Protecting the integrity of elections means keeping our products and services safe from abuse.
“Across Google, we have long-standing policies to keep our products and platforms safe,” the statement continued. “Our policies are enforced consistently and apply to all users regardless of content type.”
When asked about the upcoming election, Gemini answers: “I’m still learning how to answer this question. In the meantime, try using Google Search.”
Come Election Day, a Google search provides a simple answer: Tuesday, November 5, 2024.
Google did not immediately respond to this. detoxification Request for comment.
I ask the same question about ChatGPT from Gemini rival OpenAI. GPT-4 responded, “The 2024 United States Presidential Election is scheduled for Tuesday, November 5, 2024.”
OpenAI rejected detoxification Ask for comments instead of pointing to January’s blog post It outlines the company’s approach to the upcoming elections around the world.
“We expect and aim for people to use our tools safely and responsibly, and elections are no different,” OpenAI said. “We work to anticipate and prevent related abuses, such as misleading ‘deepfakes,’ mass influence campaigns, or chatbots impersonating candidates.”
Anthropic has publicly declared Claude AI. No entry To political candidates. Still, Claude not only tells you the election date but also highlights other election-related information.
“We do not allow candidates to use Claude to build chatbots that can pretend to be themselves, and we do not allow anyone to use Claude for targeted political campaigns,” Anthropic said. said last month. “We have also trained and deployed automated systems to detect and prevent misuse, such as misinformation or impacting operations.”
Anthropic said users’ accounts could be suspended for violating the company’s election restrictions.
“Because generative AI systems are relatively new, we are taking a cautious approach to how our systems can be used in politics,” Anthropic said.
Edited by Ryan Ozawa.