Google Prevents AI Gemini Chatbot From Discussing Election Speak

Google Prevents AI Gemini Chatbot From Discussing Election Speak

Tyler Cross


Tyler Cross

Published on: March 13, 2024

The Gemini AI model created by Google has been prohibited from election speak during the 2024 election cycle. Multiple tech companies, including Google, have made massive breakthroughs in AI technology over the past year. However, AI models have become powerful enough to cause large-scale election fraud and fuel political propaganda campaigns if they fall into the wrong hands.

With AI becoming more widely available, US voters are left with an understandable concern that companies like Google could use their AI models to change the course of elections by spreading misinformation. Many users already report that models like Gemini and ChatGPT have clear political biases in their writing.

Alphabet (Google’s parent company) is addressing people’s concerns by reinforcing its company’s principles. They have completely banned election speak during the 2024 election cycle.

If you ask Gemini election-related questions about Donald Trump, Joe Biden, or other candidates, the chatbot simply replies “I’m still learning how to answer this question. In the meantime, try Google Search.”

These restrictions were first announced back in December. Google explained that they wanted to put these defenses in place early, so actors (or even the company itself) wouldn’t get a chance to use it to spread misinformation.

“In preparation for the many elections happening around the world in 2024 and out of an abundance of caution on such an important topic, soon we will restrict the types of election-related queries for which Gemini will return responses,” states a Google spokesperson.

The tightening of election speak in Gemini comes in stark contrast to OpenAI’s recent decisions. The company behind ChatGPT has earned some bad press after going back on two important security clauses. ChatGPT’s terms of service initially didn’t allow use in any political campaigns or the military.

Over time, ChatGPT altered its terms of service to only prohibit “risky” uses of ChatGPT in political campaigns and let the military use it for anything but building weapons.

Time Stamp:

More from Safety Detectives