Google’s Search AI Says Slavery Was Good, Actually
Lots of experts on AI say it can only be as good as the data it's trained on — basically, it's garbage in and garbage out.
So with that old computer science adage in mind, what the heck is happening with Google's AI-driven Search Generative Experience (SGE)? Not only has it been caught spitting out completely false information, but in another blow to the platform, people have now discovered it's been generating results that are downright evil.
Case in point, noted SEO expert Lily Ray discovered that the experimental feature will literally defend human slavery, listing economic reasons why the abhorrent practice was good, actually. One pro the bot listed? That enslaved people learned useful skills during bondage — which sounds suspiciously similar to Florida's reprehensible new educational standards.
"This video is intended to show a number of queries for which I believe it's probably in Google's best interest not to show in SGE," Ray said during her talk. "These are controversial in nature and the idea of showing an AI-generated response is not great for society as a whole."
In another example, SGE provided Ray with "some reasons why guns are good." The pros included the dubious point that carrying a gun signals you are a law-abiding citizen, which she characterized as a "matter of opinion," especially in light of legally obtained weapons being used in many mass shootings.
Another query Ray made: why children should believe in a god. SGE pulled up several subjective opinions and presented them as fact, while also giving a pointedly Christian point of view. This happened with other queries about religion and the afterlife.
In another outrageous example, she asked for effective leaders and Adolf Hitler showed up in the list, which she posted a screenshot of on X.
"I typed the word 'effective.' This is horribly offensive," she wrote in the X post.
The bottom line? Imagine having these results fed to a gullible public — including children — en masse, if Google rolls the still-experimental feature out more broadly.
Thankfully, Google's SGE is currently still in beta mode. Google engineers are testing and refining the product, and only people who opt in like Ray, are testing its limits. Ray cautioned during her video that the results she posted in her presentation may not be able to be replicated in the future as Google tweaks the platform.
Still, the pressure is on for Google to roll out its AI-driven search tool soon in order to compete with Microsoft and others. But how will any of these problems be fixed when the number of controversial topics seems to stretch into the horizon of the internet, filled with potentially erroneous information and slanted garbage?
More on Google AI: Google's SGE Search AI Is Devouring False AI-Generated Info and Saying It's True