Alphabet Inc. (NASDAQ: GOOG) (NASDAQ: GOOGL) subsidiary Google underscored the importance of empowering users with fact-check tools amid the rising use of AI, at the third edition of the Global AI Summit (GAIN) in Riyadh, Saudi Arabia.
What Happened: Google's managing director for Middle East and North Africa, Anthony Naccache, spoke about how users can fact-check what they see on the internet, instead of relying solely on big tech companies and media organizations to do this.
Naccache said Google already has one such tool that has been available for a long time - reverse image search. This allows users to upload, or pinpoint Google Search to a specific image, and perform a reverse lookup for the image on the internet.
However, with the advent of AI, it has been increasingly difficult to figure out if an image or text is genuine or generated or modified using AI.
While Naccache said Google is working with governments and regulatory agencies to figure it out, educating users is also an important cog in the wheel.
Why It Matters: In August 2023, Google's DeepMind team developed a new technology called SynthID, which makes it nearly impossible to remove watermarks from AI-generated images. This tool aims to help users identify AI-generated photos with high accuracy.
Moreover, a study conducted by Google DeepMind in June revealed that the most common misuse of AI is the creation of deepfakes featuring politicians and celebrities. This misuse is almost twice as common as AI-assisted cyber-attacks, highlighting the urgent need for effective fact-checking tools.
The urgency of these tools is further underscored by the ease with which AI can be used to craft convincing fake news. In September 2023, a developer showcased a "disinformation machine" using OpenAI's ChatGPT, demonstrating how cheap and easy it has become to spread propaganda on a massive scale.
In May, Meta Platforms Inc. (NASDAQ: META) took down numerous Facebook accounts linked to covert influence campaigns from countries like China, Iran, Russia, and Israel, further emphasizing the need for robust fact-checking mechanisms.