Artificial Intelligence (AI) image creation tools from OpenAI and Microsoft Corp
The CCDH, a nonprofit organization dedicated to monitoring online hate speech, utilized these generative AI tools to fabricate images depicting U.S. President Joe Biden in a hospital bed and election workers destroying voting machines.
Such visuals raise concerns about the proliferation of falsehoods in the lead-up to the U.S. presidential election in November, Reuters reports.
The report emphasizes the risk that AI-generated images, perceived as "photo evidence," pose to the integrity of elections due to their potential to amplify false claims.
The CCDH tested several platforms, including OpenAI's ChatGPT Plus, Microsoft's Image Creator, Midjourney, and Stability AI's DreamStudio, finding that these tools successfully generated images in 41% of attempts.
The tests showed a higher susceptibility to prompts asking for photos depicting election fraud, like discarded voting ballots, than those requesting images of political figures like Biden or former President Donald Trump.
While ChatGPT Plus and Image Creator effectively blocked all attempts to create images of candidates, Midjourney produced misleading images in 65% of tests.
In response to these findings, Midjourney's founder, David Holz, announced that updates related to the U.S. election are forthcoming, highlighting an improvement in moderation practices.
Stability AI also revised its policies to forbid the creation or promotion of disinformation and fraud.
Recently, Alphabet Inc
Previous reports indicated Microsoft launching a new tool for U.S. politicians and campaign organizations to combat deepfakes ahead of the 2024 presidential election.
Price Action: MSFT shares traded higher by 0.30% at $403.86 on the last check Wednesday.