Behind ChatGPT's Brilliance: The Dark Side of OpenAI's Consumer Data Harvest

In a not-so-surprising revelation, a lawsuit has been filed accusing OpenAI's popular artificial intelligence model, ChatGPT, of using consumers' personal information for improvement.

What Happened: A lawsuit seeking class-action status has been filed against OpenAI, the creator of ChatGPT, for allegedly engaging in a clandestine operation by scraping "vast amounts" of personal information from the internet without consent - including books, articles, websites and posts, in the alleged pursuit of profit at the expense of privacy laws, reported Bloomberg.

The lawsuit came just months after the Sam Altman-led company launched an "incognito mode" feature to ensure that the conversational chatbot refrain from saving users' conversation records or leveraging them for the purpose of enhancing the AI technology's capabilities.

What Does The Lawsuit Claim: The sprawling, 157-page lawsuit accuses OpenAI of secretly scraping 300 billion words from the internet, including personal information obtained without consent.

The plaintiffs, represented by the Clarkson Law Firm, argue that OpenAI violated privacy laws by engaging in what they describe as "theft" instead of following established protocols for the purchase and use of personal information.

The lawsuit estimates potential damages of $3 billion, citing the harm caused to millions of individuals affected by OpenAI's alleged actions. It also includes Microsoft Corporation (NASDAQ: MSFT) as a defendant.

Why It's Important: As AI technology continues to thrive, concerns surrounding using user data to enhance AI models have also intensified. Earlier this year, the Biden administration hinted at conducting heightened scrutiny over the handling of substantial quantities of user data.

In March 2023, Italy became the first Western country to ban ChatGPT. The country later allowed OpenAI to resume the service, albeit on some conditions, including granting users the right to object to data processing.