FTC Rule To Protect Consumers From Fake Reviews, AI-Generated Testimonials Comes Into Effect: What You Need To Know

The Federal Trade Commission has enforced a new rule that prohibits the sale or purchase of fake reviews, in a move to maintain the integrity of online reviews and testimonials. The rule, which took effect on Monday, will enable the FTC to impose civil penalties on those who knowingly breach it.

What Happened: The FTC's new rule, announced in August, aims to curb the proliferation of fake reviews and testimonials that mislead consumers and unfairly disadvantage honest competitors.

The rule specifically targets reviews attributed to non-existent individuals, those generated by artificial intelligence, or those that misrepresent the reviewer's experience.

Businesses are also prohibited from creating or selling fake reviews. Those found to be in violation, including businesses that knowingly purchase fake reviews or use intimidation tactics, will face penalties.

"As of today, @FTC's final rule banning fake online reviews and testimonials has come into effect," wrote FTC Chair Lina Khan in a post on X.

Why It Matters: This new rule is part of a series of initiatives led by Khan to ensure fair and transparent business practices. Earlier in September, Khan raised concerns about the potential misuse of AI by airlines to charge higher prices for vulnerable travelers.

In October, the FTC also finalized a rule called "click-to-cancel," aimed at simplifying subscription cancellations.

The FTC, under Khan's leadership, is also under pressure to block the proposed $16.5 billion acquisition of Catalent Inc by Novo Nordisk A/S (NYSE: NVO) due to antitrust concerns.