U.S. auto safety regulator National Highway Traffic Safety Administration (NHTSA) on Tuesday said that it is opening an investigation into about 2.6 million Tesla Inc. (TSLA  ) vehicles equipped with the company's full self-driving driver assistance technology following a complaint alleging that the use of its "Actually Smart Summon" feature led to a crash.

What Happened: The regulator said that it received a complaint about the Actually Smart Summon feature leading to a crash and it subsequently reviewed at least three media reports of similar crashes. In all four incidents, the Tesla vehicle failed to detect posts or parked vehicles, it said.

Tesla, however, has not reported any crashes during the use of the feature though rules mandate reporting for crashes involving automated driving systems on publicly accessible roads, the regulator said.

Actually Smart Summon refers to a feature that allows a user to remotely move the vehicle towards themselves or to another nearby location such as a parking lot through the company's phone app.

"ODI is aware of multiple crash allegations, involving both Smart Summon and Actually Smart Summon, where the user had too little reaction time to avoid a crash, either with the available line of sight or releasing the phone app button, which stops the vehicle's movement," the regulator said.

The probe, NHTSA said, will investigate Actually Smart Summon's capabilities while also evaluating the top speed that a vehicle can attain while the feature is engaged. The NHTSA will also evaluate the restrictions in place for use of the feature on public roads and line of sight requirements, it said.

Why It Matters: Tesla started rolling out Actually Smart Summon to vehicles in the U.S. in early September. The company also launched the feature on its vehicles in China in December.

The newly announced probe is the second probe by the regulator into Tesla in less than six months.

In October, NHTSA opened an investigation into 2.4 million Tesla vehicles following reports of four crashes where its full self-driving (FSD) partial driving automation system was engaged.

The regulator then said that it had identified four crash reports in which a Tesla experienced a crash after entering an area of "reduced roadway visibility" with FSD engaged. The reduced visibility, the regulator said, was caused by sun glare, fog, or airborne dust.

In one of the crashes, the vehicle struck and killed a pedestrian, the regulator added.