Pocket-lint is supported by its readers. When you buy through links on our site, we may earn an affiliate commission. Learn more

(Pocket-lint) - Tesla's Autopilot mode is under investigation by US regulators after causing a fatal crash in the States on 7 May.

A Tesla Model S driver hit the trailer of an 18-wheeler truck in Florida, which resulted in his death. It is claimed that he might not have been in full control of his vehicle, relying on the beta version of the Autopilot software instead.

Tesla states that the Autopilot beta is disabled in all supported cars by default, and it requires acknowledgement by drivers that it is not only early software but "an assist feature that requires you to keep your hands on the steering wheel at all times". It also states the software tells you to "maintain control and responsibility for your vehicle" while using it.

The electric car manufacturer also revealed that this fatality is the first in over 130 million miles of autopilot use in its vehicles. However, the investigation is looking at the specifics of the crash and Tesla itself has confirmed that the system was confused by the circumstances.

Autopilot did not see the white side of the tractor trailer against a brightly lit sky. Tesla claims that the driver must have also failed to see it. Therefore, brakes were not applied by the system or driver.

In addition, because of the height of the trailer and the position of the truck on the road, the Model S passed under and the trailer impacted the windshield. If the car has impacted the front or rear of the trailer, said Tesla, the advanced crash safety system would have prevented serious injury.

Nonetheless, the executive director of the Center for Auto Safety in the US, Clarence Ditlow, believes there should be a recall of the Tesla vehicles with autopilot: "When you put autopilot in a vehicle, you’re telling people to trust the system even if there is lawyerly warning to keep your hands on the wheel," he said.

Writing by Rik Henderson.