Tesla cars are super-advanced, but the method the company uses to verify drivers are paying attention while in autopilot mode is super-primitive, and basically relies on the driver occasionally inputting counter-torque on the steering wheel.

Teslas have always come with driver-facing internal cameras, and critics have been demanding Tesla use these to monitor driver attention for ages, but the company has always resisted.

Not reliable Tesla hacker Green has discovered that the company is finally relenting, and developing a driver monitoring system based on the internal camera.

In examining Tesla’s car software, he found the neural net is trying to classify scenes seen by the internal camera in the following categories:

  • BLINDED
  • DARK
  • EYES_CLOSED
  • EYES_DOWN
  • EYES_NOMINAL
  • EYES_UP
  • HEAD_DOWN
  • HEAD_TRUNC
  • LOOKING_LEFT
  • LOOKING_RIGHT
  • PHONE_USE
  • SUNGLASSES_EYES_LIKELY_NOMINAL
  • SUNGLASSES_LIKELY_EYES_DOWN

Tesla only started using the internal camera for data gathering a year ago, telling drivers they were gathering anonymised data before accidents to improve the safety of their vehicles.

It is not known or when this feature will reach production, and we are reminded Tesla owners are still unhappy with AI-controlled windscreen wipers, and may not take too well to this new AI-controlled driver monitoring system.

via Electrek.

Comments