The Tesla car company is one of the pioneers of artificial intelligence-guided (AI-guided) autonomous driving. The Full Self-Driving Beta 10.8 (FSD-Beta) is described on the Tesla website.

“Autopilot is an advanced driver assistance system that enhances safety and convenience behind the wheel. When used properly, Autopilot reduces your overall workload as a driver. Each new Tesla vehicle is equipped with 8 external cameras, 12 ultrasonic sensors and a powerful onboard computer (sic) provide an additional layer of safety to guide you on your journey. Model 3 and Model Y built for the North American market have transitioned to camera-based Tesla Vision, which are not equipped with radar and instead rely on Tesla’s advanced suite of cameras and neural net processing to deliver Autopilot and related features. Model S and Model X continue to be equipped with radar.”[1]

That short paragraph on their website communicates a lot but leaves out much of the engineering that has gone into FSD-Beta. The system uses eight cameras and twelve ultrasonic sensors, but the key to it all working is the “powerful onboard computer.” Another way of describing the computer is that it is an AI neural net that seeks to function much like a human brain. The neural net takes information from the cameras and microphones and predicts what the various images and sounds represent.  In an interview with Lex Fridman on December 28, 2021,[2] Elon Musk speaks of how the images sent from the cameras are designed to feed raw, photon data to the neural net. There is no need to “pretty up” the images which only wastes time and increases latency and jitter in the system, two things which are the bane of autopilot systems. By going to first principles of engineering, Tesla is seeking to create an autopilot that replicates and exceeds the capabilities of the thinking processes of a human driver.

The next time you are out for a drive, think about all the things your own neural net (brain) is doing as you navigate down the roadways. Your brain is analysing data from your eyes: looking through the windshield as it subtracts the dirt and pock-marks on the windshield, filling in what might be behind an occlusion you are not able to see through (the bus stopped at the side of the road may be hiding pedestrians that could walk around the bus and appear in front of your car), seeing the silhouette of a person in a parked car indicating that the car door might open or the car might pull out into traffic, looking at the dash and seeing which radio station is tuned in and the volume at which it is being played, watching the signs for your next turn as you think about how to get from point A to point B, seeing that cyclist in your right peripheral view while noticing the other car overtaking your position in your left peripheral view, and many more pieces of data mediated by the photons flowing into the retina.

Now, while your neural net is analysing the data coming in from your eyes, it is also dealing with the information provided by your ears: simultaneously listening to the radio, traffic noise, horns honking, emergency sirens, rattles and bumps of the car, that airplane flying overhead, the sound of children squealing in a nearby playground, and several other sounds. Some of these are rightly ignored while others are an important part of the driving process. And that is only two of our five senses that might rightly or deceptively be involved in our trip to the local store. “Do I need to pay attention to that skunk smell? Is a small mammal likely to run out in front of my car?” What about memories from other trips? Do I need to pay attention to those? I remember travelling at dusk through the mountains when deer gathered on the side of snowy roads to lick salt sprinkled there by snowplows. “Do I need to take that into account as I drive here today?” What neural processes in my brain contribute to lane identification on a snowy day in Calgary?

One can readily see that Tesla has their work cut out for them. As they build this massive neural net, with a lot of lines of code, to serve the FSD system, what are the implications of such an AI brain? Could this brain be used in humanoid robots? What about using such systems to predict traffic flows, or on a wider basis predict climate change? What might be possible?

Tesla’s approach has been praised and criticised. Praised for thinking in ways that disrupts the normal engineering processes. Criticised for not using radar as well as cameras. Other institutions like the Massachusetts Institute of Technology (MIT) have followed Tesla’s progress and have done their own independent tests of the technology to see how well it performs. The world will be watching to see if Tesla can indeed make autonomous driving safer than human mediated driving in the coming years.


[1]Tesla Support; https://www.tesla.com/support/autopilot; accessed 2022-01-01.

[2] Elon Musk: SpaceX, Mars, Tesla Autopilot, Self-Driving, Robotics, and AI, Lex Fridman Podcast #252, https://www.youtube.com/watch?v=DxREm3s1scA, accessed 2022-01-01.

Dive in!

Join The Great Journey with KeithShields.ca subscribers, and see new posts as they happen.

We promise we’ll never spam.