This past summer, I was in the Midwest teaching a class on emerging technologies — including advanced driver assistance systems (ADAS) technology — when a technician announced to everyone his recent successful recalibration of a LiDAR sensor on a Honda.
A few minutes later, he added that his shop did not have the room inside the bay area for the reset, so he completed the task in the repair facility's parking lot.
Knowing this was a perfect "educational moment," I asked: "Was the stand level?"
The tech acknowledged he knew that the stand had to be level per his plus/minus four hours of equipment instruction.
"And the vehicle was aligned before the recalibration?" I asked. "The parking lot and vehicle were level, as well?"
Crickets.
The technician acknowledged the alignment but qualified the second question by responding that the car must have been "level" due to the absence of no trouble-light illuminated on the instrument panel to indicate an issue. And the vehicle drove straight.
It's not his fault. He hadn't been properly trained to know the consequences of not following ADAS procedures to the letter. Had he known the background to the road to autonomy, I am sure he would have followed set-up directions to the letter.
ADAS isn't a land-based transportation exclusive. The science of autonomy covers all formats of drive: air, land and sea.
But we are getting ahead of ourselves … back to land-based ADAS. So, where is our industry regarding cars and trucks moving along roads in computer-controlled modules? Well, that depends upon where your "office" is located.
From a sensor-application data scientist to the local shop technician performing a calibration/recalibration — and all those associated in between — everyone is on a different page, performing a different task. And it appears most believe the initial, factory calibration of sensor fusion is once-and-done, and everything is going to function because other support systems are infallible.
In the ADAS world, groups are on different chapters, unaware which groups are on which page, but everyone can agree how this book is going to end: autonomy.
You've got to know your history to have a successful future.
In 1969, Walt Disney studios produced the movie, "The Love Bug," and the star of the show was a car, a Volkswagen Beetle. The co-stars were Dean Jones and Buddy Hackett. There is a moment when the vehicle comes to life when Hacket (paraphrasing) tells Jones: "We stuff these machines with so much information that they begin to think for themselves, … that they are 'somebody.'"
When the movie premiered, the public didn't realize that Hackett wasn't too far off about automotive autonomy. Decades before, several auto manufacturers; well-known, electronically based companies (e.g., RCA, General Electric); and engineering colleges across the nation already were navigating driverless vehicle prototypes.
These experiments mostly were circumnavigated within the confines of the proving ground, but there were a few that were tested on surface streets. The first autonomous vehicles hit the road — U.S. Route 73 and U.S. Route 75 — in Nebraska in late 1954.
General Motors Co. (GM) had the cars outfitted with audio and visual sensors, radio receivers, and steering and brake control devices to navigate the transportation experiment on public highways back then. GM pushed forward with its autonomous studies and featured "Firebirds."
This branding — per the experimental phase — was capable of the semi-autonomy feature of highway driving "while the driver 'relaxes,' and can read the newspaper."
This prototype was built in the late 1950s, featured in the 1960s. And, yes, they were using RADAR, as well.
The Warren, Mich.-based tech center wasn't the only one on the trek for driverless. Ford Motor Co. was in the mix and already using RADAR in the FX Atomos build; GM added brake-assist for its 1959 Cadillac Cyclone project.
Just about everyone was on board the autonomy wagon. Bigger prototypes were built with more modules controlling driver functions and this continued into the 1970s. In the early 1980s, Mercedes-Benz introduced the "city-driving" autonomous van. The streets of Munich never would be the same. Nor would any other highway across the world.
Before we can have any autonomous cars to Class 8 trucks and beyond, we need sufficient roadways. Today, if you listen to any highway engineer discuss the autonomous car or truck, they say it's going to take over roadways within the next 10 years. No more street signs, no more stop signs. The vehicle will "know when to cross" the intersection while keeping its occupants safe.
And what about those ancient underground cables that were developed almost 80 years ago? I can hear you: "We don't need them for today's autonomous vehicles, right? We are more advanced than that."
Maybe not.
For example, an Israeli company has developed an in-ground/below pavement charging system that is in active testing on the west side of Detroit and East Lansing. These pods act like the smartphone wireless charging pads we have in cars and on countertops. These buried units are using the same concept, providing juice to the high-voltage battery on the HEV/PHEV/BEV vehicles as they drive over them.
Even though nationally the Department of Transportation (DOT) realizes that a fully automated traffic stream may just be a pipe dream for nog, the agency is encouraging aftermarket companies to retrofit antiques so they are compatible within the traffic-flow structure. And there are plenty of private entities gathering machine-learning voluntarily.
Universal solution companies like ThinkView's camera bundle or the Tampa Hillsborough Expressway Authority's Connected Pilot (THEA-CV) program exist.
With this program, local municipalities partnered with a private innovations company — Honda, Hyundai and Toyota came along for the ride — to gather machine-learning and technical collaboration to be used in future ADAS standards.
The road to autonomy is going to take a lot of machine-learning before we get to Level 5, aka artificial intelligence.
Any automotive data scientist will say the road to a fully autonomous vehicle is paved with quadrillions of bits of machine-learning data, including in-lab learning and testing grounds/on-road verifications. Many vehicles are downloading consumer driving information to OEMs, so the technology can get on the road sooner.
It's not as simple as telling a vehicle when to stop and not stop. Many scenarios need to be addressed, tested, confirmed and reconfirmed before moving forward to the next level of autonomy.
For example, leaves falling off a maple tree that land on a windshield and/or bumper, or falling in the road ahead of the vehicle. If coded directions say, "Stop for every object/obstacle in front of the vehicle," the vehicle wouldn't move with so many objects — bugs, rain, snowflakes — that hit a vehicle daily.
If the object is not programmed into the system, the vehicle either stays stationary or commits to an automatic emergency braking (AEB) while navigating a roadway at highway speeds. Programming is the primary key to making this magic happen.
Critical thinking modules, research and testing groups, calibration companies are working with manufacturers to become successfully automotive autonomous. Data scientists and engineers are collaborating to get the best-case sensor scenario, but sometimes their vision can get a little muddied.
Vehicle builds and guidelines often get in the way of the sensor scientist's drawing board. To be successful, sensors to vehicle standards need to work flawlessly.
When it comes to interpretation, actual application can be somewhat cloudy. That's where the Association for Standardization of Automation and Measuring Systems (ASAM) comes into play with standards for:
- Measurement and calibration
- Diagnostics
- ECU networks
- Software development
- Test automation
- Data management
- Simulation
There's a place for simulation, but the group knows nothing is better than physical testing to confirm success.
Therefore, collaboration between and among data scientists, calibration experts, manufacturer's engineering teams and aftermarket development companies (Tier 1 and Tier 2) is imperative for success.
Quadrillions of data points need to be collected, analyzed, tested, prototyped and retested before we can trust the driven-data in your grandmother's sedan.
Another group that deserves mention is the Society of Automotive Engineers (SAE) International. From the composite sensor build to operation on the roadway, the group oversees just about every aspect in the automotive — everything that moves: air, land and sea — to the holy grail of autonomous builds. It concentrates on automotive safety applications of programming and sensor designs from data scientists and engineers.
In most cases, ADAS is a very opaque concept in auto dealerships and for aftermarket technicians. It is gaining traction, but there are not enough technicians in the bay who have been given the opportunity — through proper training or white-paper exposure to the systems. There are more modules on today's vehicles than on the space shuttle.
On the dealership level, the technician who becomes the ADAS calibration/recalibration designate is getting OE training on the product line. Others in the shop get their training second-hand from the tech who had a day or two of OE school.
And to show technician proficiency? Let's not forget, ASE has a year-old, advanced level exam just for ADAS: L4. Fewer than 850 automotive technicians — out of more than 800,000 — have the shingle. Break that down across the U.S., that means there are only 17 ASE-L4 certified techs per state to properly service approximately 150,000 cars and light trucks equipped with ADAS.
The rest of the 799,000-plus technicians are just beginning to learn about ADAS. Sure, several webinars are out there from equipment manufacturers that show the glamor of the tooling. And aftermarket shops and mobile technicians are buying into it at $35,000-plus for a full set-up. Maybe the purchaser will get lucky and receive a four- or eight-hour, hands-on set-up class. Others get webinars and an 800-number to call for guidance.
For those shops/technicians who do not have ADAS equipment, the driver-assist training is considered secondary to drivability. Thousands of technicians are unaware they need to check the OE/shop management system for probable sensor recalibration post-alignment rack procedure.
Some ignore the procedure completely because the facility where they work does not have ADAS equipment to calibrate/recalibrate; they are not trained to know the importance of the procedure.
Techs who know about ADAS via one- to three-hour webinars know the "why" just enough to performing a dynamic calibration, but ignore the static because no equipment is available.
Other technicians who know the importance of ADAS still are in the mindset of "set-the-toe and let-it-go," because the vehicle's parts do not have a slot or incline to move when it comes to camber-caster settings. Some believe ADAS compensates for those angles being out of range and into the red.
The reason for this? Lack of proper training. Like any other repair procedure, proper education is the base for all things successful in the bay. The technician needs to perform tasks in a particular manner.
Printed step-by-step procedures make overall sense for successful calibrations/recalibrations.
When all is said and done, no matter the medium, there will always be ADAS to autonomy. And the industry needs to get everyone on the same page in this autonomous book to make the vision an all-around success.