So there is a problem with autonomous driving.

Sit back, check your emails, watch TV shows or just sleep – and all while driving at full speed. As soon as autonomous driving takes hold, this is what the brave new world of cars should look like. Self-driving cars should not only provide comfort and convenience, but also avoid accidents because they act predictably and react faster than human drivers and do not get caught up in acceleration. And with shorter distances between cars that also communicate with each other, our often overcrowded roads must be used even more efficiently.

Great prospects heralded by the auto industry for many years. But the truth is: we are still a long way from comprehensive autonomous driving. And even piloted driving as an intermediate stage has yet to hit the road. Audi launched its current A8 in 2018 and had it ready for Level 3 robotic driving. But initially the Nobel Audi was not legally allowed – and then Audi itself stopped the project due to its eternal lack of legal footing. It was only in 2022 that Mercedes was allowed and allowed to integrate a similar system with Drive Pilot into its flagship S-Class – but only up to 60 speeds and under narrowly defined conditions.

Who is responsible?

Because at level 3, the vehicle moves independently and if an accident occurs, the vehicle manufacturer is responsible. The driver does not have to constantly follow the traffic while driving. Ultimately, the Mercedes system is a fully automatic traffic jam pilot that takes over the steering in stop-and-go traffic and at manageable speeds, allowing the driver for the first time to take their hands off the wheel until the system prompts. take over again.

For the Chinese automaker Nio, this kind of autonomous driving is just an intermediate step. He wants to ride right away with 120 km/h or 130 km/h. Mercedes is also working on a speed boost for the Drive Pilot. However, “Automatic driving up to a maximum of 120 kilometers per hour at level 3 is a much greater challenge than at 60 km/h,” says Jan Becker, CEO of software developer Apex Ai. 24 years of driver assistance systems and software for autonomous driving.

High speed makes it harder

A simple example everyone knows from the driving school shows just how big the difference is: double your speed and quadruple the braking distance. The sensors of an autonomous car should have this on their screen, so to speak, and accordingly maintain greater distance when the speed increases. “To do that, you need a combination of level 3 cameras, radar and lidar sensors,” Becker explains.

Lidar sensors use laser beams to scan the environment and provide greater clarity in object recognition. Because the resolution of a radar is not enough for a precise image. Also, according to Becker, the combination of radar and video makes too many mistakes in sensing the environment: “We need redundancy in sensor setup.” Even Tesla has had to realize how necessary are sensors that complement and protect each other: Instead of relying solely on cameras, the electric pioneer is now re-installing radar in its cars.

Which sensor is correct?

Radar sensors emit microwaves that are primarily reflected by metal – i.e. cars, manhole covers, bridges over highways, or edge delimiters. To be on the road autonomously, the system must recognize: manhole cover or barrier? This speed is still possible up to 60 km/h, as the sensors still keep the braking distance under control. If a car is running at 120 km/h on the highway and a car is parked under the portal or next to a traffic sign, detection becomes much more difficult. Identify, identify the object, and then react – a challenge even for robot drivers at this speed.

What if one sensor reports an obstacle while the other does not? “You need computational models that show what a sensor can and cannot do. From this one can conclude whether the message from the sensor is reasonable,” Becker explains. For example, a lidar sensor is often mounted quite low and cannot detect if vehicles in front are braking because this obscures their field of view. A camera, on the other hand, can detect the appearance of flashing brake lights. Therefore, the algorithm needs to evaluate these conditions and include them in the decision. The algorithm can only make decisions about controlling the car if the sensors cooperate.

Highest levels only at low speeds

What about levels 4 and 5? The latter is the highest – and still utopian – level of autonomous driving. In the US people have reached at least level 4, but even there higher speeds are still an issue. Cruise, a subsidiary of US automaker General Motors, has received approval for autonomous Level 4 taxis for large parts of the metropolitan area of ​​California’s San Francisco – without a human driver as a fallback example. But initially only in the city. Cities like Phoenix or Houston will follow.

Rival Waymo, part of Google Group Alphabet, has been working on autonomous driving since 2009 and is thus far ahead of Tesla and is also allowed to operate its own autonomous fleet in San Francisco, Phoenix and more recently Los Angeles.

Wolfgang Gomoll
Source: Blick

follow:
Ella

Ella

I'm Ella Sammie, author specializing in the Technology sector. I have been writing for 24 Instatnt News since 2020, and am passionate about staying up to date with the latest developments in this ever-changing industry.

Related Posts