“Autonomous vehicles and autonomous driving [brought with it a huge amount] of the hype. Everyone thought that by 2020 or 2021 we would see significant numbers of autonomous vehicles, autonomous services, and autonomous robots. That didn’t happen. I think there is consensus that the reason for this is the lack of mature sensor technology. “
At first glance, this is a strange thing for Ram Machness, Vice President of Product at a company called Arbe Robotics. Finally, Arbe makes sensors that can be used by autonomous vehicles. This is a bit like Tim Cook, Apple’s CEO, saying that the reason the smartphone market has declined over the past year is because nobody makes good smartphones.
Machness has a point, however. He hopes the company’s new technology unveiled at this year’s virtual CES will help make the change. With its new sensor technology, it won’t be long before more autonomous driving technology hits the road. Really this time.
Machness points out that the misconception many people who built self-driving cars made was that the algorithms they built for autonomous driving would have access to complete information about the world in which those cars drove. That didn’t happen. Instead of having perfect information about the world they were moving in, they were hampered by perceptual problems that needed to be solved before they could develop algorithms that support autonomous technologies for various applications.
It’s like trying to teach someone how to do their job in an office that has just had a power outage. One problem must be resolved before the other can be tried. And that wasn’t possible before.
Could next generation radar help change that?
Radar is making a comeback
Radar hasn’t been taken too seriously as a way of tricking autonomous vehicles into perceiving the world, except as a means of sensing the speed of objects already identified by other sensors. Much of the main discussion involved either computer vision with standard cameras or lidar, with reference to reflected lasers used to measure distances. Both approaches have their positive and negative aspects.
Radar, which reflects electromagnetic waves, has been around much longer than lidar, but there are also some pretty big challenges.
As an example, Machness shows an image of a black screen with a handful of bright orange dots on the surface. It looks like someone splashed a small amount of colored paint on a dark wall, or maybe the lights of the city reflected in the water at night. It is practically impossible to find out what you are seeing. This is traditional radar, a technology that many cars are equipped with today for things like parking sensors, but that practically no one takes seriously for imaging. What we “see” is a street scene with other cars and a number of additional obstacles.
Machness jumps to another video and we’re now seeing a psychedelic dashcam of a car winding its way through tree-lined streets. Aside from the fact that it looks like it was captured in Predator thermal imaging, it’s perfectly readable – by humans, let alone machines.
The big upgrade is the number of transmit and receive channels on the radar. Machness compares this to the number of pixels in a camera image. “If I count the number of channels in radars today, they have 12 channels,” he said. “The more advanced have 48 channels. We see some competitors working towards 192 channels. [We’ve developed radar with] 2,000 channels. This is the breakthrough. We can process them at the same time. “
As announced at CES on January 11th, Arbe’s new radar technology promises 4D radar imaging for autonomous vehicles with the ability to separate, identify and track objects in high resolution thanks to a next-generation radar, the 100- times more detailed than any other radar on the market. This “2K radar technology with ultra-high resolution” promises to be “ready to go” by the beginning of 2022, one year from now.
The company is working with a large number of large, but not yet announced, partners to integrate this technology into future road platforms for vehicles. “The problem Arbe is trying to solve is to provide an imaging radar with near zero false positives and very high resolution [to autonomous vehicles,]Said Machness.
One of the great advantages of radar is the ability to use it in poor weather conditions. “Things to which cameras and lidar are very sensitive – such as fog, rain or dust – that radar technology is much less sensitive to,” said Machness.
Live up to the hype?
Not to say that’s the case here, but CES demos can, at best, be massaged to make the technology look better than it is. Any live demo can. (Steve Jobs demonstrated the original 2007 iPhone on a model that would fail if he didn’t follow exactly a series of steps to show how it works.) Demos in the virtual era – like a virtual live stream Show like CES 2021 – open up even more opportunities for misrepresentation.
When it comes to autonomous vehicles and imaging, there are many question marks. Until the problem of autonomous driving is perfected (and what exactly does that mean?), There will be disagreements about how best to build one. Lidar, for example, has its staunch supporters, while Elon Musk, CEO of Tesla, has declared it “unnecessary” and “a fool’s game.”
However, new opportunities like these are not only competing approaches, they also represent breakthroughs that could be part of smarter hybrid systems that use the best of all worlds. In this role, Arbe is not the only one to announce autonomous sensor breakthroughs at CES. Also at this year’s show, Seoul Robotics – a South Korean company – is introducing its first mass market product, a next-generation cross-industry plug-and-play lidar solution. Another startup, Cognata, introduces Real 2 Sim, a new product that takes data from drives and automatically converts them into simulations and data sets.
It’s not just self-driving cars that this technology could benefit from. For its part, Arbe is heavily focused on improving autonomous delivery robots so that they can better navigate the real world. “For the first generation [creators went] Overkill with the amount of sensors they use, ”said Machness. “But to try to cut costs, [they are now trying to] Reduce the number of sensors, but also increase the safety of these robots and their ability to move anywhere. “
The same technology could also be used to power autonomous trucks, buses, drones, and more, which will hit the streets in increasing numbers over the next few years.
Autonomous vehicles have been a headline-making part of CES since at least 2013. Hopefully this year, with the coronavirus removing the flash of the live event, it will focus more on substance and solve some of the problems that have remained autonomous vehicles in part for so long in the realm of science fiction.
After all, who wouldn’t want to see the next generation of vehicles with self-driving technology?
Editor’s recommendations
Comments are closed.