Robotaxis

Elon Musk claims Tesla will have 1 million robotaxis on roads next year. Currently, Tesla offers Autopilot – an advanced driver assistance system – as a standard feature in its cars. According to the company’s website, Autopilot can automatically hold a car in its lane and accelerate or brake robotically, for example, in response to pedestrians or other cars in its way. Tesla can improve Autopilot with new features (or bug fixes) over time via over-the-air updates, as well. In addition, Tesla sells a “Full Self-Driving,” or FSD, package for its vehicles.

Some would say that Tesla is years ahead of the competition when it comes to powertrain efficiency and software. Since October of 2016, all Tesla’s sold are “capable of full autonomy” pending software and an updated processor released this year. Each Tesla has 360-degree vision from 8 cameras, 12 ultrasonic sensors, and a forward-looking radar.

Despite the equipment that comes with a Tesla there are people who believes Tesla’s vision-based autonomy strategy will not work with existing computer vision and AI. Sean Chandler from Seeking Alpa don’t think it will be reliable enough for regulatory approval.   

Sean thinks it is clear that in the long run, a vision-based autonomous car is the most cost-effective way to go. Humans drive with just two eyes; Tesla has 8 eyes, 12 ultrasonic sensors, and a forward radar. However, humans have deep understandings of the roads that we drive and the things that we see. Having more eyes and sensors means that the car doesn’t have to be as smart as a human, but it still needs to be very smart. He would argue that Tesla has yet to prove that they can do autonomy across a variety of scenarios.

Earlier this year during a podcast with ARK Invest, Elon Musk said that the cars would soon be feature complete, or level 3/4 on the scale below.

SAE.org

Here’s what Elon Musk said during the podcast:

I think we will be feature complete full self driving this year. Meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention. This year. I would say that I am certain of that; that is not a question mark.

Surprisingly, on Tesla’s recent earnings call he confirmed that it’s still possible that something will be out this year:

  • “While it’s going to be tight, it still does appear that there will be at least an early access release of a feature-complete full self-driving feature this year. So it’s not for sure, but it appears to be on track for at least an early access release of a fully functional Full Self-Driving by the end of this year.”

With the exception of a demo in 2016 and a demo for investors earlier this year at Autonomy Day, customers and investors have no idea what to expect. (Videos to demos: 2016, 2019). Some positives we have seen though is substantial improvements to Autopilot and Navigate on Autopilot, which is the best level 2 system on the market.

However, unlike companies like Waymo (GOOG) (NASDAQ:GOOG) or Zoox that are constantly releasing footage of what their vehicles are capable of – Tesla doesn’t release anything.

Here’s videos of a Waymo car navigating a dust storm and a police-controlled intersection. Zoox has videos of its cars navigating San Francisco and Las Vegas, such as passing double-parked cars and yielding to dozens of pedestrians.

Recent updates to Autopilot have shown that it can detect stoplights, stop signs, and cones. Tesla’s Autopilot driver-assist features appears to be more advanced than any other production car available to consumers.

Tesla justified the latest price hike to its Full Self-Driving option with the release of Smart Summon, which was released days before Q3 ended, allowing Tesla to recognize $30M worth of Full Self-Driving revenue on its quarterly report.

Sean thinks the Smart Summon is great and appears to work most of the time but it’s far from perfect. Parking lots are chaotic and happen to be an entirely new driving category for Tesla.

Last month, Scott Kubo released a video of his Model 3 heading straight towards a movable barrier on Coronado Bridge in San Diego. This happened to be an extremely rare situation.

The concrete barrier is moved twice per day to regulate traffic flow between San Diego and Coronado Island. The driver was heading towards Coronado Island at approximately 5:40PM during sunset, likely before or while the barrier was being moved back (usually starts around 5:30).

Like Scott said in his video, the sun was setting and the barrier on the right was casting a long shadow across the road and part of the movable barrier on the left. Only a tiny part of the barrier was in direct sunlight.

Dash Cam Footage. Source: Scott Kubo

The perfect size of the reflection looks very much like a lane marking and the point in which the barrier meets the road is actually quite subtle.

The Autopilot system, which is constantly searching for lane markings, likely identified the reflection of the sun as a lane marking and misidentified the barrier as part of the road.

With computer vision, a big problem with roads is that they can vary in shades of gray. Some sections can be concrete (light gray) asphalt (dark gray) and some road sections may be newer (darker).

When Smart Summon was delayed, Elon Musk said that they were obsessed with curbs and that the hardest part about concrete is the excessive shades of gray. Throw in shadows, which can be at varying angles and intensities throughout the day/year and this becomes even more difficult for a vision-only system.

The situation Scott came across is so rare that just replicating the conditions would be difficult. During the summer, when the sun sets much later (long after the barrier is moved), these conditions aren’t possible. And due to the thickness of the shadow, this event can only happen for just a few minutes each day.

This is one of those scenarios where engineers are chasing the 9s to make the system 99.9999-something percent accurate.

On the plus side, Tesla’s large fleet gets to encounter and discover unique situations like this that most likely wouldn’t occur in a simulation.

On the flip side though, it raises questions about autonomy under Tesla’s strategy that relies primarily on vision. The system also uses ultrasonics and a front radar, but I’m not sure what happened in this case.

While there are many ways that this situation could have been prevented (such as reading and understanding the flashing “Merge Right” sign, or the traditional merge sign), it goes to show that the system needs a lot of work.

In this case, a redundancy such as LiDAR would have been an easy way to identify the barrier.

When it comes to LiDAR, Elon Musk calls it “lame”, or more specifically an expensive “crutch” when it comes to self-driving cars. He’s not wrong. LiDAR is expensive and once computer vision is mastered it may not be necessary. However, the question remains as to how long will it take for computer vision to be mastered. Zoox CTO Jesse Levinson said it could be decades. That’s quite far off, but even if it took 3-5 years that could be devastating, at least if the competition figures it out first.

In suburban Phoenix, Waymo is operating a fleet of self-driving cars and some of those cars are no longer using safety drivers. If Waymo can successfully deploy a large fleet of fully autonomous vehicles in this test area, their efforts will likely be accelerated across the globe. Mastering a single location would likely reduce complications with regulators; in fact, if Waymo is successful in one city, regulators in other cities may be asking Waymo to be in the next city on the list.

A startup’s road…

A startup’s road to self-driving future.

This is an article written by Azra Habibovic, Senior Researcher in automated vehicle systems at RISE Research Institutes of Sweden. The article was originally published in Swedish by RISE within its newsletter and blog on automated vehicles: https://omad.tech.   

I recently attended to an event where Chris Urmson and Sterling Anderson from the startup company Aurora discussed automated vehicles. The event was organized by the MIT Club of Northern California and took place at the research center PARC, a Xerox company where many major innovations have been created. There were around 300 participants at the event.

Chris has been the one who started and led Google’s self-driving project (now Waymo) before leaving it in 2016 to start Aurora together with Sterling, who until then had been leading Tesla’s development of active safety and Autopilot (as you may remember, he had a dispute with Tesla that they solved without going to court).

The discussion was very lively, much thanks to the moderator Mark Platshon, who started and financed several companies (including Tesla) and worked in many positions with various vehicle manufacturers.

Eventually, there will be a video recording of the discussion, but until then you will have to settle for my notes:

  • Why do you leave your dream job on Google / Tesla to start a new business? Both Chris and Sterling had very diplomatic answers to this question, but if you read between the lines it is about being able to do things in your own way. They had both learned what works and what does not, and saw their chance to do things right from the start. It includes the technology development itself, but also how to interact with others, including the authorities.
  • What makes their new company Aurora unique among many similar companies is that Aurora knows where they want to go (“we know where we are going” was repeated at least ten times in the evening!). The company’s vision can be summarized in three words: safely, quickly, broadly.
  • What also makes the company unique and that Chris and Sterling consider to be the guarantee of success, is its employees and mix of experiences. Many of Aurora’s employees are world-leading experts, and this is what attracts investors and partners.
  • Aurora wants to develop a “driver” that can be integrated into different vehicles. The company cooperates with several vehicle manufacturers and service providers to ensure that the developed driver can be integrated into various vehicles and services. The former is, however, the priority; Without a functioning self-driving system in vehicles, it becomes difficult to have services based on self-driving vehicles. In both cases, the focus in on the development of communication platforms and interfaces. Aurora doesn’t want to be seen as Tier 1.
  • According to Chris and Sterling, they want to let the vehicle manufacturers do what they are good at: building cars. They do not see any problems in that the vehicle manufacturers themselves are developing self-driving systems; it is only positive, because then they realize how difficult it is and are impressed by Aurora’s system. I myself would not be so sure of it – we know that vehicle manufacturers are undergoing a transformation and actually have a lot of know-how.
  • There was a question about their view on the fact that AI is developing quickly and how they can be sure that the foundation they are putting now will “hold” in a couple of years. There is a big difference between integrating new functions into an existing software architecture, and doing so on a specially developed architecture. Aurora has chosen to develop its own architecture with the help of world-leading experts who set the norm in the area. As far as the hardware and related networks are concerned, it is an area that is under development, and right now it is unclear what works best.
  • Remote control of self-driving vehicles is not excluded, but then for very unique cases. Basically, a self-driving system must be able to cope with the traffic on its own. The same applies to wireless communication (V2X): it is good to have, but it should definitely not be a prerequisite for self-driving vehicles.
  • High resolution maps are needed. The theory that it’s hard to keep them updated is a bit exaggerated, because they are created using sensors used for automated driving. This means that maps can be continuously updated without extra cost.
  • Currently, a combination of different sensors is needed. Once the algorithms have improved, it is not unlikely that it will be enough with cameras solely, but it will take a while. Lidar components are not expensive at all, so there is no reason why lidars will remain expensive when ordering them in large quantities.
  • Safety was definitely the most discussed topic. A combination of field tests and simulations for specific operative domains is the way forward, combined with a well-thought-out and well-documented development process.
  • Having a constant dialogue with the authorities is crucial. It is about explaining to them how the system was developed, which standards and principles they followed and how it was tested so that they understand why the manufacturer believes in the system. A kind of mutual understanding. (Must say that I was surprised at how often they used the word “explain” and “believe” in this context!). Aurora has a constant dialogue with the authorities. No third party is required to validate the safety, it is the manufacturer who has the best knowledge of the system. One can then wonder how objective this would be?
  • There will be no “driving license” for automated vehicles. The systems are too complex and cannot be generic. Instead, it is important to explain the system to authorities and the public in their own language.
  • Flying cars can become a reality in a distant future. But right now, there are many obstacles, not least regular, that make it less likely for such solutions to break through. In addition, an incredible number of flights are required for such a solution to be cost-effective.
  • In the beginning, many, especially vehicle manufacturers, saw Google’s work on self-driving vehicles as madness. According to Chris, this changed when Uber entered the game. Then the vehicle manufacturers began to realize the seriousness of the whole thing. He also points out to understand that the automotive industry is not homogeneous; Even among the most conservative companies, there are those who are futuristic.

In the end, I want to share with you that I learned a new acronym (!) – ACES (Autonomous, Connected, Electric, Shared).

#future #mobility #blog #future mobility blog

A quiet ride

Bose is developing a new noise-canceling system for your car. The audio system will use microphones and algorithms to cancel out sound.

Conventional methods to reduce noise would include extra insulation and specialized tires from driving on uneven pavement and rough roads. Bose is using another method with accelerometers mounted on the vehicle’s body and microphones inside the vehicle to measure vibrations that create noise. An advanced software can then process the signals and use the vehicle’s built-in audio system to electronically control unwanted sound by sending out an acoustic cancellation signal through the vehicle’s speaker.

The production is planned to start by the end of 2021 and Bose will collaborate with automakers during the vehicle development process to install its tech during production.

#future #mobility #blog #future mobility blog

Self-driving grocery stores

Robomart is an autonomous mini grocery store on wheels. The San Francisco-based company has recently signed a partnership with the Grocery store chain Stop & Shop. Stop & Shop will begin testing driverless grocery vehicles in Boston starting this spring. The vehicles will cart around Stop & Shop grocery items, and meal kits to customers’ doorsteps.

#future #mobility #blog #future mobility blog

Mobile-carrier

Gita is a mobile carrier that can follow you.

This cool high-tech carrier can transport up to 45 pounds and can operate on irregular surfaces, indoors and outdoors, on sidewalks and streets, all while using pedestrian etiquette.

Founded in 2015 by the Piaggio Group, Piaggio Fast Forward creates lightweight, intelligent mobility solutions for people and goods.

#future #mobility #blog #future mobility blog