Elon Musk claims Tesla will have 1 million robotaxis on roads next year. Currently, Tesla offers Autopilot – an advanced driver assistance system – as a standard feature in its cars. According to the company’s website, Autopilot can automatically hold a car in its lane and accelerate or brake robotically, for example, in response to pedestrians or other cars in its way. Tesla can improve Autopilot with new features (or bug fixes) over time via over-the-air updates, as well. In addition, Tesla sells a “Full Self-Driving,” or FSD, package for its vehicles.

Some would say that Tesla is years ahead of the competition when it comes to powertrain efficiency and software. Since October of 2016, all Tesla’s sold are “capable of full autonomy” pending software and an updated processor released this year. Each Tesla has 360-degree vision from 8 cameras, 12 ultrasonic sensors, and a forward-looking radar.

Despite the equipment that comes with a Tesla there are people who believes Tesla’s vision-based autonomy strategy will not work with existing computer vision and AI. Sean Chandler from Seeking Alpa don’t think it will be reliable enough for regulatory approval.   

Sean thinks it is clear that in the long run, a vision-based autonomous car is the most cost-effective way to go. Humans drive with just two eyes; Tesla has 8 eyes, 12 ultrasonic sensors, and a forward radar. However, humans have deep understandings of the roads that we drive and the things that we see. Having more eyes and sensors means that the car doesn’t have to be as smart as a human, but it still needs to be very smart. He would argue that Tesla has yet to prove that they can do autonomy across a variety of scenarios.

Earlier this year during a podcast with ARK Invest, Elon Musk said that the cars would soon be feature complete, or level 3/4 on the scale below.

Here’s what Elon Musk said during the podcast:

I think we will be feature complete full self driving this year. Meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention. This year. I would say that I am certain of that; that is not a question mark.

Surprisingly, on Tesla’s recent earnings call he confirmed that it’s still possible that something will be out this year:

  • “While it’s going to be tight, it still does appear that there will be at least an early access release of a feature-complete full self-driving feature this year. So it’s not for sure, but it appears to be on track for at least an early access release of a fully functional Full Self-Driving by the end of this year.”

With the exception of a demo in 2016 and a demo for investors earlier this year at Autonomy Day, customers and investors have no idea what to expect. (Videos to demos: 2016, 2019). Some positives we have seen though is substantial improvements to Autopilot and Navigate on Autopilot, which is the best level 2 system on the market.

However, unlike companies like Waymo (GOOG) (NASDAQ:GOOG) or Zoox that are constantly releasing footage of what their vehicles are capable of – Tesla doesn’t release anything.

Here’s videos of a Waymo car navigating a dust storm and a police-controlled intersection. Zoox has videos of its cars navigating San Francisco and Las Vegas, such as passing double-parked cars and yielding to dozens of pedestrians.

Recent updates to Autopilot have shown that it can detect stoplights, stop signs, and cones. Tesla’s Autopilot driver-assist features appears to be more advanced than any other production car available to consumers.

Tesla justified the latest price hike to its Full Self-Driving option with the release of Smart Summon, which was released days before Q3 ended, allowing Tesla to recognize $30M worth of Full Self-Driving revenue on its quarterly report.

Sean thinks the Smart Summon is great and appears to work most of the time but it’s far from perfect. Parking lots are chaotic and happen to be an entirely new driving category for Tesla.

Last month, Scott Kubo released a video of his Model 3 heading straight towards a movable barrier on Coronado Bridge in San Diego. This happened to be an extremely rare situation.

The concrete barrier is moved twice per day to regulate traffic flow between San Diego and Coronado Island. The driver was heading towards Coronado Island at approximately 5:40PM during sunset, likely before or while the barrier was being moved back (usually starts around 5:30).

Like Scott said in his video, the sun was setting and the barrier on the right was casting a long shadow across the road and part of the movable barrier on the left. Only a tiny part of the barrier was in direct sunlight.

Dash Cam Footage. Source: Scott Kubo

The perfect size of the reflection looks very much like a lane marking and the point in which the barrier meets the road is actually quite subtle.

The Autopilot system, which is constantly searching for lane markings, likely identified the reflection of the sun as a lane marking and misidentified the barrier as part of the road.

With computer vision, a big problem with roads is that they can vary in shades of gray. Some sections can be concrete (light gray) asphalt (dark gray) and some road sections may be newer (darker).

When Smart Summon was delayed, Elon Musk said that they were obsessed with curbs and that the hardest part about concrete is the excessive shades of gray. Throw in shadows, which can be at varying angles and intensities throughout the day/year and this becomes even more difficult for a vision-only system.

The situation Scott came across is so rare that just replicating the conditions would be difficult. During the summer, when the sun sets much later (long after the barrier is moved), these conditions aren’t possible. And due to the thickness of the shadow, this event can only happen for just a few minutes each day.

This is one of those scenarios where engineers are chasing the 9s to make the system 99.9999-something percent accurate.

On the plus side, Tesla’s large fleet gets to encounter and discover unique situations like this that most likely wouldn’t occur in a simulation.

On the flip side though, it raises questions about autonomy under Tesla’s strategy that relies primarily on vision. The system also uses ultrasonics and a front radar, but I’m not sure what happened in this case.

While there are many ways that this situation could have been prevented (such as reading and understanding the flashing “Merge Right” sign, or the traditional merge sign), it goes to show that the system needs a lot of work.

In this case, a redundancy such as LiDAR would have been an easy way to identify the barrier.

When it comes to LiDAR, Elon Musk calls it “lame”, or more specifically an expensive “crutch” when it comes to self-driving cars. He’s not wrong. LiDAR is expensive and once computer vision is mastered it may not be necessary. However, the question remains as to how long will it take for computer vision to be mastered. Zoox CTO Jesse Levinson said it could be decades. That’s quite far off, but even if it took 3-5 years that could be devastating, at least if the competition figures it out first.

In suburban Phoenix, Waymo is operating a fleet of self-driving cars and some of those cars are no longer using safety drivers. If Waymo can successfully deploy a large fleet of fully autonomous vehicles in this test area, their efforts will likely be accelerated across the globe. Mastering a single location would likely reduce complications with regulators; in fact, if Waymo is successful in one city, regulators in other cities may be asking Waymo to be in the next city on the list.

A startup’s road…

A startup’s road to self-driving future.

This is an article written by Azra Habibovic, Senior Researcher in automated vehicle systems at RISE Research Institutes of Sweden. The article was originally published in Swedish by RISE within its newsletter and blog on automated vehicles:   

I recently attended to an event where Chris Urmson and Sterling Anderson from the startup company Aurora discussed automated vehicles. The event was organized by the MIT Club of Northern California and took place at the research center PARC, a Xerox company where many major innovations have been created. There were around 300 participants at the event.

Chris has been the one who started and led Google’s self-driving project (now Waymo) before leaving it in 2016 to start Aurora together with Sterling, who until then had been leading Tesla’s development of active safety and Autopilot (as you may remember, he had a dispute with Tesla that they solved without going to court).

The discussion was very lively, much thanks to the moderator Mark Platshon, who started and financed several companies (including Tesla) and worked in many positions with various vehicle manufacturers.

Eventually, there will be a video recording of the discussion, but until then you will have to settle for my notes:

  • Why do you leave your dream job on Google / Tesla to start a new business? Both Chris and Sterling had very diplomatic answers to this question, but if you read between the lines it is about being able to do things in your own way. They had both learned what works and what does not, and saw their chance to do things right from the start. It includes the technology development itself, but also how to interact with others, including the authorities.
  • What makes their new company Aurora unique among many similar companies is that Aurora knows where they want to go (“we know where we are going” was repeated at least ten times in the evening!). The company’s vision can be summarized in three words: safely, quickly, broadly.
  • What also makes the company unique and that Chris and Sterling consider to be the guarantee of success, is its employees and mix of experiences. Many of Aurora’s employees are world-leading experts, and this is what attracts investors and partners.
  • Aurora wants to develop a “driver” that can be integrated into different vehicles. The company cooperates with several vehicle manufacturers and service providers to ensure that the developed driver can be integrated into various vehicles and services. The former is, however, the priority; Without a functioning self-driving system in vehicles, it becomes difficult to have services based on self-driving vehicles. In both cases, the focus in on the development of communication platforms and interfaces. Aurora doesn’t want to be seen as Tier 1.
  • According to Chris and Sterling, they want to let the vehicle manufacturers do what they are good at: building cars. They do not see any problems in that the vehicle manufacturers themselves are developing self-driving systems; it is only positive, because then they realize how difficult it is and are impressed by Aurora’s system. I myself would not be so sure of it – we know that vehicle manufacturers are undergoing a transformation and actually have a lot of know-how.
  • There was a question about their view on the fact that AI is developing quickly and how they can be sure that the foundation they are putting now will “hold” in a couple of years. There is a big difference between integrating new functions into an existing software architecture, and doing so on a specially developed architecture. Aurora has chosen to develop its own architecture with the help of world-leading experts who set the norm in the area. As far as the hardware and related networks are concerned, it is an area that is under development, and right now it is unclear what works best.
  • Remote control of self-driving vehicles is not excluded, but then for very unique cases. Basically, a self-driving system must be able to cope with the traffic on its own. The same applies to wireless communication (V2X): it is good to have, but it should definitely not be a prerequisite for self-driving vehicles.
  • High resolution maps are needed. The theory that it’s hard to keep them updated is a bit exaggerated, because they are created using sensors used for automated driving. This means that maps can be continuously updated without extra cost.
  • Currently, a combination of different sensors is needed. Once the algorithms have improved, it is not unlikely that it will be enough with cameras solely, but it will take a while. Lidar components are not expensive at all, so there is no reason why lidars will remain expensive when ordering them in large quantities.
  • Safety was definitely the most discussed topic. A combination of field tests and simulations for specific operative domains is the way forward, combined with a well-thought-out and well-documented development process.
  • Having a constant dialogue with the authorities is crucial. It is about explaining to them how the system was developed, which standards and principles they followed and how it was tested so that they understand why the manufacturer believes in the system. A kind of mutual understanding. (Must say that I was surprised at how often they used the word “explain” and “believe” in this context!). Aurora has a constant dialogue with the authorities. No third party is required to validate the safety, it is the manufacturer who has the best knowledge of the system. One can then wonder how objective this would be?
  • There will be no “driving license” for automated vehicles. The systems are too complex and cannot be generic. Instead, it is important to explain the system to authorities and the public in their own language.
  • Flying cars can become a reality in a distant future. But right now, there are many obstacles, not least regular, that make it less likely for such solutions to break through. In addition, an incredible number of flights are required for such a solution to be cost-effective.
  • In the beginning, many, especially vehicle manufacturers, saw Google’s work on self-driving vehicles as madness. According to Chris, this changed when Uber entered the game. Then the vehicle manufacturers began to realize the seriousness of the whole thing. He also points out to understand that the automotive industry is not homogeneous; Even among the most conservative companies, there are those who are futuristic.

In the end, I want to share with you that I learned a new acronym (!) – ACES (Autonomous, Connected, Electric, Shared).

#future #mobility #blog #future mobility blog

A quiet ride

Bose is developing a new noise-canceling system for your car. The audio system will use microphones and algorithms to cancel out sound.

Conventional methods to reduce noise would include extra insulation and specialized tires from driving on uneven pavement and rough roads. Bose is using another method with accelerometers mounted on the vehicle’s body and microphones inside the vehicle to measure vibrations that create noise. An advanced software can then process the signals and use the vehicle’s built-in audio system to electronically control unwanted sound by sending out an acoustic cancellation signal through the vehicle’s speaker.

The production is planned to start by the end of 2021 and Bose will collaborate with automakers during the vehicle development process to install its tech during production.

#future #mobility #blog #future mobility blog

Self-driving grocery stores

Robomart is an autonomous mini grocery store on wheels. The San Francisco-based company has recently signed a partnership with the Grocery store chain Stop & Shop. Stop & Shop will begin testing driverless grocery vehicles in Boston starting this spring. The vehicles will cart around Stop & Shop grocery items, and meal kits to customers’ doorsteps.

#future #mobility #blog #future mobility blog

5 trends reshaping the cars

Are you tired of driving your own car? Want to know when self-driving vehicles will actually be available?

A few trends are reshaping the way we will travel. We have listed 5 trends that will have an impact on how we use our cars in the future.

  1. Electrical cars
  2. Connectivity and IoT
  3. Self-driving cars
  4. Services instead of owning – Pay-per-use
  5. New HMI

Connected and self-driving cars can reduce more than 90 percent of accidents caused by human error. They could also help to eliminate hours wasted in traffic and reduce congestion. The automakers all race to put these cars on the road but however, the industry is facing regulations that limit the adoption of this new technology. The industry is therefore divided on when we will have the first commercial self-driving car. Some is saying 2020, while others expect it to take more than 10 years.

One important technology to make self-driving cars is the LiDAR (light detection and ranging) sensors. It can be found in vehicles that offer ADAS (advanced driver assistance systems) features such as pedestrian detection with automatic emergency braking. LiDAR is a complement to the radars and cameras.

Self-driving cars need to combine all three different sensors and form the right mix to solve all issues to drive safe. Increased production of self-driving cars will help to decrease prices of LiDAR but it could take a few years. Partnerships among automakers, suppliers and sensor makers will play a role in LiDAR proliferation and price declines, too.

For any type of driverless vehicle, the first thing you need to do is to create a perception module. This enables the vehicle to react to shapes and textures around it, including people, vehicles, obstructions, lane markings and traffic light colors. While cameras can detect both types of information, radar and LiDAR recognize only shapes.

Cameras are good to use as the primary source of information for perception, and then use LiDAR and radar as a redundancy for shape. All sensors combined, it is possible to build a very accurate model of the environment around the vehicle.

When automakers prepare for a world of self-driving cars, they’re experimenting with many different types of human-machine interface technologies (HMIs) including interior-facing cameras, gesture and voice controls, and touch-sensitive surfaces to integrate with the driver. This is all supported by new and smarter computing platforms.

In the race for the next generation battery, lithium-ion technology has improved a lot during the years. But same problem as in the oil industry, the power packs use raw material mined in unstable countries, and they’re dangerous if they break. Many companies are spending money to find new technologies making it possible to store more energy so the power could last longer at a lower cost.

We still don’t know how electrical and autonomous vehicles will be sold or marketed in the future. Automakers have hinted that ride-sharing and shuttle services may become their primary market and it will be interesting to see how the pay-per-use services will be developed.

#future #mobility #blog #future mobility blog

Flying car with…

…vertical take-off and landing.

Soon you will be able to fly your car with vertical take-off and landing. And if you are wondering where to park, it will be able to fit in a single car garage.

The flying car is powered by two plug-in hybrid 600-horsepower electric motors and a 300-horsepower fuel engine, the TF-X is planned to have a flight range of 500 miles (805 km) with a cruising flight speed of 200 mph (322 km/h) without the need to refuel or recharge. Road speed is currently unknown.

The Terrafugia TF-X is the first fully autonomous flying car under development by Boston-based Terrafugia. Its expected release date is listed as eight to twelve years

Big data and mobility

Predictive analytics will save lives

Big data used to be a term that refers to a large volume of data. Over the years the term has expanded to include the ability to capture, store, and manage data to use analytics to make predictions about future events.

The integration of digital technologies and use of data in our daily lives occurs at an exponential pace. Companies in the automotive industry are finding new and efficient ways to analyze data to increase traffic safety by predicting behavior.

Car crashes killed more than 35,000 people in the U.S. 2018. Most of them because of human errors. They were attributed to more than 90% of the crashes. The autonomous technology represents a great opportunity to improve traffic safety.


In 2019, we will see new efforts to combine technologies with products, using IoT to stream data and apply Machine Learning in real time.

Instead of using stored data from a traditional controlled environment it will be more common to process live streamed data with help of new frameworks. Daily intervals of data are often not sufficient. Predictive analytics is expected immediately.

With help of multiple sensors, including lidar, radar and cameras, automotive companies can process and analyze new data in real-time. Stream processing help users query continuous data streams and detect conditions immediately after receiving the data. Speed is necessary and new data has to be processed as soon as it’s known to the system.

Faster processing time will help to predict traffic behavior in real-time and enable automotive companies to emphasize on proactive safety rather than reactive safety. This paradigm shift will prevent a majority of car accidents, especially those who are attributed to human errors.

#future #mobility #blog #future mobility blog