Elon Musk claims Tesla will have 1 million robotaxis on roads next year. Currently, Tesla offers Autopilot – an advanced driver assistance system – as a standard feature in its cars. According to the company’s website, Autopilot can automatically hold a car in its lane and accelerate or brake robotically, for example, in response to pedestrians or other cars in its way. Tesla can improve Autopilot with new features (or bug fixes) over time via over-the-air updates, as well. In addition, Tesla sells a “Full Self-Driving,” or FSD, package for its vehicles.

Some would say that Tesla is years ahead of the competition when it comes to powertrain efficiency and software. Since October of 2016, all Tesla’s sold are “capable of full autonomy” pending software and an updated processor released this year. Each Tesla has 360-degree vision from 8 cameras, 12 ultrasonic sensors, and a forward-looking radar.

Despite the equipment that comes with a Tesla there are people who believes Tesla’s vision-based autonomy strategy will not work with existing computer vision and AI. Sean Chandler from Seeking Alpa don’t think it will be reliable enough for regulatory approval.   

Sean thinks it is clear that in the long run, a vision-based autonomous car is the most cost-effective way to go. Humans drive with just two eyes; Tesla has 8 eyes, 12 ultrasonic sensors, and a forward radar. However, humans have deep understandings of the roads that we drive and the things that we see. Having more eyes and sensors means that the car doesn’t have to be as smart as a human, but it still needs to be very smart. He would argue that Tesla has yet to prove that they can do autonomy across a variety of scenarios.

Earlier this year during a podcast with ARK Invest, Elon Musk said that the cars would soon be feature complete, or level 3/4 on the scale below.

Here’s what Elon Musk said during the podcast:

I think we will be feature complete full self driving this year. Meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention. This year. I would say that I am certain of that; that is not a question mark.

Surprisingly, on Tesla’s recent earnings call he confirmed that it’s still possible that something will be out this year:

  • “While it’s going to be tight, it still does appear that there will be at least an early access release of a feature-complete full self-driving feature this year. So it’s not for sure, but it appears to be on track for at least an early access release of a fully functional Full Self-Driving by the end of this year.”

With the exception of a demo in 2016 and a demo for investors earlier this year at Autonomy Day, customers and investors have no idea what to expect. (Videos to demos: 2016, 2019). Some positives we have seen though is substantial improvements to Autopilot and Navigate on Autopilot, which is the best level 2 system on the market.

However, unlike companies like Waymo (GOOG) (NASDAQ:GOOG) or Zoox that are constantly releasing footage of what their vehicles are capable of – Tesla doesn’t release anything.

Here’s videos of a Waymo car navigating a dust storm and a police-controlled intersection. Zoox has videos of its cars navigating San Francisco and Las Vegas, such as passing double-parked cars and yielding to dozens of pedestrians.

Recent updates to Autopilot have shown that it can detect stoplights, stop signs, and cones. Tesla’s Autopilot driver-assist features appears to be more advanced than any other production car available to consumers.

Tesla justified the latest price hike to its Full Self-Driving option with the release of Smart Summon, which was released days before Q3 ended, allowing Tesla to recognize $30M worth of Full Self-Driving revenue on its quarterly report.

Sean thinks the Smart Summon is great and appears to work most of the time but it’s far from perfect. Parking lots are chaotic and happen to be an entirely new driving category for Tesla.

Last month, Scott Kubo released a video of his Model 3 heading straight towards a movable barrier on Coronado Bridge in San Diego. This happened to be an extremely rare situation.

The concrete barrier is moved twice per day to regulate traffic flow between San Diego and Coronado Island. The driver was heading towards Coronado Island at approximately 5:40PM during sunset, likely before or while the barrier was being moved back (usually starts around 5:30).

Like Scott said in his video, the sun was setting and the barrier on the right was casting a long shadow across the road and part of the movable barrier on the left. Only a tiny part of the barrier was in direct sunlight.

Dash Cam Footage. Source: Scott Kubo

The perfect size of the reflection looks very much like a lane marking and the point in which the barrier meets the road is actually quite subtle.

The Autopilot system, which is constantly searching for lane markings, likely identified the reflection of the sun as a lane marking and misidentified the barrier as part of the road.

With computer vision, a big problem with roads is that they can vary in shades of gray. Some sections can be concrete (light gray) asphalt (dark gray) and some road sections may be newer (darker).

When Smart Summon was delayed, Elon Musk said that they were obsessed with curbs and that the hardest part about concrete is the excessive shades of gray. Throw in shadows, which can be at varying angles and intensities throughout the day/year and this becomes even more difficult for a vision-only system.

The situation Scott came across is so rare that just replicating the conditions would be difficult. During the summer, when the sun sets much later (long after the barrier is moved), these conditions aren’t possible. And due to the thickness of the shadow, this event can only happen for just a few minutes each day.

This is one of those scenarios where engineers are chasing the 9s to make the system 99.9999-something percent accurate.

On the plus side, Tesla’s large fleet gets to encounter and discover unique situations like this that most likely wouldn’t occur in a simulation.

On the flip side though, it raises questions about autonomy under Tesla’s strategy that relies primarily on vision. The system also uses ultrasonics and a front radar, but I’m not sure what happened in this case.

While there are many ways that this situation could have been prevented (such as reading and understanding the flashing “Merge Right” sign, or the traditional merge sign), it goes to show that the system needs a lot of work.

In this case, a redundancy such as LiDAR would have been an easy way to identify the barrier.

When it comes to LiDAR, Elon Musk calls it “lame”, or more specifically an expensive “crutch” when it comes to self-driving cars. He’s not wrong. LiDAR is expensive and once computer vision is mastered it may not be necessary. However, the question remains as to how long will it take for computer vision to be mastered. Zoox CTO Jesse Levinson said it could be decades. That’s quite far off, but even if it took 3-5 years that could be devastating, at least if the competition figures it out first.

In suburban Phoenix, Waymo is operating a fleet of self-driving cars and some of those cars are no longer using safety drivers. If Waymo can successfully deploy a large fleet of fully autonomous vehicles in this test area, their efforts will likely be accelerated across the globe. Mastering a single location would likely reduce complications with regulators; in fact, if Waymo is successful in one city, regulators in other cities may be asking Waymo to be in the next city on the list.

Who will win the race?

General Motors is putting down enormous, simultaneous stakes on electric cars and self-driving technology, a strategic bet based on its confidence that future automated vehicles will run only on electricity.

It’s a risky bet, especially if EVs and AVs are slow to be recognized by consumers. Other carmakers, like Ford, see near-term limitations to battery-electric AVs and favor a more measured method.

General Motors trusts both technologies are approaching a tipping point and hopes they will push it to the forefront of an enormous industry shift toward shared, self-driving electric cars.

“The only way to lead in either one is to go heavy and hard in both of them,” said Doug Parks, GM VP for autonomous and electric vehicle programs.

Automakers are split on the path to electrification, the Wall Street Journal reports.

  • Toyota and Ford are rolling out more hybrid gas-electric models as part of a gradual shift toward fully electric cars. They are both heavily invested in hybrid tech.
  • GM and Volkswagen see no need for hybrids as a technology bridge and are concentrating instead on all-electric models. They also need to defend their strong positions in China, where they face government mandates to sell more zero-emissions vehicles.

The companies’ views on self-driving technology also influence their plans toward electrification.

GM says all-electric autonomous cars have an advantage over hybrid- and gasoline-powered ones.

  • They’re ideal for dense cities that need solutions for congestion and pollution.
  • AVs require a lot of electrical power to run their advanced sensing and computer hardware.
  • An all-electric AV is more responsive so it can make decisions and complete maneuvers like navigating around a double-parked car more quickly.

Ford sees real-world issues that favor hybrid powertrains for AVs for the time being.

  • Building the necessary charging infrastructure will add to the already capital-intensive challenge of developing AV technology and operations.
  • Until battery technology improves, an AV’s power-hungry computers (along with air conditioning and entertainment systems) will deplete more than half the range of a battery-electric vehicle.
  • If cars are sitting on chargers, they aren’t making money. 
  • Fast-charging is needed to run an AV fleet, but Ford says repeated use will degrade the battery’s lifespan. (GM says this isn’t a concern.)

“We all want to transition to BEVs eventually, but we also need to find the right balance that will help develop a profitable, viable business model,” said Sherif Marakby, president and CEO of Ford Autonomous Vehicles LLC.

Uber and drone deliveries

The ride-sharing provider Uber is creeping further into daily life, with ambitions to help manage everything from how people get to work to what meals they order when they get home.

At a conference this week Uber was showcasing concepts and partnerships as it tries to seed an ecosystem to support the world’s first urban air taxi network.

Speaking Tuesday to an audience at the Economic Club of Washington, D.C., Uber CEO Dara Khosrowshahi said…

  • “We really want to move from being a ride-hailing app to essentially being your transportation partner.”
  • His vision is to have a multi-modal Uber transportation network that, with a push of a button, helps people plan how to get from A to B, balancing tradeoffs like time, convenience and price.
  • Uber doesn’t just want to move people — this morning, the company announced it will start testing drone food delivery in urban areas.

Serious obstacles stand in the way of flying cars, including regulations, infrastructure and air traffic management issues, not to mention consumer acceptance and safety, Deloitte’s Robin Lineberger tells Axios.

  • “They will be a legitimate part of a multi-modal transportation system, in 20 to 30 years,” he says.

At its third annual Uber Elevate conference, the company showcased vertical takeoff and landing models from 5 potential suppliers, including a full-size model from Bell that looks like a cross between a plane and a helicopter.

  • Attendees could climb into a mockup of a cabin interior built by the French aerospace company Safran or strap on virtual reality headsets to experience what a flying taxi ride would be like.
  • Uber said Melbourne, Australia, would be its first international pilot site for Uber Air, after Los Angeles and Dallas-Fort Worth in the United States, starting in 2020.
  • And it showcased 16 potential designs for urban skyports, where flying taxis would take off and land.

Uber wants to “get the industry moving and designing these vehicles so that they can be available for urban transportation,” Khosrowshahi said. “We want the pricing of this service to be ultimately available for the masses versus just the elites.”

So far, it looks out of reach for most. A new 8-minute Uber helicopter shuttle between Manhattan and John F. Kennedy Airport is expected to cost $200 when it launches next month.

  • In the wake of a helicopter crash this week on the roof of a building in Manhattan, some lawmakers want to see a ban on such flights.
  • And some argue it overshadows existing transit reform efforts.

The bottom line: The market for air taxis is expected to grow from $3.4 billion in 2025 to $17.7 billion by 2040, according to Deloitte, and for newly public Uber, which lost $1 billion in the first quarter of 2019, that opportunity is hard to pass up.

  • “We think it’s time to lean forward,” Khosrowshahi says. “The business is well positioned to profit. But the next 2, 3, 4 years are going to be about growth.”


#future of mobility #future mobility

Real world testing to L4

This is an article written by Michael DeKort “Today I see VW switched from Aurora to Argo. That switch will not matter in the least. As a matter of fact, you could literally combine every Autonomous Vehicle maker together and you will still never get remotely close to a legitimate L4. The use of public shadow and safety driving as well as gaming engines for simulation are that debilitating.

Public Shadow and Safety Driving — There is no way to mitigate the amount of time/work/miles (500B per RAND — One Trillion per Toyota) by driving and redriving scenarios you would have to stumble and restumble on. Spend the associated amount of money (+$300B). Or survive the casualties that will be created when handover is involved in time critical scenarios or when thousands of accident scenarios are run thousands of times each to train the ML.

Gaming Engine based Simulation — With regard to gaming engines. The real-time and model fidelity issues those systems cause especially in complex scenarios will cause differences between what the Planning system believes will occur and what will actually occur in analogous real-world scenarios. That will provide significant false confidence and lead to tragedies.

The Solution — The remedy is to use aerospace/DoD/FAA simulation technology and systems/safety engineering. Including augmenting that bottoms up Agile approach with a parallel top down approach. We also have a built in RL Engine automatic training capability built into the simulation. I will be glad to provide a demonstration this is the right approach. (Note- When I mention this approach folks usually think I am referring to air travel which is not nearly as complex as what is needed here. What is as complex and even more so is simulated DoD urban war games. Same locations, same complexity and the public experiences plus the ground vehicles can drive off the roads and shoot at each other.)

Update 6–12–2019 — Far worse than I stated

Over 1,400 self-driving vehicles are now in testing by 80+ companies across the US

Let’s look at the math on this compared to RAND’s estimate that it would take 500B to get to 10X a human and Toyota’s 1 trillion miles. At 40MPH driven 24X7 how long would it take to get to L4?

RAND 500B miles = 1019 Years

Toyota’s One Trillion miles = 2038 Years

1/10 of RAND’s estimated miles = 102 years

Please find more information in my articles below.

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving

Common Misconceptions about Aerospace/DoD/FAA Simulation for Autonomous Vehicles

The Hype of Geofencing for Autonomous Vehicles

The Autonomous Vehicle Podcast — Featured Guest

The article is written by Michael DeKort —  a former system engineer, engineering and program manager for Lockheed Martin. Michael worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Michael is a member of the SAE On-Road Autonomous Driving Validation & Verification Task Force and was recently asked by SAE to lead an effort to establish a new Modeling and Simulation group.

He is a stakeholder for UL4600 — Creating AV Safety Guidelines.

Michael has also been presented the IEEE Barus Ethics Award and am on the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

#future of mobility #future mobility

We’re driving computers.

Cars of the future need to be able to heal themselves. We’re not driving cars anymore — we’re driving computers.

Cars are complicated, with 100 or more computer-controlled subsystems that are needed for functions like steering, braking, and adjusting the seats. And all that hardware is embedded with massive amounts of software.

  • By 2020, 98% of new cars will be connected to the internet, making it easier to add new features or capabilities via over-the-air software updates.
  • Samsung’s Harman subsidiary handles OTA updates for 24 automakers, but most carmakers still require customers to make a trip to the dealership for driving updates.
  • Tesla is the exception. The company says it has pushed out hundreds of OTA updates to its vehicles since 2012.

Being connected to the internet can make vehicles vulnerable and software updates can sometimes introduce new problems.

The risks will only grow in the push toward automation, so having a software safety net will be important, tech analysts say.

  • “It’s going to be a necessity — especially when you’re downloading safety critical systems — that you don’t break the car,” says Michael Ramsey, mobility analyst at Gartner.

What to watch: New self-healing software from Israeli startup Aurora Labs — now undergoing tests by a half-dozen automakers — monitors a car’s systems to help ensure that doesn’t happen.

  • It uses artificial intelligence to self-diagnose coding problems and fix errors on-the-go.
  • The technology sifts line-by-line through the estimated 100 million lines of code in today’s cars to detect faults and predict problems before they occur.
  • If it finds a glitch, the software seamlessly rolls back to an earlier, safer version so related functions aren’t disabled while the problem is addressed. (Harman says it, too, has rollback capability.)
  • Self-healing software could be a viable way to protect against both malicious and accidental corruption of vehicle software, Navigant Research analyst Sam Abuelsamid says.

Yes, but: The approach requires additional onboard storage to preserve all of the previous changes, which could bog down performance. New electronic systems that automakers have in the works should address that, Abuelsamid notes.

The bottom line: If machines are going to replace human drivers, they will need to be resilient like humans.


First AV EV Truck….

15th May was a historic occasion. The first cab-less, electric truck – Einride’s T-pod – drove on a public road. The world premiere and inaugural run took place at DB Schenker’s facility in Jönköping, Sweden. The T-pod will transport goods between a warehouse and terminal at the facility, as part of a commercial flow.

Robert Falck, founder and CEO of Einride, Jochen Thewes, CEO of DB Schenker, and Mats Grundius, CEO of DB Schenker Cluster Sweden, Denmark, Iceland hosted the inauguration ceremony.

– This day represents a major milestone in Einride’s history, and for our movement to create a safe, efficient and sustainable transport solution, based on autonomous, electric vehicles, that has the potential to reduce CO2 emissions from road freight transport by up to 90 percent. I can’t begin to describe how proud I am of our team that made this happen in collaboration with our great partner and customer DB Schenker, says Robert Falck, CEO and founder of Einride.

– Autonomous trucks will become increasingly important for the logistics sector. Together with Einride, we have now introduced autonomous, fully electric trucks to a continuous flow on a public road – a milestone in the transition to the transport system of tomorrow,” said Jochen Thewes, CEO of DB Schenker.

In November 2018, the Swedish startup Einride and leading logistics firm DB Schenker initiated the first installation of an autonomous, all-electric truck or “T-pod” at a DB Schenker facility in Jönköping, Sweden. It was the first commercial installation of its kind in the world.

March 7 the Swedish Transport Agency concluded that the T-pod is able to operate in accordance with Swedish traffic regulations. March 11, the agency approved Einride’s application to expand the pilot to a public road. The permit applies to a public road within an industrial area – between a warehouse and a terminal. The permit is valid until December 31, 2020.

Einride and DB Schenker entered into a commercial agreement in April 2018 that includes the pilot in Jönköping and an option for additional pilots internationally.


#future of mobility #future mobility #glocalness

A startup’s road…

A startup’s road to self-driving future.

This is an article written by Azra Habibovic, Senior Researcher in automated vehicle systems at RISE Research Institutes of Sweden. The article was originally published in Swedish by RISE within its newsletter and blog on automated vehicles:   

I recently attended to an event where Chris Urmson and Sterling Anderson from the startup company Aurora discussed automated vehicles. The event was organized by the MIT Club of Northern California and took place at the research center PARC, a Xerox company where many major innovations have been created. There were around 300 participants at the event.

Chris has been the one who started and led Google’s self-driving project (now Waymo) before leaving it in 2016 to start Aurora together with Sterling, who until then had been leading Tesla’s development of active safety and Autopilot (as you may remember, he had a dispute with Tesla that they solved without going to court).

The discussion was very lively, much thanks to the moderator Mark Platshon, who started and financed several companies (including Tesla) and worked in many positions with various vehicle manufacturers.

Eventually, there will be a video recording of the discussion, but until then you will have to settle for my notes:

  • Why do you leave your dream job on Google / Tesla to start a new business? Both Chris and Sterling had very diplomatic answers to this question, but if you read between the lines it is about being able to do things in your own way. They had both learned what works and what does not, and saw their chance to do things right from the start. It includes the technology development itself, but also how to interact with others, including the authorities.
  • What makes their new company Aurora unique among many similar companies is that Aurora knows where they want to go (“we know where we are going” was repeated at least ten times in the evening!). The company’s vision can be summarized in three words: safely, quickly, broadly.
  • What also makes the company unique and that Chris and Sterling consider to be the guarantee of success, is its employees and mix of experiences. Many of Aurora’s employees are world-leading experts, and this is what attracts investors and partners.
  • Aurora wants to develop a “driver” that can be integrated into different vehicles. The company cooperates with several vehicle manufacturers and service providers to ensure that the developed driver can be integrated into various vehicles and services. The former is, however, the priority; Without a functioning self-driving system in vehicles, it becomes difficult to have services based on self-driving vehicles. In both cases, the focus in on the development of communication platforms and interfaces. Aurora doesn’t want to be seen as Tier 1.
  • According to Chris and Sterling, they want to let the vehicle manufacturers do what they are good at: building cars. They do not see any problems in that the vehicle manufacturers themselves are developing self-driving systems; it is only positive, because then they realize how difficult it is and are impressed by Aurora’s system. I myself would not be so sure of it – we know that vehicle manufacturers are undergoing a transformation and actually have a lot of know-how.
  • There was a question about their view on the fact that AI is developing quickly and how they can be sure that the foundation they are putting now will “hold” in a couple of years. There is a big difference between integrating new functions into an existing software architecture, and doing so on a specially developed architecture. Aurora has chosen to develop its own architecture with the help of world-leading experts who set the norm in the area. As far as the hardware and related networks are concerned, it is an area that is under development, and right now it is unclear what works best.
  • Remote control of self-driving vehicles is not excluded, but then for very unique cases. Basically, a self-driving system must be able to cope with the traffic on its own. The same applies to wireless communication (V2X): it is good to have, but it should definitely not be a prerequisite for self-driving vehicles.
  • High resolution maps are needed. The theory that it’s hard to keep them updated is a bit exaggerated, because they are created using sensors used for automated driving. This means that maps can be continuously updated without extra cost.
  • Currently, a combination of different sensors is needed. Once the algorithms have improved, it is not unlikely that it will be enough with cameras solely, but it will take a while. Lidar components are not expensive at all, so there is no reason why lidars will remain expensive when ordering them in large quantities.
  • Safety was definitely the most discussed topic. A combination of field tests and simulations for specific operative domains is the way forward, combined with a well-thought-out and well-documented development process.
  • Having a constant dialogue with the authorities is crucial. It is about explaining to them how the system was developed, which standards and principles they followed and how it was tested so that they understand why the manufacturer believes in the system. A kind of mutual understanding. (Must say that I was surprised at how often they used the word “explain” and “believe” in this context!). Aurora has a constant dialogue with the authorities. No third party is required to validate the safety, it is the manufacturer who has the best knowledge of the system. One can then wonder how objective this would be?
  • There will be no “driving license” for automated vehicles. The systems are too complex and cannot be generic. Instead, it is important to explain the system to authorities and the public in their own language.
  • Flying cars can become a reality in a distant future. But right now, there are many obstacles, not least regular, that make it less likely for such solutions to break through. In addition, an incredible number of flights are required for such a solution to be cost-effective.
  • In the beginning, many, especially vehicle manufacturers, saw Google’s work on self-driving vehicles as madness. According to Chris, this changed when Uber entered the game. Then the vehicle manufacturers began to realize the seriousness of the whole thing. He also points out to understand that the automotive industry is not homogeneous; Even among the most conservative companies, there are those who are futuristic.

In the end, I want to share with you that I learned a new acronym (!) – ACES (Autonomous, Connected, Electric, Shared).

#future #mobility #blog #future mobility blog