Ch 5. No success without failure: Story of automated driving systems - AMORE STORIES - ENGLISH
#Cathy Zhang
  • 메일 공유

Ch 5. No success without failure: Story of automated driving systems

Columns written by member of Amorepacific Group

ColumnistCathy Zhang
APC Purchasing Team

1. Prologue : Fatal crash – automated driving system involved

 The company, Tesla Motors, is well-known these days for developing automated driving systems in their cars. Typically, they have been regarded as being the next step in automotive safety. However, it was only just recently in the summer of 2016 when a fatal accident was reported involving a Model S, Tesla vehicle and a truck. The driver of the car was killed as the vehicle passed under the trailer. This accident drew widespread attention as it was the first known death that involved a Tesla autopilot-enabled vehicle.
  • Tesla Model S involved in the fatal crash in the summer of 2016

 It was reported by the press at the time that the weather conditions were favorable, which implies that the accident was not caused by road environment factors, such as slippery roads. In response, Tesla opened its own investigation into the accident and found that it was indeed the autopilot system that had played a contributing role in the accident. According to the company, both the driver and autopilot system failed to recognize the trailer crossing in front of the car. The Model S could not avoid crashing into the truck causing its windshield to collide with the bottom of the trailer.

 I would like to share some information regarding the autopilot system involved in the accident, as well as automated driving systems in general and how they will change the future, but before looking at these points, let's first explore some of the background information.

 Despite these days being associated with autonomous cars, the term autopilot is actually derived from the automated system used in aircraft, hence, auto "pilot." Autopilot is defined in Wikipedia as a system used to control the trajectory of an aircraft without constant 'hands-on' control by a human operator being required. This differs somewhat to what the automated driving system does in an autonomous car, which is in fact a vehicle capable of sensing its surrounding environment and navigating without any human input. In other words, there is no need for a human operator in an autonomous car, but when autopilot is used in aircraft, human input is still a necessity.
  • A driver-assistance system allows the hand-off of control

 The autopilot used in Tesla's vehicles, for the time-being, is in reality more of a driver-assistance system. This allows the car to automatically steer, change lanes and merge onto highways allowing drivers to remove their hands from the steering wheel and take some time to relax while driving.

2. Pushover or stepping on thin ice : Reality of automated driving systems

 Even though the vehicles produced by Tesla are not fully automated, the company is still ahead in the race to realizing this dream. What is not known by many though is that the true power behind the success of Tesla is Mobileye. Mobileye is an Israeli technology company that develops hardware and software for advanced driver-assistance systems (ADAS). Its EyeQ, a complete System-on-a-Chip (SoC) for vision processing, and ADAS have already been adopted by many leading automakers.
  • Mobileye's customers

 Mobileye is quite amazing in what it has to offer for automated vehicle makers. It provides advanced computer-driven vision and machine learning technology for vehicle and lane detection, collision avoidance and even traffic sign recognition. The company was the main supplier of Tesla's first-generation Autopilot but over time, the company has reduced its reliance on Mobileye and instead developed its own autopilot system powered by Nvidia hardware.

 In fact, it was the accident mentioned earlier that lead to the two companies parting ways. Tesla reported that the likely cause of the accident was that the autopilot system mistook the white side of the truck for an overhead highway sign and as such, failed to engage the emergency brakes. This highlights one of the few technical flaws in the Mobileye camera:

 1) The Mobileye camera relies on light to determine the proximity of surrounding objects. Unfortunately, this means the camera performs less effectively in the dark.

 2) The Mobileye camera only collects 2D data further increasing time for 3D images to be processed when there is already a time delay between image detection and data being sent by the camera.

 3) The Mobileye camera still requires concentration on the part of the driver as the camera is limited in terms of what it is actually capable of doing.

 These fundamental flaws led to Tesla to develop their own driver-assistance autopilot system. So, with this background information in mind, let's now look at the company created its own keeping in mind the levels of driving automation as defined by the U.S. Department of Transportation National Highway Traffic Safety Administration (NHTSA) and the Society of Automotive Engineers (SAE), and seen in the following table.
 At present, Tesla's Autopilot comes in at level 2. It is considered as having partial automation but is also expected to achieve some of the functions specified in level 3 for conditional automation, and potentially even level 4 (high automation) in the not too distant future. For now, these levels are achieved through a combination of cameras, affordable but still have the light issue mentioned earlier, laser radar, for its penetrating power and 3D images but is unfortunately expensive, not to mention large, and microwave radar, which removes the need for reliance on light, but is still lacking terms of accurate 3D imaging.
  • Complicated automated driving system

 As can be seen, an automated driving system is no simple affair. It contains a wide range of technologies that need to work in harmony for the vehicle to be able to drive itself, and its owner, home. To ensure that the implementation of these technologies goes smoothly, Tesla has adopted a staggered release approach using software updates. This would allow for all new features to be thoroughly tested both in-house and on-the-road before being applied to the system while also integrating feedback from actual users of their vehicles.

 This gradual roll-out approach has given Tesla a clear head-start in the race to create a completely autonomous vehicle. To put their approach of working up through the levels of automaticity and build up to full automation gradually into perspective, consider the strategies taken by competitors Google and Apple. These two companies have elected instead to aim for a fully autonomous vehicle right from the get-go. Other differences can be also noted in terms of data collection process. In particular, there are huge differences in the quantity of data derived from actual road driving conditions. It was reported in May 2016 that Tesla had collected 780 million miles of driving data and was adding another million every ten hours. In comparison, Google was only able to capture 2 million miles of road data over a period of four years after beginning their fully self-driven car project. Tesla clearly has the advantage in terms of raw data to be used as it develops and rolls out updates to its automated driving system.

3. Advancement of core components led by new innovative companies

 One important point that is probably in the minds of all readers by this point though is just how safe the technology actually is. As stated at the beginning, there has already been one fatal accident and other minor ones have also been reported. As such, to ensure the safety and full autonomy of a self-driving car, the vehicle must have an adequate operating system, able to function as the brain of the car, and hardware for real-time location and obstacle detection. The problem that presents itself now though is the current state of hardware technology does not meet the expectations of the market.

 As was stated earlier, there are issues in relation to the current technology being used. Cameras are able to collect and process certain types of data but have issues related to light and millimeter wave radar, while generally affordable, is not that precise. The issue that becomes noticeable here is that the autopilot system has to rely on the data gathered by these technologies.

 For some time, it seemed that the restrictions placed by the technology would never be overcome. However, more and more promising startups are coming up with new innovations to break through this barrier. One such promising technology is known as LiDAR, which is in production by startups Quanergy and Velodyne as they produce key components of autonomous vehicles. In fact, it was this year that saw Quanergy receive the CES (International Consumer Electrics Show) 2017 'Best of Innovation' award in the Vehicle Intelligence category.

 LiDAR is a laser radar boasting a high level of penetration. Similar to other radars, it determines the distance to objects and their position. The main difference is the wavelength at which LiDAR functions. By sending short wavelengths ranging in the hundreds to thousands of nanometers, LiDAR is capable of much greater precision. One major hindrance to the wider implantation of this technology is its price. In fact, the cost of the laser itself can come close to that of an actual car. Needless to say, this is quite restrictive in terms of widespread application. The other point to consider is that is also tends to be less durable. There is optimism regarding this technology but it is felt that there is still more time and effort needed before it can be commercially applied. In fact, to be effectively put into mass production, the cost of the laser needs to be reduced to almost USD 100. No easy task! However, innovations have already been made regarding a new technology referred to as solid-state LiDAR.
  • Understanding how laser radar works

  • Challenges facing the commercialization of laser radar technology

 Quanergy, as mentioned earlier, released its first solid-state LiDAR in 2016, which at the time only cost about USD 250. This technology makes use of an optical phrased array as a transmitter, which shifts the phase of the laser pulse as it is projected. In other words, fewer moving parts and therefore, less cost.
  • Quanergy's second-generation solid-state LiDAR and its principle

 On top of these exciting developments in laser radar technology, industry analysts are also saying that there will be a turning point in terms of laser radar production and cost reduction within the next two years as reliability and mass production are improved. Companies including TriLumina and Princeton Lightwave in the US, Germany-based Ibeo and Osram, Israeli Innoviz, Dutch company Innoluce and LeddarTech in Canada are already presenting advanced LiDAR products, priced at about USD 100. Let's also not forget China. Besides Baidu actively pushing into the market, companies like LeiShen Intelligent, Slamtec, Great Star and Han's Laser come hot on their heels. Considering how low the labor cost is in China, it is not entirely impossible that Chinese companies may someday outrun the front runners in the industry.

4. Conclusion

 Researchers do acknowledge the fact that autonomous vehicles have the potential to be much safer than their human counterparts with even just a 1% improvement in driver safety resulting in 12,000 lives being saved in the approximate 1.2 million traffic fatalities every year. Self-driving car technology continues to move forward despite some problems in what really are still early days. However, as the technology improves, we can all expect and look forward safer driving in the not-so-distant future.

  • Like

  • Recommend

  • Thumbs up

  • Supporting

  • Want follow-up article


Follow us: