IEC E-Tech
article here
Sensors replace human vision in autonomous cars, and the
tech is rapidly evolving as data informs R&D teams the world over. But what
are the standards?
As vehicles become more autonomous, the amount of data
needed to ensure passenger safety has steadily increased. While early debates
focused on the number and type of sensors required, attention has now shifted
towards how data is processed, stored and leveraged to achieve higher levels of
autonomy.
“Autonomous driving is fundamentally a data-driven
development process,” says Oussama Ben Moussa, Global Automotive Industry
Architect at an international IT and consulting group. “Mastery of data —
both physical and synthetic — will determine the pace of innovation and
competitiveness in the industry.”
Sensors reach maturity for AVs
This new autonomous taxi van from a major German
automotive manufacturer integrates 27 sensing devices into its advanced
driver-assistance systems (ADAS). It has been tested to Level 4, which
means that the vehicle is capable of operating without human intervention
within designated areas.
The ADAS requires precise information about what's happening
inside and outside the vehicle. While an array of technology combines to sense
the natural environment and detect objects around a vehicle, applications
inside the car monitor driver behaviour and machine diagnostics.
“Sensors have reached the required maturity to be able to
support most automated driving scenarios, and they are also two to three orders
of magnitude better than a human driver,” says Nir Goren, Chief Innovation
Officer at an Israel-based developer of light detection and ranging
(LiDAR) technologies and perception software. “We have the sensor technology,
the range, the resolution and the multi-modalities. It’s not only that sensors
are scanning and updating all sides of the vehicle all of the time – which a
human driver cannot do – but they also have superhuman vision way beyond what
we can see with our eyes.”
The optimum combination of sensors
The market for autonomous driving passenger cars is
estimated to generate USD 400 billion within a decade, according to a 2023
report by Mackinsey. The market for autonomous driving sensors is expected to
skyrocket accordingly, from USD 11,8 billion in 2023 to reach over USD 40
billion by 2030, with some predictions estimating that 95% of all
cars on the road will be connected.
The exact mix of sensors varies by car maker. One
manufacturer, for example, has concentrated development on “vision-only”
information culled from an array of eight cameras spanning the car’s entire
field of view augmented by artificial intelligence (AI).
“Sensors are a strategic choice for original equipment
manufacturers (OEMs), impacting both features and safety,” says Ben Moussa. “One well-known
autonomous vehicle (AV) manufacturer relies on cameras only, while others
insist on active LiDAR sensors – which work by targeting an object or a surface
with a laser and measuring the time for the reflected light to return to the
receiver – to handle cases such as foggy nights or poorly marked roads.”
A key test case is being able to identify debris, such as a
tyre, on the road ahead. “Even during daylight, this is hard to spot from 200
metres away in order to take action (break or change lanes),” says Goren. “On a
dark road, it is beyond the capabilities of human vision and computer vision,
but accurate information is clearly necessary for safe driving. This is why
many experts are of the view that AVs require LiDAR sensors as well as
cameras.”
Other types include ultrasonic sensors, which emit
high-frequency sound waves that hit an object and bounce back to the sensor,
calculating the distance between sensor and object. Since ultrasonic sensors
work best at close range, they tend to be complemented by sensors which are
more proficient at detecting objects at a distance, such as LiDAR, and their
velocity, which is what radars do best.
In addition, inertial measurement units, like
gyroscopes and accelerometers, support the overall navigation system. Infrared
cameras inside the car record images of the driver’s eyes and blend
this with real-time data about road conditions to detect if a driver is paying
attention at potentially hazardous moments.
“In one semi-autonomous architecture I’ve worked on, there
are 12 cameras (front, corners, rear, mirrors, cockpit for driver monitoring
and sometimes thermal cameras), plus more than four radars, one LiDAR and at
least eight ultrasonic sensors. Altogether, the minimum number of sensing
devices is around 24,” says Ben Moussa.
The five levels of autonomy
Autonomous driving levels are defined by the
Society of Automotive Engineers (SAE). Level 1 qualifies vehicles for
assistive driving systems like adaptive cruise control. Level 2 is where ADAS
kicks in: the vehicle can control steering and accelerating/decelerating or
automatically move the steering wheel to keep in lane, but the driver remains
in charge.
“There’s a huge gap between Level 2 and Level 3,” says
Goren. “Level 3 is ‘hands off, eyes off’, which means that you can push a
button and the car drives, leaving you free to read the newspaper. If anything
goes wrong, then it's the responsibility of the car.”
Level 4 applies to passenger vehicles but today is
commercialized only in robotaxis and robo-trucks, where
the car is capable of full automation, and some vehicles no longer
have a steering wheel. Level 4 restricts operation to designated geofenced
zones, whereas Level 5 vehicles will theoretically be able to travel anywhere
with no human driver required.
Data generation and management
AVs generate vast amounts of data based on the number of
sensors and the level of autonomy. Goren calculates that a single
high-definition camera generates hundreds of megabytes of data per second,
while a single LiDAR sensor generates one gigabyte (GB) of data per second.
In day-to-day operations, however, vehicles can store only a
fraction of this potential data. For every five hours driving, only around 30
seconds can be stored because of the cost of storage and the delay in routing
data from the car to the cloud and back again. Vast
amounts of data are, however, collected during the engineering and development
phase.
Ben Moussa explains, “During R&D, OEMs run fleets across
many countries with different geographies and conditions to collect diverse
data. This data, estimated to generate up to 22 terabytes (TR) per vehicle per
day, is used to build a universal software that will operate across the fleet
when vehicles are in service. In the engineering phase, we are storing most of
the data because we need to capture all of the specificities about road,
weather conditions and so on.”
For some projects, OEMs operate hundreds of cars driving in
more than 50 countries and over millions of kilometres to collect data for use
in autonomous driving development. In daily operations, powerful chipsets
running AI algorithms enable data to be processed onboard the vehicles (at the
network edge) with response times in milliseconds. This includes the
aggregation and analysis of raw data from multiple sensors (a process known
as sensor fusion) to obtain a detailed and probabilistic understanding of
the surrounding environment and automate response in real time.
Select data is uploaded to the OEM’s cloud during EV
charging or Wi-Fi connection. This data tends to be triggered by anomalies
(e.g. animals crossing the road) and used to train, refine and update the OEM’s
universal platform.
In order for autonomous driving to scale, a key challenge is
to decrease the dependency on physical, real-world data. Development is
focusing on distributed or hybrid databases, using virtual information.
“Hybrid means a mix between physical data gathered from
sensors in the real environment plus virtual or synthetic data from digital
twins,” explains Ben Moussa. “For example, we are building digital twins of
cities based on a simulation platform in which we drive virtual cars
and collect synthetic data from sensors as if we were driving in the real
world. This will accelerate autonomous driving development.”
The value of standards
Automated vehicles require the highest levels of safety and
failsafe testing, and these objectives lie at the core of the international
standards calibrated and published by the technical committees of the
IEC. IEC TC 47 is the committee developing international
standards for semiconductor devices. Among dozens of its publications, it is
working on the first edition of IEC 63551-6, which addresses chip-scale
testing of semiconductor devices used in AVs.
When it comes to the safety of cameras for AVs, IEC TC
100 publishes several documents which can prove useful. One of its
publications is IEC 63033-1, which specifies a model for generating the
surrounding visual image of the drive monitoring system, which creates a
composite 360° image from external cameras. This enables the correct
positioning of a vehicle in relation to its surroundings, using input from a
rear-view monitor for parking assistance as well as blind corner and bird’s eye
monitors.
The recently published IEC 60730-2-23 outlines the
particular requirements for electrical sensors and electronic sensing elements.
As is pointed out in this IEC article, this is intended to help
manufacturers ensure that sensors perform safely, reliably and accurately under
normal and abnormal conditions and that any embedded electronics deliver a
dependable output signal. Conditioning circuits that are inseparable from the
control for which the sensing element relies on to perform its function are
evaluated under the requirements of the relevant Part 2 Standard and/or IEC
60730-1.
These standards are published by IEC TC 72, the IEC
technical committee responsible for automatic electrical controls. Its work
supports global harmonization and enhances the safety and performance of
devices used in everyday life.
The joint IEC and ISO committee on the Internet of Things
(IoT) and digital twin, ISO/IEC JTC 1/SC 41, sets standards ensuring the
safety, reliability and compatibility of connected devices across various
applications. Another subcommittee of JTC 1, SC 38, prepares standards for
cloud computing, including distributed cloud systems or edge computing.
Conformity assessment (CA) is also key for industry
stakeholders to be able to trust that the parts used to make AVs follow the
appropriate standards. The IEC Quality Assessment System, IECQ, proposes
an approved components certification, which is applicable to various electronic
components, including sensors that adhere to technical standards or client
specifications accepted within the IECQ System.
As the industry continues to grow, standards and CA are
increasingly indispensable for it to mature safely and efficiently.
No comments:
Post a Comment