Le Lézard
Classified in: Science and technology
Subjects: Photo/Multimedia, Product/Service, Trade Show

AEye Announces World's First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous Vehicles


AutoMobility LA - Today, artificial perception pioneer AEye announced the world's first commercially available, 2D/3D perception system designed to run in the sensors of autonomous vehicles. For the first time, basic perception can be distributed to the edge of the sensor network. This allows autonomous designers to use sensors to not only search and detect objects, but also to acquire, and ultimately to classify and track these objects. The ability to collect this information in real-time both enables and enhances existing centralized perception software platforms by reducing latency, lowering costs and securing functional safety.

This in-sensor perception system is intended to accelerate the availability of autonomous features in vehicles across all SAE levels of human engagement, allowing automakers to enable the right amount of autonomy for any desired use case - including the most challenging edge cases - in essence, providing autonomy "on demand" for ADAS, mobility and adjacent markets.

AEye's achievement is the result of its flexible iDARtm platform that enables intelligent and adaptive sensing. The iDAR platform is based on biomimicry (see white paper), and replicates the elegant perception design of human vision through a combination of agile LiDAR, fused camera and artificial intelligence. It is the first system to take a fused approach to perception - leveraging iDAR's unique Dynamic Vixels, which combine 2D camera data (pixels) with 3D LiDAR data (voxels) inside the sensor. This unique software-definable perception platform allows for disparate sensor modalities to complement each other, enabling the camera and LiDAR to work together to make each sensor more powerful, while providing "informed redundancy" that ensures a functional safe system.

AEye's approach solves one of the most difficult challenges for the autonomous industry as it seeks to deliver perception at speed and at range: improving the reliability of detection and classification, while extending the range at which objects can be detected, classified and tracked. The sooner an object can be classified and its trajectory accurately forecasted, the more time the vehicle has to brake, steer or accelerate in order to avoid collisions. See AEye's white paper on Range, Resolution and Rate.

Enabling Autonomy On-demand

First generation robotic vision systems tried to solve the challenges of fully autonomous driving by capturing as much data as possible. This required both time and power to process. Second generation systems are designed to intelligently collect, manage and transform data into actionable information.

The unique intelligent capabilities of the iDAR platform allow for applications ranging from ADAS safety augmentation, such as collision avoidance, to selective autonomy (highway lane change), to fully autonomous use cases in closed-loop geo-fenced or open-loop scenarios.

Engineers can now experiment using software-definable sensors without waiting years for the next generation of hardware. They can adapt shot patterns in less than a second and simulate impact to find optimal performance. They can also customize features or power usage through modular design, for instance using a smaller laser and no camera to create a specialized ADAS system for under $1000, or mixing and matching short and long range LiDAR with camera and radar for more advanced 360 degree systems for under $15,000. Unlike with the industry's previous generations of sensors, OEMs and Tier 1s can now also move algorithms into the sensors when it is appropriate.

"We believe the power and intelligence of the iDAR platform transforms how companies can create and evolve business models around autonomy without having to wait for the creation full Level 5 Robotaxis," said Blair LaCorte, president of AEye. "Automakers are now seeing autonomy as a continuum, and have identified the opportunity to leverage technology across this continuum. As the assets get smarter, OEMs can decide when to upgrade and leverage this intelligence. Technology companies that provide software-definable and modular hardware platforms now can support this automotive industry trend."

iDAR's 2D/3D Perception System

AEye's system more quickly and accurately searches, detects and segments objects and, as it acquires specific objects, validates that classification with velocity and orientation information. This enables the system to forecast the object's behavior, including inferring intent. By providing the smarts to capture better information faster, the system enables more accurate, timely, reliable perception, using far less power than traditional perception solutions.

This 2D/3D perception system is based on AEye's iDAR platform, whose perception advancements the company will make broadly available via a software reference library. That library includes the following features that will be resident in AEye's AE110 (Mobility) and AE200 sensors (ADAS):

"Combining sensor modes that can measure speed, distance, orientation and semantic understanding in real-time enables a more robust virtual driver system," said Sam Abuelsamid, principal analyst at Navigant Research. "Creating a perception system at the sensor level can potentially deliver more depth, nuance and critical information for improved prediction to feed into path planning systems than is possible with a 2D image-based system, which will be a boon to ADAS and autonomous vehicle initiatives."

AEye's iDAR software reference library will be available in Q1 2020, and will be demonstrated this January at CES. To schedule a demo at CES, contact ces@aeye.ai.

About AEye

AEye is an artificial perception pioneer and creator of iDARtm, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others. For more information, please visit www.aeye.ai


These press releases may also interest you

at 06:05
European Bank for Reconstruction and Development (EBRD) invested in obilet.com, the online sales platform for users to purchase bus and airline tickets. EBRD, which offers financing for thousands of projects to support local growth in European...

at 06:05
Axonics Modulation Technologies, Inc. , ("Axonics"), a medical technology company that has developed and is commercializing novel implantable rechargeable sacral neuromodulation ("SNM") devices for the treatment of bladder and bowel dysfunction,...

at 06:05
As part of the Business Roundtable Workforce Partnership Initiative (WPI) for the Greater New York City region, the City University of New York (CUNY) together with IBM, Pitney Bowes, JPMorgan Chase & Co. and Aon today announced the upcoming launch...

at 06:05
The global testing, inspection, and certification market size is expected to reach USD 404.7 billion by 2025, according to a new report by Grand View Research, Inc. The market is anticipated to witness a CAGR of 5.2% from 2019 to 2025. Testing,...

at 06:05
Health design and innovation studio GoInvo announced today that Meghana Karande, MD will join as Medical Director for the firm. Dr. Karande, a classically trained clinician via Yale and Mount Sinai, will bring her medical expertise to GoInvo's...

at 06:02
Infosys , a world leader in next-generation digital services and consulting, has been awarded the 'Mazda Excellent Partner Award' by Mazda Digital Innovation (MDI) & IT Division, Mazda Motor Corporation. Infosys has been working with Mazda to...



News published on 19 november 2019 at 19:05 and distributed by: