Tag: vision systems
Singapore improves the AI it uses to detect smokers
IMF Says AI Will Upend Jobs and Boost Inequality. MIT CSAIL Says Not Fast.
NIST warns of ‘snake oil’ security claims by AI makers
This Ant-Inspired AI Brain Helps Farm Robots Better Navigate Crops
Generative AI: Use Case Scenarios – MassTLC
Generative AI: Use Case Scenarios – Mass Tech Leadership Council
Future of AI Careers: Top Trends and Job Prospects According to Experts | BitPinas
Can AI Minimize the Fusarium Mycotoxin Risks in Cereals
Best VR Headset 2023: Quest 2, PSVR 2 Or Pico 4?
Two Hitachi Group Companies to Merge to Expand Robotic SI Business in Japan and ASEAN Countries
Hitachi Astemo Develops Prototype 360-Degree Stereo Vision with Multi-Camera 3D Sensing
Most of the current automated driving systems are limited to highway driving. To enable automated driving systems to be used on general roads, they need to accurately recognize the entire road environment around the vehicle, including a complex mix of objects such as pedestrians and bicycles. Radar and LiDAR systems with high-ranging accuracy, however, still face cost issues before they can see widespread adoption, and all-surrounding-area camera systems, which although are superior in terms of cost,
are mainly based on a monocular camera that is still continually evolving to improve accuracy and other issues.
Hitachi Astemo, together with the Research and Development Division of Hitachi, Ltd., leveraged their strengths in stereo camera technology to develop a prototype of the 360-degree stereo vision system, using multi-camera 3D sensing that enables distance measurement with the stereo camera technology. Instead of the conventional module consisting of two cameras with the same view angle and being nearly parallel to each other, the camera layout has been made more flexible to use a combination of approximately 10 cameras with different angles of view, including non-parallel cameras to provide stereoscopic 3D vision. By integrating multi-camera 3D sensing into a single in-vehicle camera system, the system realizes 360-degree stereo vision, with a cost advantage, high accuracy and resolution.
By generating highly accurate distance information in stereo and all around the vehicle, the system can detect such elements as the distance of a vehicle traveling in the adjacent lane, or a two-wheeled vehicle slipping through a line of cars from behind in a traffic jam. It can estimate the relative speed and direction of movement, and apply this information for vehicle control to avoid collisions and entanglement at intersections. In addition to basic object recognition of objects such as cars, motorcycles, pedestrians, and traffic lanes, the recognition function also includes turn signals, red lights, and brake lights to predict the behavior of other vehicles; as well as traffic signals, road signs, road edges, and free space areas available for driving--all of which are necessary to identify for while driving on ordinary roads.
Furthermore, Hitachi Astemo has improved reliability and environmental resistance in camera sensing such as features to resist water droplets and dirt adhering to the lens surface, or shielding the entire lens in the snow. The AI has learned malfunction patterns caused by these factors and can identify malfunction factors that occur in each camera, thereby preventing malfunctions.
Going forward, Hitachi Astemo will continue to strengthen its 360-degree stereo vision systems that combine cost advantages with high accuracy and resolution, as well as improved reliability and environmental resistance, with the aim to expand the scope of automated driving systems for use on general roads.
Hitachi Astemo is committed to strengthening its business and delivering technological innovation through a strategic business portfolio, which consists of the Powertrain &
Safety Systems business, Chassis business, Motorcycle business, Software business and Aftermarket business. Aiming for a better environment globally and growth around the pillars of "green," "digital," and "innovation," we will deliver highly efficient internal combustion engine systems; electric systems that reduce emissions; autonomous driving for improved safety and comfort; advanced driver assistance systems; and advanced chassis systems. Through such advanced mobility solutions, we will contribute to realizing a sustainable society and provide enhanced corporate value for our customers.
For more information, please visit the Hitachi Astemo website: www.hitachiastemo.com/en/.
Copyright 2022 JCN Newswire. All rights reserved. www.jcnnewswire.comHitachi Astemo, Ltd. has developed a prototype 360-degree stereo vision system for automated vehicles traveling on regular streets. Based on multi-camera 3D sensing, the all-surrounding sensing system is both high-resolution and highly accurate.
Computer vision using synthetic datasets with Amazon Rekognition Custom Labels and Dassault Systèmes 3DEXCITE
Videos beamed straight onto human retina via compact laser projector
The Fukui team created the device by integrating a laser module capable of outputting red, green and blue lasers with a microelectromechanical (MEMS) mirror.
The direction in which the MEMS mirror reflects light from the laser module can be controlled electronically, making it possible to project high-quality 2D images through laser scanning over the projected area.
It can currently project colour video at a resolution of 1280×720, which, together with its small size, makes it a promising device for wearable displays.
The researchers said that further tuning will be required to make it possible to safely project images directly onto the retina of the human eyes.
One of the biggest challenges to getting the device working was combining the light beams from three independent laser sources to obtain an RGB output.
To achieve this, a device called a waveguide-type combiner was used, where each of the three waveguides receives light from each of the primary laser colours.
Akira Nakao, assistant professor and lead author of the study, said: “The outputs from the individual RGB lasers end up perfectly aligned thanks to the nature of the waveguide-type combiner.”
The researchers believe the applications for their device goes beyond those of virtual and augmented reality for entertainment and could allow for better conferencing, surveillance and even remote-assisted surgery.
“For now, our unit can be used in laser microscopes, sensors, projectors and HUD displays, particularly those for novel automobile systems with intelligent driving technology, which are all set to reshape our future,” Nakao said.
In October, Stanford University researchers developed a new architecture for OLED displays which could support resolutions up to 10,000 pixels per inch which would be perfect for improved VR headsets.