Hagens Berman files class-action lawsuit against Ford and Bosch claiming Super Duty diesel emissions defeat devices
Nissan and NASA extend research into autonomous mobility services

Sony introduces 360˚ “Safety Cocoon” concept for automotive image sensors at CES

In addition to showcasing its new lineup of consumer products such as 4K OLED televisions, smartphones and wireless noise-cancelling stereo headsets, Sony used CES to introduce its “safety cocoon” concept, which signifies the creation of an area of enhanced safety around a vehicle where it can monitor and detect the 360-degree surroundings and prepare for danger avoidance from an early stage in a variety of driving situations.

By evolving cars’ “eyes” through image sensor-based viewing and sensing, Sony aims to contribute to enabling a higher level of safety and to advance the realization of autonomous driving.

Kaz Hirai, Sony Corporation Chief Executive Officer & President, discusses the role of Sony’s automotive image sensors at CES 2018.

Sony announced its intention to commercialize image sensors for automotive use in 2014. In October of that year, the company announced the commercialization of the IMX224MQV CMOS image sensor for automotive cameras. That sensor is capable of capturing high-resolution color images even in light conditions as low as 0.005 lux—darker than a star-filled night sky. Mass production and shipment commenced in May 2016. This sensor allows the quick and effective detection of obstacles and people in parking lots, on dark nights, and in other situations that would challenge the naked eye.

In 2015, Sony acquired Belgium-based Softkinetic Systems S.A. (now known as Sony Depthsensing Solutions Holding SA/NV), which possesses time-of-flight (ToF) image sensor technology for distance detection.

With ToF technology, the distance to an object is measured by the time it takes for light from a light source to reach the object and reflect back to the sensor. ToF image sensors detect distance information for every pixel, resulting in highly accurate depth maps. To further improve accuracy from near to far distances, reflected light must be efficiently received and the signal processing must be executed at a high frame rate.

Combining the technological expertise of both companies, Sony is developing its future solutions for the areas such as smartphone, robotics, and automotive. ToF image sensors are already being applied in the automotive field to enable gesture control, and going forward they will be used for an ever-increasing range of applications crucial to the realization of advanced autonomous driving. These include the development of HMI (Human Machine Interfaces) that can track the status of drivers and passengers.

In April 2017, Sony introduced the IMX390CQV high-sensitivity CMOS image sensor. The sensor is equipped with an LED flicker mitigation function that reduces flickering when shooting LED signs and traffic signals, as well as an HDR function capable of 120dB wide dynamic range shooting. The sensor can do both simultaneously.

It is capable of accurately recognizing traffic signs with LED and adjacent vehicles’ LED lamps while capturing high-quality images with a wide dynamic range, even in situations with a wide contrast gap, such as when entering or exiting tunnels during daylight hours. The product is ready for use in forward-sensing cameras and cameras for Camera Monitoring System (CMS), an application expected to grow as a substitute for conventional rearview mirrors, and expected to show persons and obstacles on an in-car monitor with clear visibility. Sample shipment began in May 2017.

In October 2017, Sony releases a 7.42 effective megapixel stacked CMOS image sensor for automotive cameras. The IMX324 is intended for forward-sensing cameras in advanced driver-assistance systems (ADAS). Sample shipment began in November 2017. This image sensor is capable of high-definition image capture of distant road signs up to approximately 160 meters ahead.

Furthermore, this image sensor is expected to be compatible with the “EyeQ4” and “EyeQ5” image processors currently being developed by Mobileye, an Intel Company, for use in ADAS and autonomous vehicle technology.

In December 2017, Sony released a back-illuminated, time-of-flight (ToF) image sensor which is 1/2-type, VGA resolution and that delivers improved depth sensing performance. Sony will start shipping sample units in April 2018.

The new sensor is part of Sony’s DepthSense lineup, a group of depth-sensing image sensor products with range finding capability, and it is the first product in the lineup to adopt a back-illuminated ToF configuration.

Compact sensors that can provide accurate depth maps are used for various applications such as gesture recognition, object recognition, and obstacle detection, as well as for robotics and drones that require autonomous operations, or virtual reality, augmented reality and mixed reality systems, for which market expansion is anticipated. The small size of the new product was achieved thanks to development of a 10µm square pixel, with highly accurate distance measurement performance from close to far distances, which can be used in a wide range of applications in these fields.

The new sensor adopts a back-illuminated CMOS image sensor architecture and allows for more accurate detection of the reflected light because of improved sensor sensitivity. While conventional ToF sensor has difficulty in measuring far distance of approximately 10 meters, the new product comes with a sensitivity raising mode, enabling distance measurement with a high rate of detection at these distances. It is also possible to capture high-precision depth maps in a VGA resolution at close distances of approximately 30 centimeters to 1 meter.

Additionally, because this sensor captures depth maps for each frame, it enables image capture at a higher frame rate than when using a laser to scan the object for distance measurement. This reduces distortion of moving subjects in depth maps.


The comments to this entry are closed.