News & press releases
Edge-sensor and information viewer: A CLASS car sample
Within the CLASS project, we sensorized 3 cars and gave them a “brain”. The board that we use, called Drive AGX Xavier, is an automotive grade computing platform developed by NVIDIA. The car is equipped with an Xsens GPS, 4 Sekonix automotive cameras and a LiDAR sensor, produced by Ouster.
In the video above you can see the fusion of the 4 cameras and the LiDAR. The cameras return frames, that can be seen on the left, while the LiDAR is a laser sensor that determines the distance to one or more surrounding points by measuring their light reflection. The sensor returns a point cloud of the surrounding environment, but with only spatial information. Therefore, if we want both the spatial information and image features, such as colours, we need to fuse the outputs of those sensors.
To do so, a calibration and alignment of cameras and Lidar is required. Once this step is performed, the sensors share the same reference system, so that it is possible to project a pixel of a frame on the cloud and obtain a meaningful coloured map.
Moreover, now 3D boxes can be easily obtained. We perform object detection and classification on the cameras, using a neural network enhanced by our team, called YOLO. The detector gives in output bounding boxes and a corresponding category. For the pointcloud, we first pre-process the points, such as removing the points corresponding to the ground, and then we compute clustering in a very fast way, associating all the points of the same object in a single cluster. At that point, we apply the fusion, matching the 2D bounding boxes with their category, and with the clusters on the pointcloud, in order to obtain 3D object detection and classification.
The boxes that are shown in the video are the outputs of this fusion and the colour of the boxes indicates the classification. For example, the green boxes are the cars, while the orange ones are the pedestrians. We then send the fused information to the city via the private 4G network of the MASA area.
The pyramids seen on top of some objects represent all the information that is sent back from the city. As it can be noticed, the data from the car is augmented, and this can be very useful for many different ADAS applications.