Discuss data fusion between laser pointer analysis and imaging system

The naval experiment was conducted in the Baltic Sea, and there were many other active and passive electro-optical sensors next to the analysis system. This article will discuss data fusion between laser pointer analysis and imaging systems. In addition, drone experiments are conducted in a laboratory on the roof of FOI. Light Detection and Ranging (LIDAR) technology provides the ability to quickly capture high-resolution 3D surface data with centimeter-level accuracy for various applications. Due to the leaf penetration characteristics of the lidar system, these geospatial data sets can detect the ground surface under the trees to produce high-fidelity bare ground elevation models.

The precise characterization of the ground allows the identification of topographical and non-topographical points within the point cloud, and facilitates the further discrimination between natural and man-made objects based only on structural aspects and relative proximity parameters. Here is a framework for automatically extracting natural and artificial features without relying on overlapping adjacent images or point RGB attributes. The TEXAS (Terrain Extraction and Segmentation) algorithm first generates bare ground from lidar measurements, and then it is used to distinguish whether a point is terrain or non-terrain.

A further classification is to use local spatial information to assign point levels. Similar classification points are then gathered into a region to identify individual features. A description of the spatial attributes of each area is generated to identify the location of individual trees, forest sections, areas occupied by buildings, and three-dimensional building shapes, among other types. Then compare the results of the fully automated feature extraction algorithm with the ground truth to evaluate the completeness and accuracy of the method.

Many modern lidar platforms include an integrated RGB camera for capturing contextual images. However, these RGB cameras do not collect near-infrared (NIR) color channels and miss a lot of useful information for analysis purposes. This raises the question: in this case, can LIDAR data collected in the near-infrared be used as a substitute for actual near-infrared images?

650nm 10mW Red Light Laser Pointer Pocket

When another source of near-infrared, such as satellite imagery, is not available, it is potentially useful to generate LIDAR-based near-infrared images. Lidar is an active sensing system very different from operating a passive system, so additional processing and correction are required to approximate the output of a passive instrument. The author studies the method of obtaining the real data set from the approximate passive near-infrared image obtained from the lidar, and evaluates the difference with the real near-infrared image.

At present, the development of lidar (LIDAR) for autonomous navigation and collision avoidance of various vehicles has attracted widespread attention. In these applications, minimizing size, weight, and power consumption (SWaP) is critical, especially for airborne advanced imaging systems that require positioning, calibration, and docking. In this article, the author designed a miniaturized high-energy radar system that uses the liquid crystal birefringence of the electro-optical (EO) beam steering device to achieve a 20° x 5° field of view (FOV) without moving parts. FOV will significantly increase future horizons.

In addition to scanning, the device can operate in a “point and hold” mode, that is, lock it to a single moving object. The non-mechanical design leads to particularly advantageous size and weight values, respectively: 1 L and <1 Kg. In addition, these EO scanners operate without mechanical resonance or inertial influence. A green laser pointer with a repetition rate of 50kHz, a pulse energy of 1mJ, and a beam diameter of 2mm was used for imaging. It was proved that a frame rate of 2 fps was produced in the range of 100m, which was limited by the laser pulse repetition frequency. The fine control provided by the EO steering gear produces an angular accuracy of 6*10-4 degrees. This field of view can be increased with a fine, non-mechanical polarization grating beam redirector. In this article, the author will introduce design concepts, preliminary results, and next-generation improvement plans.