304
Views
3
CrossRef citations to date
0
Altmetric
Research Articles

A Multi-Source Convolutional Neural Network for Lidar Bathymetry Data Classification

, ORCID Icon, &
Pages 232-250 | Received 08 Aug 2021, Accepted 17 Jan 2022, Published online: 04 Feb 2022
 

Abstract

Airborne Lidar bathymetry (ALB) has been widely applied in coastal hydrological research due to outstanding advantages in integrated sea-land mapping. This study aims to investigate the classification capability of convolutional neural networks (CNN) for land echoes, shallow water echoes and deep water echoes in multichannel ALB systems. First, the raw data and the response function after deconvolution were input into the network via different channels. The proposed multi-source CNN (MS-CNN) was designed with a one-dimensional (1 D) squeeze-and-excitation module (SEM) and a calibrated reference module (CRM). The classification results were then output by the SoftMax layer. Finally, the accuracy of MS-CNN was validated on the test sets of land, shallow water and deep water. The results show that more than 99.5% have been correctly classified. Besides, it has suggested the best robustness of the proposed MS-CNN compared with other advanced classification algorithms. The results indicate that CNN is a promising candidate for the classification of Lidar bathymetry data.

Disclosure statement

No potential competing interest was reported by the authors.

Data availability statement

The data that support the findings of this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.

Additional information

Funding

This work is supported by the Guangxi Innovative Development Grand Grant (No. 2018AA13005) and the Science and Technology Project of Tianjin (No. 18ZXZNGX00230).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.