617
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Pothole detection in the woods: a deep learning approach for forest road surface monitoring with dashcams

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 303-312 | Received 22 Mar 2023, Accepted 22 Nov 2023, Published online: 18 Dec 2023
 

ABSTRACT

Sustainable forest management systems require operational measures to preserve the functional design of forest roads. Frequent road data collection and analysis are essential to support target-oriented and efficient maintenance planning and operations. This study demonstrates an automated solution for monitoring forest road surface deterioration using consumer-grade optical sensors. A YOLOv5 model with StrongSORT tracking was adapted and trained to detect and track potholes in the videos captured by vehicle-mounted cameras. For model training, datasets recorded in diverse geographical regions under different weather conditions were used. The model shows a detection and tracking performance of up to a precision and recall level of 0.79 and 0.58, respectively, with 0.70 mean average precision at an intersection over union (IoU) of at least 0.5. We applied the trained model to a forest road in southern Norway, recorded with a Global Navigation Satellite System (GNSS)−fitted dashcam. GNSS-delivered geographical coordinates at 10 Hz rate were used to geolocate the detected potholes. The geolocation performance over this exemple road stretch of 1 km exhibited a root mean square deviation of about 9.7 m compared to OpenStreetMap. Finally, an exemple road deterioration map was compiled, which can be used for scheduling road maintenance operations.

This article is part of the following collections:
Digitalization of Forest Operations

Acknowledgements

This work is part of the Center for Research-based Innovation SmartForest: Bringing Industry 4.0 to the Norwegian forest sector (NFR SFI project no. 309671, smartforest.no).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Norges Forskningsråd [309671].