1,099
Views
0
CrossRef citations to date
0
Altmetric
Research Article

A lightweight and style-robust neural network for autonomous driving in end side devices

, , &
Article: 2155613 | Received 31 May 2022, Accepted 20 Nov 2022, Published online: 28 Dec 2022
 

Abstract

The autonomous driving algorithm studied in this paper makes a ground vehicle capable of sensing its environment via visual images and moving safely with little or no human input. Due to the limitation of the computing power of end side devices, the autonomous driving algorithm should adopt a lightweight model and have high performance. Conditional imitation learning has been proved an efficient and promising policy for autonomous driving and other applications on end side devices due to its high performance and offline characteristics. In driving scenarios, the images captured in different weathers have different styles, which are influenced by various interference factors, such as illumination, raindrops, etc. These interference factors bring challenges to the perception ability of deep models, thus affecting the decision-making process in autonomous driving. The first contribution of this paper is to investigate the performance gap of driving models under different weather conditions. Following the investigation, we utilise StarGAN-V2 to translate images from source domains into the target clear sunset domain. Based on the images translated by StarGAN-V2, we propose Conditional Imitation Learning with ResNet backbone named Star-CILRS. The proposed method is able to convert an image to multiple styles using only one single model, making our method easier to deploy on end side devices. Visualization results show that Star-CILRS can eliminate some environmental interference factors. Our method outperforms other methods and the success rate values in different tasks are 98%, 74%, and 22%, respectively.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work is supported by the Fundamental Research Funds for the Central Universities [grant number 2022JBMC012] and the National Natural Science Foundation of China [grant number 62206013].