ABSTRACT
Remaining useful life (RUL) predictions are a key enabler for predictive maintenance. Data-driven approaches, typically based on deep neural networks (DNNs), have shown success in RUL prediction. However, DNNs are usually handcrafted via a labor-intensive design process. To overcome this issue, we propose a neural architecture search (NAS) technique based on a surrogate-assisted genetic algorithm that automatically discovers the optimal architectures of Transformers. To our knowledge, this is the first work to optimize the architecture of Transformers for RUL predictions using evolutionary computation. We evaluate the performance of the proposed method (in terms of RMSE and -score) on the well-known CMAPSS benchmark dataset. Compared with the state-of-the-art, the Transformers obtained by our NAS method outperform other recent handcrafted DNNs in terms of RMSE and are comparable regarding the -score. Our results demonstrate that the proposed method provides better prediction accuracy with less human effort compared to other data-driven approaches.
Disclosure statement
No potential conflict of interest was reported by the authors.