Abstract
Driven by the escalating requirements for efficiency and operational safety in power-generation units, data-driven forecasting of generator output has emerged as a critical research frontier. Generator performance is governed by a complex interplay between mechanical operating conditions and stochastic human factors. In this study, we performed online monitoring of turbine lubricating oil to assemble a comprehensive time-series dataset integrating oil wear signatures and unit power metrics. We propose a synergistic deep-learning architecture, the Transformer–LSTM, which leverages an LSTM-based auxiliary extractor to capture fine-grained local temporal patterns. Concurrently, positional encodings, timestamp embeddings, and value embeddings are employed to characterize global contextual features. By fusing these local and global representations, the model achieves a joint perception of short-term dynamics and long-range dependencies. Empirical results demonstrate that the proposed hybrid model significantly outperforms standalone LSTM and Transformer baselines across diverse evaluation metrics in terms of both predictive accuracy and robustness. This framework provides robust technical decision support for condition-based maintenance and optimized power-generation scheduling.
Get full access to this article
View all access options for this article.
