Abstract
Centrifugal compressors are widely applied in both aero-engines and internal combustion engine turbochargers. Large-scale scaling design (similarity scaling) has effectively addressed the challenges of long design cycles and high design costs associated with centrifugal compressors. However, rapidly obtaining performance maps for similarity-scaled compressor designs through numerical simulation remains a challenging problem. Addressing this, the similarity and scaling laws informed neural networks (SINN) proposed in this study integrates 520 sets of experimental data (full dataset) from a prototype (m1 = 1.0) and three types of scaled compressor designs (m1 = 0.9/1.2/1.3) to construct a physics-informed hybrid loss function. Furthermore, this study investigates the influence of varying training sample sizes (520 sets, 416 sets, 112 sets) on the prediction results. The prediction results indicate that SINN, when trained on the full dataset, achieves a mean absolute error (MAE) of 0.379 for mass flow rate prediction, representing a 19.2% reduction compared to the traditional BPNN model (0.469), while the pressure ratio prediction deviation consistently remains within ±3%. Even with only 20% of the training samples, SINN still maintains an MAE of 0.370, demonstrating a 51.3% reduction in error compared to BPNN. For extrapolative prediction to the tested scaling ratios (m1 = 0.9/1.3), SINN’s mass flow rate prediction error is reduced by 23.7%–50.4% compared to BPNN. The prediction performance of SINN shows significant advantages over BPNN, whether using the full training dataset or sparse or small-sample training datasets. By explicitly embedding similarity principles to constrain the solution space, the SINN model validates the advantages of physics-informed neural networks in scaled compressor design, specifically their high accuracy and strong generalization capability. The results demonstrate that the prediction stability and accuracy of the proposed SINN are superior to those of traditional backpropagation neural networks (BPNN), making it also suitable for extrapolative prediction.
Get full access to this article
View all access options for this article.
