Abstract
The growing demand for electric vehicles has brought fast charging into focus as a practical solution to reduce downtime. However, little is understood about how such rapid charging methods affect battery performance under different temperature conditions, especially in real-world environments. This study examines how ambient temperature, alongside the type of charger used Level 1, Level 2, or DC fast charging affects key battery performance indicators such as state of charge (SOC) gain, energy usage, and long-term capacity behavior. Using data collected from 1320 real-world EV charging sessions, the analysis revealed a significant drop in SOC gain by as much as 27% in sub-zero conditions and 19% when temperatures exceed 40°C. Similarly, energy consumption increased by 20%–25% at these extremes, pointing to the additional power drawn by thermal management systems and reduced charging efficiency. Among the charging types, DC fast chargers were the most sensitive to temperature fluctuations, particularly at high ambient temperatures, where internal resistance effects were more pronounced. A key contribution of this work is the incorporation of electrochemical resistance parameters namely charge transfer resistance (Rct) and electrolyte resistance (Re) into a suite of machine learning models for performance prediction. Of these, the XGBoost model offered the most accurate results, with an R2 of 0.95 and RMSE of 1.63. Interpretability analysis using SHAP confirmed that Rct and Re were stronger predictors of battery performance than temperature alone. By leveraging real-world data and explainable machine learning, this research offers a more realistic understanding of how EV batteries behave under varying conditions and proposes a path toward more intelligent, temperature-aware battery management systems.
Keywords
Get full access to this article
View all access options for this article.
