Grid Import Optimization with Adaptive Deep Reinforcement Learning for PV-Battery Systems

Authors

  • Romi Naufal Karim Universitas Indonesia
  • Budi Sudiarto Universitas Indonesia

DOI:

https://doi.org/10.62146/ijecbe.v3i1.98

Keywords:

Deep Reinforcement Learning, Energy Management, Photovoltaic and Battery Systems

Abstract

This article explores the application of Deep Reinforcement Learning (Deep RL) to optimize energy management in photovoltaic (PV) and battery systems. The new framework presented here includes important innovations such as Rule-Based Action Smoothing for system performance consistency, PPO Multi-House Training to generalize across a wide range of energy usage patterns, and Post-Controller Integration to deal with real-time operational issues. While the dataset originates from Ireland, the model is adapted to align with Indonesia's dual-tariff system and local energy regulations. Simulation results demonstrate substantial cost savings, with reductions of up to 85.28% in stable scenarios and 18.26% in high-variability environments. These results highlight the flexibility and resilience of the methodology for using renewable energy to reduce costs and increase system efficiency. The model is, therefore, scalable for the implementation of intelligent energy systems in the residential context to support Indonesia's renewable energy goals and demonstrate its applicability to a broad range of scenarios.

Author Biographies

Romi Naufal Karim, Universitas Indonesia

Department of Electrical Engineering, Faculty of Engineering, Universitas Indonesia, Depok, Indonesia

Budi Sudiarto, Universitas Indonesia

Department of Electrical Engineering, Faculty of Engineering, Universitas Indonesia, Depok, Indonesia

References

Intergovernmental Panel on Climate Change (IPCC), “Global Warming of 1.5°C: IPCC Special Report on Impacts of Global Warming of 1.5°C above Pre-industrial Levels in Context of Strengthening Response to Climate Change, Sustainable Development, and Efforts to Eradicate Poverty,” Cambridge University Press, Cambridge, 2022. doi: DOI: 10.1017/9781009157940.

Indonesia, “Indonesia Long-Term Strategy for Low Carbon and Climate Resilience 2050 (Indonesia LTS-LCCR 2050).”

F. Echevarría Camarero, A. Ogando-Martínez, P. Durán Gómez, and P. Carrasco Ortega, “Profitability of Batteries in Photovoltaic Systems for Small Industrial Consumers in Spain under Current Regulatory Framework and Energy Prices,” Energies (Basel), vol. 16, no. 1, Jan. 2023, doi: 10.3390/en16010361.

J. Quer and E. Ribera Borrell, “Connecting stochastic optimal control and reinforcement learning,” J Math Phys, vol. 65, no. 8, Aug. 2024, doi: 10.1063/5.0140665.

A. C. Real, G. P. Luz, J. M. C. Sousa, M. C. Brito, and S. M. Vieira, “Optimization of a photovoltaic-battery system using deep reinforcement learning and load forecasting,” Energy and AI, vol. 16, May 2024, doi: 10.1016/j.egyai.2024.100347.

H. Kang, S. Jung, H. Kim, J. Jeoung, and T. Hong, “Reinforcement learning-based optimal scheduling model of battery energy storage system at the building level,” Renewable and Sustainable Energy Reviews, vol. 190, Feb. 2024, doi: 10.1016/j.rser.2023.114054.

S. Xia, P. Wei, Y. Liu, A. Sonta, and X. Jiang, “A multi-task deep reinforcement learning-based recommender system for co-optimizing energy, comfort, and air quality in commercial buildings with humans-in-the-loop,” Data-Centric Engineering, vol. 5, Nov. 2024, doi: 10.1017/dce.2024.27.

N. Ali, A. Wahid, R. Shaw, and K. Mason, “A reinforcement learning approach to dairy farm battery management using Q learning,” J Energy Storage, vol. 93, Jul. 2024, doi: 10.1016/j.est.2024.112031.

R. S. . Sutton and A. G. . Barto, Reinforcement learning : an introduction. The MIT Press, 2020.

A. A. Elshazly, M. M. Badr, M. Mahmoud, W. Eberle, M. Alsabaan, and M. I. Ibrahem, “Reinforcement Learning for Fair and Efficient Charging Coordination for Smart Grid,” Energies (Basel), vol. 17, no. 18, Sep. 2024, doi: 10.3390/en17184557.

V. Mnih et al., “Human-level control through deep reinforcement learning,” Nature, vol. 518, no. 7540, pp. 529–533, 2015, doi: 10.1038/nature14236.

J. Schulman, F. Wolski, P. Dhariwal, A. Radford, and O. Klimov, “Proximal Policy Optimization Algorithms,” Jul. 2017, [Online]. Available: http://arxiv.org/abs/1707.06347

T. P. Lillicrap et al., “Continuous control with deep reinforcement learning,” Sep. 2015, [Online]. Available: http://arxiv.org/abs/1509.02971

R. Trivedi, M. Bahloul, A. Saif, S. Patra, and S. Khadem, “Comprehensive Dataset on Electrical Load Profiles for Energy Community in Ireland,” Sci Data, vol. 11, no. 1, Dec. 2024, doi: 10.1038/s41597-024-03454-2.

D. I. Jurj, L. Czumbil, B. Bârgăuan, A. Ceclan, A. Polycarpou, and D. D. Micu, “Custom outlier detection for electrical energy consumption data applied in case of demand response in block of buildings,” Sensors, vol. 21, no. 9, May 2021, doi: 10.3390/s21092946.

Kementerian Energi dan Sumber Daya Mineral, “Tarif Tenaga Listrik Triwulan I Tahun 2021,” 2021. [Online]. Available: https://www.esdm.go.id/assets/media/content/content-tarif-tenaga-listrik-tw-i-2021.pdf

G. Brockman et al., “OpenAI Gym,” Jun. 2016, [Online]. Available: http://arxiv.org/abs/1606.01540

G. Bao and R. Xu, “A Data-Driven Energy Management Strategy Based on Deep Reinforcement Learning for Microgrid Systems,” Cognit Comput, vol. 15, no. 2, pp. 739–750, Mar. 2023, doi: 10.1007/s12559-022-10106-3.

Kementerian Energi dan Sumber Daya Mineral Republik Indonesia, “Peraturan Menteri Energi dan Sumber Daya Mineral Republik Indonesia Nomor 2 Tahun 2024 tentang Pembangkit Listrik Tenaga Surya Atap yang Terhubung pada Jaringan Tenaga Listrik Pemegang Izin Usaha Penyediaan Tenaga Listrik untuk Kepentingan Umum,” Jan. 2024. [Online]. Available: https://jdih.esdm.go.id/storage/document/Permen%20ESDM%20Nomor%202%20Tahun%202024.pdf?

Published

2025-05-21

How to Cite

Karim, R. N., & Sudiarto, B. (2025). Grid Import Optimization with Adaptive Deep Reinforcement Learning for PV-Battery Systems. International Journal of Electrical, Computer, and Biomedical Engineering, 3(1), 43–68. https://doi.org/10.62146/ijecbe.v3i1.98

Issue

Section

Electrical and Electronics Engineering

Most read articles by the same author(s)

1 2 > >>