Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Edge computing has emerged as a viable paradigm for delivering cloud-like services to mobile consumers. Edge computing is able to intelligently deliver on-demand services with minimal latency because it takes use of the geographical and social similarities among users. However, a bottleneck in the performance of edge computing has been established as a result of an unprecedented number of activities together with the diverse nature of applications that will be used in future smart cities. Some examples of these applications are healthcare, energy management, transportation, cybersecurity, and so on. Therefore, in order to keep up with the ever-evolving computing environment and to continue to deliver dependable smart services to mobile consumers, edge devices must be outfitted with adaptive machine learning algorithms. In addition to reducing the load on edge servers, a cooperative learning system in which information is sent from the cloud to the edge and the edge to the cloud can improve the efficiency of the learning process. In this work, we develop a smart energy management system for IoT devices using deep reinforcement learning and edge computing. In the first section, we provide a high-level overview of IoT-based smart city energy management. At the end of the paper, a framework and software model for an Internet of Things-based system that employs edge computing are presented. Finally, we describe a deep reinforcement learning-based energy scheduling technique that makes optimal use of the suggested architecture. Lastly, we provide examples of the success of the suggested method.