AI-Driven Predictive Caching for Edge Computing

Main Article Content

Ashlesha Gupta

Abstract

The exponential growth of digital content consumption is straining traditional centralized networks, resulting in high latency and server overloads. To meet user expectations, Edge Computing has emerged as a vital solution, placing processing and storage resources near end-users. Within this decentralized architecture, caching is critical for efficient content delivery. However, conventional caching methods like LRU and LFU are static and fail to adapt to dynamic user behavior and evolving content popularity. This paper proposes an AI-Driven Predictive Caching framework to overcome these limitations. The approach utilizes the Random Forest Algorithm to analyze historical access patterns and proactively forecast future content demands. This predictive strategy is demonstrated using Redis to simulate a realistic, in-memory edge environment, with Streamlit providing real-time visualization of system behavior and predictions. Our findings aim to demonstrate that integrating machine learning with edge computing significantly enhances traditional caching mechanisms, resulting in higher cache hit rates, reduced latency, and ultimately creating more adaptive and responsive edge networks essential for critical content delivery systems

Article Details

How to Cite
Ashlesha Gupta. (2023). AI-Driven Predictive Caching for Edge Computing. International Journal on Recent and Innovation Trends in Computing and Communication, 11(5), 580–591. Retrieved from https://ijritcc.org/index.php/ijritcc/article/view/11769
Section
Articles