Performance Analysis of Edge Computing Architectures for Low-Latency Networking

Authors

  • Ramvriksh Benipuri Presidency University, Bangalore, Karnataka, India Author

DOI:

https://doi.org/10.15662/IJARCST.2025.0804001

Keywords:

Edge computing, low-latency networking, performance analysis, distributed offloading, serverless edge, fog computing, performance inversion, edge vs cloud, response time, edge benchmarking

Abstract

Edge computing has emerged as a cornerstone of low-latency networking, promising to minimize response times by processing data closer to the source. This paper critically analyzes edge computing architectures, evaluating performance in scenarios demanding real-time responsiveness. It integrates findings from architecture proposals, analytic studies, experimental evaluations, and comparative benchmarks, highlighting the contexts in which edge outperforms centralized cloud solutions—and when it doesn’t.

We propose a taxonomy of evaluation metrics—latency, tail latency, queuing delays, throughput, resource utilization, and scalability—and outline criteria for architectural comparison. Core contributions include in-depth review of: (1) distributed offloading frameworks optimizing stateless task execution at edge nodes, demonstrating similar or better delays than centralized systems with lower network usage arXiv; (2) analytic and experimental studies revealing "performance inversion"—where edge systems under high utilization yield worse end-to-end latency than cloud alternatives arXivSC21; (3) performance analysis of serverless platforms deployed on resource-constrained edge hardware (e.g., Raspberry Pi clusters), where OpenFaaS delivered lower response times than cloud offerings, albeit with reliability trade-offs arXiv; (4) benchmarking across edge, fog, and cloud layers—in object detection workloads, fog surpassed both, with edge significantly lagging due to limited compute resources MDPI.

The paper presents an evaluation methodology combining realistic workloads, diverse hardware profiles, and layered comparisons. Key findings underscore that while edge brings latency advantages in light-load scenarios, its constrained compute may limit performance under stress. Energy-efficiency and hybrid edge-cloud models offer balanced solutions for industrial applications MDPI. Future work should address dynamic resource provisioning, quantifying queuing thresholds for performance inversion, and developing adaptive offloading strategies.

References

1. Claudio Cicconetti, Marco Conti, Andrea Passarella. “Architecture and Performance Evaluation of Distributed Computation Offloading in Edge Computing.” arXiv, 2021. arXiv

2. Ahmed Ali-Eldin, Bin Wang, Prashant Shenoy. “The Hidden Cost of the Edge: A Performance Comparison of Edge and Cloud Latencies.” arXiv, 2021. arXivSC21

3. Hamza Javed, Adel N. Toosi, Mohammad S. Aslanpour. “Serverless Platforms on the Edge: A Performance Analysis.” arXiv, 2021. arXiv

4. Applied study comparing edge, fog, and cloud using YOLO models for object detection. MDPI, detail. MDPI

5. Hybrid edge-cloud architecture for industrial condition monitoring. MDPI, detail. MDPI

6. Performance evaluation metrics taxonomy for cloud, fog, and edge. IoT Journal, 2020. ScienceDirect

7. Edge computing benefits over cloud: latency, scalability, energy usage. Quantum Zeitgeist article referencing studies.

Downloads

Published

2025-07-01

How to Cite

Performance Analysis of Edge Computing Architectures for Low-Latency Networking. (2025). International Journal of Advanced Research in Computer Science & Technology(IJARCST), 8(4), 12449-12453. https://doi.org/10.15662/IJARCST.2025.0804001