Predicting and Optimizing Throughput in Hybrid 6G Communication Networks

6G communication networks are complex and involve integration of multiple subsystems, operators, services and applications. While network planning tools are useful for dimensioning the system to address envisaged deployment scenarios, in-situ network management tools can be valuable in exercising near real-time control mechanisms. Most of the tools recently being investigated for these purposes include some form of mathematical programming or machine learning algorithms.

One key metric that can be used to optimize the system performance is throughput or data rate. Whether the network operations are served by terrestrial, non-terrestrial infrastructure or both, end-to-end throughput is impacted by the individual link conditions, serving capacity per base station, application-layer parameters and configuration of the protocol stack. This leads to complex interdependencies that factor into objective functions and constraint sets. One of the recent works¹ investigates such an optimization formulation in next-generation networks capable of supporting multiple numerologies.

Nonetheless, optimization techniques that operate on a snapshot basis can be more effective when coupled with prediction of system parameters. To perform prediction of performance metric such as throughput, it is necessary to first characterize the correlation properties. A first order analysis of the relevance of machine learning algorithms has been studied in a recent publication². Application of predictive algorithms in control loops can enable optimal configuration of the network parameters thereby improving overall efficiency and meeting end user’s service level agreements.

Representative datasets are a key requirement for training robust machine learning algorithms. For this purpose, system simulation environments and testbeds can be critical to ensure higher quality datasets can be generated. One such tool that has been used to characterize uplink throughput is also highlighted in the publication³. Typically, such a data generation is associated with appropriate, comprehensive scenario definitions with capabilities to import information from real-world data. Portions of inputs are also taken from 3GPP technical reports and standards documents.

Author: Sahana Raghunandan

¹ S. Raghunandan, K. Koulagi, and C. Rohde, “Throughput optimization in multi-numerology 5G NR non-terrestrial networks,” IET Conference Proceedings, pp. 10–16(6), January 2024. [Online]. Available: https://digital-library.theiet.org/content/conferences/10.1049/icp.2024.08134.0813 (awarded certificate of commendation)

² S. Raghunandan and S. Ebrahimi, “Machine Learning based Throughput Prediction in 5G NR Non-Terrestrial Networks using System-Level Simulator,” International Communication Satellite Systems Conference, ICSSC 2024.

³ S. Raghunandan, M. Bauer, S. Roy, K. Koulagi, and C. Rohde, “Throughput characterization of 5G-NR broadband satellite networks using OMNET++ based system level simulator,” IET Conference Proceedings, pp. 115–121(6), January 2023. [Online]. Available: https://digital-library.theiet.org/content/conferences/10.1049/icp.2023.1371 

Related Posts