The International Telecommunication Union Announces a New Architecture That Incorporates MI Mechanisms
Recently, a study into the architecture has been presented by members of the Wireless Networking and AI&ML research groups and described as a “new standard” for telecommunications networks.
Machine learning (ML) has been widely touted as a key enabler of future wireless networks. By being able to take advantage of huge data volumes, ML is expected to be the solution to current networking problems which grow increasingly complex with each passing day.
However, current networks are not yet prepared to support ML-based applications that have specific requirements for data collection, processing, and output.
A New Architecture for Machine Learning
Now, a new architecture for machine learning in future networks (5G and beyond), which was approved for telecommunications networks last July following recommendations from the International Telecommunications Union (ITU), has been proposed for adoption by members of the Wireless Networking and AI&ML research groups following a study funded by the agency StandICT.eu, which promotes the participation and contribution by academics to single digital market standards, such as 5G and cloud computing.
A figure showing a series of mechanisms that can facilitate the adoption of ML-based architecture. Image used courtesy of Pompeu Fabra University (UPF)
Machine Learning Applications
The study was led by Boris Bellalta and Anders Jonsson, professors at Pompeu Fabra University (UPF) in Barcelona, Spain.
With the grant, the researchers were able to study how the applications of ML can lead to several use cases that require transmissions of between 10 and 20 Gbps capacity, support a large volume of devices, and reduce latency to less than 5ms.
In their research, the team looked at wireless local area networks (WLANs) specifically because, due to their nature, they can be found in multiple applications and deployments from cloud-based to edge computing.
The study’s findings were published in IEEE Communications on March 18 and describes a use case that involves neural networks which the study claims make it possible to learn a series of complex patterns that current mechanisms cannot handle.