Scalability analysis of machine learning QoT estimators for a cloud-native SDN controller on a WDM over SDM network

C. Manso*, R. Vilalta, R. Munoz, N. Yoshikane, R. Casellas, R. Martinez, C. Wang, F. Balasis, T. Tsuritani, I. Morita

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Maintaining a good quality of transmission (QoT) in optical transport networks is key to maintaining the service level agreement between the user and the service provider. QoT prediction techniques have been used to assure the quality of new lightpaths as well as that of the previously provisioned ones. Traditionally, two different approaches have been used: analytical methods, which take into account most physical impairments that are accurate but complex, and high margin formulas, which require much less computational resources at the cost of high margins. With the recent progress of machine learning (ML) together with software defined networking (SDN), ML has been considered as another option that could be both accurate and that does not consume as many resources as analytical methods. SDN architectures are difficult to scale because they are usually centralized; this is even worse with QoT predictors using ML. In this paper, a solution to this issue is presented using a cloud-native architecture, and its scalability is evaluated using three different ML QoT predictors and experimentally validated in a real wavelength-division multiplexing (WDM) over spatial-division multiplexing (SDM) testbed.

Original languageEnglish
Pages (from-to)257-266
Number of pages10
JournalJournal of Optical Communications and Networking
Volume14
Issue number4
DOIs
Publication statusPublished - 2022 Apr 1
Externally publishedYes

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Scalability analysis of machine learning QoT estimators for a cloud-native SDN controller on a WDM over SDM network'. Together they form a unique fingerprint.

Cite this