Learning EPON delay models from data: A machine learning approach

Thumbnail Image
Publication date
Defense date
Journal Title
Journal ISSN
Volume Title
Optical Society of America
Google Scholar
There have been a large number of studies focused on the characterization of the upstream delay in time-division multiplexing passive optical networks (TDM-PONs). However, most of them focus on finding equations for the average delay and ignore other useful metrics like delay percentiles, which are of paramount interest in dimensioning PONs with delay guarantees. This work shows how to learn delay models from data using supervised machine learning (ML) techniques. Essentially, a nonlinear regression ML algorithm is trained with PON simulation data, showing that it can provide accurate equations for such metrics of interest. In particular, we obtain an 𝑅2 score above 80% under Poisson traffic and above 65% under self-similar traffic, and we provide a general equation for any delay percentile in the upstream channel of a PON employing interleaved polling with adaptive cycle time. We further show its applicability in dimensioning Tactile Internet and 5G transport support scenarios.
Machine learning, Multiple input multiple output, Multiplexing, Passive optical networks, Stochastic gradient descent, Wavelength asssignment
Research Projects
Bibliographic citation
Hernández, J. A., Ebrahimzadeh, A., Maier, M., & Larrabeiti, D. (2021). Learning EPON delay models from data: a machine learning approach. Journal of Optical Communications and Networking, 13(12), 322-330.