符号(正式)
缩小
计算机科学
字错误率
人工智能
算法
程序设计语言
作者
Muhammad Iqbal,Salman Ghafoor,Arsalan Ahmad,Abdulah Jeza Aljohani,Jawad Mirza,Imran Aziz,L. Potì
标识
DOI:10.3389/fphy.2024.1387284
摘要
Short-reach optical communication networks have various applications in areas where high-speed connectivity is needed, for example, inter- and intra-data center links, optical access networks, and indoor and in-building communication systems. Machine learning (ML) approaches provide a key solution for numerous challenging situations due to their robust decision-making, problem-solving, and pattern-recognition abilities. In this work, our focus is on utilizing deep learning models to minimize symbol error rates in short-reach optical communication setups. Various channel impairments, such as nonlinearity, chromatic dispersion (CD), and attenuation, are accurately modeled. Initially, we address the challenge of modeling a nonlinear channel. Consequently, we harness a deep learning model called autoencoders (AEs) to facilitate communication over nonlinear channels. Furthermore, we investigate how the inclusion of a nonlinear channel within an autoencoder influences the received constellation as the optical fiber length increases. Another facet of our work involves the deployment of a deep neural network-based receiver utilizing a channel influenced by chromatic dispersion. By gradually extending the optical length, we explore its impact on the received constellation and, consequently, the symbol error rate. Finally, we incorporate the split-step Fourier method (SSFM) to emulate the combined effects of nonlinearities, chromatic dispersion, and attenuation in the optical channel. This is accomplished through a neural network-based receiver. The outcome of this work is an evaluation and reduction of the symbol error rate as the length of the optical fiber is varied.
科研通智能强力驱动
Strongly Powered by AbleSci AI