Delay-based reservoir computing is a neuromorphiccomputing technique employing a single nonlinear node, in this case a semiconductor laser subjected to a delayed optical self-feedback. This implementation of reservoir computing, while being versatile and highly performant, is limited in processing speed by the delay length. In an effort to reduce the required delay line length, while maintaining the same strong performance, we investigate if it is possible to distribute the processing power over more than
one longitudinal mode. We therefore have numerically analysed a rate equation model of a semiconductor laser with two longitudinal modes, delayed optical feedback and optical injection. Our modelling takes into account the carrier inversion moments that represent carrier gratings formed in the cavity during operation. This allows us to study the importance of modal interaction via carrier gratings and how this effects the computational power. We show that the setup with a dual-mode laser and an optimized
reservoir parameter space, can reduce the delay-line fourfold in comparison with a single-mode laser setup.
Original languageEnglish
Pages (from-to)1-9
Number of pages9
JournalIEEE Journal of Selected Topics in Quantum Electronics
Volume25
Issue number6
Publication statusPublished - 11 Nov 2019

ID: 49493029