Condicions d'accésAccés restringit per política de l'editorial
Ultrafast optical communication is the backbone of high-speed global networking infrastructure. Optical time division multiplexing (OTDM) is a popular technique for embedding data from many simultaneous users on a single optical channel. This paper studies the optimal clock signal used in optical time gating to extract the data of the desired user in an OTDM network. We show that the pulse width of the clock signal can be optimized to achieve a minimum bit error rate (BER) in these networks. In this paper, we assume that the optical clock signal used for time gating has jitter, and there is therefore a delay variation between the clock and data signals. We model this delay as a zero mean Gaussian random variable. Using this model, an analytical BER expression is derived for systems with Gaussian pulses. In the numerical results, we find the optimal values of the clock pulse width by evaluating the BER versus the pulse width for different variances of the delay. Simulation results are also presented to evaluate the accuracy of the analytical expression.
CitacióYazdani, A.; Rincon, D.; Sallent, S. Efficient time gating in ultrafast optical TDM networks. "Photonic network communications", 01 Desembre 2014, vol. 28, núm. 3, p. 218-224.