The Basics: Optical Receiver Stress Test

Blog Post created by stmichel Employee on May 9, 2017

The fundamental test for these network elements is the bit error ratio, demonstrating reliable operation in digital data transmission systems and networks. The basic principle is simple: the known transmitted bits are compared with the received bits over a transmission link including the device under test. The bit errors are counted and compared with the total number of bits to give the bit error ratio (BER). The applied test data signal can be degraded with defined stress parameters, like transmission line loss, horizontal and vertical distortion to emulate worst-case operation scenarios at which the device under test has to successfully demonstrate error free data transmission. Obviously, this test is of fundamental importance for receiving network elements, due to the manifold impairments occurring on optical transmission lines. Therefore, many all optical transmission standards define such stressed receiver sensitivity on the basis of a BER measurement. The basic test methods and setups are usually very similar. However, the test conditions, the stress parameters or methods of stress generation vary from standard to standard, depending on the application area, transmission medium, data rate or data protocol.


If you would like to know about the basic setup and different stress conditioning scenarios for 40GBASE-LR4, 100GBASE-LR4 and ER4 an others in detail, continue reading here:


Lightwave Catalog 2018


 on page 16



Definition of optical parameters

OMA: Optical Modulation Amplitude, measured in [μW] ("average signal amplitude")

A0: Vertical eye opening ("innermost eye opening at center of eye") [dBm or μW]

VECP: Vertical Eye Closure Penalty