The terms “bit error rate” and “bit error ratio” are used interchangeably on many websites and publications. However, the definitions are very different. Understanding the difference will help you effectively analyze your system’s performance.

# What is a BERT?

To find the bit error rate or bit error ratio of your system, you need a Bit Error Rate Tester or Bit Error Ratio Tester (BERT). BERT refers to a class of test equipment; depending upon the manufacturer or distributor, BERT stands for Bit Error Rate Tester or Bit Error Ratio Tester. A BERT tests the complete transmitter/receiver system for any data loss. It transmits data into a system, and then measures how well a system transmits and receives the data. To do this, a BERT requires a pattern generator and error detector.

# What is the difference?

There is a very clear difference between the error ratio and rate. Understanding that difference is important to assess your system performance.

**Bit error ratio** (BER) is the number of bit errors divided by the total number of bits transferred during a specific time interval.

**Bit error rate** (also BER) is the number of bit errors per unit time.

Essentially, the bit error rate refers to errors with respect to time, and the bit error ratio refers to errors with respect to the quantity of transferred bits.

The bit error ratio is a unitless performance calculation and is expressed as a percentage. It is an estimate of the bit error probability which is the expected value of the bit error ratio. This estimate is more accurate over a longer time interval and when capturing a high number of bit errors.

# Why it’s important differentiate?

It’s important to differentiate between bit error rates and bit error ratios. If your BERT pattern generator sends 100 bits to your device under test and your BERT error detector sees 10 errors, the bit error ratio is 10 percent.

The bit error rate is the bit error ratio multiplied by the bit rate. For example, if your BERT pattern generator sends bits to your unit under test at a rate of 100 bits/second and your BERT error detector sees 10 errors every 100 bits, the bit error rate would be the bit error rate equals 10 bits/second.

The bit error rate is used more often because it tells you how long it will take to encounter an error. For example, using the calculated 0.1 bit error ratio above tells you the ratio between errors received and number of data bits sent.

But what does knowing the bit error ratio really tell you about your system performance? Not much - you need to know your data rate. If your system data rate was 1 bit per week, then your system calculated bit error rate would only be one error in 10 weeks. Another example would be if your system data rate was 100 G bits/second, then your bit error rate would be 10,000,000,000 errors every second!

# Summary

The bit error ratio is the number of bit errors divided by the total number of bits transferred during a specific time interval. Bit error rate is the number of bit errors per unit time. The bit error rate gives you an indication of your system’s performance relative to bits transferred vs bits received. Visit Keysight.com to learn more about Keysight’s bit error ratio test options.