Path: EDN Asia >> Design Centre >> Test & Measurement >> Revisiting the history of jitter: The early years
Test & Measurement Share print

Revisiting the history of jitter: The early years

19 May 2015  | David Maliniak

Share this page with your friends

Jitter is a signal-integrity gremlin that has been around for a long time. In fact, it's been with us since before anyone really needed to care about it. But as time has worn on, our perception of jitter has certainly changed, and with it our approaches to diagnosing it, measuring it, and ultimately dispatching it.

Here, we'll begin a traversal of the "jitter story," surveying where we've been, where we are, and where we may be going in our dealings with the phenomenon. There's no simple, straight path through the history of jitter. Rather, it's a story of numerous instruments, inventors, and twists and turns. We know, however, that it is borne of the ascent of serial data rates from a 45-baud telegraph receiver to the venerable 9-pin serial port to optical fibre carrying signals out to 160 Gbaud and up (figure 1).

Along the way, we've seen real-time oscilloscopes, sampling oscilloscopes, time-interval analysers, phase-noise analysers, and bit-error-rate (BER) testers thrown at the problem in our efforts to understand and tame it.


Figure 1: The story of jitter spans 45baud telegraph machines to 160 Gbaud optical fibre.


To take a step back for a moment, why do we care about jitter? The short version: It causes bit errors. Fundamentally, jitter is a horizontal (or time-based) phenomenon in which the edges of waveform transitions arrive early or late with respect to the clock that is latching the signal. If, for instance, the data edge arrives after its companion clock edge, then a bit that was supposed to be latched as high will be latched as low (figure 2). Wrong edge timing begets incorrect latching which begets bit errors.


Figure 2: Jitter happens when data edges and their associated clock signals aren't marching in step.


In the early days of digital logic—the 1960s—the issue surrounding timing measurements and proper latching concerned setup and hold times. Investigation of setup and hold performance was relatively straightforward, even with the analogue oscilloscopes of the day. One would trigger on the clock and measure the time from one edge to the next using cursors. In other words, you'd try to duplicate the timing diagrams on the datasheet to see if you fell within the requisite timing margins.

1 • 2 Next Page Last Page


Want to more of this to be delivered to you for FREE?

Subscribe to EDN Asia alerts and receive the latest design ideas and product news in your inbox.

Got to make sure you're not a robot. Please enter the code displayed on the right.

Time to activate your subscription - it's easy!

We have sent an activate request to your registerd e-email. Simply click on the link to activate your subscription.

We're doing this to protect your privacy and ensure you successfully receive your e-mail alerts.


Add New Comment
Visitor (To avoid code verification, simply login or register with us. It is fast and free!)
*Verify code:
Tech Impact

Regional Roundup
Control this smart glass with the blink of an eye
K-Glass 2 detects users' eye movements to point the cursor to recognise computer icons or objects in the Internet, and uses winks for commands. The researchers call this interface the "i-Mouse."

GlobalFoundries extends grants to Singapore students
ARM, Tencent Games team up to improve mobile gaming


News | Products | Design Features | Regional Roundup | Tech Impact