Since you mentioned Nortel, I'm guessing you're Canadian and I think "bit error ratio" must be a Canadian thing. In 25 years of comms engineering in the U.S., I have always heard BER defined as "bit error rate."
True KB3001, FPGAs continue to eat into the ASIC market but the reason has to do mostly with cost of designing ASIC going thru the roof not with high speed links technology. Ten years ago I worked on ASICs with 2.5-10 Gb/s IOs while FPGAs at the time could deliver 1 Gb/s so there was a gap. But that gap still exists, FPGAs can do 10 Gb/s while highly specialized ASICs can do 100 Gb/s (the number above are per differential pair, you can always increase the bandwidth by going more parallel). One can of course argue that 10 Gb/s per 2 pins is sufficient so that gap is less relevant and FPGA is on par. ASIC development cost however used to be in single millions of dollars, is now is several millions of dollars so TAM required to justify the cost exists only in very small number of system level sockets. Hence everyone is using FPGAs unless it is a cell phone, PC or Ethernet switch...Kris
It's Bit Error RATIO, not Bit Error Rate.
Li is waving the pom poms again, clumsily using ex-CTO credentials to claim "putting a big instrument in a tiny box inside the chip", when Nortel was shipping this BER enhancing eye profiling technology in multi-gigabit chips in the early 1990's based on Tremblay et al's US patent 4,823,360 (1988), followed by NEC, Cisco, JDS, Vitesse, and others with patents in the same area. The eye asymmetries owing to the nature of fiber optics AFEs seems to also be a revelation for Li, but is day to day life for the customers he probably has never been in the lab with. [yawn]
@zeeglen - chip inductors have been used or on-chip oscillators for over two decades. In 28nm, they'll take up a huge amount of chip area in terms of transistor count. You are also FOS, and the kid was right, about seeing eye-closing phase hits that happen once in tens of minutes if not hours.
Eye patterns are useful at any data rate, but it takes familiarity and experience to interpret them. I remember a meeting where a young guy just out of school claimed that one could not simply glance at an eye pattern and classify it as good, acceptable, or bad. I had to set him straight that maybe kids fresh out of school could not, but those who have looked at eye patterns for 30 years certainly can.
Fascinating stuff to those of us who are not on the cutting edge. I remember eye patterns from 50 baud FSK systems and 9600 BPS Codex modems. (God, I sound like a real old fart....) As data rates got faster they seemed to be done away with. Nice to see them still being used at 10 GHz. I always reckoned they were immensely valuable, you could see from the pattern exactly what was wrong with your link.
Noticed something else same paragraph that is of interest. Does Altera need an external copper inductor or is the L part of the internal silicon?
The use of an LC tank in a VCO for PLL clock recovery is not new, was a conventional method long before the rickety voltage controlled ring oscillator was ever used. But if the L has been incorporated into the silicon then yes, that is relatively new. If so, might you have a link to Atera publications describing this technique in more detail?
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.