Wednesday, May 07, 2008

Why GHz is So Last Century

GigaHertz. It's a billion cycles per second. It wasn't that long ago when we used to be impressed with MegaHertz. A PC that ran at 100 MHz was really something impressive when yours was running at 20 MHz. Now that every processor has ramped up to over a GHz, anything tagged MegaHertz seems ancient. But what if GHz is just a step on the way to something even faster. When we get to THz or TeraHertz, GigaHertz is going to look like yesterday's news. But, honestly, who needs a TeraHertz anything?

Back in the days when personal computers were young, it was said that Bill Gates chose 640 KB of RAM as the upper limit for the Microsoft Operating System because it was so much that nobody would even use it all. That actually made sense when the first 8-bit personal computers had a 64K memory limit. I remember drooling at the thought of 64K when 16K was all I could afford. Since 640K is 10x the 64K max used in early PCs, it stood to reason that it was plenty for the text, spreadsheet and word processing applications of the time.

The problem came when software technology advanced, as it has a way of doing. Now people wanted graphics and pictures in addition to plain text. Desktop publishing really helped this along. When the Internet opened up to the general public, and especially e-commerce, the hardware fell way, way behind the applications demand. It's taken a couple of decades to get to the point where PC platforms have matured and every ad you see doesn't boast of more RAM, hard disk space or processor speed. The one exception seems to be the Microsoft Vista operating system, which still brings relatively new computers to their knees.

The current benchmark is Giga-everything. Your computer needs a couple of Gigabytes of RAM, a few hundred Gigabytes of hard drive and a GigaHz or two of processor speed for GigaFlops of processing power. Internet bandwidth is playing catch-up, but going the same way. A single Mbps or a T1 line at 1.5 Mbps just isn't enough anymore. We want 10, 100 or 1,000 Mbps. Yes, a full Gbps doesn't seem to be too gluttonous to demand for high tech companies with high demand or cutting edge services.

In the meantime, the world is moving on to Tera territory. You can buy 1 TB or Terabyte hard drives now in many consumer electronics stores. That's 1,000 Gbps. Internet backbones are being upgraded toward Terabit per second capability to handle the deluge of packets that will soon be coursing through their fibers. Will processing and memory keep up?

A recent news report unveiled a development at the University of Utah that engineered a waveguide/splitter capable of working with frequencies in the 0.3 to 10 THz range. Supercomputers have processing powers in the tens and hundreds of Teraflops (thousand billion floating point operations per second). Intel and Cray are now working on supercomputers capable of operation in the Petaflop range or quadrillions of flops.

These enormous speeds will be achieved initially by bundling or paralleling existing technologies. It's much less expensive to use thousands of off the shelf microprocessors to build a supercomputer than try an engineer THz individual processors. Tbps transmission rates will come by bonding the bandwidth of multiple wavelengths within a fiber and multiple fiber strands within a bundle. But eventually, technology will catch up and these performance levels will seem mundane and available off the shelf.

As in the early days of PC computing, it seems a bit hard to understand why business and home users will be demanding Terabyte and TeraHertz gear. The real problem to seeing what the future requires is our outdated benchmarks. To put advances in speed "into perspective", researchers often boast about how fast their new development can process or deliver the entire works of the U.S. Library of Congress. Oneupmanship is now how many Library of Congresses per second can be handled.

Is anyone really wowed by this type of claim anymore? The truth is that the entire collection of written works in any library or even the world is only so big. Text storage, transmission or processing is as inappropriate a benchmark today as it was once computers got color graphics screens. A better measure might be how many YouTubes or Hollywood studio libraries can be handled per second. In the age where everything is going toward high definition video, Tbps, THz and TB are no longer technical extravagances.

A few days ago, I had a chance to see a demonstration of Mitsubishi's 3D HDTV televisions. These are normal DLP sets with the addition of an infrared emitter and LCD goggles. In 3D mode, the two lenses of the goggles are toggled in sync with picture fields that are slightly offset. What you see is a full color full resolution TV scene that reaches out from the set almost to your nose. Most of the program material now is cartoons and horror films, but it looks like game consoles will support this technology in the near future. How long before Monday Night Football is televised in 3D?

Even more impressive is a new development by Mitsubishi that eliminates the goggles. They use a linear array of 16 video cameras, 16 PCs for processing and 16 projectors to display the scene on a special lenticular screen. The prototype is a little clunky, but it's real-time 3D. Getting it packaged into something that will sell is just a matter of engineering and manufacturing. So, how much bandwidth do you suppose we'll need when everybody wants big screen 3D television, videos and games in their home? How about 3D telepresence? It's coming to a conference room near you. Sooner than you might expect.

Are you ready for more bandwidth, lots more bandwidth for your business? See how little you'll pay today for even Gigabit Ethernet connections. TeraE is going to take a little longer.

Click to check pricing and features or get support from a Telarus product specialist.




Follow Telexplainer on Twitter