If you’ve had the feeling that Internet broadband speeds have been accelerating lately, you’d be right. Broadband service was an amazing improvement on dial-up access when it was first introduced and then quickly took over. For a lot of years a few Mbps seemed plenty fast enough. Then 10 Mbps became the benchmark to shoot for. Now businesses are moving quickly to 100 Mbps and beyond, while the FCC is codifying 25 Mbps as the threshold where broadband begins. What does this really mean and what does it portend for the future?
What Does The New Standard Represent?
Official broadband standards have never been a leading indicator. They are set after a technology is well proven and the need clearly demonstrated. The move from 4 Mbps to 25 Mbps is quite a jump and it tells you something important about where bandwidth requirements are going. Is 25 Mbps the end of the road for awhile? Not likely and here’s why.
Few individuals and fewer businesses are going to pay up for higher line speeds for simple bragging rights. Something is driving the need to upgrade and then upgrade some more. The big factors are the move from local to remote resources and the nature of content itself.
The Rapid Growth of Computing Requirements
Remember message boards? Digital communications used to mean ASCII text transmissions. No need to press up against the speed of light just to get a few KB of characters to a distant server. It was the text-based model of computing that prompted Bill Gates to declare that 640K of RAM was plenty for the original PC, because who could possibly need more? Well, nobody did when the migration was starting from 64KB 8-bit processors. In just a few years, 640 KB was a joke. How long before 640 GB becomes a limitation? Probably sooner that any of us think possible.
Processor speeds have bumped up against some practical manufacturing limits, but the way around that is multiple processors. Memory, both RAM and Disk (now solid state), are sill growing. Bandwidth? Why is anybody surprised that WAN bandwidth requirements are going up when processing capability and memory are steadily increasing?
Computers always seem on the verge of not being able to keep up with us because of the functions we ask them to do. Text? That’s kids stuff. Photos? Easy. HD and 4K video? That’s more of a challenge. Will 4K be fully deployed before 8K starts to take over? What then?
You might think that software is getting less sophisticated because of the relatively small size of downloaded apps compared with some of the huge software packages that used to come in fancy cardboard boxes. That’s an illusion. The real power of of the apps we use is in all of the back-end processing that is going on at some remote server. Now everything is being tied to locality (with GPS) and highly personalized. A lot of this personalization is subtle and automatic. The system watches your behavior instead of you having to manually input a bunch of parameters.
Big data was a big bottleneck when it had to be handled locally. Just how much of a database can you put on a PC and how much grief is it going to be to keep up to date and accurate? Enormous data bases in equally enormous data centers can present a wealth of opportunity. The way the investment in gathering and managing all this data makes sense is to keep it simple for the end user. That doesn’t mean the system is simple. It just means that you and I are only dealing with the tip of the iceberg when we access these systems.
Big Data, Big Support Required
Big Data and sophisticated business applications have driven IT to a new architecture: The Cloud. The cloud pools all the processing and memory you can possibly use in a system that allows individual tenants to scale resources up and down at will. What you don’t need right now can be used by someone else. The resources they release can be put to work on your growing applications. As long as there are sufficient resources above and beyond what everybody needs at the moment, the system appears to be infinitely expandable.
What often isn’t infinitely expandable or even seems that way is bandwidth. The WAN connection, be it a dedicated line or Internet connection, has become the new bottleneck. Think of those clouds as enormous lakes full of data and you are getting your share through an old garden hose.
The way businesses are going to continue accelerating productivity is to let the machines do more and more of the work. Paper pushing is already a thing of the past in most clerical operations. Manually filling out forms is as obsolete as standing at a drill press or using a scythe to cut grain. Make no mistake that performance will continue to increase and probably accelerate. If you don’t make it happen, competitors will. That means more and more sophisticated processing, more mobility, and more data to present in simple, usable formats, more flexibility in manufacturing (think 3D printing) and larger data communication channels.
How Much Bandwidth Is Enough
In the long run, we have no idea how many Gbps or Tbps or Pbps will be needed. Right now, we can make some good estimates on what’s needed immediately and what that will grow to over the next few years.
Single digit Mbps connectivity has had its day. The only place T1 lines are still appropriate is for simple point of sale terminals and remote locations where there really isn’t any better solution. Bonding T1 lines can take you to 10 or 12 Mbps, but that’s just a stopgap. You’ll be needing more in the future.
If the consumer threshold for broadband is 25 Mbps, then that seems like a reasonable amount for businesses too. Granted, most business users aren’t creating or consuming a stream of HD movies all day. But they are accessing cloud applications, doing desktop or conference room video conferencing, sharing large files among business locations and running the phone system in the cloud. Remember that one consumer or a family is using that 25 Mbps. Your business demands per person may not be as consistently high, but you have lots of them on the same line. Productivity is also more of an issue when you are paying people. You don’t really want them sitting around waiting for the computers.
That argues for at least 10 Mbps for really small operations and 25 to 50 Mbps more commonly. Fast Ethernet at 100 Mbps used to be expensive and hard to come by. Now it’s reasonably priced and readily available. It doesn’t seem unreasonable for a medium size office to have 100 Mbps broadband… especially when that 100 Mbps may not cost much more than the 1.5 Mbps T1 line you budgeted for when you first got broadband many years ago.
Larger companies or sophisticated operations creating video content or doing 3D printing on an industrial scale can easily justify Gigabit Ethernet. So can school districts and anybody else with hundreds of simultaneous users.
How about really big companies? The new threshold may well be 10 Gbps Ethernet or 10 GigE. That service is readily available in metro areas and 100 GigE is starting to deploy nationwide on some carriers. This is where the upper end will be soon. Can Tbps service be far behind? It’s in development now.
Ethernet is the Way to Go Now
Note that all of these recommended services are Carrier Ethernet based. That’s where the industry is going for ease of interfacing and rapidly scalability. Like cloud resources, connectivity changes will be on-demand as well.
How is your company doing for broadband? Feeling the squeeze as you try to get more packets through the old lines? Feeling put upon now that the FCC has declared your connection as below broadband standards? Not to worry. Faster fiber optic bandwidth connections are plentiful and now cost much less than you might think. This is a good day to make a broadband upgrade.