Just a few years ago the notion that businesses and government agencies needed 100 Gbps WAN bandwidth seemed absurd. Why, the provider’s core networks ran mostly at 10 Gbps, with some moving to 40 Gbps to handle high traffic levels from many simultaneous customers. Today, those same customers are expecting their service providers to give them the option of 40 Gbps and even 100 Gbps connectivity.
What could be causing this massive increase in bandwidth requirements? Two words: Big data. Big Data offers big opportunities to enhance business processes and mine databases, but it consumes big processing, big storage and big bandwidth. Big storage and big processing have been the focus of cloud computing providers as well as data center expansions. Big bandwidth is now the last piece of the puzzle to be put in place to create an integrated solution.
Where is all this bandwidth going? Some of it goes to interconnecting multiple business locations. Other connections go to remote data centers for disaster backup and recovery. Some is needed to communicate with customers and suppliers. In the case of content producers, large amounts of bandwidth are needed to transport that content to the local service providers, like cable companies, or directly to the end user. The latest wrinkle is the rapid rise of cloud computing solutions. Putting those cloud resources to best use is proving to require massive increases in bandwidth.
The reason that cloud computing needs big bandwidth is that most of the traffic that used to be carried on the LAN now needs to be carried through the WAN. The enterprise data center with all its processors and disk drives is generally located in proximity to the users so that the organization can have control of the network connections. This has the dual benefit of minimizing costs because there are no traffic fees and increasing security because the entire network is under local control.
The cloud changes this equation. Processing and storage is moved to a remote data center and most of the local data center facilities go dark. The local network still connects all of the PCs, printers, SIP telephones, switches, routers and other devices. Some servers may still stay local. But the big data processing is done elsewhere where there are enough compute resources to crunch all that big data into more bite size pieces. The issue is how to get all that data to and from the cloud.
What happens when you skimp on WAN bandwidth? The cloud keeps running at full speed, but its inputs and outputs slow to a crawl. Think of a WAN connection as a pipe, as it is sometimes called, and you can visualize how too small of a pipe keeps the flow (of data) at a trickle. The local experience is that the system isn’t very responsive and interactions can become unpredictable. There can be big congestion delays where nothing happens on the screen for long periods of time. What’s worse, you don’t know if you’ll have a result in half a second, half a minute or who knows how long? That’s a productivity killer if there ever was one.
People to cloud communications is critical for high employee productivity, but machine to machine communications can be more important. It’s the machines that generate data at a rate humans can’t possibly equal. Those machines might be data banks, sensors, machine tools, video cameras & displays, server farms and so on. What’s important is that they are not constrained by the capacity of the communication channel.
tw telecom is one of the latest carriers to announce that it has introduced 40 Gbps and 100 Gbps business Ethernet services in the 75 metro markets that it serves. These services are in addition to the 2.5 and 10 Gbps bandwidth options that have been in place for some time. Traditional fiber optic technology offers up to 10 Gbps on a single fiber strand or one of several wavelengths that share that strand. Bonding multiple wavelengths using DWDM (Dense Wavelength Division Multiplexing) increases that carrying capacity up to 40 or 100 Gbps with a single connection to the user.
Who needs 100 Gbps bandwidth? Right now it’s only the largest corporations, financial institution, research labs or government agencies. But how long ago was it that DS3 at 45 Mbps was considered big bandwidth and OC-3 entry level fiber at 155 Mbps was plenty for most enterprises? Today 100 Gbps is on the cutting edge of technology. Tomorrow that level might be considered par for the course or even entry level for many businesses.
Would your organization’s productivity benefit from a bandwidth increase? You may not need anywhere near 100 Gbps, but you should know that fiber options from 10 Mbps to 1 Gbps are readily available and more affordable that you imaging. Get fiber optic bandwidth options available for your business locations.