Showing posts with label IT services. Show all posts
Showing posts with label IT services. Show all posts

Wednesday, February 21, 2024

Colocation Hosting Offers Lower Costs and More

By: John Shepler

Traditionally, having your own data center has been the way to go for medium and larger businesses. It may still be the best answer for your needs. However, there are advantages to moving at least some of your equipment and data to a colocation hosting facility. Let’s have a look at why this might be a great idea.

Colocation Hosting offers advantages for your business. Just What is Colocation?
Colocation sounds like it means having two or more operations in one location. That’s pretty much it. Colocation facilities were originally called carrier hotels. Multiple service providers would locate equipment in the same building run by one of the carriers or a third party who provided common services such as HVAC, AC power, backup generator power, and connectivity as needed. It was a way for carriers to easily exchange traffic on neutral turf and not have to each pay for their own building.

Nowadays colocation facilities or “colos” serve businesses as well as carriers. They are still a great way to connect to the Internet with as much bandwidth as you need at the best prices and connect to various carriers who also happen to be in the facility. Compare that with trying to get 10 Gbps or 100 Gbps out in the boonies.

In addition to bandwidth, colocation centers offer rack space, cabinets, cages and the power, cooling and connections to go with them. You generally bring your own servers and storage and take responsibility for maintaining your equipment. Many colos also now offer expanded tech services to monitor and service equipment 24/7 and may even lease you servers and other equipment you would normally buy.

Why Pay Someone Else to Host My Servers?
It may seem logical to keep everything under one roof, but that doesn’t necessarily give you everything you need. For one thing, you have no redundancy in the event of a disaster. A tornado, hurricane, earthquake or flood can wipe out your data center and you’ll be out of business for awhile. If you keep all your backups in the same data center, that could take a long while and very expensive to recreate all the data you need.

By having at least some of your servers and storage offsite, you gain the advantage of redundant facilities. It works even better if your sites are geographically dispersed.

As mentioned previously, you’ll likely find much better connectivity at better prices in a colocation center. Carriers go where many potential customers are clustered so that they can quickly and easily provide service. It’s just a cable run in the colo.

Are you able to provide tech service 24/7? Most colocation facilities have their own tech staffs available round the clock to handle their equipment and often provide a suite of services to their customers.

Think about the cost of expansion. If you are running out of space now, you have a make or buy decision to face. Making means building or leasing a larger facility to accommodate your growing needs, including more backup power, tech support, and security. Buying means avoiding the capital investment in facilities and leasing from a colo, likely at much lower cost that doing it all yourself.

Another area where colos shine is being physically close to your customers to reduce transmission latency. That’s more important if you have latency sensitive real-time applications and if you are selling your service nationwide or worldwide. Some of the larger colos have multiple sites so that you can disperse your equipment as needed.

Is your business growing and creating a need for additional data center capacity or would you simply prefer to lease rather than buy the IT facilities you need? If so, consider the advantages of colocation hosting and get competitive quotes now.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter


Friday, November 10, 2023

When It Comes to Computer Networks, Trust No One and No Thing

By: John Shepler

Network security is a major headache for business. It almost makes one long for the days of one computer per desk and nothing connected to anything else.

Almost. Those air-gapped computers weren’t all that secure either. Sneaker networks, meaning running around with floppy discs, allowed malware to spread and sensitive files to be copied. It’s just that today’s networks with LANs, local data centers, multi-clouds, and the Internet make it really hard to know who’s sneaking in where and what they are up to.

One breach in a corporate network can run up a cost in the millions. If ransomware is involved, the bill can be a lot higher… and a lot more disruptive. What can you do? Don’t be so trustful. Make sure your system is suspicious of everybody and everything all the time. The buzzword for that is “zero trust security.”

Protect your castle with better network securityWhat is Zero Trust and How is it Different?
Traditional network security is sometimes compared to a castle with a moat. The castle is your corporate network. Everybody inside the castle is considered to be friendly and trustworthy. Everybody beyond that moat is suspected to be an enemy. The drawbridge is your firewall. It works to keep the bad actors away from the castle while allowing trustworthy visitors access. It assumes that everything bad is going to come through the Internet.

There are a couple of weak links with this approach. First is that some bad actors can already be inside the castle. There are spies and infiltrators and even trusted employees that have turned rogue. Of course we want to trust our colleagues, and that’s how we get in trouble. Even worse when we automatically trust our vendors and customers.

Then there is the famous tale of the Trojan Horse. Gee, it sure looks safe enough. Let’s open the firewall and bring it in. You can just imagine some well-meaning but naive individual in your company doing just that. Of course the gullible Trojans got the worst of that deal since once the Greeks were inside they had the run of the city.

Moral of the story: It’s too easy to have your organization destroyed by one little misstep. Trust no one and no thing. Network security is not an insult to your integrity. It’s a way to make everyone more secure and prevent little slips from becoming major disasters. That means high security processes both inside and outside the network.

What Makes Zero Trust Work?
It starts with having everybody and every thing, meaning anything attached to the network, prove that it is approved for access and what they are approved for. You can’t really say that because someone has been cleared by, say, logging-on, that they should be able to access all the files and every peripheral on the net.

Oh, no. You must have a need to know for everything you want to access. That leads to segmenting the network into much small pieces that each have to be accessed separately. You may have access to one set of information to be able to do your job, but no way are you getting into some of the companies trade secrets or even financial data. Access to HR files? Fat chance… unless you are specifically authorized to see them.

Each use and each device will have a profile constructed that says what they can do and where they can do it. These lists will be used by the network administration to grant or refuse access. You may find that your access times out and you have to log in again to keep using a particular resource. Multi-Factor Authentication, like password plus a code sent to a mobile phone or a hardware key that must be plugged-in, is especially valuable for access through the Internet or to highly sensitive data.

Zero Trust Security does take some doing to implement and maintain, but it can also be the means that keeps hackers and scammers of all sorts from stealing your information or damaging your systems. Are you feeling vulnerable? Learn more about how to secure and safeguard your network and get a complementary quote appropriate for your business.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter


Tuesday, October 31, 2023

Gigabit and 10 Gigabit Metro Fiber Ethernet

By: John Shepler

Have you ever wished that you could stretch your LAN to cover other locations around the block or around an entire city and suburbs? You can. Best of all, you don’t have to do it personally. A Metro Fiber Ethernet connection will plug into your LAN at one location and plug into your LAN at another location.

Get quotes on Metro Fiber EthernetThe Problem Connecting Multiple LANs
Most all private networks are now Ethernet LANs or Local Area Networks. Within your realm, you have complete control. You string the cabling. You provide the switches and routers. You hook up the user equipment. You manage the entire network operations.

It doesn’t matter what the company next door or across town is doing. They won’t be bothering your network. They have their own to serve their employees.

This is all well and good until you get another location that is not on your campus. What are you going to do to tie them together? You could go into the business of pulling a fiber bundle across town. Just get the rights of way, bring in the trenching equipment and get to work. It keeps you in control, but it gets really expensive really fast. It also takes forever and may be blocked by city organizations that just don’t want you doing it.

The Internet Will Interconnect Your Locations… Sort of
Hey, the Internet is available. It goes everywhere. You probably have service at each of your locations already. Why not simply exchange files and route phone calls over the Internet?

Actually, this works after a fashion. You can connect anything to anything over the Internet. However, you need to be mindful about how you do this or you’ll find out it is nowhere near the expectation of a transparent line connection.

The Internet is so available and so cheap because of scale. It does connect everybody to everybody else, and they are all on one big party line. No way do you have any say over priority of traffic or who is accessing that traffic. It’s a big happy family and everybody potentially has their nose in everybody’s business.

There are ways to make this work better. First, get dedicated access. No, you won’t have a private connection through the Internet, but you can order a private line to the Internet. That helps greatly with keeping your service consistent.

Also, make sure you encrypt the daylights out of anything you send through a public network. If not, you are just asking for eavesdroppers to lick their chops as they read through all your sensitive documents or tap into your phone calls and video conferences.

To really make the Internet seem like your private lane, take a look into SD-WAN, or Software Defined Wide Area Networking. This is a technique of combining multiple internet connections of different types, such as wireless, fiber, copper, and cable, using software to pick the best path for each packet despite constantly changing network conditions. It sets priorities and knows that data backups take a back seat to interactive cloud services.

Better Yet, Go Private
Now we’re getting to Metro Fiber Ethernet. It’s a service provided by a commercial carrier but not part of the Internet. You get a LAN to LAN connection between your locations. You can set it up as point to point, like a direct line. You can also set it up as multipoint to multipoint for any number of locations in the area. They’ll all be on that one big LAN. Another flavor of this service is a direct to cloud connection that connects you to your cloud service provider through a local data center.

Sometimes the Metro designation is a bit limiting. You need to connect to cloud services or branch offices in another city, state or even country. Many fiber optic network providers have connections that go far beyond your city and may have interconnections with other networks to extend the reach across International borders.

You can also contract with a private service provider called an MPLS or Multi Protocol Label Switching network. These are wide area service providers that are privately owned and not accessible by the general public. They will guarantee performance and connect your far flung empire with low latency and high bandwidth. Security is enchanted because this type of network has its own protocol that differs from what runs on the Internet.

Do you have a need to interconnect business locations with speed, reliability and privacy? Gigabit and 10 Gigabit Metro Fiber Ethernet might be just what you need at a reasonable cost. For even higher performance 100 Gbps bandwidth are also supported in key metro areas.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter


Tuesday, February 21, 2023

Colocation Hosting vs Cloud Data Centers

By: John Shepler

You’ve run your own in-house data center for years, but business is growing and you’ve hit the limit on what your server room can support. Now you’ve got a choice to make. Lease new space for the additional servers, storage and other appliances you need or consider moving everything to the cloud. It’s a big decision and one that needs careful consideration before funds are committed.

Choose colocation instead of cloud hosting.Isn’t Everyone in the Cloud?
If you read the tech headlines and articles, it looks like everyone is clearing out the old server room and simply leasing cloud services. That does have a lot of attraction. With your data and applications deep within the cloud, you no longer have any capital investment, no power bills, no physical security worries, no HVAC worries, and perhaps less IT support staffing. If you need more bandwidth, server processing or storage, you simply ask the cloud to increase your allowance, perhaps even automaticlly.

Why Wouldn’t You Join the Stampede to the Cloud?
Perhaps you’re feeling a little uncomfortable. You’ve heard that joke: “There is no cloud. It’s just somebody else’s computer.” What it really amounts to is somebody else’s thousands of computers, all nicely divvied-up to share among thousands or millions of clients. The promise of the cloud is that it looks to you like you have your own computing resources all by themselves.

Does that sound exciting or does it give you a bit of a twinge? After all, you’re really happy with how responsive your IT staff is and the control you have over all the equipment and software. There are no other companies sharing your facilities. Security involves keeping bad actors out of the building and on the far side of the firewall. So, is your only choice to bite the bullet and lease a new building for expansion?

Consider the Colo Option
Perhaps a third option is best. Lease space in someone else’s specialized building but keep your computing resources to yourself. This is the idea behind Colo or colocation hosting. These facilities were once called carrier hotels when their tenants were primarily telecom carriers. Now colo is popular with businesses of all sizes.

A colocation facility provides the physical building with controlled access and security personnel. It is staffed 24/7, which may even be more than you are able to provide now. Massive redundant power lines feed the facility so there is never a question of having enough amps to power new equipment. Moreover, that power is backed up by emergency generators and often batteries to keep things running no matter what.

With all that power, you are also going to need to get rid of the heat generated by the electronics. That is handled by redundant HVAC equipment to provide cooling air to the servers and other equipment. Air filters keep the facility dust-free.

What about connectivity? That’s one reason why companies move out of their own facilities and to a colocation center. With so many clients wanting so much bandwidth, major carriers have a presence in the colo. Often you have multiple carriers to choose from and they each have multiple fiber links for dedicated access and Internet service. Not every business is served with high bandwidth fiber yet, but the colocation centers are. They’ll get you as many Gbps as you need along with IP addresses.

Moving to a Colo Facility
When you move to a colo, you lease racks with power and cooling plus connections for bandwidth. Want more security? You can have those racks installed within a locked cage that keeps everybody but your staff out. Your people can come and install their own equipment, do maintenance, and make upgrades as needed.

Many colo facilities also offer additional services if you want them. You can have the colo tech staff monitor, troubleshoot and repair your equipment. You can even lease servers and storage from the colo instead of buying them yourself.

Are you outgrowing your tech facilities but want to explore options other than simply relocating to a cloud? Consider colocation data center facilities as an option that gives you more control but saves money compared to leasing your own dedicated buildings.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter


Thursday, September 25, 2014

The Software Defined Network Comes to Austin

By: John Shepler

The Software Defined Network (SDN) has been one of those nebulous concepts that is coming someday to do something better than current IT technology. Well, that day has arrived and here’s what SDN is going to do for your company.

Austin Texas at nightWhy Software?
Like software everything, the software defined network is intended to replace fixed hardware functions with reprogrammable software. It’s not really a simplification process. The hardware may be more generic, like microprocessors and digital signal processors, but if you include the lines of code, the component count shoots through the roof. The beauty of software is that all those “soft” parts can be replicated instantly at little or no cost. Even more importantly, software can be changed from afar as needed.

The Idea of Virtualization
You’ve probably run into virtualization in the IT racks. Not that long ago, a server was a stand-alone computer with its own operating system and software load. Each server had a designated function. If it was overloaded, you needed to buy a more powerful computer and swap out the boxes. If the application wasn’t that demanding, the server would loaf along most of the time.

In this type of environment everything needs to be planned up-front and changes are time consuming and sometimes expensive. There’s also a poor utilization of resources. You may need a lot of lightly loaded servers all cooking in the racks in order to run your myriad of business applications.

Virtualization changes all that. The server is no longer a hardware appliance but a software function running on one or more processors. The computer hardware might not look much different, but what used to be one server may now be a dozen running on the same box. Huge applications might span several boxes to get the job done. It’s just a matter of how much in the way of resources an application needs.

Some of what virtualization has accomplished is to reduce the number of physical computers needed since each box is running at a higher capacity. Even more important, a new virtualized server can be installed in minutes since it is simply a software “instance” running on the hardware already in the racks. Don’t need a server anymore? Simply have the software release the resources back into the pool. You don’t even have to set foot in the data center to make this all happen.

Does this sound like “The Cloud”? Virtualization on a huge scale is the magic behind cloud data centers and cloud services.

Virtualization for the WAN
Now consider your telecommunications network connections. Like all hardware based approaches, there are many specialized functions implemented by very specific equipment cards and boxes. Some are in the central office, some in the network path and some at the customers premises. It takes a long time to provision a new service and get everything wired up correctly so that you get the service you pay for and don’t interfere with others or have them interfere with you. The term “nailed up” goes back to the days when physical copper wires were literally nailed up on a board while they were assigned to a particular customer.

If you’ve ever tried to upgrade service, you know what a pain it can be. You need to submit a new order that needs to be processed. The changes to the network for your extra bandwidth have to be engineered. Then a truck has to roll to your location delivering a CPE (Customer Premises Equipment) box with the proper interface for the new service. Bandwidth is typically available in major increments and you better get your order placed well in advance of running out of current capacity.

Now, what if the network could be virtualized like the servers? The hardware becomes more of a life support system for the software. That software can be changed, upgraded or supplemented at will. All of a sudden, network changes become fast and easy. That’s the software defined network.

What AT&T is Doing in Austin
AT&T is launching its software defined network in Austin, Texas with the moniker AT&T Network on Demand. That’s pretty much what it’s all about. Businesses will be able to increase or decrease the bandwidth of their broadband speeds in near real time. In olden days (before SDN) this could take hours maybe days in the case of Ethernet services or weeks or longer for legacy SONET and T-Carrier.

The Carrier Ethernet services over copper and fiber that have appeared on the scene recently were engineered with more of the software defined network idea in place. One of their bragging points is that you can usually get a bandwidth increase by simply calling your service provider and making the request over the phone. No need to keep watching out the window for the service truck to roll in. As long as you have enough port capacity, the carrier will make the changes “invisibly” while you are doing other things.

In fact AT&T’s SDN will let them provision new communication ports in days compared to weeks. That’s an extension of the software-defined philosophy that separates physical hardware from software. Once again, as long as the installed hardware has the capability of handling the demands placed on it, what it does is really a function of software parameters and apps. Look for this approach to expand rapidly throughout the industry. It will be a matter of competitiveness among the communication carriers and other service providers.

Are you limited by your current MAN or WAN network capability? The service offerings are changing fast. Chances are that you can get more capacity and flexibility without a cost increase with MAN and WAN Network Services available now.

Click to check pricing and features or get support from a Telarus product specialist.

Note: Photo of Austin, Texas at night courtesy of Daniel Mayer on Wikimedia Commons.



Follow Telexplainer on Twitter

Monday, August 04, 2014

Is Managed WiFi Right for You?

By: John Shepler

Broadband is everywhere now. In fact, its become an expectation. For consumers, it’s their way to stay connected when there isn’t a wire to plug into. For businesses, it’s a way to offer the benefit of connectivity to their customers and to unchain their employees from the cable tether.

Look into cloud managed WiFi for cost and performance advantages.What about 3G and 4G cellular?
Isn’t cellular the true way to go mobile? Over wide areas, yes. It’s hard to beat cellular broadband on your smartphone. That is, until you reach your monthly usage limit. Then it gets expensive fast. Also, many computers, tablets and other devices don’t have the radios built-in to work on cellular, even if you wanted to pay to add them to your account. The one thing most every device does have is WiFi connectivity.

Enabling WiFi
At the most basic level, you can create a WiFi “hotspot” by simply connecting a wireless access point or WiFi router to your network or broadband connection. This is how it’s done at home and in smaller businesses. As the number of users increases and the area to be covered expands, suddenly managing a WiFi network isn’t so simple anymore. You can either grin and bear the extra effort involved or you can consider moving to a managed WiFi solution.

Managed WiFi in the Cloud
Managed WiFi simply means that a service provider, rather than you, does the heavy lifting of making the larger WiFi network work. A new wrinkle is cloud managed WiFi. This allows a service provider to deploy software updates and generate reports for you behind the scenes. A comprehensive system for cloud managed wireless is the Cisco Meraki system.

What Cisco Offers
The Cisco Meraki access point features high power radios for solid coverage with enhanced receive sensitivity compared to the garden variety WiFi AP. It includes MIMO and beamforming technology to met enterprise-class 802.11ac and 802.11n standards on the 2.4 and 5 GHz bands. The MR34 AP also has a dedicated security radio that scans and protects against security threats, adapts to interference and automatically configures the RF settings for maximum performance.

Security Features
BYOD (Bring Your Own Device) has become a user demand and a major headache for the IT department. If anyone can bring anything onto the network, security goes out the window. Who knows what’s going on?

The Cisco Meraki wireless solution feature set accommodates BYOD by identifying clients and automatically applying access policies by device or user groups. The system automatically assigns firewall and traffic shaping rules, VLAN tags and bandwidth limits to enforce policies by user class. Critical apps are prioritized and recreational apps can be limited for management control.

The cloud based analytics generate extensive metrics such as user visit time, repeat visits, apps used. You can manage WAN, LAN , wireless LAN and mobile devices on your control panel. That includes everything from a single location to a campus wide solution. There’s even an iOS and Android mobile app for network management on the go.

Acquiring Managed WiFi
An excellent approach that works well for both large and small installations is to get your managed wireless solution from a bandwidth provider such as MegaPath. This way you have one supplier for all of your connections, including MAN, WAN and WiFi. MegaPath’s network operations center will continuously monitor, configure and troubleshoot your wireless network on your behalf. They also have the most up to date security features the meet the requirements of the PCI (Payment Card Industry) data security standards.

If you are considering a major wireless expansion or installing WiFi access for the first time, get the details on cloud managed WiFi now.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter

Monday, June 17, 2013

100 Gbps Wavelength Service Open for Business

At a time when many businesses are discovering the benefits of Fast Ethernet at 100 Mbps, the upper limit of bandwidth available to business is, if anything, accelerating. Would you believe that you can now order 100 Gbps wavelength service up and down the West Coast?

High bandwidth fiber optic connection spped traffic to and from the cloudThat’s not just a small increment over typical 100 Mbps Ethernet. It’s a factor of 1,000x. How on earth can you use this much bandwidth... and where can you get it?

Oddly enough, what’s driving the demand for 100 Gbps bandwidth are the same factors that are establishing 100 Mbps as almost entry level. I say almost because many smaller businesses are just now becoming totally frustrated with their old dependable T1 lines. Today’s online activities drag along at 1.5 Mbps. You really need 10x or 10 to 15 Mbps to do much of anything productively.

Once you get beyond email, general Web surfing, inventory management and customer support, that 100 Mbps Fast Ethernet service starts looking pretty attractive. What pushes the capacity of the line is video downloads and streaming, big data and cloud computing. It’s really the move en masse of businesses to the cloud that is generating the stampede to 100 Mbps, 1 Gbps, 10 GigE and now 100 Gig Wavelength.

What’s the cloud got to do with it? Pulling up stakes and moving your IT services to the cloud completely changes your network requirements. Ethernet has grown for decades on the model that all of your computers, servers, switches, routers, printers, storage and other assets will connect on a wired, copper or fiber, local network (LAN). Your WAN connection only needs to be big enough to support communications outside the company.

This was no big deal for most companies prior to Internet-everything and outsourcing to the cloud. T1 lines and DS3 bandwidth were plenty to support even mid-size companies. It worked because the vast majority of traffic stayed within the company walls. Only communications to and from remote sites, franchises, some customers and vendors needed to traverse the outside network (WAN).

That was good because the cost of telecom services prior to competitive deregulation was astronomical. In fact, the availability of lower cost bandwidth has been a driver in the rise of the Internet. It’s also been an enabler of relocating data center facilities to colocation centers as a cost savings measure. Once you start moving your servers off-site, it’s an easy jump to the cloud.

That’s really all the cloud is. Each cloud is a very large data center with infrastructure and perhaps software supplied and maintained by a vendor. You no longer need to invest in IT. You rent everything. Payment is by the “seat” by usage or as a monthly fee. No more capital expenses, no more long cycles to upgrade capacity, and no more round the clock staffing to keep everything working.

The only fly in the ointment is the shift in bandwidth requirements from the LAN to the WAN. Your traffic no longer zips around in-house. Instead everything goes to and from your offices over the WAN. All the heavy computing is out there in the cloud. So is the data. If you want to send a file, get a file or perform any process you’ll be doing it over the WAN.

Companies that didn’t take this into consideration are hurting. They’re the ones frantically shopping for higher bandwidth levels at reasonable prices. A slow WAN connection means everybody waits. While your employees are killing time waiting for the system, you are flushing money down the drain.

Fortunately, the cost of high performance bandwidth has been plummeting with the skyrocketing demand and entry of new regional and national fiber optic networks. There’s a mad scramble on now to light every building with fiber and ensure enough capacity in the system to prevent traffic jams. This is where Zayo comes in.

Zayo is a premier international provider of bandwidth infrastructure services with massive fiber optic capacity. This is the company that is establishing 100 Gbps Wavelength services, not just for carriers, but for businesses too. Their initial buildout is in the high tech corridor of the West Coast, from Seattle to Los Angeles. The latest rollouts are the ability to originate and terminate add/drop 100 Gbps service to Seattle, Portland, Sacramento, San Francisco, Modesto and Los Angeles. There is a similar network on the Eastern Corridor that includes New York City, Philadelphia and Washington, DC. and a path through Chicago that connects the East and West coasts.

Do you really need 100 Gbps Wavelength service? Probably not unless you are a content distributor, Internet service provider, cloud service provider, large corporation, government agency or massive organization dealing with the biggest of big data operations. Nevertheless, you should know that higher bandwidths are in your future and that fiber is well within reach of most company budgets.

Do you feel constrained by your current modest WAN bandwidth? Now is the perfect time to get a new set of competitive quotes for high speed copper and fiber connectivity just to see what new services are available and how low the costs have plunged since you last went for quotes.

Click to check pricing and features or get support from a Telarus product specialist.

Note: Photo of high speed traffic courtesy of Wikimedia Commons.



Follow Telexplainer on Twitter

Note: Photo of high speed traffic courtesy of Wikimedia Commons.

Monday, February 25, 2013

Private Cloud In a Kit

The benefits of cloud computing are being widely touted as the next generation of Information Technology. You’ve read a lot about this, but are still somewhat apprehensive about moving your valuable proprietary applications into a public multi-tenant cloud. With security somewhat unproven, you’d really like to keep everything privately under your control. Does this mean abandoning the competitive advantages offered by the cloud?

Is the private cloud right for your company?Not at all. Remember that there are actually 3 types of clouds readily available. The public cloud is the most talked about and the pioneer in popularizing cloud computing. The other clouds are the private cloud and the hybrid cloud. A hybrid cloud is a combination of public and private, with some applications running in a private cloud and other, less sensitive, applications running in the public cloud.

Even if you are cloud savvy, there is still the question of cost. Public cloud cost savings are well documented and promoted. They’re based on premise that a large special purpose data center operator serving thousands of simultaneous clients can offer IT services at lower cost that each of those clients running their own data centers. But what about the private clouds? Can they really save you anything?

You can build your own private cloud right in your data center. It’s a matter of virtualizing all those servers and disk drives so that act as a pooled resource. This prevents the utilization problems of many low demand applications owning expensive hardware resources while high demand applications run out of capacity under peak loads.

Now you can also rent private clouds offered by major cloud service providers. These providers have acknowledged that the public cloud isn’t for everyone and that it makes sense to include private clouds within their data centers. The private cloud consists of a set of servers, disks, racks and so on for the exclusive use of a single client. Whatever capacity you aren’t using at the moment idles. In a way, this is very similar to using dedicated private lines for communication instead of shared bandwidth consumer-oriented connections.

What’s the advantage of renting private cloud services over building and running your own facilities? It comes down to capital and operating expenses. Renting instead of buying lets you avoid the capital investment needed to acquire expensive servers and peripherals. Operational costs may also be lower because the technical staff at the cloud provider can support your equipment as well as many others. Small companies might not even be able to justify 24/7 technical staffing for their data centers. The cloud service provider has multiple experts on duty at all times.

Another advantage common to all cloud services is that you pay as you go for only the resources you actually use. The size of the cloud data center guarantees that there are more resources than you can possibly use yourself. Contrast this to running out of processing power and having to wait for more equipment to be delivered or over-provisioning and paying way too much for the capacity that is in daily use.

CenturyLink, a major telecom and cloud services provider, has created an affordable and easy to use private cloud environment that is suitable for small as well as large businesses. Smaller companies may have avoided the migration to the cloud simply because they lack the understanding and expertise to make it all run effectively. Century’s savvisdirect AppGrid is the answer to this dilemma. It lets you get up and running in less than two business days, select the CPU, RAM , storage space, management & reporting tools, firewalls, load balancers, switches and databases you need. You configure all of this with an online graphical interface to make cloud setup almost trivially easy.

AppGrid is Platform as a Service (PaaS) that lets you develop your cloud applications and then put them into production without having to move to a different infrastructure. If your needs change, you can easily reconfigure your private cloud to meet those new requirements.

Are you interested in learning more about the merits of cloud services and the tradeoffs among the various types of clouds? Get free consultation, features and pricing for cloud services that meet your business needs.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter

Monday, February 04, 2013

3 Types of Clouds You Should Know About

You’ve heard the buzz and have started thinking that you might be missing something by not moving your IT services to the cloud. To help you with that decision, we’ll take a look at the 3 basic types of clouds available and some of the applications that make sense to be cloud-based.

Learn about the different types of clouds that support your business needs...It seems like just about everybody is in the cloud business these days. They are basically offering their own versions of 3 different cloud architectures. These are the public cloud, private cloud and hybrid cloud. All of the different cloud services run on one or more of these cloud models.

The most popular and prevalent type of cloud is the public cloud. This is a special purpose data center owned and operated by a cloud service provider who seeks to provide services to many paying customers simultaneously. The cloud data center is the type of data center we’d all like to have, but few can afford. It is in a large secure facility with fire suppression and a couple of layers of backup power. There are multiple fiber optic lines connecting to the outside world for redundancy.

In fact, redundancy is a key to successful cloud operations. One of the key selling features is reliability. In fact, many business oriented cloud companies offer Service Level Agreements (SLAs) that define the uptime you can expect, how fast service will be restored in the rare event it is lost, and what compensation you’ll receive for not having your cloud service available. These SLAs are similar to what telecom carriers offer and distinguish enterprise level cloud providers from consumer type services.

Redundancy means multiple everything. That includes servers, batteries, diesel generators, Ethernet switches, routers, wireline connections, environmental control, and staffing. Any decent cloud company has a full time technical staff ready to address issues around the cloud.

The companion to redundancy is virtualization. This is the real magic behind the cloud. Virtualization became popular with IT departments because so many applications don’t use the full capacity of the server. They either run at a fraction of the maximum throughput or occasionally burst to more than the server can handle. By connecting multiple physical servers in a virtualized environment, each application can take the resources it needs at any given moment without hogging an expensive server that mostly idles. The load can be spread among multiple physical servers automatically to handle peak loads.

In fact, this is how you can build the second type of cloud yourself. Set up your own data center as a virtualized environment of servers, storage and WAN connections and you can provide computing for your entire organization including remote business locations.This is the private cloud. You own it. You maintain it. You keep all the resources for yourself.

Economy of scale suggests that public cloud with multiple tenants are going to cost you less than running your own private could. You may want to go with the private option anyway when you have high security requirements or heavy regulatory compliance requirements. The other reason is if you have a unique environment that requires specialized hardware, software or operating systems.

A fairly new wrinkle is the private cloud in the public data center. In this case, the cloud service provider sets up a fully dedicated infrastructure that serves only your needs. It is not interconnected with the public cloud facilities. An advantage of this approach is that you can maintain the privacy of having your own cloud without the capital investment and maintenance headaches of ownership.

The third type of cloud is called the hybrid cloud. It is a combination of private and public clouds. For instance, you may use a public cloud for many of your applications but keep a smaller private cloud in-house for your most sensitive or proprietary functions. You can decide how much cross-connection occurs so that the private and public clouds can share information. You may even want a cloud service company to create a hybrid cloud for you using their public cloud infrastructure and special facilities for your dedicated private cloud.

Are you still scratching your head trying to decide what, if any, cloud services make sense for your company? Get free consulting advice and a wide range of competitive quotes for enterprise level cloud services now.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter

Monday, November 05, 2012

Why Cloud Data Centers Are Multiplying

You’ve probably noticed that there are a lot more offers for cloud based services than there were even a year ago. It seems like nearly everyone in the telecom and IT services space has a new cloud offering. Some offers are only new for a few days or weeks before being replaced with an even better offer. Have you wondered what’s driving this frenzy and how it’s all being implemented behind the scenes?

Find enterprise grade cloud computing and networking services now...There are two big pieces to implementing cloud computing services. The first is a robust data center that is sized to hold all of the equipment anticipated, and supported by an ecosystem of cooling, backup power and security. The second is connectivity. If you don’t have a reliable way to deliver your cloud services, it doesn’t matter how groundbreaking they are. Enterprise customers need to know that they can count of the system being available and functioning at all times without slowdowns or dropouts.

This gives the network bandwidth providers a leg up in making their case for high performance cloud solutions. A good example is EarthLink Business. They were in the metro and long haul network business long before clouds started to appear. EarthLink operates a high speed nationwide network that spans 28,00 fiber route miles with 90 metro fiber rings. This network supports six classes of service for MPLS over Ethernet, T-1 and DSL connections. Class of Service (CoS) is honored from edge to edge over the entire network.

Why is that important? Today’s converged networks are only effective if they can handle data, voice and video seamlessly. Voice over network communications, like VoIP telephony, is especially sensitive to latency, jitter and dropped packets. You need CoS to provide a fast lane for real-time two-way services like VoIP so they aren’t destroyed by mundane file transfers and backups hogging all the bandwidth.

This argues for a high performance privately operated network over taking your chances on the public Internet. A private network operator can carefully allocate resources to ensure that every packet stream is supported to ensure high quality operations. On the Internet, there are no such guarantees. You launch your packets and you take your chances. For one way video, email and Web browsing this is perfectly adequate most of the time. It tends to fall short for enterprise VoIP, video conferencing and cloud access.

Why is connectivity with the cloud so fussy? If all you are doing is backing up files in a background process, you may never notice any issues. However, if you have moved from your own data center to the cloud and are your applications are now running as SaaS (Software as a Service), any delays in response through the network will be frustrating at least and a detriment to productivity at worst. You moved to the cloud to save money. It’s a shame if poor network performance eats up those cost savings with reduced employee productivity.

EarthLink Business and other fiber optic network owners see the competitive advantage they have by providing both the cloud resources and connectivity as a single vendor. That’s why you’re seeing them expand into the cloud space as fast as they can. EarthLink is building four more data centers to support its next generation cloud hosting platform. These will be located in San Jose, Chicago, Dallas and South Florida. An existing data center in Rochester, NY will be updated early next year.

The fiber network is also being expanded to accommodate higher traffic levels anticipated by cloud users. These include key East Coast markets such as Ashburn, Atlanta, Charlotte and Orlando plus new capacity for the Texas cities of Austin, Dallas, San Antonio and Houston. The entire network is being beefed up to support the new native benchmark of 100 Gbps. Businesses today can typically get fiber optic service up to 10 Gbps in major metro areas.

Do you see the advantages of moving your IT operations to the cloud, but worry that you’ll lose performance in the process? Take a closer look at enterprise grade cloud computing and networking services and see how they have raised the bar on cloud performance.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter

Monday, October 01, 2012

Secure To The Core Cloud Hosting

Moving to cloud based solutions is making sense for more and more companies. The cloud offers easy scalability, near-infinite resources, high performance, no maintenance headaches and the opportunity to avoid capital investments and pay only for what you use. The one nagging issue is how secure is the cloud, really?

Move up to highly secure cloud and network services.MegaPath, a major player in private networking and hosted IT services, has taken a big step toward assuring businesses that their data and business process will remain private by introducing a concept it calls “secure to the core.” Just what does secure to the core mean and how can it work for your business?

Nearly every cloud service provider touts its security. This generally centers around the data center itself. Many are SAS 70 Type II and SSAE 16 compliant with physical security that include biometric scanning, a full time security staff, video surveillance and a walled fortress. Inside there are redundant power and cooling systems, fire suppression and multiple WAN connections to the outside world. However, this last group is really more about reliability than security.

With proper personnel screening and all the physical and technical barriers to entry, it’s not that hard to physically keep people out who don’t belong in the data center. It’s more difficult to keep them out when they come in through the Internet.

The Internet is a weak link when it comes to any data security program. The most motivated and talented of wrong-doers operate in this domain. They eagerly stalk potential targets to penetrate and make off with intellectual property, credit card numbers, personal data that can be used for identity theft and anything else of value. It takes talented network security people and an array of firewalls and security appliances to protect high value business, organizational and government assets that face the Internet.

This is where MegaPath has a leg-up on a lot of cloud service providers. They also have the latest in high security data centers that meet stringent industry compliance standards. What MegaPath has that most providers don’t is a large private network completely independent of the Internet.

When you think about it, companies with multiple locations or Intranets that include key suppliers and customers don’t really need the Internet for internal communications. In fact, it is highly desirable to keep internal communications on a private network for both security and performance. MegaPath makes this affordable for all size businesses through their nationwide MPLS (Multi-Protocol Label Switching) fiber optic network. The label switching technology of MPLS makes packet forwarding simple and efficient. It also allows customers to chose from eight levels of QoS (Quality of Service) so that time sensitive packet streams get the priority they need to maintain integrity end to end. This is ideal for enterprise VoIP telephone systems and video conference or telepresence.

MegaPath can offer you MPLS network connections throughout the United States plus Managed SSL VPN, Retail Access SSL and Business Continuity SSL. Their compliance services help companies meet regulatory requirements such as PCI DSS, FFIEC/NCUA, HIPAA/HITECH, GLBA and SOX.

Of course, you probably want Internet connections as well to serve the general public and commercial buyers, and for employee access to the vast information resources available worldwide. MegaPath offers a comprehensive security array called UTM or Unified Threat Management. This includes advanced firewall, intrusion prevention, anti-virus protection, Web filtering, anti-spam, Web application control and data loss protection. These UTM services can be implemented within the cloud, at the customer’s premises or in a hybrid configuration.

Are you looking for cloud services that have rigorous physical and network security protections? Get features and pricing for secure network and cloud services from MegaPath and other high quality providers.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter

Monday, June 25, 2012

Hosted Network Monitoring Services

Metro and Wide Area Networks are becoming more and more important to businesses. The steady migration toward electronic transactions, online services and cloud storage & computing are making networks a critical part of the business infrastructure. That, in turn, is making networking monitoring a necessary function in every IT department. It breaks down to doing it yourself or contracting with a provider to monitor the network.

Consider hosted network monitoring services to ensure your telecom links stay active...Companies have been monitoring their own LANs since they started installing networks. It’s logical to extend that philosophy to include any telecom links between facilities. That includes point to point networks and both public and VPN connections to the Internet. There are good reasons to rethink this approach now. Let’s have a look at how the networking monitoring processing can be improved.

One area of weakness is networks used by smaller businesses. A lot of factories, offices and retail outlets aren’t open or staffed around the clock. What’s going on in the wee hours? Who knows? A construction crew could be working late and cut through the cable that is your last mile connection. The alarm lights on the terminal equipment illuminate, but there is no one around to take action. You come in to open up in the morning and find that you have no network connection. You report it, of course, and the carrier will start the investigation process. You may or may not be back in business by noon. That’s only if you have a good SLA (Service Level Agreement) or a very proactive service provider.

There are measures you can take. You can set up a process to monitor the alarms and ping remote servers periodically to make sure the link is still up. If something goes wrong, you can have it send you or someone on-call in your IT department a text message. Somebody sill has to be awake enough to get the alert and then start working the problem. That involves figuring out if the outage in on the telecom link or in your equipment and contacting the right party to get some corrective action.

Large companies with 24/7 coverage of their data centers and other IT assets may consider this part of their charter. Small and medium companies, especially those with branch offices and critical e-commerce infrastructure, may have the need for faster response but not the budget to provide “just in case” staffing. These are the companies that hosted network monitoring services work wonders for. With hosted monitoring, the responsibility of what’s happening beyond your network’s edge belongs to the service provider. They do have the automated equipment and round-the-clock staffing to keep an eye on every network link to make sure that it is working properly.

In most situations, the network edge is a managed router installed by the service provider. This equipment terminates the line with the proper interface, be it T1, DS3, EoC, EoF or SONET. The other side of the router is where you connect, generally with an Ethernet interface. The router itself is considered “in the loop” as far as network monitoring and testing is concerned. If anything goes wrong, the provider has the ability to test functionality through all their equipment and lines, right on up to your Ethernet connection.

In some cases, you even have the option of letting the service provider include your internal local network within their networking monitoring service. You decide whether to let them have access to the entire network or just a portion. This is the ultimate level of support for smaller companies that have little or no on-site IT staffing.

A particularly robust network security solution is provided by MegaPath, a major facilities-based carrier. MegaPath goes far beyond just monitoring to ensure the network is “up.” They include comprehensive Unified Threat Management (UTM) to implement network security. This involves an advanced firewall, intrusion prevention, anti-virus and anti-spam filtering, web application control and data loss prevention. Network management involves deep packet inspection and uptime monitoring that monitors for latency, jitter, delays and packet loss. It is sometimes difficult for even larger companies to provide this depth of networking monitoring and security.

Are you interested in better management of your network connections at a reasonable cost? You should look into hosted network monitoring and security services suitable for the size of your business and criticality of your network.

Click to check pricing and features or get support from a Telarus product specialist.




Follow Telexplainer on Twitter

Wednesday, June 13, 2012

Dual Core, Quad Core, Up to 64 Core Managed Dedicated Servers

It’s no secret that the battle for higher processor speeds is over. After a decades of moving up the Megahertz clock speed ramp, things have leveled off at somewhere 2 and 3 GHz. So, is that it? The end to Moore’s law? Of course not. The quest for speed has simply shifted to a different approach. Instead of faster clocks, the throughput gains are now achieved with multiple cores and processors.

Liquid Web Fully Managed Web HostingThe good news is that there is more than enough processing power to be had. The better news is that you can get that processing at excellent prices including top quality support with managed dedicated servers.

The low end is dual core running on a single processor. From there, you can move up to quad core and HEXA core (6 cores). Not enough horsepower? OK, then it’s time to upgrade to a dual or quad processor dedicated server. Each processor has multiple cores, of course. That extends your server’s core count to 8, 16, 32 or 64 cores.

With that much processing power at your command, you’ll need a few other things to ensure that you get the most out of all those cores. Consider that an AMD Quad x 15 CORES 2.1 GHz Opteron 6272 system with a total of 64 cores has something like 134 GHz of composite processing power and you know that you’ll need both bandwidth and memory to keep the processors loaded.

DDR3 SDRAM ranges from 8 GB on up to 512 GB. SATA Disk Drives range from 500 GB single drives on up to 4x SAS RAID 10 spinning at 15,000 RPM or an 8 x SSD RAID 10 solid state drive. Additional storage arrays can be added up to 14x SAS Drives.

How about bandwidth? Pick either a Fast Ethernet (100 Mbps) or a Gigabit Ethernet (1,000 Mbps) uplink port and 10,000 GB or 12,000 GB of monthly transfer, or an unmetered port with bandwidth ranging from 10 Mbps on up to 100 Mbps.

Who offers these high performance dedicated server options with 100% uptime guarantee and outstanding support? It’s Liquid Web, a managed hosting company with multiple data centers serving over 20,000 clients in over 120 countries for the last 15 years. Their 200+ on-site staff members offer something called “heroic support.” It comes in three levels, fully managed, core managed and self-managed, depending on how much help you need and want.

All Heroic Support options feature 24x7x365 phone, email and live chat support, tier/level 3 technicians, the highest grade, on the site at all times, system level health monitoring, graphing, level monitoring alerts and notification, with a 100% uptime SLA on the fully managed hardware and network infrastructure.

The core managed option adds full installation and support of the core software package, with system updates and patches, security enhancements, full web server support, proactive response and restoration of monitoring events, all with a 30 minute initial response time guarantee.

Fully managed support adds virus and spam protection, full control panel support, updates and patches, external migrations and best effort third party application support.

Do you need higher performance servers to handle your demanding applications? Are you less than impressed with the support you receive now? Are you paying far more than you need to for the resources that you require? If so you may be a perfect candidate for Liquid Web traditional, VPS, cloud or managed dedicated server solutions. Check out their extensive range of services and that unique “Heroic Support.”

Click to get more information and view sample videos.




Follow Telexplainer on Twitter

Wednesday, May 30, 2012

How Does Cloud Computing Work In Business?

We hear a lot about “the cloud” these days. Just what is the cloud and how does it help you do business better, faster and cheaper? Let’s have a look.

How the Cloud Works... for you!Usually when we say that business is cloudy, it’s not a good thing. Cloudy implies fuzzy and unpredictable. Wouldn’t you rather have clear and predictable instead? Financially speaking, of course you would. This is why people cock an eyebrow when somebody says you need to go off to the cloud somewhere.

You know, this is really a problem of semantics more than anything. When we talk about cloud computing, cloud storage or cloud networking, we’re not talking about dealing with fuzzy logic, quantum mechanics or anything else that you can’t quite nail down. The “cloud” with respect to IT and telecommunications services is quite determinate. The metaphor of the cloud is more about not having to be personally concerned about the details of how something is being accomplished rather than once you turn your back it’s a free for all.

Let’s take cloud storage for instance. This is becoming the most popular cloud service because it works for both business and non-business users. Everybody has data. Today most of that data is spinning on hard drives in personal computers. Some of it is stored in solid state memory chips inside tablets and smartphones. There are two services that the cloud can provide your data. First, it can safely store a copy in case disaster strikes and your hard drive crashes or you leave your laptop or smartphone in a taxi. If nothing else, there are thousands of personal photographs or client records amassed over the course of years that won’t disappear forever in an instant.

Second, the cloud can provide a central repository that you can access from whatever device you have at the moment and from wherever you happen to be. Sure, there are remote access programs that let you reach into your PC from across town. That assumes that you leave your computer on all the time and that a specialized client is available that will work on all devices for all types of data. It’s not quite the same as going to the online cloud “warehouse” to fetch a copy of whatever. In fact, it is getting to the point where you don’t have to fetch at all. All of your devices can sync with the cloud so that they either have a copy of the file locally or it appears to the user that they do.

Cloud storage as automatic backup makes sense because we all pretty much suck at remembering to backup files ourselves. You can improve this a bit by adding an external disk drive with a program like Apple’s Time Machine to automatically make copies whether you are paying attention or not. Still, if your house or office burns down you lose everything in the ash heap of melted hard drives.

The cloud gets away from the problem of having everything in one place where it can be destroyed all at once by keeping a copy hundreds or thousands of miles away. But the cloud also backs up its own data. Remember that the cloud is not some vaporous collection of neurons in the sky. It is realized as a secure data center with racks and racks full of hardware. That’s hardware that can fail just as surely as your desktop PC or local server. Cloud providers need RAID and other protections for disk data along with battery and generator backup power to ensure that those files will be there when you call for them.

The business model behind the cloud is that very large data centers operated by dedicated service providers who specialize in cloud services can give hundreds, thousands or millions of clients all the disk storage, servers and bandwidth that they could possibly use at a lower cost than having each and every client replicate that data center on a smaller basis.

The secret to a successful cloud is virtualization. This is a technique where a fixed pool of hardware resources is sliced and diced so that it can be apportioned to customers as needed. The virtualization software makes it appear to each customer that they are in control of 1 or 100 or 1,000 separate servers and Terabytes worth of disk drives. In actuality, there are fewer actual hardware servers than there are virtual servers because few applications need to hog a whole server to themselves. Sometimes, however, a virtual server will encompass more than one physical server because it really, really needs that much power. It doesn’t matter to the application, the client or the cloud whether a particular server is physical or virtual. The job gets done and the client gets billed by the capacity used.

One specialized cloud service is hosted VoIP, also called hosted PBX or hosted voice. This is a telephone switching system that handles your internal and external phone calls exactly as the most modern in-house PBX system would. The difference is that all you have in the building are phone sets and a SIP Trunk connecting your network to the service provider. Like cloud computing and storage, you pay for what you use and never have to make a capital investment. When you need more, you simply order more and you have it in a flash.

Does the idea of the cloud make more sense as a resource and potential cost saver over what you are doing now? If so, get more information on cloud computing and communications services and compare prices with doing everything yourself. That should definitely make things a lot clearer.

Click to check pricing and features or get support from a Telarus product specialist.




Follow Telexplainer on Twitter

Monday, March 12, 2012

How The Cloud Provides Virtualization For The SMB

Virtualization has been a hot topic within enterprise data centers for years. It’s a matter of getting more out of what you already have. Virtualization can turn a lightly loaded server into many smaller servers that you’d otherwise have to buy. It can ease the burden of a heavily loaded server across many hardware platforms, avoiding the expense of one monstrous hardware platform. That’s great for large corporations, but what about small and medium size businesses. Is there any way they can gain the efficiencies of virtualization?

Cloud services for Small and Medium size businesses...There is now. Just look to the cloud. Virtualization is the magic behind the curtain that makes the cloud a practical reality. Clouds are nothing more than extremely large data centers set up to serve many tenants. The principle is that any customer will perceive having access to infinite resources and a sense of being the only user if the data center is engineered correctly. The practical way to do that is with virtualization.

Every aspect of the cloud is virtualized. It starts with the servers used as computing resources. The racks and racks full of physical servers are virtualized into hundreds or thousands of virtual servers. What happens on one virtual server stays on one virtual server. As long as there are enough physical resources to support the demand for as many virtual servers as customers activate, users have no idea how many other customers are sharing the same physical assets or what the maximum number of virtual servers can be.

The same is true for storage, also virtualized, and bandwidth, which has always worked well as a virtualized resource. Even software can be virtualized when running on the cloud. This has led to the Software as a Service or SaaS model. Customers perceive having their own installation of a software package, although in practice they are one out of many.

So, what does this have to do with the Small and Medium Business (SMB)? Once you are relieved from the burden of ownership, the barriers to entry for fairly sophisticated computing services shrink considerably. Few companies have the multi-million dollar capital resources to go out and build secure, environmentally controlled data centers with layers of redundant power, and then populate these spaces with rack after rack of the latest hardware. They also need to budget to hire the staff that makes all of these resources work together reliably and keep on top of patches, upgrades and troubleshooting.

The cloud moves all of that cost and effort from your hands to those of a dedicated service provider who is in business to manage the cloud and nothing else. They’re not trying to run a manufacturing, marketing or healthcare company. They’re in business to run a cloud and run it for excellent performance. By purchasing your IT services from a cloud provider instead of replicating a data center on-premises, the smaller a company you can be and still afford the service. It’s now possible for even “mom & pop” operations to use cloud services from the day the business opens and grow their IT right along with their business.

Here’s something else that SMB operations are finding is better off in the cloud than on-site. That’s PBX telephone systems. How many companies have the expertise to run their own phone systems? Many contract with VARs (Value Added Resellers) and consultants to buy, install and maintain an in-house telephone system. Send all that to the cloud and all you need onsite is the phones and perhaps a provider installed managed gateway. They even take care of the trunk lines that connect to the public telephone system.

What’s all this cost? Cloud services are sold on a pay-as-you-go basis. Many are prices by the month for each user or “seat,” as they are called. Computing is sold by the number of virtual servers and quantity of storage per month or even per hour. Unlike normally contracted services, cloud resources are easily increased or decreased. Often you can do this through a Web-based control panel for your account. Your bill is automatically adjusted as you add and subtract resources.

Are you finding it too hard to acquire the computing and communication resources you really need to be competitive? Consider getting those resources as services from the cloud and pay only for what you use now with option of rapidly scaling up when business conditions dictate. Compare prices and options from multiple cloud service providers now and compare with the expense of trying to do it all yourself.

Click to check pricing and features or get support from a Telarus product specialist.




Follow Telexplainer on Twitter

Monday, February 13, 2012

What’s Wrong With Bring Your Own Bandwidth?

When you go out to contract new IT services, you are often faced with a choice. You can start fresh with both services and connectivity from your new provider. Otherwise you can layer the new IT services on the network setup you have now. Seems that if your MAN and WAN connections are working to your satisfaction, it would be smart to leave well enough alone. But, what other considerations should play into this decision?

Evaluate your bandwidth choices for any new IT serviceNot every service provider deals in bandwidth. Many VoIP, cloud computing and other service providers specialize in the specific service they offer. They may make recommendations, but getting connected from your facility to theirs is your responsibility.

Other companies have a far larger array of services to offer. Some started off as competitive telecom carriers and later added other services such as colocation, cloud infrastructure, hosted PBX, Software as a Service and so on. Talk to these companies and they’ll express a definite preference for providing both the IT or telecom service plus the connectivity. Is this simply being offered as a convenience to the customer, a desire to get as much of the client’s business as possible, or are there other reasons? If so, what would they be?

A common industry term for using the network connections you have now is “bring your own bandwidth.” You’ll hear this term used most often by VoIP, hosted PBX and cloud computing providers. There’s a reason that this expression keeps coming up in provider literature and consulting discussions. It’s far more than just industry jargon.

What’s unique about cloud services of all sorts is that the performance of the cloud service is highly dependent on the performance of the connection between your facility and your provider's. Issues with customer provided bandwidth pop up so often that providers have become wary.

A common application is hosted VoIP. Unless you have an IP PBX switching system in-house, any VoIP service you are using is probably hosted VoIP. Hosted means that the service provider owns and operates the switching system and the trunk lines to the public telephone system. You only need to have IP phones and perhaps a gateway device in-house.

Hosted VoIP can best the best thing that ever happened to your company or an unmitigated nightmare. In fact, the very same service that works beautifully for one company can be a disaster for another. Why? It’s because voice services, particularly network voice services like VoIP, are highly sensitive to the characteristics of the long haul network connections to them.

Voice is a touchy application because it is real-time and easily corrupted. Any network congestion that holds up packets, drops them or brings them in at a variable rate will be reflected in the call quality. Many network anomalies show up as audio distortion that garbles conversations. Other problems are delays that chop off the beginning of sentences and intermittent outages that drop calls.

This is why major business providers with reputations to protect cringe at the thought of their clients bringing their own bandwidth when it is a shared bandwidth DSL or Cable Internet connection. Those low end services work just fine for e-mail and general website access. Add a third party broadband phone services and you are really taking your chances. Why? Your network may not be set up to give voice packets priority, the bandwidth on your broadband Internet service may vary all over the place, and even when those things are working well, the Internet itself has all sorts of vagaries.

Even the Cable companies themselves are aware of this situation and don't run their bundled telephone services over the Internet. What they sell you is a VoIP phone service that uses their Cable network to connect to their own VoIP switching center. Your telephone calls never touch the Internet itself. Call quality is just too dicey and hard to predict when you are dealing with a public network where no packet has an advantage, by design.

What major VoIP service providers really prefer is that you get a dedicated SIP trunk from them to provide your voice and Internet service. This way they can control the WAN connection and keep the voice and data packets from tripping all over each other. With the right bandwidth, latency, jitter and packet loss characteristics, you can have excellent voice quality and all the advantages of hosted PBX telephone services.

The same discussion applies to high bandwidth cloud connections. What some companies are finding is that the WAN connection to their new cloud provider isn’t nearly as capable as the LAN connection they had to their in-house data center. The big issues are bandwidth and latency. You need sufficient bandwidth so that users don’t have to queue up when they access the cloud. Latency introduces a delay that makes the cloud applications seem sluggish. In some instances this has become such a problem that Amazon has make arrangements with Level 3 Communications and AboveNet for special AWS Direct Connect service at 1 Gbps and 10 Gbps for its high performance cloud computing services.

Is bring your own bandwidth a good idea or a bad idea? It depends on how robust your current metro and long haul network connections are and what services you intend to add. A good approach is to discuss connectivity at the same time you are evaluating new IT services to make sure you have what you need to get optimum performance. You can get expert consultation and a wide variety of telephone, cloud and bandwidth services through our telecom broker, Telarus, Inc. A simple online inquiry will get the process started immediately.

Click to check pricing and features or get support from a Telarus product specialist.




Follow Telexplainer on Twitter

Thursday, January 05, 2012

Office Telephone Service Via Cloud

When you think of office telephone service, the thing that comes to mind is a phone on every desk with an easy 3 or 4 digit number to call other phones in-house and access to an outside line whenever you need one. Today, that picture also includes your own direct dial phone number and personal voice mail. Transfers, call forwarding and conference calls are easy to set up. So, where is the equipment that runs this phone system?

Why stay stuck in the past? The most modern office telephone solutions are in the cloud...For most people, the answer is “who cares?” The actual phone system could be anywhere. As long as it works all day, every day, the mechanics of how this is accomplished is beyond the interest of all but a few people. The ones who care are the business manager who pays the phone bills, the IT person or people who quietly keep everything humming, and the provider of the service.

That provider could be the local phone company. That’s how it was for at least half of the last century. For many if not most companies, at least part of the system is in-house. This could be a small wireless phone system with a couple of lines and a half-dozen handsets. It could be a key system using desk phones with a separate button for each of 4 to 6 outside lines. It might even be a PBX phone system mounted in a back room with lines coming in from each phone and going out to the telephone company.

Since very few people are all that interested in running their own in-house telephone switching system, why not ditch the whole thing? Does that mean going back to analog phones connected to the local Telco? Not at all. Today it means going forward to a cloud based solution that will give you all the functionality you have now and more.

Cloud communications is the new PBX. It’s also the new key system and the new small office system with only a few telephones. It may even provide your broadband Internet.

The cloud telephone system is more formally known as hosted PBX or hosted VoIP. You already know what hosting is. Chances are that you already buy a hosting solution from one of the many online providers. This could be a shared solution, a virtual private server (VPS) or a dedicated server of your own. The economics of setting up your own data center just to run a web server don’t make sense anymore. There is so much competition in the hosting field that simple solutions are only a few dollars a month. Dedicated servers are only a few hundreds of dollars per month. That includes the box, the bandwidth and the IT services to keep it all running.

Larger companies that run their own customized packages for business information may have elected to just add one more server for the web to the racks they already have in their data centers. It hasn’t made much sense to go out of house for a solution when you need a large data center and staffing to optimize your business. Well, not till recently. Now these major corporations are shutting down their in-house data centers and moving to cloud computing solutions in droves. Why? For the same reasons that make sense to outsource telephone systems. You don’t need to invest in capital and you don’t need a staff to maintain it.

The point is that cloud services have evolved to the point where both your computing and your telephone can be supplied by a cloud vendor at a lower overall cost with more functionality and lower in-house staffing than doing it yourself. There are other advantages, too. You pay for a cloud telephone system per seat per month. You only buy as many seats as you need. When you need more, you order more. There is no need to maintain extra capacity just in case business picks up suddenly. Provisioning of extra resources is fast and easy because the cloud has all the capacity you need.

Now that you have office telephone solutions in the cloud as an option, is there any reason to be stuck with a technical solution that was “modern” decades ago? At the very least, take a look at what’s available in the cloud, what it costs compared to what you are paying now, and if you can get all new phones included with your service.

Click to check pricing and features or get support from a Telarus product specialist.


Note: Original photo of telephone operators courtesy of Seattle Municipal Archives on Wikimedia Commons.



Follow Telexplainer on Twitter