Monthly Archives: December 2013


The Relationship between Big Data and Colocation

big data

Few would deny that data has become the most important asset for modern businesses. Last year’s Edgell Knowledge Network reveals that 80% of retailers say that they’ve heard of big data, but only 47% know how to analyze and then apply what it can tell them.

The report goes on to say that of the retailers surveyed:

·       46% believe data volume is their biggest issue

·       34% think managing data variety (structured and unstructured) is their biggest headache

·       20% say effectively managing velocity poses the greatest challenge

 

When it comes to Big Data, the focus tends to be on analytics and decision-making. While essential, analytics provide only the final piece of the process. What about the backend systems that facilitate transactions and capture data in the first place?

Let’s use Amazon as an example. Last year, Amazon processed 1 billion data transactions every day (excluding holiday shopping). That volume translates to 10,000 data points every single second on a normal day. Velocity on Black Friday 2012 bumped up to 11,500 data points per second. That kind of volume requires robust, reliable systems to facilitate flawless transactions (the customer experience) complex algorithms to power the recommendation engine, and sophisticated analytics to enable nimble, smart business decisions. A minute of down time in the Amazon ecosystem costs tens of thousands of dollars.

Granted, most companies do not conduct business on Amazon’s scale. Regardless of scale, though, I would argue that uptime and system reliability are equally important to small to medium-sized businesses as they are to market leaders. Maybe more so. SMBs typically have smaller margin for error when it comes to risking profitability.

So, if you’re considering a Big Data initiative for 2014 and still run operations on site, perhaps it’s time to consider colocating those ops. Moving IT ops to a datacenter makes sense when it comes to Big Data because your needs will bump up quickly. Consider this. 90% of the world’s data has been generated in the past 2 years.* Colocation ramp directly correlates to rate of a company’s data growth so you have the flexibility to expand quickly without capex related to IT infrastructure in a stable, redundant, secure environment.

Geographically speaking, you’ll get the most bandwidth bang for your buck with colo services in Ashburn, VA—aka Data Center Alley. Ashburn has an extremely high datacenter density, with 70% of the world’s Internet traffic flows through its pipes. Talk about robust. If you’d rather your colo be on the West coast, consider Sacramento. Geologically speaking, it’s a safe location in a state plagued by earthquakes large and small. Either way, we can help you avoid the infrastructure and reliability headaches that most executives worry about when it comes to effectively managing Big Data initiatives. Contact us for details.