Cloudy with a Chance of Explosive Growth

By on 03/29/2015.

dccloudLast week, fellow “Datacenter Dude’er” Justin talked about how Cloud truly *is* everything when discussing the reality of problem solving today.  The infrastructure of today is almost drowning in choice.  There are countless competing platforms and ecosystems that each offer their own unique attributes.  The problem with this landscape is simple. Organizations are struggling to realize the economic and competitive advantage that these ecosystems provide for one simple reason.  They are far too busy running their business to fully investigate these opportunities.  They have fallen into the trap of simply keeping the lights on. This is a travesty for all involved.

The administrators who find themselves in such a position are scared.  They’re scared that they are going to fall out of touch, and that they’ll wake up one day and their skills will no longer be valued.  They don’t properly understand or have time to investigate everything. They find themselves in an analysis paralysis, or what Barry Schwartz would call the Paradox of Choice.  They are fearful to recommend making any big bets, even when they think they’ve found an advantage.  Not because of any fear of being cut, but because these ecosystems today only offer onramps.  Backing out of an ecosystem today is a very timely and expensive proposition, assuming of course it’s possible at all.

On the other side of things, the executive leadership find themselves talking to their friends and colleges.  They learn about how HGST built a 729 TeraFLOPS cluster in 8 hours in AWS, and completed the analysis of one million hard drive designs in 8 hours.  Or how IDC SVP and Chief Analyst Frank Gens said in November that:

“Over the next four to five years, IDC expects the community of developers to triple and to create a tenfold increase in the number of new cloud-based solutions. Many of these solutions will become more strategic than traditional IT has ever been. At the same time, there will be unprecedented competition and consolidation among the leading cloud providers. This combination of explosive innovation and intense competition will make the next several years a pivotal period for current and aspiring IT market leaders.”

Finally we have the actual business workers who are walking around in the reality of the consumer connected cloud of 2015.  They are inherently familiar with the concept of cloud, but it’s more magic then math.  Whether it’s Siri, google maps, Xbox Live, Netflix, or any of the other hundreds of online services they have grown to depend upon.  They’re frustrated that they don’t have ubiquitous access to their data.  They can manage all aspects of their life from any device, anywhere, anytime… except their professional life.  In some cases this frustration actually leads to them taking departmental budgets, and invoking shadow IT to work around what they perceive to be the idiots running things (does this sound familiar to anyone?).

confused-young-businesswoman-shrugs-her-shoulders-in-a-clueless-gestureSo what’s the problem here?

The user base is ready, and the executive team sees the market shift. Then why don’t those lazy admins just do it already?!

And this is where the house of cards comes crashing down.  Because what those lazy admins know is that they have to make a bet to do any of this.  The safe bet is to just answer that they already have a cloud because they’re using virtualization.  However that is ignoring the larger point.  The core problem impacting these decisions today is data gravity.  If the IT organization makes the wrong bet and invests in the wrong cloud not only could it be disruptive and costly, it could end up failing all together.  Or worse … the project could be wildly successful and, as a result, negatively impact the operational cost of doing business, due to concern over potential “Cloud lock-in,” and an ever-evolving competitive landscape.

This is the source of the analysis paralysis.

I’ve had the privilege to watch from the front lines as this fight went from warm to surface-of-the-sun hot.  Today, unfortunately, it’s more confusing and fractured than ever.  Amazon, AT&T, Equinox, Google, Microsoft, Oracle, Orange, SalesForce.com, Rackspace, Verizon, and VMware are but a handful of the heavy weights in the Public/Hybrid Cloud space.  They all offer a variety of similar [yet different] solutions and services.

So what is an organization to do? How does IT continue to run the existing business while enabling the larger organization to grow their market and hire new people?  The Public Cloud advance is actually slowing them down, not speeding them up… but it doesn’t have to.

Almost a year ago, NetApp EVP of Product Operations, George Kurian, shared his view of this growing trend.  At the time he coined his vision as the “Data Fabric.”  The idea is simple, if you look outside IT today and analyze older and more mature industries, you discover that they were only able to truly expand once a fabric was established.  Telegraphs to Telephones, ATM to the internet … the key metric appears to have been connections … the ability to communicate with each other. When an industry was able to create sufficient connections (think mesh networks), they were able to make massive advances.  The modern financial industry regularly dances on a razor blade and hardly ever gets cut.  Imagine trying to run Bank of America or ING Direct in a world where one had to ship gold on a boat to move currency.  That’s the analogy here. The only difference is once transactions were able to be shared in near real time through a network was the banking industry able to evolve and develop the financial system that fuels our modern society. The same system that will hopefully send men and women to mars in 2030!

ntap_data_fabricEnter the NetApp Data Fabric

The idea here is simple.  If we can provide a mesh network at the storage layer in-between ALL of these platforms and ecosystems then we can effectively remove the data gravity problem entirely!  I like to think of the Data Fabric as anti-gravity for the modern cloud-connected world.  Without the data gravity concerns it is no longer complicated or expensive to move from one ecosystem to another.  Sure, chances are you’re currently invested in Virtual Machines, you’re probably looking at containers and rolling you’re eyes.  Something like, “Oh man we just finished our migration not another one”, but here is the thing.  What a Data Fabric enables is the ability to fail.  Fail fast, fail often, but fail safely, iterate through the tough problems.  Just try stuff out. Go ahead and put a copy of your most mission critical database in Softlayer for a week, did it work? No?  Oh well, move it back or trash it.  It’s your choice and there’s no risk!

As the Hyneman argued to NASA’s Innovative Advanced Concepts (NIAC) symposium last year:

“I submit that an important part of the scientific method should be to introduce a certain amount of nonlinearity to the process of the regular part of your investigations.”

… he told a room full of NIAC researchers. Going on to challenge them with …

“In a metaphorical sense, running over to something you’ve spotted on the side of the path you’re on, poking it with a sharp stick, might just provide something extremely valuable. How many great discoveries have been made that were not what was being sought at the time?”

In my opinion, this speaks directly to where we are with Cloud today.  We shouldn’t be paralyzed by the thought of failure. Sometimes you really do need to just poke something with a sharp stick and see what happens.  While my kids love watching Adam and Jamie blow stuff up on Mythbusters, I find the man fascinating.  From special effects master to quasi-TV scientist… somehow he ended up addressing some of the most intelligent people on the planet. And more importantly, he had the insight to suggest what they need most was more error, or as they put it on the show “Failure is Always an Option.”

This is the beauty of the Data Fabric!

It provides a framework that enables continuous experimentation and as result, very real advancements.  This fabric is not a product, but rather a manifesto at NetApp (at least from my POV).  It is driving roadmaps and development efforts for customers as they realize the freedom to experiment and begin to discover new and exciting ways to solve problems.  NetApp, too, learns of new and complicated challenges that are unique to this operational model. These problems require hard science to properly solve and like most things in storage it takes time.  Fortunately for NetApp’s customers, this is not a new problem, but the end result of literally years of effort.  How do I know?  I was on the original team that eventually evolved into our Cloud Solutions Group at NetApp.  When I started writing this post, I had planned to lay it all out today … To explain how Cloud ONTAP connects to NetApp Private Storage … To explain how Integrated Infrastructures like FlexPod anchor it all in … Maybe share some interesting use cases with SteelStore and StorageGrid WebScale … And maybe even dive a little into how Flash plays in a pure hyperscaler environment, but alas my ramblings have carried on too long, so maybe next time.

I will leave you with this thought though:

The lesson here is that the industry is without a doubt in the middle of a transition.  I personally do not believe there will be a real winner or loser when we emerge in 3-5 years.  We will one day look back as we now do with “Application Silos” and “virtualization”.  We will look back and discuss what we worked on during this time … the mistakes we made and the victories we won. At least those of us who are in the game will. Don’t let the need to make the perfect decision lead to no decision at all.  This is a truly exciting time. Clouds are literally designed to be ideal for transient projects!  Experimentation is in their blood stream and when enhanced by a data fabric, customers can reap the benefits of this unique attribute without worrying about the future.  Just get to work!  Poke it with a stick… see what happens… at the end of the day the Fabric ensures you’re not locked in… you can change your mind without penalty… and you can go into and out of the cloud as you see fit.

In a nutshell … the Data Fabric is offering you freedom of choice… and choice without penalty.

Glenn Sizemore
Glenn is an industry veteran who specializes in cloud and automation at NetApp. He has authored countless reference architectures and best practices for using Microsoft software with NetApp storage environments, and is a seasoned speaker at industry events around the world.

No Comments