Many years ago, Bank of America faced a data crisis in the growth of the check industry. By 1952, check usage in its branches doubled to over 21,000 per day, or 8 billion instruments per year. Internal estimates pegged the growth to hit 20 billion by 1966 and the army of bookkeepers it employed to manage and record every check (with a high degree of accuracy) simply could not keep up. The banking industry went to electronic recordkeeping, a means that changed the way American did business. It was literally a sea change, and today that same industry handles a staggering 300 million transactions every 24 hours.
Such is the sea change facing today’s business community-not in checks, but with data.
One of the most challenging cost centers in today’s tech climate is the cost of maintaining information. From ETL’s (extract, test, load) to short and long term storage, the cost of hardware, middleware, and software needed to keep up with a continuously expanding data universe is scary to many companies.
And it should be. Decisions made with 2012 in mind may be all but useless in a few short years.
In 1981, former Microsoft CEO Bill Gates was quoted as saying that “640K (memory) ought to be enough for anybody”. Like many good quotes, Gates never said it, of course, but the decision at the time was discussed. “I have to say that in 1981, making those decisions, I felt like I was providing enough freedom for 10 years,” Gates said in a 1989 speech. “A move from 64k to 640k felt like something that would last a great deal of time. Well, it didn’t – it took about only 6 years before people started to see that as a real problem.”
The technology industry has grown beyond all expectations in the shadow of a phrase every C-level executive should be familiar with—“Moore’s Law”, which states that the number of transistors on integrated circuits doubles approximately every two years. Memory grows, costs decrease, more power to the end user. A cellular phone in 2012 has more processing power than the Apollo space capsule did a generation go.
Such rapid changes place particular stress on the capital investment of tools needed to store an ever increasing data requirement. The client-server architecture which so many companies have relied on since the dawn of the technology age is straining to stay afloat, even as the cost of data storage has plummeted. In 1980, he cost of storing one gigabyte of data cost $2 million. Today, it is around four cents. The trouble, of course, is that people need a LOT of gigabytes, terabytes, and yes, petabytes (1,000 terabytes, or approximately 1 quadrillion bytes of data) to run today’s businesses, and mere servers are a poor way of housing such enormous (and increasingly valuable) sums of data. Years ago, we had a saying in IT for server decisions: “Today’s investment, tomorrow’s door-stop.”
Cloud computing and cloud storage are, for many clients, a means by which a firm can achieve the cost efficiency of data storage without the capital expense of investing in physical servers that will never be enough for the future. To other clients, the concept of the “cloud” is, well, cloudy.
Is it secure? Is it safe? What if it isn’t there anymore? Is it economical in the short and in the long run?
These are all legitimate questions, and there isn’t always one answer. But these are questions that must be asked by companies straining to meet the increasing needs of a customer base that not only demands access to data, they expect it. Clockwork Technology has been “in the cloud” for a number of years now, and while we don’t run the cloud, we can provide the resources necessary to introduce you to the products and services which do.
Where would the banking industry be if it manually handled each check, each ATM transaction, each online purpose? Where will your business be if it relies on yesterday’s technology to solve tomorrow’s challenges?