I expect to see continued exponential growth in the use and emulation of internet data centres; the likes of Amazon, Rackspace, Memset or Google that have tightly packed racks, operating at very high efficiencies, running cloud computing services.The key difference in approach will be the widespread adoption of commodity hardware to deliver enterprise quality services by moving intelligence into software. Examples include distributed applications running on virtual machines, many cheap nodes crunching big data and more object storage and non-relational databases.Storage is especially exciting. I believe the days of “big iron” vendors, RAID5/6 and tape are numbered. Our enormously resilient, distributed storage system using commodity tin costs us less than £20 a terabyte per month. By layering media types, such as SATA disks, SSDs and DRAM, and mobilising tools including Automated Intelligence’s Datapoint, you can have your cake – cheap storage with low latency for critical data – and eat it.
A CIO’s infrastructure decisions will focus more on leading the business rather than simply aligning with it. Technologies such as unified communications, virtualisation and cloud computing will be further adopted to gain a competitive advantage while security and risk concerns have to be mitigated. This requires an agile and flexible IT model that is shackled by traditional infrastructure and has left IT departments struggling with daily firefighting exercises. To ensure success, IT admin will need a new breed of infrastructure that enables them to focus on delivering, optimising and managing the application while not needing to worry about the infrastructure that supports them. Consequently the benefits offered by standardised, pre-integrated and pre-validated converged infrastructures will gain even more traction in the industry. This will not only present a dramatic paradigm shift in IT infrastructure but also the way IT is approached, managed, deployed and viewed by the application owners and business it supports.
There can be no questioning that big data is a disruptive market force. The massive influx of data that’s impacting upon organisations of all shapes and sizes means that traditional IT infrastructure is becoming increasingly obsolete. Big data is the intensive analysis of large, complex, disparate or unstructured data sets to get actionable results in real-time. For many, to do this with an on-premise infrastructure will almost certainly lead to failure as few boast the necessary servers or computer clusters. Simply put, to execute big data analytics you need the suitable infrastructure to underpin it. Organisations need a massive amount of computing power to take all their data, wherever it’s stored, and analyse it for valuable insights. For most people at least, this leaves cloud computing as the most attractive option. Being able to gain access to potentially limitless scalability through Infrastructure-as-a-Service via the cloud makes big data a possibility for one and all. As such, the evolution of IT infrastructure seems likely to be moving towards outsourcing.