I am still surprised that so many of the DCIM (proper) solutions on the market today focus on only a small portion of the DCIM promise. The eye candy portion. Drawing floorplans, standing up pretty hi-fidelity racks and simple monitoring of IT gear. Looks great at trade shows, and in the vacuum of nothing else, feels remarkable. It’s true that some of the DCIM solutions also include connectivity mapping of power, storage and networks, and a few do a really good job of including workflow for operations/provisioning/etc. All good….
All good? Humm, me thinks NOT! This ‘read-only’ world of the Data Center is only HALF of the value of DCIM. I dare say it is the simplest HALF. This first half of DCIM value is the toolset that enables and leverages the humans involved. It directly supports daily operations, capacity planning, etc. It’s the part of DCIM that HAS been done manually for years, but with the current crop of DCIM solutions it is just easier to do so more effectively. The first half of DCIM is a noble goal to be sure, and DCIM will dramatically affect and rewrite the most common of these best practices. Want to know how much power a row is consuming, where a cable is connected or where to place 4 new servers? A few clicks and away you go. DCIM circa 2014 from the companies that live and die DCIM is impressive to be sure.
BUT…. There is a whole new dynamic world at the end of the tunnel and the second half of DCIM will address it. That part of DCIM that can NOT be done using any other means. Here in 2012 DCIM vendors are consumed with their own engineering gymnastics to meet today’s needs. The forward-looking dynamic vision is a huge portion of the long-term value of DCIM: DYNAMIC resource allocation. I am quite sure that the most successful DCIM solutions of 2017 (or if we are lucky 2016) will derive at least HALF of their value from an orchestration layer that understands ITSM/ITAM, BMS and DCIM knowledge and uses a deep set of policy-based rules to TAKE ACTION. The DCIM2015 solution will require asset description, location and real-time monitoring as a pre-requisit, but the biggest value to it’s customers will be the inclusion of an engine to create and process very granular and extensive rules which dynamically adjust online resources minute by minute. Call this their ‘Resource tuning engine’. DCIM2017 vendors will have varying degrees of expertise creating the engine AND the rules, and I am sure they will continue to refine the creative ways that they interpret and react to observed conditions. They will probably release a ‘rules pack’ update to their software periodically. This will become a major differentiator for DCIM2017 vendors. Remember the data center major systems that can be studied and controlled includes ALL of the systems (Power, Server, Storage, Security, Virtualization, network, Cooling, etc).
Top down, in this energy-aware era of IT and the Data Center, the BIG requirement for IT and Facilities alike is to have just the right amount of processing capability (and all supportive services) on-line and available at every point in time. No More, No less. When demand trends up, turn some number of servers on to provide that same level of processing along with just enough cooling for this new source of heat. Conversely, when the demand goes away, make the servers and cooling go away with it.
THAT my friends is the big promise of DCIM2017. Today’s DCIM solutions ARE indeed one of the first steps to get ‘there’, but we all need to think farther into the future, put on our visionary hats and articulate where ‘there’ is!