More

    DCIM in a Rapidly Changing Data Ecosystem

    Slide Show

    Eight Critical Forces Shaping Data Center Strategy

    It is eminently clear by now that as data environments continue to push the boundaries in terms of both scalability and abstraction, a high degree of automation is necessary to keep everything running smoothly.

    And it is a testament to the IT industry’s forward thinkers that this automation should apply to the facilities side of the house along with the actual data infrastructure—hence the rise of data center infrastructure management (DCIM) as a vibrant movement within the enterprise community.

    Of course, applying this level of automation to real-world deployments is where the consensus starts to break down. With such a wide variety of systems, architectures, hardware/software configurations and other parameters to deal with, it’s no wonder that DCIM solutions are all over the map in terms of functionality and even the very elements that are to be addressed in order to gain data/power equilibrium.

    According to Schneider Electric’s Joe Reele in a recent Data Center Journal article, a typical data center may or may not employ any number of management components such as a building management system (BMS), a computerized maintenance management system (CMMS), a branch circuit monitoring system (BCMS), and an IT asset management/workload placement system. The trick with DCIM is to bring all of these operations under one yoke so both facility and data resources can be configured and deployed with a high degree of coordination. Getting to this state of holistic management, however, is easier said than done, and most certainly not through a full scale rip-and-replace of legacy management infrastructure. Rather, he says, greater reliance on open protocols and renewed focus on key metrics like risk avoidance, capex/opex reduction, space management and improved dev/ops should be driving factors.

    At the same time, beware of solutions that are heavy on core vendor strengths rather than full DCIM functionality, says Raritan’s Paula Alves. A full suite should include not only asset management and capacity planning, but change management and automated import capabilities, energy monitoring and management, and a full suite of reporting and search tools. Keep in mind the wide variety of soft considerations when selecting a system, including experience, integration and service offerings and user support.

    Some vendors, however, recognize that one solution will not be right for all enterprises and are forming coalitions to bring disparate talents together. HP, for example, recently launched a new set of consulting services, workshops and design/implementation guidelines to help enterprise executives craft DCIM solutions appropriate to their needs. The company has teamed up with a number of vendors, including Schneider Electric, Emerson Network Power, Nlyte and Siemens to compile working systems, although it stresses that the goal is to remain vendor-neutral.

    Still, the problem remains that most DCIM solutions are geared toward today’s largely static IT infrastructure, not the software-defined architectures that are right around the corner, says Power Assure CTO Clemens Pfeiffer. While you can’t shift power loads in software like you can with IT resources, it is possible to dynamically alter power consumption of individual components based on workload requirements. A key application for this approach would be in automated disaster recovery, which would benefit greatly from power and utility intelligence and the ability to manage application service levels in relation to energy costs. The next step in data/facilities management, then, is to integrate DCIM with application monitoring and other elements of the software-defined stack(s) currently under development.

    DCIM is a very valuable tool for the enterprise as it seeks greater control over both data operations and the resources needed to maintain them. But it is not the cure-all for everything that ails the data center. Organizations looking to implement the technology on a broad scale will need to compile a wide range of specialized knowledge, from vendors, independent experts and in-house talent, with an eye toward devising a solution that meets not only current needs but those of a rapidly evolving, and still largely undefined, data ecosystem.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles