Using SkySpark as Part of a Fully Integrated Data Center Backbone

Posted by

First published in The Data Center Issue of SkyFoundry Insider, April, 2016

InfoMart datacenter

Controlco began its first implementation at a major data center, located in the Pacific Northwest four years ago. Recently, the integration and building automation services company was brought on to complete a second implementation at the facility, this time for one specific data center client. At the outset of this client’s 8-megawatt project, which will go live later this year, Controlco was tasked with integrating the company’s building management system into the IP backbone and providing a comprehensive operator information interface, while communicating with the data centers BMS systems.

Data centers require very controlled environments, and each client’s data processing needs require unique sets of data collection and monitoring rules. For this particular client, their data center focus has been on transitioning from retail co-location capacity to wholesale, and utilizing up-and-coming management methodologies such as 400-volt distribution and cabinet level heat rejection. The company is also doubling cabinet densities from past data centers.

The client will use three rooms of the total 240,000 sq. ft. facility. In those rooms, Controlco has networked 352 pieces of equipment with 12,800 single project points. Each point generates data. Controlco is utilizing both Niagara and SkySpark for data collection and analytics, along with a client-facing DG Lux interface, creating a unique combination of information analysis, presentation and alarm management capabilities that reduces the need for the client to learn, manage and maintain multiple systems.


To determine setpoints in air handling units (AHUs), for example, engineers have identified a target zone temperature for all points on that unit. SkySpark gathers the data values for those points and compares the data to the setpoint temperature throughout specific threshold periods of time. A Spark is generated if the zone temperature noted in the data is above or below the desired target set point and not adjusted by the AHU within the time threshold. This ability to monitor equipment function over time allows the project manager to know exactly where to look when a problem is identified, rather than simply having to manually check all AHUs in this example, or having to start by looking at a whole room full of multiple pieces of equipment.

Controlco engineers developed additional layers of analytics and alerting to customize the check times from up to one day to every 10 minutes, giving the client further knowledge and control of their sensitive data center environs. Controlco has also associated costs factors with Sparks, which is helpful for a number of applications.

The processing power of this combined solution allows for more data to be analyzed so that analysis can be more accurate. After the project goes online, the client can begin viewing Spark patterns, historic data charts, measure averages per year, and calculate actual cost savings against key performance indicators (KPIs). Among the goals for the client in moving its data center to the Oregon facility was to increase energy efficiencies by having greater access to renewable sources. Since many inefficiencies can also be spotted through automated data analytics they can be addressed more quickly through this enhanced datadriven solution, Controlco, Niagara and SkySpark have offered another way for the client to achieve its energy efficiency goals.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s