| Tier I | Tier II | Tier III | Tier IV |
---|---|---|---|---|
Capacity Components |
N |
N+1 |
N+1 |
N after any failure |
Distribution Paths |
1 |
1 |
1 Active and 1 Alternate |
2 Simultaneously Active |
Concurrently Maintainable |
No |
No |
Yes |
Yes |
Fault Tolerance (Single Event) |
No |
No |
No |
Yes |
Compartmentalization |
No |
No |
No |
Yes |
Definitions
Capacity Components: The capacity of HVAC and Power Components as compared to the design demand of the facility. N indicates that utility systems only provide capacity, meaning failure of any component leaves the center without capacity to continue operation. N+1 indicates that all systems have an extra component that can be activated to continue to provide full necessary capacity in the event of the failure of any one piece of equipment. N after any failure indicates that utilities are actively running at all times redundant systems which do not require the implementation or activation of failover components.
Distribution Paths: The provision of electrical feed systems including UPS, Bypass and panels and their connection to servers and equipment.
Concurrently Maintainable: Equipment components can be taken ‘off line’ for regular maintenance without impacting the ability of the Data Center to operate at capacity.
Fault Tolerance: The complete ability of Equipment and Distribution systems to operate in the event of the failure of any component and the Data Center to be able to continue operation in the event of said failure.
Compartmentalization: The data center is divided based upon function and location with unique access using fully logged biometric security
Analysis
Tier Rating
Capacity Components: ATLDC center is designed based upon N after any failure (Tier IV) including HVAC, UPS, Power Distribution and Network.
UPS: Two active 160 kVA Powerware 4X4 UPS providing true A/B power feed. Both manual maintenance and automated bypass are provided separately for each unit. Two 250 KVA transformers sit downstream for the A and B distribution paths. Panel Systems are unique to A and B with no shared components. Each UPS carries 8 minutes of full capacity battery operation and the Automatic Transfer Switch is rated to go from utility feed to generator in no more than five seconds.
HVAC: The facility has four active CRAC units fed by two unique Glycol distribution systems including two full capacity circulation pumps always in operation and redundant Dry Coolers. All systems comply with ASRAE’s “Thermal Guidelines for Data Processing Environments” All CRACs are always in operation. Should any individual unit fail or be taken offline for maintenance there is sufficient capacity in the remaining units to continue to provide design parameters.
Distribution Paths: (Tier III) True diverse path feeds (A/B) are provided to all server and network locations. These paths are simultaneously in operation (Tier IV). Distribution paths for each feed share no components including breaker panels, cabling or connectors. All components are in compliance with NEMA standards and have been tested accordingly. Distribution paths to Cooling System components require manual reroute in the event of Distribution Path component failure.
Concurrently Maintainable: Equipment components can be taken ‘off line’ for regular maintenance without impacting the ability of the Data Center to operate at capacity. Equipment is maintained on a regularly scheduled basis by factory certified personnel. All maintenance activities can be conducted without affecting Data Center Operations due to the N after any failure design capacity for Electrical and HVAC components.
Fault Tolerance: Equipment and Distribution Paths are completely fault tolerant (Tier IV) allowing for continued operation in the event of component failure. The exception is the Generator System which is subject to disruption in case of component failure (Tier III). However, the UPS system is designed to continue operation on a battery basis until the component can by replaced or repaired.
Compartmentalization: Three distinct Data Center environments are provided with varying purpose and access. The 1740 and 1765 centers are equipped with biometric fingerprint access controls which keep a complete log of entry on an individual person basis. Logs are maintained in perpetuity. Surveillance Systems are in place including Closed Circuit, Infrared and Motion Detection. A Log of system events is maintained and surveillance video is stored on an ongoing basis.
Summation
Though most parameters in Atlanta Data Center’s design and operational capacities actually meet Tier IV requirements, the Uptime Institute designates that a Data Center’s ratings are equivalent to its lowest rated component. Given this definition Atlanta Data Center qualifies for a Tier III rating.
Sustainability Concepts (A, B, C)
The Center is designed and laid out in accordance with TIA 942 (Telecommunications Industry Association) standards. These standards stress the concept of quality, maintainability and operational efficiency. Likewise A-C ratings are tied to the concept of Operational Sustainability (the ability to continue operations over a long period of time without interruption.
Rating | Description |
---|---|
A |
Excellent level of Operational Sustainability demonstrated at the site. High levels of availability can be expected for long periods of time. |
B |
Acceptable levels of Operation Sustainability demonstrated at the site. High levels of availability can be expected, but for shorter periods of time. |
C |
Minimal level of Operational Sustainability demonstrated at the site. High levels of availability may be achieved, but are not likely to be sustained over long periods of time. |
Operational sustainability is measured in five categories;
Definitions
Site Selection: Geographical locations affect multiple aspects of Data Center dependability. Considerations include;
Possibility/Probability of natural disasters
Diversity of Utility Feeds
Provisions for Expansion
Physical Security
Qualifications of Labor Pool
Qualifications of Local Contractors
Building Characteristics: Building and Center physical attributes also play a significant part in determining the sustainability of operations. Considerations include;
Local Codes and Standards
Disaster Recovery
Utility Subsystems
Support Spaces
Design Intent and Verification
Fitness for Use: The Data Center itself must provide for effective operations. Considerations include;
Flexibility for accommodating specific needs
Robust redundancy
Consistency of Design
Ease of Maintenance
Use of Proven Technology
Investment Effectiveness: The Data Center needs to be able to change operational considerations based upon changes in industry requirements. Considerations include;
Ability to evolve without abandoning infrastructure
Energy Efficiency
Green (environmentally sound) considerations
Management Operations: Data Center operating personnel are a key consideration in sustaining operations. Considerations include;
Integrated management
Competency and experience
Maintenance and Performance Metric Monitoring
Analysis
Sustainability Considerations
Site Selection:
Possibility/Probability of natural disasters: The Data Center is located in Atlanta Georgia USA. The Atlanta area is far inland of severe tropical weather considerations and is a geographically stable environment. The area is metro in nature and not susceptible to flooding or extended fire outbreaks.
Diversity of Utility Feeds: The facility is on the ‘Grady’ power grid which is fed by three Southern Company Power Stations. These three transformer stations are fed by plants Yates, McDonough and Savannah River. All utilities in the area are fed underground to minimize exposure to outside environmental factors.
Provisions for Expansion: ATLDC currently occupies some 10,000 square feet of Data Center. Of this approximately 4,000 square feet is occupied. In addition ATLDC has access to an additional 12,000 square feet of undeveloped space. ATLDC has both room rights and roof rights which would allow for substantial expansion of HVAC capacities. In addition ATLDC has reserved 192 KVA of additional electrical capacity.
Physical Security: The Data Center Facility provides secure access in accordance with industry recommendations and PCI requirements. A Security Guard is on premises conducting access control and rounds at all times. Data Center access is Biometric in nature to control access to specified approved personnel. Biometric Access must be Electronically Logged and Stored. Internal Data Center and External building Access Points are equipped with Video Surveillance. Data Center Surveillance is both Video and Motion Detection with monitoring available to Customer via Internet. Cabinets are 42U, HP Quick Rail Compatible, ventilated on both front and back with secure, uniquely locked doors.
Qualifications of Labor Pool: Atlanta is considered one of the most ‘connected’ cities in the United States and is a center of technical expertise. A goodly supply of technical IT personnel are available in the area on both contract and employment basis.
Qualifications of Local Contractors: Atlanta is a substantial city with a history of construction expansion. Multiple contractors are available with skilled journeymen for expansion or in case of emergency. ATLDC retains contractors on a 24/7 on call basis for all critical systems.
Building Characteristics:
Local Codes and Standards: Construction in Atlanta is controlled by the US Southern Standard Building Code, and the Atlanta City Code. These codes are in line with construction codes across the country and provide for safe provision of facilities. Construction of the Data Center was conducted in compliance with ICC, BOCA, ICBO, and SBCCI standards adopted in 1997 and all Addendums.
Disaster Recovery: Atlanta is the Southeast location for US Federal Courts and Offices and is the State Capitol of Georgia. As a result the city has extensive Federal, State and Local disaster recovery plans, programs and personnel.
Utility Subsystems:
POWER:
Power to Customer’s Cabinets consists of at least two true A/B Simultaneously Active Distribution Paths each of which is diverse from the data center primary switchgear through UPS, Transformer and Power Distribution Unit with completely diverse paths. Data Center power distribution shall provide for Remote Monitoring over Internet and Power Management. RPP/PDU has the capacity to feed 208 volt 3 phase power to Customer’s solution. Distribution Configuration is by diverse path and in a tray or wire way environment providing for short notice addition of power feeds to Customer’s Cabinets if needed. Facility design provides for an average of 3kw per cabinet. Facility UPS is N+1 in a 4X4 configuration with a separate and independent 4X2 transformers allowing for both Internal and Full Failure External System Bypass. Utility feeds contain an ATS to provide for Automatic Generator activation.
Data Center is fed by its own power feed independent of building feeds and has its own Natural Gas full capacity generator. Circuit loads are measured and monitored independently and power usage loads may be viewed by customer remotely at any time.
CLIMATE CONTROL:
Data Center is cooled by a Raised Floor air Distribution system which provides for complete distributed air flow regardless of individual CRAC failure or down time maintenance. Center is based upon a Hot/Cold Aisle Design providing for even air distribution with minimal eddy and hot points. All Condenser and/or Chilled Water is routed below or external to the Data Center environment preventing any possibility of water condensation or leakage affecting computer room equipment. All CRAC units are Professionally Maintained by factory certified personnel and are covered by a scheduled periodic preventative maintenance program. CRAC units provide no greater than a 68º F discharge air temperature. Air flow temperatures are constantly monitored and can be viewed by Customer externally via the internet.
FIRE SUPRESSION:
The fire suppression system is Dry Chemical Halon, FM-200 or equivalent complying with NFPA and UA requirements. Fire suppression annunciation panel alarms are directly connected to First Responders requiring no human action for notification. Systems are third party inspected by NFPA certified technicians at least annually.
Support Spaces: ATLDC has some 400 square feet of storage space and 200 square feet of equipment workshop totally separate from the Data Center footprint. This allow all storage, supply and maintenance work to take place outside of the Data Center environment.
Fitness for Use:
Flexibility for accommodating specific needs: With a utility under raised floor design and all power and data cabling run in sub floor tray the Data Center can easily change its layout and power distribution systems to accommodate specific customer needs or changes in operational styles or technology.
Robust redundancy: The Data Center is Multi-Homed with three independent, industry recognized Tier 1 providers supplied through fully Diverse Paths from Data Center to Bandwidth Provider remote switches such that damage to one feed shall not affect or damage the other. The Data Center has Multiple Edges each independently receiving different provider connections so that failure by any provider or any Edge Device shall not affect network performance. Edge Routers and Switches provide Simultaneous Processing within the LAN as opposed to providing an inactive failover structure. Feeds are diverse and cross connected in such a manner that the LAN has no single point of failure from the Internet Cloud to the Customer’s Switches/Firewalls.
Consistency of Design: The design and construction of the Data Center is consistent across all aspects of performance. All systems are designed with a minimum of a Tier III rating and many with Tier IV performance.
Ease of Maintenance: The Data Center has been designed and constructed with OSHA, ASHRAE and NEMA guideline maintenance housings and disconnects allowing for safe and convenient access to maintained components. Maintenance can be conducted independent of operational considerations.
Design Intent and Verification: ATLDC was professionally designed by Empower Energy Technology, a premier US Data Center designer and Construction Manager headquartered in Atlanta. Empower also acted as construction manager and Quality Control enforcement agent.
Use of Proven Technology: Though state of the art in Network Design and Performance ATLDC has made use of proven performance architectures in every respect. No experimental or untried technologies have been employed in any critical operational system.
Investment Effectiveness:
Ability to evolve without abandoning infrastructure: With N + after any failure infrastructure configuration changes can be made without affecting operations. Additionally data center roof rights and expansion slots provide for significant capacity increase without having to modify the existent infrastructure. The facility was designed to accommodate 160 kVA additional power distribution and an additional 40 tons of cooling capacity.
Energy Efficiency/Green (environmentally sound) considerations:
In an effort to promote environmentally sound policies in line with recent policy adopted by the Federal Government; ATLDC is able to submit to Customer a Green Initiative Performance Program for their operations. This program addresses the recommendations of the Green Electronics Council and Standard Performance Evaluation Corp. All servers and equipment meet requirements of the 80 Plus power guidelines. Green Initiatives address the Seven Step program including Consolidation, Power Management, Energy Efficiency, Power Supplies, Internal Barriers, EPA Standards and Environmental Advocacy.
Management Operations:
Integrated management: With two PhDs on staff and other experienced technical personnel ATLDC/TULIX operates through the use of Matrix Management. This management style provides for the use of individuals strengths and experience to be used as it applies to the functioning of various team efforts.
Competency and Experience:
In that Customers house Application Servers and Storage Devices, the function of which are critical, ATLDC provides support and assistance with Network Gear, Network Feeds, Firewalls, Hardware, OS Software and Application Components. In that TULIX SYSTEMS is Developer and ASP in addition to Facility Operator our technicians provide smart support well above the standard qualifications of normal data center technicians.
Network Support
Data Center is able to manage and support Customer’s Switches, Firewall and Server configurations including Firewall Settings for PCI compliance.
Software Technician:
ATLDC has on staff programmers and administrators capable of providing Installation, OS Maintenance, Update Installation and Troubleshooting for Microsoft and Suse Linux servers. Additionally on staff personnel should be able to troubleshoot and support PHP, MS Sql, My Sql and Oracle.
Hardware Technician:
ATLDC is able to Troubleshoot, Maintain and Repair Customer’s and Operator’s equipment onsite. A Spare Parts Inventory shall be maintained for standard server parts including processors, memory, hard drives and NIC Cards.
24/7:
ATLDC shall provide on staff Monitoring and Support on a 24/7/365 basis. Additionally automated support systems shall monitor servers and applications using Get Requests, Log Ins, Ping and HTTPS testing with automatic failure notices being generated to Operator and Customer.
Systems Programming:
ATLDC has on staff personnel familiar with E-Commerce Implementations who can provide assistance to Application Provider. Operator should have at least five years experience programming, managing and supporting e-commerce applications.
Service Mix:
In order to be able to customize services to the Customer ATLDC is able to demonstrate experience in Hosting, Managed Hosting, Colocation, Programming and ASP delivery.
Maintenance and Performance Metric Monitoring: ATLDC has developed a planned maintenance program of all major systems. Additionally monitoring is in place on a 24/7 basis with alarms, notifications and pages taking place when any performance variation is detected.
Maintenance Schedules | |||
---|---|---|---|
System |
Responsible Party |
Period |
Approximate Month |
HVAC, CRAC, Pumps Dry Coolers |
ActionMechanical |
|
|
Full Maintenance, Filters, Belts, Seals, Compressors |
|
Quarterly |
March, June, Sept. Dec. |
Glycol |
|
Annually |
Sept. |
|
|
|
|
UPS and Transformers |
24/7 Technology |
|
|
Full Maintenace, Bypass, Inverters, Rectifiers, Filters |
|
Semi Annually |
July, January |
Battery Test and Bypass |
(Automated) |
Weekly |
|
|
|
|
|
Power Distribution |
Tulix |
|
|
Monitoring |
|
24/7 |
|
Firmware/OS |
|
Monthly |
|
|
|
|
|
Generators |
Prime Power |
|
|
Load Test |
|
Annually |
November |
Tuning |
|
Semi Annually |
November, May |
|
|
|
|
Fire Protection Systems |
Orr Protection Systems |
Annually |
August |
|
|
|
|
Network Equipment |
Tulix |
|
|
Firmware |
|
As Needed |
|
Performance Monitoring |
|
24/7 |
|
Logs |
|
Daily |
|
Settings and Filters |
|
Auto Update - Daily |
|
Licensing and Compliance |
|
Monthly |
|
|
|
|
|
Access Systems |
Tulix |
|
|
Battery |
|
Bi Monthly |
Jan., Mar., May, July, Sept. |
|
|
|
Nov. |
Logs |
|
Monthly |
|
Access List |
|
As Needed |
|
Facility Cleaning |
Tulix |
|
|
Surface |
|
Monthly or as needed |
|
Subfloor |
|
Annually |
Sept. |
Summation
Given the criticality and 24/7 nature of operational requirements, ATLDC’s Colocation and Hosting Facility design meets high industry standards in accordance with Uptime Institute Tier III Design Topology. The facility also meets the requirements of PCI TPP DSE Data Security Standard 1.2.1. Operational Sustainability shall be no less than 99.995%. All major power and LAN components are Concurrently Maintainable from facility demarcation to Customer solution. Support services are available 24/7/365. All equipment components (CRAC, Generator and UPS) are maintained by manufacturer certified maintenance technicians in accordance with manufacturer’s recommendations
Given the considerations above and ATLDC’s past performance record we are considered class ‘A’ rated. ATLDC (TULIX SYSTEMS, INC) has been operating Data Centers since 1994 without major interruption of services. In the four years of operation at its current facility ATLDC has experienced no outages of any kind. This past performance projects into a high probability of future performance with the same quality.
It must be noted that this is a self certification based upon the review of ATLDC management and their understanding of the Uptime Institutes guidelines. The Data Center Tier system is the exclusive property of The Uptime Institute and it retains all rights to it. Actual certification is a complex and thorough review of design and performance parameters performed by The Uptime Institute, its members and partners and ATLDC has not undergone this process.
References_________________________________________________
Operational Sustainability and Its Impact on Data Center Uptime Performance, Investment Value, Energy Efficiency, and Resiliency;
Renaud, Seader, Turner; Uptime Institute, 2008©
Uptime Institute’s Tier Classifications Standard Technical Users Guide; Computer Site Engineering, Inc., 2008©
Tier Classifications Define Site Infrastructure Performance; Turner, Seader, Renaud, Brill; Uptime Institute, 2008©
TIA-942 Data Center Standards Overview; ADC Telecommunications, Inc., 2006©
© Copyright 2022 ATLDC, All Rights Reserved.