Colocation center

colocation center (also spelled co-location , or colo ) or ” carrier hotel “, is a type of data center where equipment, space, and bandwidth are available for retail customers. Colocation facilities provide space, power, cooling, and physical security for the server, storage, and networking equipment of other firms and a network of providers and a minimum of cost and complexity.


Many collocation providers sell to a wide range of customers, ranging from large to small companies. [1] Typically, the customer owns the equipment and the facility provides power and cooling. Customers retain control over the design and use of their equipment, but they are overseen by the multi-tenant colocation provider. [2]

  • Cabinets – A cabinet is a locking unit that holds a server rack. In a multi-tenant data center, servers within law firms share raised-floor space with other tenants, in addition to sharing power and cooling infrastructure. [3]
  • Cages – A cage is dedicated server space within a traditional raised-floor data center; it is surrounded by mesh walls and locked through a locking door. Cages share power and cooling infrastructure with other data center.
  • Suites – A suite is a dedicated, private server space within a traditional raised-floor data center; it is fully enclosed by solid partitions and entered through a locking door. Suites may share these benefits, or have these resources provided on a dedicated basis.
  • Modules – data center modules are purpose-engineered modules and components to offer scalable data center capacity. They are using standardized components, which makes them easily added, integrated or retrofitted into existing data centers, and cheaper and easier to build. [4] In a colocation environment, the data center is a data center, with its own steel walls and security protocol, and its own cooling and power infrastructure. “A number of colocation companies have a business approach, they have a service, they pay only for what they consume.” [5]

Building features

Buildings with data centers are often easy to recognize. [6]

Colocation facilities have many other special characteristics:

  • Fire protection systems, including passive and active elements, and the implementation of fire prevention programs in operations. Smoke detectors are usually installed to provide early warning of a developing fire by detecting particles generated by smoldering components prior to the development of flame. This allows investigation, interruption of power, and manual fire suppression using hand held fire extinguishers before the fire grows to a large size. A fire sprinkler system is often provided to control a full scale fire if it develops. Clean agentfire suppression gaseous systems are sometimes installed to remove fire from the fire sprinkler system. Passive fire protection elements include the installation of fire walls around the space, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems, or if they are not installed.
  • 19-inch racks for data equipment and servers, 23-inch racks for telecommunications equipment.
  • Cabinets and cages for physical access control over equipment.
  • Overhead or underfloor cable rack (tray) and fiberglass, power cables usually on separate rack from data.
  • Air conditioning is used to control the temperature and humidity in the space. ASHRAE recommends a temperature range and humidity range for optimal electronic equipment conditions versus environmental issues. [7] The electrical power used by the electronic equipment is converted to heat, which is rejected to the ambient space. Unless the heat is removed, the resulting temperature will be higher, resulting in electronic equipment malfunction. Controlling the space air temperature, the server components at the level of the temperature controlled by the manufacturer. Air conditioning systems help keep equipment space humidity Within acceptable parameters by cooling the air below-the return spacedew point . Too much humidity and water can begin to condense on internal components. In case of a dry atmosphere, ancillary humidification systems may add water to the space if the humidity is too low, to avoid static electricity discharge.
  • Low-impedance electrical ground .
  • Few, if any, windows.

Colocation data centers are often audited to prove that they live up to certain standards and levels of reliability; The most commonly seen systems are SSAE 16 SOC 1 Type I and Type II (formerly SAS 70 Type I and Type II) and the tier system by the Uptime Institute or TIA. For service organizations today, SSAE 16 calls for a description of its “system”. This is far more detailed and comprehensive than SAS 70’s description of “controls”. [8] Other data center compliance standards include HIPAA (Health Insurance Portability and Accountability Act (HIPAA) audit) and PCI DSS Standards.

Physical security

Most colocation centers have high levels of physical security, including on-site security guards. Others may simply be guarded continuously. They may also employ CCTV .

Some colocation facilities require that employees escort customers, especially if there is no individual locked cages or cabinets for each customer. In other facilities, a PIN code or proximity card access system may be used in the building, and individual cages or cabinets have locks. Biometric security measures, such as fingerprint recognition , voice recognition and “weight matching”, are also becoming more commonplace in modern facilities. Man-traps are also used, where a hallway leads into the data center. visitors can be seen via CCTV and are manually authorized to enter.


Colocation facilities have generators that start automatically when utility fails, usually running on diesel fuel . These generators may have different levels of redundancy, depending on how the facility is built. Generators do not start instantaneously, so colocation facilities usually have battery backup systems. In many facilities, the operator of the facility provides large inverters to provide AC power from the batteries. In other cases, customers may install smaller UPSes in their racks.

Some customers choose to use 48VDC (nominal) battery banks. It may provide better energy efficiency, and thus the size and cost of power delivery wiring. An alternative to batteries is a motor generator connected to a flywheeland diesel engine.

Many colocation facilities can provide redundant, A and B power feeds to customer equipment, and high end servers and telecommunications equipment.

Colocation facilities are sometimes connected to multiple sections of the utility grid for additional reliability.


This section does not cite any sources . Please add this section by adding quotes to reliable sources . Unsourced material can be challenged and removed . (November 2016) ( Learn how to remove this template message )

All computers produce heat as a waste product. Generally the more computationally Powerful the system the more heat will be Produced, The majorité of consumer electronic devices will Either passive use cooling Where features Such As heatsinks are used to dissipate heat to the local environment and cool the device down, gold will employee the use of a device Such as a computer fan to actively aid in cooling the device (so called active cooling ). For certain industrial and high-end consumer devices, it is also possible to use a water cooling system.

As a typical data center, large amounts of heat are produced. To reduce risk to personal and prevent damage to equipment and hardware, providing adequate cooling. Such technologies include Computer Room Air Conditioners (CRAC), Computer Room Air Handler (CRAH) and Chiller Plants are common place. More progressive operators have opted to use conductive cooling. The traditional cooling technologies rely on chilled water systems, which consume and waste a lot of power and water, conductive cooling and cooling.

The operator of a colocation is providing services to the building. The cooling system generally includes some degree of redundancy. In older facilities, the cooling system can be used in the building of the building.

To reduce the costs and constraints of current cooling systems, it has been a recent trend for data centers to be opened in cooler regions of the world. Because the air temperature is colder, less energy has been consumed in cooling leading to reduced costs. Companies such as Facebook has built new data centers within the arctic circle [9] .

Internal connections

This section does not cite any sources . Please add this section by adding quotes to reliable sources . Unsourced material can be challenged and removed . (November 2016) ( Learn how to remove this template message )

Roommate facility owners have differing rules regarding cross-connects between their customers, some of which may be carriers. These rules can not be used at any rate, or allow customers to order such connections for a monthly fee. They may permit customers to order cross-connects to carriers, but not to other customers. Some colocation centers feature a ” meet-me-room ” where the different housed in the center can easily exchange data.

Most peering points sit in colocation centers and because of the high concentration of servers inside larger colocation centers, most of them will be interested in bringing these buildings into direct contact. In many cases, there will be an Internet Exchange hosted inside a colocation center, where customers can connect for peering. [10]

See also

  • Data center
  • Internet exchange point


  1. Jump up^ Pashke, Jeff. “Going Open – Software vendors in transition” . 451 Research . Retrieved 6 March 2016 .
  2. Jump up^ “Room: Managed or unmanaged?” . 7L Networks . Retrieved 6 March2016 .
  3. Jump up^ “Roommate Benefits And How To Get Started” . Psychz Networks . Retrieved 18 February 2015 .
  4. Jump up^ DCD Intelligence”Assessing the Cost: Modular vs. Traditional Build”, October 2013Archived7 October 2014 at theWayback Machine.
  5. Jump up^ John Rath,”DCK Guide To Modular Data Centers: The Modular Market”, “Data Center Knowledge”, October 2011
  6. Jump up^ Examples can be seen at
  7. Jump up^ “Thermal Guidelines for Data Processing Environments, 3rd Ed. – ASHRAE Store” .
  8. Jump up^ “America Roommate” . SSAE 16 Compliance.
  9. Jump up^ “Inside Facebook’s new Arctic Circle data center” . . Retrieved 2017-06-28 .
  10. Jump up^ “Learn About Colocation Benefits And How To Get Started” . Psychz.Net .