You are searching about Ac Problem Fan Is Running But No Air Flow, today we will share with you article about Ac Problem Fan Is Running But No Air Flow was compiled and edited by our team from many sources on the internet. Hope this article on the topic Ac Problem Fan Is Running But No Air Flow is useful to you.
The Data Center Temperature Debate
Although no data center authority has ever stated it directly, the prevailing practice around these critical facilities is “the cooler, the better.” However, some leading server manufacturers and data center efficiency experts believe that data centers can run hotter than today without sacrificing uptime and with significant savings in both cooling-related costs and CO2 emissions. A server manufacturer recently announced that their server rack can operate with an inlet temperature of 104 degrees F.
Why push the envelope? Cooling infrastructure is an energy hog. This system, running 24x7x365, uses a lot of electricity to create the optimal computing environment, which is 55 to 65 degrees F. Can move anywhere in between. (ASHRAE’s current “recommended” range is 18-27 C or 64.4 deg F to 80.6 deg F)
To achieve efficiency, many influential end users are running their data centers warmer and are advising their contemporaries to follow suit. But the process isn’t as simple as raising the thermostat in your home. Here are some key arguments and considerations:
Condition: Increasing the server inlet temperature will result in significant energy savings.
o Sun Microsystems, both a major hardware manufacturer and data center operator, estimates a 4% savings in energy costs for every (1) degree increase in server inlet temperature. (Miller, 2007)
o A higher temperature setting means more hours of “free-cooling” are possible through air-side or water-side economizers. This information is particularly compelling for areas such as San Jose, California, where outside air (dry-bulb) temperatures are at or below 70 degrees F for 82% of the year. Depending on the geography, annual savings from the economy can exceed six figures.
o Cooling infrastructure has specific design setpoints. How do we know that raising the server’s inlet temperature won’t result in false economies of scale, causing additional, unnecessary consumption in other components such as the server’s fans, pumps, or compressors?
o Free-cooling, while great for new data centers, is an expensive proposition for existing data centers. The entire cooling infrastructure will require re-engineering and can be cost prohibitive and unnecessarily complex.
o Costs due to thermal-related equipment failure or downtime will offset savings from higher temperature setpoints.
Condition: Increasing server inlet temperature complicates reliability, recovery, and equipment warranty.
o Frequent mixing of inlet air and exhaust air in a data center. The temperature is kept low to offset this mixing and keep the server inlet temperature within ASHRAE’s recommended range. Warming temperatures can exacerbate pre-existing hotspots.
o Cold temperatures provide an envelope of cool air in the room, an asset in case of cooling system failure. Employees can have more time to diagnose and repair problems and, if necessary, gracefully shut down equipment.
o In the case of a 104 degree F server, what is the probability that each device—from storage to networking—will demonstrate reliability? Will all warranties remain valid at 104 degrees F?
o Increasing the temperature of the data center is part of the efficiency program. Temperature rise requires following best practices in airflow management: using blanking panels, sealing cable cutouts, eliminating cable obstructions under elevated floors, and implementing some form of air restriction. These measures can effectively reduce the mixing of hot and cold air and allow safe, practical temperature increases.
o The 104 degree F server is an extreme case encouraging thoughtful discussion and critical inquiry among data center operators. After their study, perhaps a facility that once operated at 62 degrees is now operating at 70 degrees Fahrenheit. These changes can significantly improve energy efficiency without compromising availability or equipment warranty.
Status: Servers are not as fragile and sensitive as one might think. Studies conducted in 2008 highlight the flexibility of modern hardware.
o Microsoft ran servers in tents in the humid Pacific Northwest from November 2007 to June 2008. They did not fail.
o Using an air-side economizer, Intel subjected 450 high-density server components to temperatures of up to 92 degrees and relative humidity of 4 to 90%. The server failure rate during this experiment was marginally higher than Intel’s enterprise facility.
o Data centers can operate with temperatures in the 80s and still comply with ASHRAE. The upper limit of their recommended temperature range increased to 80.6 degrees F (from 77 degrees F).
o High temperatures, over time, affect server performance. Server fan speed, for example, will increase in response to higher temperatures. This wear and tear can shorten the life of the device.
o Data center studies like Microsoft and Intel may not be relevant to all businesses:
o Their huge data center footprint is more resilient to occasional server failures that can be caused by overheating.
o They can leverage their purchasing power to obtain gold-plated warranties that allow higher temperature settings.
o They are probably refreshing their hardware faster than other businesses. If that server is completely spent after 3 years, no big deal. A small business may need a server to last more than 3 years.
Condition: High inlet temperatures can cause uncomfortable working conditions for data center personnel and visitors.
o Consider a 104 degree F rack. The hot aisle can range from 130 deg to 150 deg F. Even at the upper end of ASHRAE’s operating range (80.6 deg F) hot aisle temperatures will be around 105-110 deg F. Employees servicing these racks have to endure very uncomfortable working conditions. .
o In response to higher temperatures, server fan speeds will increase to dissipate more air. Increasing the fan speed will increase the noise level in the data center. Noise can reach or exceed OSHA noise limits, requiring occupants to wear ear protection.
o It goes without saying that as the inlet temperature of the server increases, the temperature of the hot aisle increases. Businesses must carefully balance worker comfort and energy efficiency efforts in the data center.
o Not all data center environments have high user volumes. Some high performance/supercomputing applications operate in light-out environments and consist of a homogeneous collection of hardware. These applications are suitable for high temperature setpoints.
o The definition of a data center is more fluid than ever. A traditional brick and mortar facility can quickly add compute power through data center containers without a costly construction project. Containers isolated from the rest of the building can operate at higher temperatures and achieve greater efficiency (some close-coupled cooling products perform the same function).
The movement to increase data center heating is growing but will face opposition until the issues are resolved. Reliability and availability are at the top of any IT professional’s performance agenda. For this reason, most to date have decided to err on the side of caution: to keep cool at all costs. However, high temperature and reliability are not mutually exclusive. There are ways to protect your data center investment and become more energy efficient.
Temperature is inseparable from airflow management; Data center professionals must understand how air moves from, to and through their server racks. Computational fluid dynamics (CFDs) can help by analyzing and charting the airflow projected onto the data center floor, but while cooling equipment doesn’t always perform to spec and the data you enter can miss some major bottlenecks, onsite monitoring and adjustments are critical requirements. Make sure your CFD data and calculations are accurate.
Data centers with additional cooling are prime environments for increasing temperature setpoints. Those with hotspots or inadequate cooling can start with low-cost solutions such as blanking panels and grommets. Close-coupled cooling and containment strategies are particularly relevant, as server exhaust air, so often the cause of thermal challenges, is isolated and prohibited from entering the cold aisle.
By addressing airflow, users can focus on finding their “sweet spot” – the ideal temperature setting that aligns with business requirements and improves energy efficiency. Active measurement and analysis are required to detect it. But the rewards—smaller energy bills, an improved carbon footprint, and a message of corporate responsibility—are worth the effort.
Video about Ac Problem Fan Is Running But No Air Flow
You can see more content about Ac Problem Fan Is Running But No Air Flow on our youtube channel: Click Here
Question about Ac Problem Fan Is Running But No Air Flow
If you have any questions about Ac Problem Fan Is Running But No Air Flow, please let us know, all your questions or suggestions will help us improve in the following articles!
The article Ac Problem Fan Is Running But No Air Flow was compiled by me and my team from many sources. If you find the article Ac Problem Fan Is Running But No Air Flow helpful to you, please support the team Like or Share!
Rate Articles Ac Problem Fan Is Running But No Air Flow
Rate: 4-5 stars
Search keywords Ac Problem Fan Is Running But No Air Flow
Ac Problem Fan Is Running But No Air Flow
way Ac Problem Fan Is Running But No Air Flow
tutorial Ac Problem Fan Is Running But No Air Flow
Ac Problem Fan Is Running But No Air Flow free
#Data #Center #Temperature #Debate