This isn't just a "Can't we all get along?" story. Andy Patrizio also interviews eBay to learn what they're doing differently than most, such as questioning the assumptions about data center cooling:
How did eBay do it? For starters, the company didn't insist on turning its server containers into meat lockers. The conventional wisdom has been that a data center must be cold enough to store meat, even though Intel rates its CPUs at max temperatures of around 150 degrees Fahrenheit.
"We baby these systems too much," says Nelson. "The perception is it has to be cold. And who does that? IT thinks it can't be hot in here. There are two variables you need to design to: the surface temperature of the chip and the outside maximum worst case temperature in the environment of where you are."
So instead of turning the place frigid, they use outside air and unchilled water. If a CPU is hitting 150 degrees Fahrenheit, water that's 87 degrees is downright cold, even if that's the typical temperature of a pool in summertime.
eBay also has a different way of buying servers — and it has 1,920 servers in its Phoenix data center alone. Though I think what startled me the most is that the losing bidders on an RFP are told why. eBay went back to the losing vendors and explained why they lost:
The vendor put 36 engineers on the phone to get a lesson in their hardware's inefficiency. They went back and tuned their servers with low-voltage DIMMs, changed the heat sink, changed firmware settings, and made other tweaks. They won the next bid four weeks later.