A series of service disruptions in 1999 caused real problems for eBay’s business. Over the course of three days, overloaded servers intermittently shut down, meaning users couldn’t check auctions, place bids or complete transactions during that period. Buyers, sellers and eBay were very unhappy, and a complete restructuring of eBay’s technological architecture followed.
In 1999, eBay was one massive database server and a few separate systems running the search function. In 2005, eBay is about 200 database servers and 20 search servers.
The architecture is a type of grid computing that allows for both error correction and growth. With the exception of the search function, everything about eBay can actually run on approximately 50 servers — Web servers, application servers and data-storage systems. Each server has between six and 12 microprocessors. These 50 or so servers run separately, but they talk to each other, so everybody knows if there is a problem somewhere. EBay can simply add servers to the grid as the need arises.
While the majority of the site can run on 50 servers, eBay has four times that. The 200 servers are housed in sets of 50 in four locations, all in the United States. When you’re using eBay, you may be talking to any one of those locations at any time — they all store the same data. If one of the systems crashes, there are three others to pick up the slack.
When you’re on the eBay Web site and you click on a listing for a Persian rug, your computer talks to Web servers, which talk to application servers, which pull data from storage servers so you can find out what the latest bid price is and how much time is left in the auction. eBay has local partners in many countries who deliver eBay’s static data to cut down on download time, and there are monitoring systems in 45 cities around the world that constantly scan for problems in the network.
This infrastructure lets millions of people search for, buy and sell items simultaneously. On the user end, it all works seamlessly. Let’s try it out.