Saturday, April 20, 2024
Advertisement
  1. You Are At:
  2. News
  3. Business
  4. Man who is rewiring Google from inside out

Man who is rewiring Google from inside out

New York, Sep 11: On May 10, 2011, at 1:35 in the afternoon, Eric Brewer told the world he was redesigning the most important operation on the internet.Brewer, a professor of computer science at the

PTI PTI Updated on: September 11, 2012 9:27 IST
man who is rewiring google from inside out
man who is rewiring google from inside out

New York, Sep 11: On May 10, 2011, at 1:35 in the afternoon, Eric Brewer told the world he was redesigning the most important operation on the internet.




Brewer, a professor of computer science at the University of California, Berkeley, was headed for Google, where he would help build a brand-new computing platform that could span dozens of data centers across the globe and instantly process requests from billions of people in a matter of milliseconds, says an article by Cade Metz  in wired.com.

 “I will be leading the design of the next gen of infrastructure at Google,” he wrote. “The cloud is young: much to do, many left to reach.”

Brewer now regrets the tweet. It leaves out so many other Googlers working alongside him.

“I am actually providing design leadership — and an outside perspective,” he tells Wired in an e-mail, “but it is a multi-person effort.”

And yet, that's all he'll say. Google, you see, treats its globe-spanning infrastructure as the most important of trade secrets.

Google believes much of its success stems from its ability to craft software and hardware capable of juggling more data, more quickly than practically any other operation on Earth.

The Googlenet is what so much of the computing world looks to as the modern ideal.

Occasionally, the company will reveal pieces of its top-secret infrastructure — which now spans as many as three dozen data centers — and others will follow its lead. The followers include everyone from Facebook, Yahoo and Twitter to  the NSA.

That's why the tweet was so intriguing. Eric Brewer and his team are building what may be the future of the internet.

At this point, we don't know what all this will look like. But we can at least understand who Eric Brewer is — and, to certain extent, why he was chosen for the task.

Eric Brewer isn't just an academic. In the mid-1990s, one of his Berkeley research projects spawned a web search engine called Inktomi.

Nowadays, Inktomi is remembered — if it's remembered at all — as one of the many web search engines that flourished during the dot-com boom before bowing to Google in the decade that followed. But Inktomi was a little different.

Before it was purchased by Yahoo in 2002, it pioneered a computing philosophy that served as bedrock not for the Google empire but for the web as a whole.

When Inktomi was founded in 1996 — two years before Google — web search engines and other massive online applications were served from big, beefy machines based on microprocessors that used the RISC architecture and other chips specifically designed for very large tasks.

Alta Vista — the dominant search engine prior to the arrival of Inktomi — ran on enormous machines built around the Alpha processor, a RISC chip designed by its parent company, the Digital Equipment Corporation.

But Eric Brewer realized that, when building this sort of sprawling application, it made more sense to spread the load across a sea of servers built for much smaller tasks.

“Eric was able to demonstrate that a cluster of hundreds of cheap computers could actually significantly outperform the fastest supercomputers of the day,” says David Wagner, who studied under Brewer and is now a professor at UC Berkeley specializing in computer security.

This model makes it easier to expand an application — adding new machines as needed — and it makes it easier to accommodate hardware failures.

But it also means you're using technology that improves at a faster clip.

“By working with low-end, everyday machines, you benefit from volume. You benefit from the fact that this is what everyone else is buying,” says Wagner.

“Volume drives Moore's Law, so these commodity machines were getting faster at a faster rate than supercomputers.”

Plus, these machines use less power — and when you expand your application to “internet-scale,” power accounts for a significant amount of your overall cost.

The idea at the heart of Inktomi would redefine the internet.

Following in the footsteps of Brewer's company, Google built its search empire on commodity servers equipped with processors based on the x86 architecture Intel originally built for desktops PCs.

In 2001, Jim Mitchell and Gary Lauterbach — two bigwigs at Sun Microsystems — visited Google's server room and saw hundreds of dirt-cheap motherboards slotted into what look like bread racks you'd find in a bakery.

Sun was another company that built big, beefy RISC machines, and though it had close ties to Google, Mitchell and Lauterbach knew it would never sell a single machine to the fledgling search company.

“Those servers are so cheap and use so little power,” Mitchell told Lauterbach, “we have no hope of building a product to help them.”

Google would eventually take this idea to extremes, designing its own stripped-down servers in an effort to save additional cost and power. And the rest of the web followed suit.

Today, the web runs on cheap x86 servers, and some large outfits, including Facebook and Amazon, are designing their own machines in an effort to push the outside of the envelope.

You could argue this was the only way the web could evolve — and Eric Brewer knew that it would.

“Eric's big insight was that the internet would soon grow so big that there won't be any computer big enough to run it — and that the only way to accommodate this was to rethink the architecture of the software so it could run on hundreds of thousands of machines,” says Armando Fox, another Berkeley distributed systems guru who studied with Brewer.

“Today, we take that for granted. But in 1995, it was new thinking. Eric rightly gets credit on having that vision before a lot of other people — and executing on it.”

It only makes sense, then, that Google would tap Brewer to help rebuild its infrastructure for the coming decades.

Whatever direction Google takes, you can bet it will look well beyond the status quo.

In addition to calling on Brewer, the company has apparently tapped several other engineers with vast experience in this area. Brewer says his desk is within 10 feet of Jeff Dean, Sanjay Ghemawat, and Luiz André Barroso.

That would be three of the engineers who designed the Google infrastructure the first time around.
Advertisement

Read all the Breaking News Live on indiatvnews.com and Get Latest English News & Updates from Business

Advertisement
Advertisement
Advertisement
Advertisement