Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Facebook

Submission + - Ginormous Data: The Story of Facebook's Analytics Back End->

waderoush writes: "Forget ‘big data’ — Facebook’s data challenges are ‘ginormous,’ to quote Jay Parikh, the company’s vice president of infrastructure engineering. Everybody knows that the social networking site is also the world’s largest photo sharing service, storing some 240 billion photos, with another 350 million uploaded every day (about 7 petabytes per month). But Facebook’s vast and detailed activity logs, which are spread across huge Hadoop clusters of 100 petabytes or more, have received far less attention. This Xconomy article takes an in-depth look at how Parikh’s team manages this back end, and more importantly, how Facebook product engineers use it track the tens of thousands of A/B tests running on the front end on any given day. ‘Our top priority, beyond keeping the site up and running and fast, is enabling our product teams to move at lightning speed,’ Parikh says."
Link to Original Source
This discussion was created for logged-in users only, but now has been archived. No new comments can be posted.

Ginormous Data: The Story of Facebook's Analytics Back End

Comments Filter:

"Why waste negative entropy on comments, when you could use the same entropy to create bugs instead?" -- Steve Elias

Working...