SANTA CLARA, Calif. — Facebook called for a new class of dense but low-cost flash chips to store a rising flood of digital photos and data in a keynote at the Flash Memory Summit here. A datacenter manager also gave insights into how the company organizes its computer systems and processes its traffic.
The flash industry has focused on driving ever higher write endurance and performance. But a relatively low endurance, poor performance chip would better server Facebook's need to store some 350 million new photos a day, said Jason Taylor, director of infrastructure at Facebook. Other solid-state technologies could work for the so-called "cold flash," he added.
"Make the worst flash possible -- just make it dense and cheap," Taylor said, noting the social networking giant aims to lower the $1.24 billion it spent last year building and provisioning datacenters.
Long-writes, "like 10x as long as usual," do not matter, he said. Likewise, low endurance and lower I/O operations per terabyte are also OK for Facebook's uses. Many of the 240 billion photos the social networking giant stores are "data written once and read never," he quipped.
Facebook's Jason Taylor called for low-cost, dense, low-endurance flash chips.
In an email exchange with EE Times after the keynote, Taylor tried to quantify the opportunity:
An IDC report at the end of last year estimated that 2.8 zettabytes of data were stored in 2012. This amount is projected to grow to 40 zettabytes (1 zettabytes = 1 billion terabytes) by 2020. We believe that a huge percentage of that data will be written once and read rarely, if ever. Given that only a small percentage of data is stored in flash today, we see a massive opportunity for dense cold flash storage over the next few years.
He declined to provide any specifics on Facebook's discussions with flash vendors about the product concept. "We've talked with people in the flash chip industry, and they have been excited about the engineering challenge presented by cold flash."
On the following pages are some slides from Taylor's presentation at the show, showing a broader picture of how Facebook configures its systems and handles traffic.