By Code. Editorial Staff

Over the past four years, Facebook has been steadily working to revolutionize data center hardware. We started with new server designs, power handling, and cooling; then focused on storage and rack; and over the past year, we have completely opened the data center network. The result is that today we have open-sourced every major physical component of our data center stack — a stack that is powerful enough to connect 1.39 billion people around the world and is efficient enough to have saved us $2 billion in infrastructure costs over the last three years. But we’re not finished — not even close.

Our mission is to connect the world. We can’t achieve this goal without infrastructure innovation. To go where we want to go, we need to build and deploy infrastructure that is as flexible, efficient, and sustainable as possible. To do this, we want to work with not just the best minds under one roof, but the best minds in the world — and that’s where the Open Compute Project comes in.

Four years ago, the major Web companies were working in silos to build the infrastructure necessary for their scale. Now, just four years in, the Open Compute Project has thousands of participants and nearly 200 companies working to increase the pace of innovation in the industry. Facebook is proud to have started this initiative, and we will continue to openly share our technologies and our learnings as we build the infrastructure required to connect the next 5 billion people.

So what’s next? Here’s a quick recap of the exciting announcements Facebook made today:

  • Introducing “Yosemite”: Over the last 18 months, Facebook has been working with Intel on a new SoC compute server that dramatically increases speed and more efficiently serves Facebook traffic. Yosemite is our first system-on-a-chip compute server that supports four independent servers at a performance-per-watt superior to traditional data center servers for heavily parallelizable workloads. It is an ideal component for our disaggregated rack infrastructure, and Intel has been a great partner in helping us push the limits here.
  • Making “Wedge” easier to consume: We have proposed the contribution for the Wedge spec, our top-of-rack network switch to the OCP Foundation. And to help make it easier for people to buy and start deploying this open top-of-rack switch, Facebook is working with Accton, Broadcom, Cumulus, and Big Switch to create a Wedge product package for the OCP community. Accton will begin shipping Wedge in the first half of 2015.
  • Releasing “OpenBMC”: We released OpenBMC — open, low-level board management software that enables flexibility and speed in feature development for BMC chips. OpenBMC provides an open software framework for next-generation system management. Wedge will be the first hardware powered by OpenBMC, and 6-pack is next.
  • “FBOSS Agent” now available: We opened the central library of FBOSS — the software created for our top-of-rack network switch, Wedge. The FBOSS Agent is built on Broadcom’s OpenNSL library to program the Broadcom ASIC inside Wedge.
  • New data on cost and energy savings: Thanks to OCP and related efficiency work, we announced that we’ve saved $2 billion in infrastructure costs over the course of the last three years. And in the last year alone, we’ve saved enough energy to power nearly 80,000 homes. The carbon savings associated with that energy efficiency are equivalent to taking 95,000 cars off the road.

These technologies underpin the software services we use every day. As hardware advances, so do the speed, performance, capabilities, and reach of that software. The Open Compute Project is about working together to reimagine, reinvent, and build data centers, servers, storage devices, and network technologies to support the massive growth of data today and enable the great services of tomorrow.

Leave a Reply

To help personalize content, tailor and measure ads and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy