Facebook on Thursday unveiled its latest Open Rack-compatible hardware designed for AI computing at a large scale, code-named “Big Sur”. It is the next-generation GPU-based systems for training neural networks.
The social networking giant also announced that it plans to open-source Big Sur and will submit the design materials to the Open Compute Project (OCP).
“Facebook has a culture of support for open source software and hardware, and FAIR [Facebook Artificial Intelligence Research] has continued that commitment by open-sourcing our code and publishing our discoveries as academic papers freely available from open-access sites”, Facebook said in an online post. “We're very excited to add hardware designed for AI research and production to our list of contributions to the community… We believe that this open collaboration helps foster innovation for future designs, putting us all one step closer to building complex AI systems that bring this kind of innovation to our users and, ultimately, help us build a more open and connected world.”
Big Sur has been so designed as to incorporate 8 high-performance GPUs of up to 300 watts each, with the flexibility to configure between multiple PCI-e topologies. It was built with the NVIDIA Tesla M40 in mind but is qualified to support a wide range of PCI-e cards.
“Leveraging NVIDIA's Tesla Accelerated Computing Platform, Big Sur is twice as fast as our previous generation, which means we can train twice as fast and explore networks twice as large. And distributing training across eight GPUs allows us to scale the size and speed of our networks by another factor of two”, Facebook explained.
Moreover, the new servers have been optimized for thermal and power efficiency, which allows operating them in Facebook’s own free-air cooled, Open Compute standard data centers and does not require special cooling and other unique infrastructure to operate.
Also, lesser used components have been removed, while removing and replacing of components that fail relatively frequently, such as hard drives and DIMMs, have been simplified.
“Even the motherboard can be removed within a minute, whereas on the original AI hardware platform it would take over an hour. In fact, Big Sur is almost entirely toolless — the CPU heat sinks are the only things you need a screwdriver for.” - Facebook


Microsoft AI Spending Surge Sparks Investor Jitters Despite Solid Azure Growth
C3.ai in Merger Talks With Automation Anywhere as AI Software Industry Sees Consolidation
ASML’s EUV Lithography Machines Power Europe’s Most Valuable Tech Company
US Judge Rejects $2.36B Penalty Bid Against Google in Privacy Data Case
Nvidia’s $100 Billion OpenAI Investment Faces Internal Doubts, Report Says
Amazon Stock Dips as Reports Link Company to Potential $50B OpenAI Investment
Meta Faces Lawsuit Over Alleged Approval of AI Chatbots Allowing Sexual Interactions With Minors
SoftBank Shares Surge as It Eyes Up to $30 Billion New Investment in OpenAI
SpaceX Reports $8 Billion Profit as IPO Plans and Starlink Growth Fuel Valuation Buzz
Alibaba-Backed Moonshot AI Unveils Kimi K2.5 to Challenge China’s AI Rivals
SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
Apple Earnings Beat Expectations as iPhone Sales Surge to Four-Year High
Sandisk Stock Soars After Blowout Earnings and AI-Driven Outlook
NVIDIA, Microsoft, and Amazon Eye Massive OpenAI Investment Amid $100B Funding Push
Anthropic Raises 2026 Revenue Outlook by 20% but Delays Path to Profitability
Meta Stock Surges After Q4 2025 Earnings Beat and Strong Q1 2026 Revenue Outlook Despite Higher Capex
Samsung Electronics Posts Record Q4 2025 Profit as AI Chip Demand Soars 



