NVIDIA Corporation (NASDAQ: NVDA) used the CES 2026 convention in Las Vegas to reaffirm its leadership in artificial intelligence infrastructure, announcing that its next-generation Rubin data center platform is now in full production and on track for release later this year. The move highlights Nvidia’s accelerated release cycle as competition intensifies from rivals such as Advanced Micro Devices (NASDAQ: AMD) and custom silicon developed by major cloud providers.
During his keynote address, CEO Jensen Huang revealed that all six chips in the Rubin platform have successfully returned from manufacturing partners and passed initial milestone tests. This puts the new AI accelerator systems on schedule for customer deployments in the second half of 2026. By unveiling Rubin early, Nvidia is signaling confidence in its roadmap while keeping enterprises closely aligned with its hardware ecosystem.
The Rubin GPU is designed to meet the growing demands of agentic AI models, which rely on multistep reasoning rather than simple pattern recognition. According to Nvidia, Rubin delivers 3.5 times faster AI training performance and up to 5 times higher inference performance compared to the current Blackwell architecture. The platform also introduces the new Vera CPU, featuring 88 custom cores and offering double the performance of its predecessor. Nvidia says Rubin-based systems can achieve the same results as Blackwell while using far fewer components, reducing cost per token by as much as tenfold.
Positioned as a modular “AI factory” or “supercomputer in a box,” the Rubin platform integrates the BlueField-4 DPU, which manages AI-native storage and long-term context memory. This design improves power efficiency by up to five times, a critical factor for hyperscale data centers. Early adopters include Microsoft (NASDAQ: MSFT), Amazon AWS (NASDAQ: AMZN), Google Cloud (NASDAQ: GOOGL), and Oracle Cloud Infrastructure (NYSE: ORCL).
Beyond data centers, Nvidia also highlighted major advances in robotics and autonomous vehicles, calling the current period a “ChatGPT moment” for physical AI. New offerings such as Alpamayo AI models for self-driving systems and the Jetson T4000 robotics module further underscore Nvidia’s bet that reasoning-based AI will drive a massive, trillion-dollar infrastructure upgrade across industries.


Tencent Shares Slide After WeChat Restricts YuanBao AI Promotional Links
Global PC Makers Eye Chinese Memory Chip Suppliers Amid Ongoing Supply Crunch
FDA Targets Hims & Hers Over $49 Weight-Loss Pill, Raising Legal and Safety Concerns
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
Baidu Approves $5 Billion Share Buyback and Plans First-Ever Dividend in 2026
Nasdaq Proposes Fast-Track Rule to Accelerate Index Inclusion for Major New Listings
OpenAI Expands Enterprise AI Strategy With Major Hiring Push Ahead of New Business Offering
Rio Tinto Shares Hit Record High After Ending Glencore Merger Talks
Anthropic Eyes $350 Billion Valuation as AI Funding and Share Sale Accelerate
Nvidia Confirms Major OpenAI Investment Amid AI Funding Race
Elon Musk’s SpaceX Acquires xAI in Historic Deal Uniting Space and Artificial Intelligence
SpaceX Reports $8 Billion Profit as IPO Plans and Starlink Growth Fuel Valuation Buzz
Instagram Outage Disrupts Thousands of U.S. Users
Sony Q3 Profit Jumps on Gaming and Image Sensors, Full-Year Outlook Raised
Nvidia CEO Jensen Huang Says AI Investment Boom Is Just Beginning as NVDA Shares Surge
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
Amazon Stock Rebounds After Earnings as $200B Capex Plan Sparks AI Spending Debate 



