šŸœ Oracle cuts jobs for..

Nothing says ā€œfuture of workā€ like pink slips in one hand and...

Welcome, Noodle Networkers.

Nebius just made Wall Street do a double take šŸ“ˆ after landing a multi‑billion dollar deal with Microsoft. The stock jumped like it saw ChatGPT say ā€œbuy.ā€ Apparently, selling GPUs in bulk is the new oil. MIT researchers dropped a spicy little light-powered AI chip ⚔ that’s supposedly 100 times faster than the usual silicon slabs—and way less sweaty. No word yet on whether it can run Skyrim, but we’ll take the science win. And Oracle? They’re cutting jobs to chase AI glory šŸ¢. Nothing says ā€œfuture of workā€ like pink slips in one hand and server racks in the other.

Are we watching the AI era level up, or just witnessing another round of corporate musical chairs? Let’s find out...

In today’s AI digest:

  • Nebius stock jumps 60% on $19B AI deal with Microsoft šŸ“ˆ

  • New light chip makes AI 100x faster with less power ⚔

  • Oracle cuts jobs as AI development costs rise šŸ¢

Read time: 5 minute

How old are you?

Login or Subscribe to participate in polls.

WHAT’S HAPPENING TODAY

(source: CNBC)

šŸ“ˆ The Digest: Nebius just pulled off one of the most surprising wins in the AI infrastructure race. The Amsterdam-based cloud player landed a contract with Microsoft worth up to $19 billion, sending its stock soaring more than 60 percent. Think of it as the corporate version of finding a golden ticket, one that comes with a five-year supply of GPU orders.

Key Details:

šŸ¤ Mega Microsoft Deal
Nebius agreed to provide GPU infrastructure to Microsoft’s Azure cloud from a new data center in New Jersey. The contract starts at $17.4 billion and can climb to $19.4 billion depending on usage.

šŸ“ˆ Shares in Overdrive
Investors reacted instantly, pushing Nebius stock up more than 60 percent in extended trading. The rally set fresh highs and turned the company into one of Wall Street’s most unexpected stars of the week.

šŸ’» A New Kind of Cloud
Nebius calls itself a ā€œneocloud,ā€ focused exclusively on AI workloads rather than traditional computing. That niche focus is paying off as demand for GPU capacity keeps outstripping supply.

šŸ—ļø Rapid Expansion
The deal provides Nebius with guaranteed revenue and the financing needed to build out its U.S. footprint. The New Jersey data center is the first of several planned growth projects.

Why It Matters: This is more than a one-off contract. It shows how smaller, specialized infrastructure firms can compete with giants like Amazon and Google by targeting the AI boom directly. For Microsoft, it secures precious GPU power in a market where demand is insatiable. For Nebius, it is nothing short of a transformation—an overnight shift from little-known provider to major industry player. At the end of the day, Nebius didn’t just sign a deal. It changed its place in the AI economy, proving that in the race for computing power, even smaller players can score jackpot-level wins.

AI chips

⚔ The Digest: Scientists are developing chips that use light instead of electricity, and the results are remarkable. Early tests show these photonic and neuromorphic designs can make AI run up to 100 times faster while consuming far less power. Think of it as trading in a gas guzzler for a high-speed solar car.

Key Details:

šŸ”† Laser-Driven Processing
Researchers at the University of Florida created a chip that uses microscopic lenses and lasers to handle convolution tasks. These are the backbone of pattern recognition in AI. The chip delivered efficiency gains of up to 100 times compared to conventional hardware and achieved 98 percent accuracy when classifying handwritten digits.

šŸƒ MIT’s Analog Speed Boost
Engineers at MIT unveiled a chip that can process data at 100 times the speed of current digital AI chips while using a fraction of the energy. The design is small, cost-effective, and perfect for real-time applications such as autonomous vehicles or wearable health devices.

🧠 Brain-Inspired Efficiency
A German team built a neuromorphic chip modeled after the human brain. It matched standard AI performance while using ten times less energy and worked fully offline, making it ideal for secure on-device intelligence.

Why It Matters: The challenge with AI today is not just performance but also the enormous energy costs that come with it. Chips that rely on light or brain-like designs solve both issues at once. They promise faster results while easing the pressure on power grids and data centers. This could reshape everything from smartphones that run AI locally without draining the battery to data centers that no longer need industrial-scale cooling. The bigger picture is clear: the future of AI might be powered less by heat and more by light.

(source: TheInformation)

šŸ¢ The Digest: Oracle is laying off thousands of employees even as it pours billions into AI infrastructure. The company says the move will streamline operations and free up resources for its growing data center ambitions. It feels like a classic case of tightening one belt while loosening another.

Key Details:

šŸ‘„ Global Workforce Cuts
Reports indicate more than 3,000 jobs are being eliminated across the U.S., India, Canada, the Philippines, and parts of Europe. Teams connected to cloud infrastructure and AI projects are among those affected.

šŸ¢ U.S. Layoffs Confirmed
In California, 143 positions were cut, while Seattle saw 161 roles eliminated. These reductions centered on Oracle Cloud Infrastructure and AI/ML units. Rumors suggest thousands more globally could be impacted as the company continues restructuring.

šŸ’° Revenue Still Rising
Despite the layoffs, Oracle reported an 8 percent increase in annual revenue. Executives insist the cuts are not about shrinking the business but about reallocating resources to support faster AI-focused growth.

⚔ Funding the AI Pivot
Money saved from job reductions is being redirected to large-scale projects like the Stargate initiative, which includes building more than 4 gigawatts of data center capacity in partnership with OpenAI and other industry leaders.

Why It Matters: Oracle is betting heavily that its future lies in AI infrastructure rather than traditional software or cloud services. Cutting jobs in some divisions while expanding in others highlights how disruptive AI costs have become for legacy tech companies. The bigger picture is that even profitable firms are reshaping themselves to stay competitive in the AI era. For Oracle employees, that means painful uncertainty. For the industry, it signals just how much weight the AI gold rush carries—big enough to restructure entire companies around it.

THE NOODLE LAB

AI Hacks & How-Tos

The Digest: Grok‑Code‑Fast‑1 is a high-speed, cost-effective agentic coding model designed for day-to-day software development. It excels with tool-based tasks (like navigating codebases, using grep, terminal automation, and file editing) and supports languages like TypeScript, Python, Java, C++, Rust, and Go. Thanks to innovations like prompt caching, it's so fast that responses can come before you finish reading the initial reasoning.

How-to:

  1. Access Grok‑Code‑Fast‑1

    • Use it for free via partner platforms like GitHub Copilot, Cursor, Cline, Kilo Code, Roo Code, Opencode, or Windsurf.

  2. Select the Model in Your IDE

    • In your chosen tool (e.g., Copilot or Cline), go to model settings and choose grok-code-fast-1—no setup needed.

  3. Work in Iterative Blocks

    • Break down tasks into focused steps—like "Create a function to parse a CSV row"—for best results. Rapid iteration lets you steer the AI precisely.

  4. Use Large Contexts for Codebases

    • With a 256k token context window, you can paste in entire files or long error logs. The model excels at diagnosing issues and making cohesive changes.

  5. Cost Efficiency & Speed

    • Super fast (around 90 tokens/sec). Pricing is designed to stay affordable at $0.20 per million input tokens and just $0.02 for cached input—making it ideal for rapid AI-powered dev workflows.

Explore More: Grok‑Code‑Fast‑1 is reshaping coding workflows with unmatched speed and tool integration. It’s perfect for rapid prototyping, bug fixes, and lightweight development—especially when fast output matters most.

Trending AI Tools

  • Latam-GPT – LLM for Latin American languages and culture.

  • Monoya Connect ā€“ AI helping artisans sell globally.

  • Div-idy – Builds apps and sites from text prompts.

  • Predis.ai – Creates and schedules social media content.

  • Grok-Code-Fast-1 – xAI’s agent that writes code faster.