Blogs


A Desert Bean and an Oil Revolution

How a nation that once begged OPEC for oil became the world’s largest producer, with a little help from Rajasthan’s desert farmers

Alt text

The Smell of Dependence

Picture this. It is October 1973. Richard Nixon is in the White House. The Vietnam War is winding down. And in the Middle East, a coalition of Arab nations has just made a decision that will bring the most powerful economy on Earth to its knees.

In retaliation for American support of Israel during the Yom Kippur War, the Arab members of OPEC, led by Saudi Arabia, announced an oil embargo against the United States. Overnight, the taps were turned off. Oil prices quadrupled from $3 to $12 a barrel. Petrol stations ran dry. Americans queued for hours, sometimes all night, just to fill their tanks. The speed limit on highways was slashed to 55 mph to conserve fuel. The lights on the Christmas tree at Rockefeller Center were dimmed. The world’s richest country was brought to a standstill by a commodity it did not control.

Read on →

Black Gold Rules the World - Everything You Never Knew About Oil

The world as we know it would not exist without petroleum. From the fuel in your car to the plastic in your phone case, oil is woven into almost every thread of modern civilization. But how did this thick, dark liquid buried miles underground become the most powerful commodity on Earth?

A Brief History: From Ancient Seeps to Industrial Lifeline

Long before the first oil well was ever drilled, humans knew about petroleum. Ancient Mesopotamians — in what is today Iraq — used naturally occurring bitumen (a semi-solid form of crude oil) to waterproof boats and bind bricks together as far back as 3,000 BCE. The Chinese were drilling primitive bamboo wells to extract oil as early as 347 CE.

But the modern petroleum era truly begins on August 27, 1859, in Titusville, Pennsylvania, USA, when Edwin Drake successfully drilled the world’s first commercial oil well to a depth of 21 metres. The oil he struck didn’t just flow — it ignited an industry, and with it, an entirely new world order.

Within decades, John D. Rockefeller’s Standard Oil had monopolized the American market, controlling over 90% of U.S. oil refining by the 1880s. The invention of the internal combustion engine, and then the automobile, sent demand soaring. By the early 20th century, oil was no longer just a source of lamp fuel — it was the lifeblood of empires, powering warships, aircraft, and tanks in both World Wars.

Alt textSource: Internet

Read on →

The Future of Software Engineering in the Age of AI

Over the past year, one question keeps coming up in almost every tech discussion:

Will AI replace software engineers?

Alt textSource: AI

We constantly see headlines about AI writing code, companies slowing down hiring, and tools that can generate entire applications in minutes. It is natural for people in the IT industry to feel uncertain about what the future looks like. But when you step back and observe what is actually happening inside engineering teams, a different picture appears. AI is not eliminating the need for developers. Instead, it is changing the nature of software development itself. After watching several discussions about the impact of AI on IT jobs and observing how engineering teams are evolving, one thing becomes clear:

The structure of the IT industry is shifting.

Not disappearing. Just shifting. Let’s break down what this really means.

The First Big Change: Automation of Repetitive Work

A large portion of software development has always involved repetitive work.

Typical examples include:

  • Writing boilerplate code
  • Creating simple APIs
  • Generating test cases
  • Writing documentation
  • Fixing small bugs
  • Refactoring simple logic

These tasks are necessary, but they are also predictable. This is exactly the type of work AI systems are very good at.

Read on →

DeepSeek mHC: Fixing the Hidden Chaos in Giant AIs

If you’re anything like me, you’ve probably spent the last few years glued to the whirlwind of AI advancements. From ChatGPT blowing our minds to models getting bigger and smarter by the day, it’s been a wild ride. But every now and then, something comes along that feels like a real paradigm shift – not just more parameters or fancier training data, but a fundamental rethink of how these beasts work under the hood. That’s exactly what DeepSeek’s latest innovation, mHC (short for Manifold-Constrained Hyper-Connections), feels like to me. I stumbled upon their paper right at the start of 2026, and man, it got me excited. It’s not just another incremental tweak; it’s a clever fix to a problem that’s been lurking in neural networks for over a decade.

What the Heck is DeepSeek and Why Should You Care About mHC?

First off, a quick intro to DeepSeek for those who might not be as deep in the AI weeds. DeepSeek is a Chinese AI lab that’s been punching way above its weight class. They’re the folks behind models like DeepSeek-V2 and DeepSeek-Coder, which have consistently outperformed bigger names from OpenAI or Google on certain benchmarks, often at a fraction of the cost. They’re all about efficiency and open-source vibes, which is refreshing in an industry that’s sometimes too secretive.

Now, mHC? It’s their fresh-out-of-the-oven framework, detailed in a paper released on December 31, 2025. The full name is Manifold-Constrained Hyper-Connections, and it’s basically a smarter way to handle the “connections” inside neural networks. If you’ve ever wondered why training massive models can be so unstable – like, why do gradients explode or vanish, causing the whole thing to crash? – mHC tackles that head-on. It’s built on top of something called Hyper-Connections (HC), which was a cool idea from ByteDance in 2025, but HC had some serious flaws. DeepSeek fixed them by adding mathematical “constraints” that keep things stable without sacrificing performance.

Read on →

When AI Quit Whispering and Started Running the Show

Listen to this post

It’s the last day of 2025, and I’m hunched over my laptop in a Bengaluru apartment. The ceiling fans hum low, pushing back the winter chill. The air outside has that fake December bite—cool enough for a light sweater, but not cold like up north. I’ve been deep in this AI mess all year: jumping on calls that drag like bad dates, reading endless Slack chats from tired coders, and sorting through pitch decks that could stack to the moon. Lately, I’ve been tinkering with an agent app right here on my machine—a simple tool to help solo devs juggle tasks, mixing R1 bits with Copilot tricks while the city winds down for the holidays. Remember those wild January chats? Folks swearing AGI would fix everything from sick beds to sock drawers by summer. Nah. It was more like giving a kid the wheel—fun rides, close calls, and a lot of yelling.

But here’s the thing: 2025 didn’t bring the end-of-days robot takeover we joked about. No AIs kicking bosses out of offices. It was the year the nuts and bolts got honest. We cut the fat on power-hungry training. Agents crawled out of chat boxes and into real jobs, handling the boring stuff we used to fake. And the gear? Man, the gear. A trillion bucks thrown at it, leaving data halls wheezing and my power bill 40% higher. As I sip filter coffee—spilling drops on the keys—I feel we’ve tipped over an edge. Not some shiny paradise. Something rawer, more like us with our screw-ups. Let’s sift through the mess and squint at what’s next.

The Wake-Up Call: DeepSeek R1 Kills the “Spend Big” Lie

Think back: January starts with the same old hype. OpenAI rolls out a beefed-up version. Anthropic tweaks Claude to fix bugs and toss in lame jokes. Tech hotshots burn money like candy—$200 million for one “super model,” $500 million for “fast thinking.” Looks cool at first. Then you step back. Training bills had jumped to nuts levels: GPT-4o hit about $100 million. Claude 3.5 Opus close behind. Each one just dumping raw power into the mix. NVIDIA’s shares? They jittered like a guy on too much coffee, touching $150 by March on talk of endless growth. But growing what? More brain cells? Bigger piles of web junk? Felt like stacking cards into a tower—pretty, but one breeze away from flat.

Read on →