Thought-provoking piece. But one thing feels off: governance is being treated as something done to AI, not with it.
We talk about coordination, restrictions, treaties—but how many of the people shaping these decisions actually understand what today’s AI can and can’t do?
Policymakers should be power users. The best governance won’t come from abstract principles—it’ll come from hands-on experience. From people who’ve tested the limits, seen the failures, and know where the edge cases live. Otherwise we’re making rules for a system we barely understand.
Unpredictable outcomes are inevitable. The question is whether we’ll have skilled people—users and designers both—who can respond with clarity, not fear.
Before we restrict the tools, we need to understand them. Then governance won’t just be policy—it’ll be practice.
They don’t need to wait for regulation. They could pause voluntarily—not out of fear, but to build real understanding. Even they don’t fully grasp what their chips are enabling.
Before we chase more scale, we need to understand what we’ve already built. The class of serious AI power users—people who can push the systems to their edge and report back meaningfully—hardly exists. Chipmakers could help create it by getting advanced hardware into the right hands.
If we’re going to slow down, let it be to think better, not just to move slower.
Thought-provoking piece. But one thing feels off: governance is being treated as something done to AI, not with it.
We talk about coordination, restrictions, treaties—but how many of the people shaping these decisions actually understand what today’s AI can and can’t do?
Policymakers should be power users. The best governance won’t come from abstract principles—it’ll come from hands-on experience. From people who’ve tested the limits, seen the failures, and know where the edge cases live. Otherwise we’re making rules for a system we barely understand.
Unpredictable outcomes are inevitable. The question is whether we’ll have skilled people—users and designers both—who can respond with clarity, not fear.
Before we restrict the tools, we need to understand them. Then governance won’t just be policy—it’ll be practice.
Chipmakers are a good place to start.
They don’t need to wait for regulation. They could pause voluntarily—not out of fear, but to build real understanding. Even they don’t fully grasp what their chips are enabling.
Before we chase more scale, we need to understand what we’ve already built. The class of serious AI power users—people who can push the systems to their edge and report back meaningfully—hardly exists. Chipmakers could help create it by getting advanced hardware into the right hands.
If we’re going to slow down, let it be to think better, not just to move slower.