My current understanding is that getting good regulations is a chicken-egg problem. It’s difficult to pass a regulation if there isn’t yet the technical implementation details necessary to execute it. You can try to create broad FDA-like authorities that can invent the technical details later, but, that requires more political will and are more at risk at turning into molochian molasses without really doing anything.
One point of the Evals Plan was to lay the groundwork to make it possible to make regulations later.
That said, I agree something vibeswise pretty feels off about the Evals / AI Companies dynamic.
You might argue “predictably, the orgs require a culture that can interface with the labs, and this has spillover dynamics that make it less likely that real/useful regulation gets passed.” But, this post didn’t really connect those dots.
I don’t think the opening point here is very reasonable. (I think the latter points feel more reasonable)
My current understanding is that getting good regulations is a chicken-egg problem. It’s difficult to pass a regulation if there isn’t yet the technical implementation details necessary to execute it. You can try to create broad FDA-like authorities that can invent the technical details later, but, that requires more political will and are more at risk at turning into molochian molasses without really doing anything.
One point of the Evals Plan was to lay the groundwork to make it possible to make regulations later.
That said, I agree something vibeswise pretty feels off about the Evals / AI Companies dynamic.
You might argue “predictably, the orgs require a culture that can interface with the labs, and this has spillover dynamics that make it less likely that real/useful regulation gets passed.” But, this post didn’t really connect those dots.