From the OpenAI report, they also give 9% as the no-tool pass@1:
Research-level mathematics: OpenAI o3‑mini with high reasoning performs better than its predecessor on FrontierMath. On FrontierMath, when prompted to use a Python tool, o3‑mini with high reasoning effort solves over 32% of problems on the first attempt, including more than 28% of the challenging (T3) problems. These numbers are provisional, and the chart above shows performance without tools or a calculator.
Whether this is feasible depends on how concentrated that 0.25% of the year is (expected to be), because that determines the size of the battery that you’d need to cover the blackout period (which I think would be unacceptable for a lot of AI customers).
If it happens in a single few days then this makes sense, buying 22GWh of batteries for a 1GW dataset is still extremely expensive (2B$ for a 20h system at 100$ / kWh plus installation, maybe too expensive for reliability for a 1GW datacenter I would expect, assuming maybe 10B revenue from the datacenter??). If it’s much less concentrated in time then a smaller battery is needed (100M$ for a 1h system at 100$/kWh), and I expect AI scalers would happily pay this for the reliability of their systems if the revenue from those datacenters