A tentative solution to a certain mythological beast of a problem

First post please be bru­tal.

For bet­ter or worse I learnt about the Roko’s Basilisk Prob­lem that de­vel­oped from this site and I had an idea I wanted to bounce off the com­mu­nity most ac­quainted with the prob­lem.

What if ev­ery­one knew? The AI in this case pun­ishes peo­ple for not pur­su­ing its cre­ation and thereby tac­itly al­low­ing suffer­ing to con­tinue. Fear of this pun­ish­ment com­pels peo­ple to act to­wards its cre­ation such that the threat or even ac­tual pun­ish­ment of the few (who know) al­lows for the good of the many in the fu­ture. But what if ev­ery­one knew about the prob­lem, the AI would then have no util­i­tar­ian in­cen­tive to pun­ish any­one for not con­tribut­ing to its cre­ation. For, since ev­ery­one knew, it would have to pun­ish ev­ery­one re­sult­ing in more suffer­ing than it would pre­vent from such pun­ish­ments.

I un­der­stand the ob­vi­ous flaw of past gen­er­a­tions be­ing finite and fu­ture gen­er­a­tions be­ing in­finite in pop­u­la­tion. But surely at least it merely be­comes a race against the clock, pro­vided we can en­sure more peo­ple know now than could pos­si­bly ex­ist in the fu­ture (that last part sounds in­com­pre­hen­si­bly strange/​wrong to me but I’m sure a cre­ative mind could find a way of mak­ing that the­o­ret­i­cally pos­si­ble)

*Edit* For ex­am­ple—you could ma­nipu­late events such that the prob­a­bil­ity of fu­ture pop­u­la­tions be­ing larger than past pop­u­la­tions is less than the prob­a­bil­ity that fu­ture pop­u­la­tions are smaller than past gen­er­a­tions. The con­stant threats of nu­clear an­nihila­tion (pri­mar­ily this), cli­mate change, and dis­ease could lend them­selves to this.

The idea is rem­i­nis­cent of how peo­ple han­dle black­mail in real life. If you don’t want to be black­mailed make sure ev­ery­one knows the se­cret you don’t want re­vealed. Hid­ing in plain site. Vuln­er­a­ble but on your own terms