[Question] Would a halfway copied brain emulation be at risk of having different values/​identity?

When I was think­ing about the con­cept of hu­man brain em­u­la­tion re­cently, a dis­turb­ing idea oc­curred to me. I have never seen any­one ad­dress it, so I sus­pect it is prob­a­bly caused by my be­ing deeply con­fused about ei­ther hu­man neu­ro­biol­ogy or com­puter sci­ence. I thought I’d ask about it in the hopes some­one more in­formed would be able to ex­plain it and the idea could stop both­er­ing me:

Imag­ine that a brain em­u­la­tion is in the pro­cess of be­ing en­coded into a stor­age medium. I don’t think that it is rele­vant whether the copy is be­ing made from an or­ganic brain or an ex­ist­ing em­u­la­tion. Pre­sum­ably it takes some amount of time to finish copy­ing all the in­for­ma­tion onto the stor­age me­dia. If the in­for­ma­tion is about a per­son’s val­ues or per­son­al­ity, and it is only halfway copied, does that mean for a brief mo­ment, be­fore the copy­ing pro­cess is com­plete, that the par­tial copy has very differ­ent per­son­al­ity or val­ues from the origi­nal? Are the par­tially copied per­son­al­ity/​val­ues a differ­ent, sim­pler set of per­son­al­ity/​val­ues?

Pre­sum­ably the copy is not con­scious dur­ing the copy­ing pro­cess, but I don’t think that af­fects the ques­tion. When peo­ple are un­con­scious they still have a per­son­al­ity and val­ues stored in their brain some­where, they are just not ac­tive at the mo­ment.

I find this idea dis­turb­ing be­cause it im­plies that em­u­lat­ing any brain (and pos­si­bly copy­ing de novo AI as well) would in­evitably re­sult in cre­at­ing and de­stroy­ing mul­ti­ple differ­ent per­son­al­ity/​value sets that might count as sep­a­rate peo­ple in some way. No one has ever brought this up as an eth­i­cal is­sue about up­loads as far as I know (al­though I have never read “Age of Em” by Robin Han­son), and my back­ground is not tech or neu­ro­science, so there is prob­a­bly some­thing I am miss­ing .

Some of my the­o­ries of things I am miss­ing in­clude:

  • I am naively view­ing per­son­al­ity/​val­ues as be­ing stored in the brain in a sim­ple, list-like for­mat. For ex­am­ple, I may be imag­in­ing that some­one who likes read­ing and jog­ging is be­ing copied, and there is a brief pe­riod where the copy is some­one who only likes read­ing be­cause “jog­ging” has not been copied yet. In re­al­ity per­son­al­ity/​val­ues are prob­a­bly dis­tributed in the brain in some com­plex way that would not work like that. If they are only par­tially copied the em­u­la­tion would sim­ply crash, or not have val­ues (be un­mo­ti­vated). It would not have sim­pler or differ­ent val­ues.

  • I am imag­in­ing brains as hav­ing one sec­tion for per­son­al­ity and val­ues, and one sec­tion for in­stru­men­tal things like thought and judge­ment. In re­al­ity this is not the case, the parts are close to­gether enough that a par­tially copied brain would not be a sim­pler ver­sion of the origi­nal. It would just crash/​go in­sane.

  • It might be the­o­ret­i­cally pos­si­ble to copy a brain in such a way that you could cre­ate a sim­pler ver­sion of the origi­nal with a sim­pler per­son­al­ity and val­ues that would have a sep­a­rate, dis­tinct iden­tity. But you’d have to do it re­ally pre­cisely and in­ten­tion­ally to get it to work. Nor­mally par­tial copies would just not work/​crash, un­less they are so close to be­ing com­pletely copied that they are more like the origi­nal with mild brain dam­age/​am­ne­sia rather than like a differ­ent per­son.

  • Brain em­u­la­tion would not in­volve literal neu­rons, it would in­volve code that says where a vir­tual neu­ron is. So hav­ing some code miss­ing dur­ing copy­ing would not re­sult in the neu­rons rerout­ing into new paths that would de­velop into a new per­son­al­ity the way they might in some­one with brain dam­age. The miss­ing code would just re­sult in the em­u­la­tor pro­gram crash­ing.

I’d ap­pre­ci­ate if some­one with more knowl­edge about this is­sue, or pro­gram­ming/​neu­ro­science would be will­ing to ex­plain where my think­ing about it is go­ing wrong. I am in­ter­ested in ex­pla­na­tions that are con­di­tional on brain em­u­la­tion work­ing. Ob­vi­ously if brain em­u­la­tion doesn’t work at all this is­sue won’t arise. Thank you in ad­vance, it is an is­sue that I con­tinue to find dis­turb­ing.