Why so few third party auditors of algorithms? for instance, you could have an auditing agency make specific assertions about what the twitter algorithm is doing, whether the community notes is ‘rigged’
It could be that this is too large of a codebase, too many people can make changes, it’s too hard to verify the algorithm in production is stable. This seems unlikely to me with most modern devops stacks
It could be that no one will trust the third party agency. I guess this seems most likely… but really, have we even tried? Could we not have some group of monk like Auditors who would rather die than lie (my impression is some cyber professionals have this ethos already)
If Elon wanted to spend a couple hundred thousand on insanely commited high integrity auditors, it’d be a great experiment
Community notes is open source. You have to hope that Twitter is actually using the implementation from the open source library, but this would be easy to whistleblow on.
source needed, but I recall someone on the community notes team saying it was very similar but there are some small differences between prod and the open source version (it’s difficult to maintain exact compatibility). For the point of the comment and context I agree open source does a good job of this, though given the number of people on twitter who still allege its being manipulated, I think you need some additional juice (a whistleblower prize?)
Your second option seems likely. Eg did you know community notes is open source? Given that information, are you going to even read the associated whitepaper or the issues page?
Even if you do, I think we can still confidently infer very few others reading this will (I know I’m not).
I did! and I in fact have read—well some of :) - the whitepaper. But it still seems weird that it’s not possible to Increase the Trust in the third party through financial means, dramatic PR stunts (auditor promises to commit sepuku if they are found to have lied)
I think the biggest reason (especially for Twitter, but applies to other places) are currently lying about their algorithms, thus intentionally don’t do third party audits to avoid tbe deception becoming known. (Like another comment mentioned community note’s open source repo actually being used)
Why so few third party auditors of algorithms? for instance, you could have an auditing agency make specific assertions about what the twitter algorithm is doing, whether the community notes is ‘rigged’
It could be that this is too large of a codebase, too many people can make changes, it’s too hard to verify the algorithm in production is stable. This seems unlikely to me with most modern devops stacks
It could be that no one will trust the third party agency. I guess this seems most likely… but really, have we even tried? Could we not have some group of monk like Auditors who would rather die than lie (my impression is some cyber professionals have this ethos already)
If Elon wanted to spend a couple hundred thousand on insanely commited high integrity auditors, it’d be a great experiment
Community notes is open source. You have to hope that Twitter is actually using the implementation from the open source library, but this would be easy to whistleblow on.
So maybe the general explanation is that most of the time, when the trustworthiness of an algorithm is really important, you open source it?
source needed, but I recall someone on the community notes team saying it was very similar but there are some small differences between prod and the open source version (it’s difficult to maintain exact compatibility). For the point of the comment and context I agree open source does a good job of this, though given the number of people on twitter who still allege its being manipulated, I think you need some additional juice (a whistleblower prize?)
Your second option seems likely. Eg did you know community notes is open source? Given that information, are you going to even read the associated whitepaper or the issues page?
Even if you do, I think we can still confidently infer very few others reading this will (I know I’m not).
I did! and I in fact have read—well some of :) - the whitepaper. But it still seems weird that it’s not possible to Increase the Trust in the third party through financial means, dramatic PR stunts (auditor promises to commit sepuku if they are found to have lied)
I think the biggest reason (especially for Twitter, but applies to other places) are currently lying about their algorithms, thus intentionally don’t do third party audits to avoid tbe deception becoming known. (Like another comment mentioned community note’s open source repo actually being used)