My first guess is that Amodei simply treats the board meeting like a relatively standard for-profit company. Talks about revenue, growth, new features, new deals, etc.
My guess is that it’s a relatively common occurrence for Founders/CEOs to believe that their product is going to do wondrous things and take over the world, and that investors mostly see this as a positive.
Like, I don’t think VCs are especially trying to be intellectuals, and don’t mind much if people around them seem to believe inconsistent or incoherent things. I expect many founders around him believe many crazy things and he doesn’t argue with them about it.
Edit: Seems I was explaining something that wasn’t true! Points awarded to Eli’s model that was confused.
Yes! But also, points deducted from Eli’s epistemic practice that he was confused and said explicitly that none of the hypotheses to describe the observation seemed non-bizarre, and then didn’t have the thought “maybe the I’m mistaken about the data.”
My first guess is that Amodei simply treats the board meeting like a relatively standard for-profit company. Talks about revenue, growth, new features, new deals, etc.
Maybe, but that doesn’t feel like it explains my confusion!
Reid is interested in the future of AI. Presumably he’s had conversations with Dario about it?
My guess is that it’s a relatively common occurrence for Founders/CEOs to believe that their product is going to do wondrous things and take over the world, and that investors mostly see this as a positive.
Like, I don’t think VCs are especially trying to be intellectuals, and don’t mind much if people around them seem to believe inconsistent or incoherent things. I expect many founders around him believe many crazy things and he doesn’t argue with them about it.
Edit: Seems I was explaining something that wasn’t true! Points awarded to Eli’s model that was confused.
Yes! But also, points deducted from Eli’s epistemic practice that he was confused and said explicitly that none of the hypotheses to describe the observation seemed non-bizarre, and then didn’t have the thought “maybe the I’m mistaken about the data.”