Are we already at the point where AI, or some app, can summarize podcasts accurately and extract key takeaways with relatively technical interviewees like Dylan, so we don’t need 5 hours (or even 2.5h at 2x)?
I actually ended up listening to this episode and found it quite high-signal. Lex kept his peace-and-love-kumbaya stuff to a minimum and Dylan and Nathan actually went quite deep on specifics like innovations in Deepseek V3/R1/R1Zero, and hardware and export controls
Are we already at the point where AI, or some app, can summarize podcasts accurately and extract key takeaways with relatively technical interviewees like Dylan, so we don’t need 5 hours (or even 2.5h at 2x)?
Haven’t used it much but dexa.ai tries to let you interact with podcast episodes, here’s this episode:
https://dexa.ai/d/e2fc9f6e-e1d5-11ef-8e88-ffec9447dc76
Much appreciated, thanks!
I actually ended up listening to this episode and found it quite high-signal. Lex kept his peace-and-love-kumbaya stuff to a minimum and Dylan and Nathan actually went quite deep on specifics like innovations in Deepseek V3/R1/R1Zero, and hardware and export controls