RSS

QURI

TagLast edit: 8 Mar 2021 4:28 UTC by Yoav Ravid

The Quantified Uncertainty Research Institute (QURI) is a nonprofit set up to study forecasting and epistemics. You can see their website here.

Related Pages: Prediction Markets, Forecasting & Prediction

An­nounc­ing the Fore­cast­ing In­no­va­tion Prize

15 Nov 2020 21:12 UTC
69 points
5 comments2 min readLW link

Epistemic Progress

ozziegooen20 Nov 2020 19:58 UTC
61 points
16 comments5 min readLW link

Squig­gle: An Overview

ozziegooen24 Nov 2020 3:00 UTC
60 points
6 comments7 min readLW link

Squig­gle: Tech­ni­cal Overview

ozziegooen25 Nov 2020 20:51 UTC
18 points
3 comments9 min readLW link

[Question] Prize: In­ter­est­ing Ex­am­ples of Evaluations

28 Nov 2020 21:11 UTC
43 points
25 comments2 min readLW link

In­tro­duc­ing Metafore­cast: A Fore­cast Ag­gre­ga­tor and Search Tool

7 Mar 2021 19:03 UTC
83 points
6 comments4 min readLW link

Ques­tions are tools to help an­swer­ers op­ti­mize utility

ozziegooen24 May 2021 19:30 UTC
33 points
6 comments4 min readLW link

Or­a­cles, In­form­ers, and Controllers

ozziegooen25 May 2021 14:16 UTC
15 points
2 comments3 min readLW link

The Prac­tice & Virtue of Discernment

ozziegooen26 May 2021 0:34 UTC
41 points
11 comments11 min readLW link

Two Defi­ni­tions of Generalization

ozziegooen29 May 2021 4:20 UTC
33 points
4 comments2 min readLW link

All Metafore­cast COVID predictions

NunoSempere16 Aug 2021 18:30 UTC
16 points
0 comments1 min readLW link

Metafore­cast up­date: Bet­ter search, cap­ture func­tion­al­ity, more plat­forms.

NunoSempere16 Aug 2021 18:31 UTC
35 points
0 comments3 min readLW link

AI Safety Papers: An App for the TAI Safety Database

ozziegooen21 Aug 2021 2:02 UTC
81 points
13 comments2 min readLW link

18 pos­si­ble mean­ings of “I Like Red”

ozziegooen23 Aug 2021 23:25 UTC
29 points
14 comments3 min readLW link

In­for­ma­tion Assets

ozziegooen24 Aug 2021 4:32 UTC
42 points
3 comments9 min readLW link

In­tel­li­gence, epistemics, and san­ity, in three short parts

ozziegooen15 Oct 2021 4:01 UTC
14 points
2 comments3 min readLW link

Pri­ori­ti­za­tion Re­search for Ad­vanc­ing Wis­dom and Intelligence

ozziegooen18 Oct 2021 22:28 UTC
49 points
8 comments5 min readLW link
(forum.effectivealtruism.org)

An­nounc­ing Squig­gle: Early Access

ozziegooen3 Aug 2022 19:48 UTC
51 points
7 comments7 min readLW link
(forum.effectivealtruism.org)

Guessti­mate: Why and how to use it

24 Jan 2023 16:24 UTC
6 points
0 comments3 min readLW link
(forum.effectivealtruism.org)

Eli Lifland on Nav­i­gat­ing the AI Align­ment Landscape

ozziegooen1 Feb 2023 21:17 UTC
9 points
1 comment31 min readLW link
(quri.substack.com)

Rel­a­tive Value Func­tions: A Flex­ible New For­mat for Value Estimation

ozziegooen18 May 2023 16:39 UTC
20 points
0 comments1 min readLW link

Distinc­tions when Dis­cussing Utility Functions

ozziegooen9 Mar 2024 20:14 UTC
24 points
7 comments1 min readLW link
No comments.