Prometheus has two primary goals:

  1. Increase the accessibility of science bowl to encourage scientific learning and thinking.

  2. Introduce talented high schoolers to existential risks and encourage them to make a larger impact.

Existential Risk

This year, Prometheus will be including a new category based on existential risk, or X-risk for short.

The first form of existential risk is an event that causes human extinction. This is obviously something we should avoid, but many see it as such because it would cause an immense amount of suffering for every human on Earth and prevent them from fulfilling their lives. However, the primary damage of an existential risk comes from the loss of the future, not just the present; human extinction would also indirectly lead to the death (or non-existence) of every possible future human, which could contain orders of magnitude more people than are currently alive today. As such, it makes sense to try to protect this future and ensure that humanity can reach its fullest potential by working to stop existential risks.

Learn more:
What are existential risks?
Why should we care about existential risks?
Further discussion

Longtermism

Longtermism is an ethical view suggesting that future people matter just as much as those today. This implies that because there is a significant probability of the existence of far more humans in future generations, we should work towards ensuring that those future generations get a chance to exist by minimizing existential risks, which by definition reduce that chance to zero.

This does not necessarily advocate for any specific future, and ideas on this topic range wildly. The exact future we aim for is non-obvious and might be something entirely different from what we expect. Still, no matter what you think would be best for humanity, that optimal future necessarily includes the survival of humanity, making the minimization of existential risk an instrumentally convergent goal for improving the future. Prometheus focuses on what some call “weak” longtermism, the belief that protecting future generations should be one of our priorities, not necessarily our greatest priority. (Read about the difference here.)

Learn more:
How should we define longtermism?
Why should we care about future generations?
Links & further discussion

Epistemics

Existential risks are extremely difficult to think about. Many top researchers in these fields disagree with each other on key concepts, and many outside this sphere question whether these beliefs are even valid. To make this easier, we can use concepts from epistemics, the study of knowledge and beliefs, to make our beliefs clearer. Specifically, many in these fields use Bayesianism to quantify beliefs based on probabilities and evidence according to Bayes’ rule.

Various other forms of statistics are also useful for quantifying ideas related to existential risks. For example, determining the likelihood of certain events is crucial to calculating the actual probabilities of these risks, as Toby Ord approximates in his book The Precipice. Statistics allow us to turn subjective beliefs, e.g. “I think AI has a high chance of killing us all,” into concrete beliefs that can be narrowed down and used to decide the relative importance of various ideas/causes.

Learn more:
Using Bayes’ Rule & Bayes’ Theorem
Forecasting your beliefs
Further discussion

X-Risk Topic Distribution

Questions in the X-risk category will be approximately divided according to the following distribution:

25% • Artificial intelligence

25% • Biosecurity

12.5% • Nuclear weapons & technology

12.5% • Climate change

12.5% • Other risks, e.g. nanotechnology, extraterrestrial organisms, supervolcanoes

12.5% • Epistemics and statistics

Learn more:

Artificial intelligence:
Vox: Taking AI seriously
Cold Takes: Why alignment is hard
Ajeya Cotra: Why an AI takeover is likely
Eliciting Latent Knowledge*
AGI Safety Fundamentals*
AI Alignment Forum

Biosecurity:
Dept of Health & Human Services: Biorisk management
Biosecurity Fundamentals
CERI: Biosecurity resources*
Biorisk reading list

Nuclear:
Nuclear war as a global catastrophic risk
LLNL: Nuclear Weapons Technology 101
80,000 Hours: Nuclear war*
CERI: Nuclear security

Climate change:
The understatement of existential climate risk
Exploring catastrophic climate risks
StudySmarter: Climate change & feedback loops
80,000 Hours: Climate change*

Epistemics:
LessWrong* & The Sequences*
Center for Applied Rationality
Metaculus

*Highly recommended