When considering which problems are largest in scale, most important, and most neglected, it’s crucial to think not just of people alive today, but also of future generations. The lives of future generations are just as important as ours, but they don’t have the opportunity to influence the world into which they are born. Moreover, the number of humans that will exist in the future will likely outnumber the humans who have existed till now. This cause area focuses on global catastrophic risks associated with emerging technologies that have the potential to cause suffering on an astronomical scale to future generations.
It is extremely difficult to identify promising interventions in this cause area because it is not only highly complex and hence difficult to map, but also highly neglected compared to previously mentioned cause areas. The best we can do now is conduct research and raise awareness to ensure we minimise the the likelihood of global catastrophic risks becoming a reality.
We recommend the Machine Intelligence Research Institute and our own fund. These recommendations are made with a focus on scenarios worse than extinction. If you are primarily concerned with extinction scenarios, we recommend the Long-term Future Fund run by the Centre for Effective Altruism instead.