![]() And, I will show, such decision theories do not always recommend longtermist interventions. ![]() Instead, I will argue, if the argument's consequentialism is correct, we should choose using a decision theory that is sensitive to risk, and allows us to give greater weight to worse-case outcomes than expected utility theory does. In this paper, I argue that, even if we grant the consequentialist ethics upon which this argument depends, and even if we grant one of the axiologies that are typically paired with that ethics to give the argument, we are not morally required to choose an option that maximises expected utility indeed, we might not even be permitted to do so. ) per unit of resource devoted to them than each of the other available interventions, including those that focus on the health and well-being of the current population. While it is by no means the only one, the argument most commonly given for this conclusion is that these interventions have greater expected goodness (. Longtermism is the view that the most urgent global priorities, and those to which we should devote the largest portion of our resources, are those that focus on (i) ensuring a long future for humanity, and perhaps sentient or intelligent life more generally, and (ii) improving the quality of the lives that inhabit that long future.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |