This 80,000 Hours podcast with Alexander Berger is really thought-provoking on many levels. One highlight is that he puts into words some of my previously-vague misgivings about focusing on the far future (in addition to number games). From the "Cluelessness" section:
I think it makes you want to just say wow, this is all really complicated and I should bring a lot of uncertainty and modesty to it. ... I think the more you keep considering these deeper levels of philosophy [editor's note], these deeper levels of uncertainty about the nature of the world, the more you just feel like you’re on extremely unstable ground about everything. ... my life could totally turn out to cause great harm to others due to the complicated, chaotic nature of the universe in spite of my best intentions. ... I think it is true that we cannot in any way predict the impacts of our actions. And if you’re a utilitarian, that’s a very odd, scary, complicated thought. But I think that in some sense, basically ignoring it and living your life like you are able to preserve your everyday normal moral concerns and intuitions to me seems actually basically correct.
I think the EA community probably comes across as wildly overconfident about this stuff a lot of the time, because it’s like we’ve discovered these deep moral truths, then it’s like, “Wow, we have no idea.” I think we are all really very much — including me — naive and ignorant about what impact we will have in the future.
I’m going to rely on my everyday moral intuition that saving lives is good ... I think it’s maximizable, I think if everybody followed it, it would be good.
Maybe we should just pray harder. |
No comments:
Post a Comment