Suppose you were offered the following gamble:

1. With probability p, you will live forever at your current age.

2. With probability (1-p), you instantly, painlessly die.

What is your critical value of p?  If you combine expected utility theory with the empirical observation that happiness is pretty flat over time, it seems like you should be willing to accept a very tiny p.  But I can’t easily say that I’d accept a p<1/3. 

Perhaps the main reason is that all the people I care about would suffer a lot more from my instant death than they’d gain from my immortality.  But even if I were fully selfish, I wouldn’t be enthusiastic even at p=.5.  Should I get my head examined?