PZ Myers has a posting where he makes a short argument against transhumanist uploading. This was relevant to my interests, because I think uploading is bonkers.
He has two arguments, really. Unfortunately one (using entropy) is just wrong: entropy doesn’t prevent complex systems; it only requires that more entropy be generated to offset them. So long as you convert only a tiny fraction of the universe into computronium, entropy won’t stand in your way.
His other argument was better. but sketchy: uploaders prefer “what is good for the individual over what is good for the population”. As he was arguing with Eliezer Yudkowsky among others, this is probably a misfire– judging from his Harry Potter fanfic, Yudkowsky does consider it an imperative that technology benefits everyone.
Still, there’s the germ of an actual good argument in there: that the uploaders think way too much about personally not dying, and way not enough about how to make what life we have worth living. Morally, it’s hard to argue that our biggest problem is that people don’t live 1000 or 1,000,000 years. If humans keep on with the sort of behavior and morality and economics they have right now, such lifetimes would be hellish. Even if you have a wildly optimistic view of how well we’re doing, prolonging lifetimes even to a couple hundred years would be horrible for 90% of the population, and that’s assuming we can even keep our civilization going. (If you want to live forever, climate change is not your grandchildren’s problem, it’s yours.) So even if you want immortality, you’d better prioritize, well, almost everything else.
But that’s a discussion for another day. I was caught up short by this comment, by one Gregory in Seattle:
There is a growing belief among memory researchers that the brain relies on “archetypes.” You actually have only one or two physical memories of the taste of bacon: all of the apparent memories of bacon link back to them. REM sleep is when the brain recompiles, tossing out actual memories from short-term storage and integrating the day’s experiences into long-term storage with heavy object reuse (pardon the computerese.)
According to this model, children learn faster because they have fewer archetypes: they are building a “library” and links into them are pretty straightforward. As we get older, though, the ability to store and link novel information becomes more difficult and memory begins to ossify. Someone who pursues life-long learning can stave this off, but not completely. To use another computer example, the problem does not appear to be one of storage so much as the storage becoming fragmented. The ability to link begins to suffer, and memories begin to get lost in the shuffle.
Without a major redesign of how the brain stores memories, very long lifespans will probably bring us to a point where novel experiences cannot be integrated at all. We see this sort of slow down in people who are 90 and 100; I cannot imagine what it would be like for someone who is 200, much less 500 or 1000.
I’d never heard about this theory, but then I don’t know anything really about memory research. But it’s a fascinating idea, and one that makes a lot of sense as a way for a creature of limited brain to organize the reams of sensory data that swamp it daily.
Though it’s not so much an argument against long lives as an argument that if we want to have them, we’ll have to change some basic facts about ourselves. That’s why, in the Incatena, I have people doing a kind of brain reboot every century or two: throw out a bunch of memories, loosen the connections, re-adolescentize the brain.
To put it another way, your basic personality, attitudes, ideology, politics, etc. are generally pretty well firmed up by the time you’re 30. You can adapt to new things after that, but with increasing difficulty– by the time you’re 80, you’re a curmudgeon who hates the kids’ music and clothing and votes for reactionaries. That’s acceptable when lifetimes are 90 years, but not if they’re 900. If you refuse to die, then you have to do something to regain your adaptability, for your own benefit and for that of society.