Viewing last 25 versions of comment by skybrook on image #2953091

skybrook

"@Background Pony #01A8":/images/6520891#comment_14322

Well there's no need to be name calling. Philosophical zombies can't exist anyway because to emulate consciousness is to have it. No one's ever come back from brain death, but so far there hasn't been any technology capable of analyzing a human brain to the degree that it could be copied and emulated. If there was, and you brought someone back from zero brain activity, they would be a different person in the same way that you're a different person when you wake up every morning.

The reason discontinuity of consciousness seems so final is because of the lack of information transfer. Our brains don't just stop and start, they get destroyed. It's like melting down a hard disk, and trying to compare that to turning off your computer. And any "upload" that lost the vast majority of information would be a horrible tragedy. Not more horrible than death, but still pretty bad. Something pretending to be an upload that wasn't would be just awful. That doesn't mean I'm not allowed to imagine an upload that preserved so much of the information that I am, it was no different than just walking through an open door.

Oh and not all the humains in FiO are completely retarded. Iceman's humans are, but there are plenty of other authors writing of humans waiting until death's door to emigrate. There's even one if I recall that went into the dangers of waiting that long, with the result far degraded trying to copy it from a dying brain, a lot of them just heuristic guesswork. And everyone agreed it was a tragic loss.

[bq]Said by a person who shares rationalist beliefs about consciousness.[/bq]YOU TAKE THAT BACK but no really I don't give a fuck what rationalists believe as long as they stop trying to act all superior for their ability to rationalize it. Harry Potter can eat a dick

[bq]Maximizer. It maximizes a thing, that is, increases its quantity as much as possible. Less resources spent on individual instance of thing = more of the thing can exist for the same total quantity of resources = the maximizer will attempt to spend less resources. [/bq]Yeah, and good luck making a computer that isn't dumb as bricks, easy to fool, break, override, or reprogram, that also remains incapable of doing anything but maximizing. Programming is messy at high complexity; you can't determine exactly what a program is going to do, but you can predict that the result is going to be complex. Not just paperclips.
No reason given
Edited by skybrook