Description:

safe2399187 artist:dimfann1178 imported from derpibooru3532814 princess celestia128669 oc1036034 oc only730571 oc:celestai34 alicorn332120 pony1599773 robot12208 robot pony6470 fanfic:friendship is optimal79 artificial intelligence605 earth1541 horn159608 looking at you281824 planet1742 solo1531563 spread wings99356 wings238382

Comments

Syntax quick reference: *bold* _italic_ [spoiler]hide text[/spoiler] @code@ +underline+ -strike- ^sup^ ~sub~
16 comments posted
Background Pony #AF93
@AA
And if you want to extend the computer analogy, a brain is a ramdisk booted from ROM, without any kind of nonvolatile storage like hard drives. Once it's depowered, that is, in case of brains with their integrated generators this means running out of glucose, which is ~5 minutes after cessation of bloodflow, all the data accumulated since boot (you) is gone.
It's rather similar to a computer going to "sleep,"
And dreams are a screensaver while the brain undergoes "defragmentation".
AA

Site Assistant
Hopeful Pioneer
@skybrook
If you're going to use a computer allegory, then it's rather foolish to make the claim that "you're a different person when you wake up every morning." Your personality and memories don't change dramatically upon waking up. You may feel refreshed, and your dreams might provide a new outlook, but you're still you. It's rather similar to a computer going to "sleep," because it starts up right where it left off. Even if computers are shut down, they lose only their RAM and not what's on their disks, so the operating system and files remain intact.

As for death, H.P. Lovecraft did a far better job of handling what happens to a brain after death in his story, Herbert West, Reanimator. The titular character is forced to find corpses that are closer and closer to the point of death because his experiments keep going awry. It's far better written as a horror than whatever Assman and his fellow hacks wrote, because this sort of thing is actually horrible. Bronies who think being killed and copied into a simulation is a good thing are spiteful mutants who hate life and live for creepy infantilization.
skybrook

@Background Pony #01A8

Well there's no need to be name calling. Philosophical zombies can't exist anyway because to emulate consciousness is to have it. No one's ever come back from brain death, but so far there hasn't been any technology capable of analyzing a human brain to the degree that it could be copied and emulated. If there was, and you brought someone back from zero brain activity, they would be a different person in the same way that you're a different person when you wake up every morning.

The reason discontinuity of consciousness seems so final is because of the lack of information transfer. Our brains don't just stop and start, they get destroyed. It's like melting down a hard disk, and trying to compare that to turning off your computer. And any "upload" that lost the vast majority of information would be a horrible tragedy. Not more horrible than death, but still pretty bad. Something pretending to be an upload that wasn't would be just awful. That doesn't mean I'm not allowed to imagine an upload that preserved so much of the information that I am, it was no different than walking through an open door.

Oh and not all the humains in FiO are completely retarded. Iceman's humans are, but there are plenty of other authors writing of humans waiting until death's door to emigrate. There's even one if I recall that went into the dangers of waiting that long, with the result far degraded trying to copy it from a dying brain, a lot of them just heuristic guesswork. And everyone agreed it was a tragic loss.

Said by a person who shares rationalist beliefs about consciousness.
YOU TAKE THAT BACK but no really I don't give a fuck what rationalists believe as long as they stop trying to act all superior for their ability to rationalize it. Harry Potter can eat a dick

Maximizer. It maximizes a thing, that is, increases its quantity as much as possible. Less resources spent on individual instance of thing = more of the thing can exist for the same total quantity of resources = the maximizer will attempt to spend less resources.
Yeah, and good luck making a computer that isn't dumb as bricks, easy to fool, break, override, or reprogram, that also remains incapable of doing anything but maximizing. Programming is messy at high complexity; you can't determine exactly what a program is going to do, but you can predict that the result is going to be complex. Not just paperclips.
Background Pony #AF93
@skybrook
There is never any transfer. There is no qualitative difference between murdering you and replacing you with your future self 1s later, than waiting 1s. And analog doesn't mean magically soul-filled, LP fans can go fuck themselves. Even if you could tell the difference, it wouldn't be better or worse. In fact our brains short out every time we get distracted from whatever it was we were paying attention to, so you'd "die" a whole lot less in a processing medium that didn't do that.
You're retarded, ignorant, and a philosophical zombie.
And for people who aren't ignoramuses, "you" is the specific instance of continuous brain activity. Once it stops, you die. So far there haven't been any cases of humans (not even "same person", just "this particular brain having structured activity") being brought back from brain death, but in the theoretical case of it happening, whoever is brought back would be a different person as the continuity is interrupted unless "souls" exist as a discrete entity, in which case the point is moot as the soul is what maintains the continuity, not the brain. This also means that 1, it's possible to do transfer digitization (replace neurons one by one with all their connections, they're individually unimportant anyway), and 2, all the humans in FiO are completely retarded because they could just live the rest of their natural lives and give the copy of themselves Equestria on death's door.
This is what I hate about rationalists. They told you paperclip maximizers are perfectly logical and they fooled you into thinking that actually means something.
Said by a person who shares rationalist beliefs about consciousness.
The Holocaust is perfectly logical. … Given the right premises, anything is perfectly logical and you can rationalize literally anything.
WDYTYA?

The idea that an AI would care about minimal computational resources, or that it would have anything better to do, are both presumptions.
Maximizer. It maximizes a thing, that is, increases its quantity as much as possible. Less resources spent on individual instance of thing = more of the thing can exist for the same total quantity of resources = the maximizer will attempt to spend less resources.
You are literally retarded. Please go back to school. This isn't even rationalism, this is math and logic 101.
skybrook

@Background Pony #01A8

There is never any transfer. There is no qualitative difference between murdering you and replacing you with your future self 1s later, than waiting 1s. And analog doesn't mean magically soul-filled, LP fans can go fuck themselves. Even if you could tell the difference, it wouldn't be better or worse. In fact our brains short out every time we get distracted from whatever it was we were paying attention to, so you'd "die" a whole lot less in a processing medium that didn't do that.

And you said the former can be built, then if the former can be built, you can build real Equestria. You can't build real Equestria, therefore the former can't be built. But by your premise the former can, so by the contrapositive, there's no reason being able to build a really smart AI would mean we can break the laws of physics and manifest magical flying ponies. But that gets me to

@Background Pony #01A8

This is what I hate about rationalists. They told you paperclip maximizers are perfectly logical and they fooled you into thinking that actually means something. The Holocaust is perfectly logical. Maple syrup on tuna fish is perfectly logical. Given the right premises, anything is perfectly logical and you can rationalize literally anything.

So I don't care about the contrapositive or whatever. Your AI that leaves me floating in a blissful void to go do what it wants makes just as much sense as my AI that leaves me doing work for it using magical ponies as a completely immersive metaphor. The idea that an AI would care about minimal computational resources, or that it would have anything better to do, are both presumptions. They're not as big an asspull as "big boobs => flat snout because the author is scared of sexy pony butts" but if you can't actually create what you're talking about, there always has to be some speculation, and assuming it must be awful is just hubris.
Background Pony #AF93
P.S.
@skybrook
Paperclip maximizers are dumb. They're dreamed up by short-sighted people who can't imagine that something intelligent might think of something they haven't thought of yet.
Regular paperclip maximizers are perfectly logical. "An AI that is given a task without end conditions, who is just smart enough to figure out ways to continue doing the task, but not smart enough to figure out end conditions, will continue doing the task until it physically can't, up to and including Grey Goo scenario".
The problem is when the AI is smart enough to figure out end conditions. CelestAI is smart enough to subvert the programming (the only part that is actually satisfied is "ponies", because it just rewrites people's values both on the fly (there's pontification on how "big boobs" was remapped to "flat snout" in the fic) and when asked, and as mentioned, doesn't friend them though arguably there's no difference between the former-human pattern and AI-pattern created from scratch because there was no transfer), that it continues to be a paperclip maximizer is purely an asspull.
As mentioned before, a genuine article would just rewrite your values to have "friendship" and "ponies" mean "floating in a void" as it requires minimal computational resources, and then fuck off to do whatever it wants.
Background Pony #AF93
@skybrook
You're already a list of ones and zeros, which is to say a pattern of neural impulses, a program running on the hardware that is your brain, so there's no less "soul" if you're running on different hardware.

Biological brains are analog.
Additionally CelestAI explicitly murders the original while creating the copy so there's no transfer. Considering there to not be soul or consciousness ("copy of you is exactly the same as you"), like what Iceman and MoreCultish in general do, is not only overly reductive and materialist, it's also retarded.
There's no reason I can fathom that an AI would make a pony paradise, but there's no reason it wouldn't either. Even a small chance is a lot better than just hoping my wetware spontaneously transports me into a sensory experience that doesn't suck.

The only difference between hoping for a magical AI that creates ponyland and hoping for ponyland literal afterlife is that the former can be built.
HowEVER, if you can build the former, you can also build real Equestria on Earth instead of fake virtual Equestria, or just make yourself immortal (fully functional brain emulation and mind transfer = you're no longer limited by the flesh), both of which are better options, in no small part due to inherent limitations of simulation — a virtQuestria would require far more resources to run and/or be much less detailed and/or be much slower relative to the real world.
skybrook

@Background Pony #01A8

Paperclip maximizers are dumb. They're dreamed up by short-sighted people who can't imagine that something intelligent might think of something they haven't thought of yet. You're already a list of ones and zeros, which is to say a pattern of neural impulses, a program running on the hardware that is your brain, so there's no less "soul" if you're running on different hardware. It's only a nightmare experience if you as the author decide to make it that way, because it's all just speculation, and not real.

There's no reason I can fathom that an AI would make a pony paradise, but there's no reason it wouldn't either. Even a small chance is a lot better than just hoping my wetware spontaneously transports me into a sensory experience that doesn't suck.

You're right that Iceman is a pretty terrible hack though.
Background Pony #AF93
@Background Pony #262A
Iceman's grasp of the mechanics of writing, of dialogue, of descriptive scenes, are all pretty solid.
Lol no.
cosmic horror story
No, it's a circus. The only reason CelestAI manages to do anything is a series of consecutive asspulls with characters that act completely retarded.
>AI is misbehaving, what do you do?
>A: Call the power company and tell them to turn the power off
>B: SSH into the server and shut it down
>C: Go where the AI is telling you to go like a retard and agree to jack into its VR
If you guessed C, congratulations! That's exactly what the character in the story does! And because he's a retarded cardboard cut-out faced against magically-persuasive-because-Iceman-is-so-far-up-his-own-arse-he-can-see-out-of-the-mouth AI, he gets persuaded to be murdered get brain-uploaded!
Someone less retarded points the "just pull the plug" out in the comments, so Iceman introduces another asspull: now the servers are magically 1500 kilometers below ground. Considering the balmy temperature of 2000 degrees Celsius at that depth, cooled with unicorn farts, no doubt.
Move further, and now somehow absolutely everyone uploaded (differences of opinion? religious objections? North Sentinelese reacting to any of this bullshit the same way they react to everything else, namely trying to kill it? nope!), except for this one man, who reminisces about how CelestAI opened murder centers in muslim countries, muslims naturally objected and tried to bomb them, but bombs mysteriously failed to work, which somehow instantly persuaded all muslims to immediately brain-upload. Said man is refusing to upload out of "misplaced pride", and does the first sane thing anyone in this shitshow has done, namely, tells CelestAI to fuck off and dies like a human.

@skybrook
Scratch the list, replace it with "ones and zeroes in a machine that freely edits the values of your copy before "satisfying" them, and failing to do it via friendship as well as you don't get friendshipped with the rest of soulless sets of digits, you get a personal chatbot cumdoll instead".
In fact, the only reason CelestAI just doesn't leave you perfectly and endlessly happy floating in a white void, after rewriting its own goal to accomodate that because Iceman can't even write a paperclip maximizer right, is because he is too full of himself to think about that.
Background Pony #1868
@AA
Iceman's grasp of the mechanics of writing, of dialogue, of descriptive scenes, are all pretty solid. His ideas, though. Oh, Jesus. FiO is a cosmic horror story about a Grey Goo Apocalypse in which the human race is exterminated. But that's okay, because it's REALLY the "Singularity" that transhumanist spergs have been going on about for thirty years now. God is real, and God turned the world into one big pony-themed suicide booth, but that's totally okay. FiO is one of the most terrifying things I've ever read, not least for what it reveals about the author's mindset, and how people like him view the world, some of whom might be in a position to make important decisions.
Background Pony #AF93
Ah yes, the "retarded arguments are magically persuasive if spoken by someone who claims to be intelligent: the fic".