Saturday, September 17, 2016

... on Why do theists believe that an omnipotent God is a more likely creator of our universe than a mortal alien running a supercomputer simulation?

Why do theists believe that an omnipotent God is a more likely creator of our universe than a mortal alien running a supercomputer simulation?

Arguing that agents in a simulation would feel “meaningless” is pure conjecture, as there isn’t evidence that agents in a sufficiently advanced simulation (one that captured the likeness of our world down to the quantum level) wouldn’t have “consciousness” and feelings of meaningfulness.

Hans-Georg Lundahl
, Bio:Studied religions as curious parallels and contrasts to Xtian faith since 9, 10?
For one thing, supercomputers cannot produce human reason.

Isaac Comer-Wyrd
… yet

Hans-Georg Lundahl
Or ever in future.

Isaac Comer-Wyrd
That’s an awful lot of certainty for someone that can’t play Go as well as a computer:

The Sadness and Beauty of Watching Google’s AI Play Go

It’s only significant because the program doesn’t have an algorithm for how to play Go really well— the game is too complicated to allow us to make an algorithm for that ever.

Instead, the program does what a person would do—it examines many, many Go games played by experts and it… *learns* to recognize patterns.

In other words, this computer does what a person does—pattern recognition— and it does it *better* than a person can— better than *any* person on Earth can.

Obviously, of course, it’s still limited — it’s only good at recognizing patterns in the game Go. But “pattern recognition” is kinda the central feature of the thing we call the “thought process”.

I would not want to take a bet that computers will never be able to think. Pattern recognition is what allows speech-to-text on your phone to (sometimes) work. It’s what allows cameras to recognize faces.

And on and on and on.

Hans-Georg Lundahl
Sorry, but computers mimicking learning process and examining process does not mean they actually understand and reason.

Isaac Comer-Wyrd
Oh there's been pedantic semantics as to whether mimicry is the same thing as the thing since forever.

Philosopher DeCartes said dogs don't have souls and, therefore, cannot *really* feel pain.

They can only provide a convincing facsimile of pain he said.

Now I recognize that a computer does not even count as a living thing.

I'm not saying the reasoning necessarily *must* hold. I'm just arguing for a degree of uncertainty in the whole thing.

For a long time I didn't think “strong” AI was a realistic possibility. But everything is less certain now.

Hans-Georg Lundahl
Dogs do have animal souls, they just don’t have rational souls and feeling pain is within the domain of an animal soul. Or feeling pleasure.

I don’t think the semantics is pedantic, it is rather very basic to our correct understanding of how mind and computers relate.

An abacus can mimic a mathematician’s operation, but an abacus cannot understand what it is doing.

Sankar Srinivasan
Humans are biochemical computers, or are you unaware of this fact?

Hans-Georg Lundahl
I have heard the theory.

I will remain as unaware of it being fact as of equatorial polar bears being a valid species.

No comments: