From: Tim Tyler 
Subject: Re: Robot Evolution
Date: Thu, 28 Dec 2006 13:15:29 -0500 (EST)

Phil Roberts, Jr. wrote:

> Yes.  Lucas addressed this issue, in response to
> Whitely and Bannaceraf as I recall:
>    Banacerraf protests that "It is conceivable that
>    another machine [or formal system] could do that
>    as well."  Of course.  But that other machine was
>    not the machine that the mechanist was claiming
>    that I was.  It is the machine that I am alleged
>    to be that is relevant: and since I can do
>    something that it cannot, I cannot be it.  Of
>    course It is still open for the mechanist to
>    alter his claim and say, now, that I am that
>    other machine which, like me, could do what the
>    first machine could not.  Only, if he says that,
>    then I shall ask him "Which other machine?" and
>    as soon as he has specified it, proceed to find
>    something else which that machine cannot do and
>    I can.  I can take on all comers, provided only
>    they come one by one in the sense of each being
>    individually specified as being the one that it
>    is:  and therefore I can claim to have tilted at
>    and laid low all logically possible machines.
>    An idealized person, or mind, may not be able to
>    do more than all logically possible machine can,
>    between them, do:  but for each logically
>    possible machine there is something which he can
>    can do and it cannot; and therefore he cannot be
>    the same as any logically possible machine.
>    (J. R. Lucas, 'The Monist', vol 52, pp 145-158)

Vacuous argument, wrong conclusion.

The passage assumes as a premise: "it is the machine
that I am alleged to be that is relevant: and since
I can do something that it cannot, I cannot be it."

That is not going to be true when comparing
with a sufficiently powerful machine.

Providing a system with its Godel sentence as
an axiom does indeed turn it into a different,
more powerful machine.

However *asking* a system whether it's Godel sentence
is true doesn't do that - the system is insufficiently
powerful to tell if what it's been given really is its
own Godel sentence or not.  It is not any the wiser about
the truth of the statement after being presented with it;
because the sentence is crafted in such a way that the
system doesn't properly understand it.

The argument from Godel's theorem really is totally dead.

If you don't see why, I recommend consulting the numerous
refutations of the argument on the internet until you
understand exactly what is wrong with it.

There /may/ be other reasons for thinking machines
cannot match the computational powers of humans -
but the argument from Godel's theorem is simply defunct.

It was been dead since the moment it was proposed - and
only continues its zombie existence in the minds of those
who don't understand it :-(

Tim Tyler