Jump to content

Artificial Intelligence Fails At Mimicry


Legion
 Share

Recommended Posts

Another interesting article by Tim Gwinn... http://www.panmere.com/?p=94

 

In the Turing Test, including this variant, the requirement is merely successful simulation, which is just mimicry. There is no requirement that the simulacrum embody any of the causal or inferential entailments or organizational properties of the system which it is simulating. Indeed, the Test is specifically designed in a way which intentionally blocks access to investigating those properties.

 

Now, plainly, if the human mind were algorithmic, the behavior that a human exhibits must, by definition, have a corresponding Turing machine. Further, by definition, such a machine is simulable by another computer algorithm. Indeed, there will be a arbitrarily large number of such simulations. Each of these simulations amounts to merely an exercise in curve-fitting.

 

It could be argued that the human mind is indeed algorithmic, but that we just don’t know the proper Turing machine, and thus our simulations will be inexact. But this line of argument is vacuous insofar as it carries no evidence of algorithmicity. Since a simulation is devoid of any requirement that the simulacrum embody any of the entailment or organizational properties of the system which it is simulating, even a good simulation can, by definition, entail nothing about what goes on inside of the simulated system.

Link to comment
Share on other sites

... not enough hours in a day..

I understand the feeling Quid.

Link to comment
Share on other sites

... not enough hours in a day..

I understand the feeling Quid.

 

A kindred "spirit". Feels good to know you are like that.

Link to comment
Share on other sites

Another interesting article by Tim Gwinn... http://www.panmere.com/?p=94

 

In the Turing Test, including this variant, the requirement is merely successful simulation, which is just mimicry. There is no requirement that the simulacrum embody any of the causal or inferential entailments or organizational properties of the system which it is simulating. Indeed, the Test is specifically designed in a way which intentionally blocks access to investigating those properties.

 

Now, plainly, if the human mind were algorithmic, the behavior that a human exhibits must, by definition, have a corresponding Turing machine. Further, by definition, such a machine is simulable by another computer algorithm. Indeed, there will be a arbitrarily large number of such simulations. Each of these simulations amounts to merely an exercise in curve-fitting.

 

It could be argued that the human mind is indeed algorithmic, but that we just don’t know the proper Turing machine, and thus our simulations will be inexact. But this line of argument is vacuous insofar as it carries no evidence of algorithmicity. Since a simulation is devoid of any requirement that the simulacrum embody any of the entailment or organizational properties of the system which it is simulating, even a good simulation can, by definition, entail nothing about what goes on inside of the simulated system.

I'm surprised! hehehe Not! :HaHa:

 

The bold is indeed the problem, IMO.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
 Share

×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines.