From: William Pearson (wil.pearson@gmail.com)
Date: Mon Jun 23 2008 - 02:07:26 MDT
2008/6/23 Peter de Blanc <peter@spaceandgames.com>:
> J. Andrew Rogers wrote:
>>
>> If your "counter" does not occupy memory then the complexity is constant.
>> If it does occupy memory, it is eating bits as N grows on a finite machine.
>> Either way, you have a problem.
>
> Turing machines have infinite memory. But the argument still works for
> finite machines:
>
> Let K be the complexity of machine 0. Let M be the number of machines of
> complexity at most K (if complexity is measured in bits, that's 2^(k+1)-2).
> Then somewhere in the set 0...M, there's a machine with complexity greater
> than K. We're only counting up to M, so that's pretty finite.
>
Complexity is typically taken to be komogorov complexity that is the
complexity of a program is measured by the shortest program that has
the same output, There can be infinite programs with the same
kolmogorov complexity, as you can always just append arbitrarily long
amounts of garbage to a program that does nothing.
Will Pearson
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT