Wouldn't the functionalist perspective suggest that consciousness is not
an "incidental" by-product, but rather an "inevitable" one? In other
words, you automatically get consciousness when you have the kind of
complex system Damien is describing. Any system which has a sufficiently
rich representation of itself and the world is conscious - that is what
consciousness is.
One problem with this suggestion is that it implies that a relatively
simple computer program, say CYC with its thousands of interrelated
facts about the world, would be conscious to some degree. Such a
program is far from being able to pass the Turing test, and we might
not be comfortable setting the bar for consciousness so much lower than
that for human equivalent intelligence. But on the other hand, it
appears in nature that conscious awareness is in fact much easier to
produce than human intelligence, so perhaps this is not so objectionable
after all.
Hal