For a computer, consciousness would be a bug-as-bug, not bug-as-feature. Nicholas Carr
During an interview with the IdeaFestival blog in advance of his 2013 appearance, journalist and author Oliver Burkeman confessed that he was fascinated with the subject of consciousness. That interest surfaced again this week in a piece for the Guardian, Why can't the world's greatest minds solve the mystery of consciousness?
The mystery of sentient, self-referential minds can be divided into two general problems. An answer to the more tractable of the two, the so-called easy problem, can be had by pointing to areas of the brain responsible for abilities such as attention and focus. We can watch that electrified mind at work in brain scans.
The "hard problem" of consciousness is altogether different. When our minds attend, for example, to the sight of underfed child, why should that attention also be accompanied by the wrenching, intimate sense that an injustice has occurred? "Why," Burkeman asks, channeling the philosopher David Chalmers in the article, "should any of this feel like something from the inside?" The "hard problem" of consciousness is hard because at the moment there is no good explanation for why any single sensation should be contextualized in a first-person experience, and one, moreover, that is utterly different for each one of us. Newtonian mechanics may one day help us to navigate the stars, but it is a poor substitute for the felt experience of watching our star set over an ocean calm.
For the time being, science's third-party perspective must share any understanding of the subjective experiences of hurt and joy, of parsimony and self-empyting, of sunsets and the sad realization that not every child is well fed, with poets and priests, who, among many other artists, render the world in a human readable format. They bring meaning, if not always accuracy, to the human project. Yes, there is a reason poetry, not novel extracts, are read at funerals.
There are numerous ways in which this arguing over the Hard Problem - and whether it exists, some deny that it does - might bear on our day-to-day lives. Without the capacity to transcend (and not merely follow) the rules we provide, will our robots ever be said to be intelligent - or creative? The always witty Nicholas Carr doubts it, and contributes the quote at the top of this blog post.
Of equal interest to me is another thought: simultaneously author and heir to an impossible story, can the mind, the spongy ferment that named itself, ever fully know itself? Are there things that can never be known?
Please give Why can't the world's greatest minds solve the mystery of consciousness a read. You won't be disappointed!
Stay curious.
Wayne
Want more? Wikipedia: Hard Problem of consciousness Stanford Encyclopedia of Philosophy: Qualia and the Knowledge Argument