http://www.nytimes.com/2010/09/21/science/21consciousness.html?pagewanted=all

Measuring Consciousness

Different ways of measuring levels of consciousness or sentience have been proposed over the years. The ability to measure consciousness has some obvious medical applications (like determining if a patient is comatose), ethical applications (like determining how conscious your chicken dinner was), and practical applications (like determining if you have been messaging with a chatbot). In this section of the website, we will explore two different approaches to measuring consciousness.

We will first explore the “Turing Test”, a classic and influential example of a proposed method for determining sentience. The development of Turing-like-tests over the years brings out part of the difficulty of measuring consciousness; the ethereal conscious experience is difficult to capture. Sometimes, it seems that we can only judge the level of a system's consciousness by its behaviour or human-like capabilities. This is obviously a very limited anthropocentric way of assessing consciousness. Thus, it can be easy-ish to compare humans and judge their levels of consciousness, but very difficult to assess the levels of consciousness of other possibly conscious systems like computer programs, animals etc.

We can see that an empirically accessible theory of consciousness might better lend itself to the measurement and comparison of consciousness.  We will explore Integrated Information Theory as a serious proposal of the necessary and sufficient conditions for consciousness, and as a theory of measuring consciousness. By asserting that integrated information is essentially what it is to be conscious, consciousness is reduced to a measurable property.