Thursday, September 20, 2012

Being Cautious About Consciousness


Last month a distinguished group of scientists gathered at Cambridge University to attend the Francis Crick Memorial Conference.  The larger public knows Dr. Crick for his Nobel Prize winning work on the structure of DNA. Few people outside of the scientific community know that he spent the last half of his career dedicated to finding the neural correlates of consciousness.  This conference, which included experts in the fields of human perception, animal sensory systems, evolutionary biology and psychopharmacology, was meant to honor Dr. Crick's vision of one day identifying how our brains give rise to consciousness.

At this conference, several prominent attendees signed The Cambridge Declaration of Consciousness.  In this document, the signatories outlined several scientific findings, including that non-human animals experience binocular rivalry, have homologous brain regions as humans, many primitive emotions are linked to subcortical brain areas, birds have REM sleep, and hallucinogens affect human and non-human brains similarly.  From these disparate observations, the conference attendees concluded, “Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors.”  In summary, many animals have the same neural substrates for conscious states as humans, a clear implication that animals experience consciousness.

To be blunt, this declaration is both inflammatory and grossly irresponsible for two reasons. 

First, it misrepresents the state of our scientific understanding of consciousness.  As of yet, the scientific community has not reached a consensus on an empirically testable definition of what it means to be conscious.  You see “consciousness” is a quale, which according to Merriam-Webster is defined as, “properties experienced as distinct from any source it may have in physical object.”  Now to be fair, in psychology and neuroscience we deal with qualia on a daily basis. Concepts such as memory, attention, and emotion are all, in themselves, immeasurable entities; however, in the laboratory we are able to observe manifestations of these concepts by constraining our hypotheses to empirically testable phenomenon.  For example, when we study memory, we are referring to the changes in behavior or in neural systems that arises from experience (or the lack of such abilities from lesions to different brain regions).  We don’t have a way of seeing an actual memory itself, but we do have measures of recall, adaptation and synaptic plasticity.  The same is true for emotion, attention, and perception. 

But consciousness, in its current definition, is a quale defined by other qualia. It is the state of being aware of things in the outside world and of ourselves. Thus consciousness is a construct built off of both “awareness” and “intention”.  (It should be noted that in the Cambridge Declaration, the signatories often conflate the concept of consciousness with intention.) Thus, the researchers who study the neural correlates of consciousness are forced to look at these other states, but as of yet there is no unifying definition of the state of being conscious. There is no brain area, network of brain areas, or neurochemical system that, when damaged, definitively removes the ability to be “consciousness” while still being awake and alert. There are many lesions that cause problems with visual perception, memory, verbal recall, etc., but how many of those abilities can we lose and still be conscious? Science still does not have an answer to that question.

The second, and by far the biggest, problem with the Cambridge Declaration of Consciousness is that it is a dangerous document.  It gives the false impression that consciousness is a ubiquitous state of simple animal brains.  This is a damning statement to any researcher who uses animal subjects in their experiments (and many of the signatories of the declaration do use animals in their research).  Are the signers saying that we should stop all animal research or stop using animals as domesticated food sources?  If animals do in fact experience consciousness as we do, then this has far reaching implications for how we use and treat animals. 

The impact of such a statement, however, goes well beyond the future of animal research and into even more hot button social issues.  For example, this declaration implies that neuroscience should be able to tell us when consciousness starts in the fetal brain (since knowing the neural substrates means knowing when they develop), diving head first into a social issue we have no business being in at the moment.  What about clinically “brain dead” patients whose families wish to let them pass away?  If having a neocortex is not necessary for consciousness, then is someone who has lost 85% of her cortex from a stroke still conscious? By signing this declaration, these scientists have given the impression that the field of neuroscience has the answers to these questions, which could not be further from the truth.  We don't even know where to start.

Saturday, September 8, 2012

The unsung "job creators"

Unless you've been living in a cave these last few years, you've undoubtably heard the term "job creator" thrown around a lot.  The phrase is often used in reference to CEOs and other businessmen (and women) who run companies.  Actually, more often than not, it is used to refer to anybody who somehow has made a lot of money (regardless of how many actual jobs they've personally "created" or whether they simply inherited their wealth).  The connotation is of a benevolent capitalist who invests in an industry with a forward eye towards sustaining local economies.

These executives are elevated to near angelic status by some, but truth be told, they are all actually sitting at the end of the "job creation" chain.  The true stimulus for developing industries and subsequent employment usually happens much much earlier.  It usually begins in the laboratory.


Building an economy by asking a question

Since the dawn of the Industrial Revolution, science and technology has served as the bedrock of emerging industries.  Without theoretical physics, we would never of had atomic energy.  Mathematics gave us computers.  Molecular biology gave us biotechnology.  Geology gave us most of the energy industry.

But often the scientific discoveries that allowed for an industry to bloom were not intended to shape an economy.  These research endeavors were started merely to satisfy a curiosity.  My favorite example of this is photography.  Few can argue that the advent of the photograph didn't fundamentally change our world.  It not only revolutionized journalism, but also spawned hundreds of new companies, most notably companies like Kodak.

However, the early photograph came about because of independent, basic science discoveries in physics and chemistry.  The research on photons, electromagnetism and light sensitive materials that lead to the photograph weren't developed explicitly for photography.  They were the result of many independent scientists who were simply curious about the world around them.  It wasn't until Joseph Niepce and Louis Daguerre put these together that the early photographic process began to get off the ground.

Or take another example more closely tied to my work. The advent of the now almost ubiquitous medical imaging procedure MRI didn't come around on its own.  It was built off of fundamental discoveries in physics on the electromagnetic properties of molecules and atoms, as well as biological studies on the tissue content of the human body.  So basically, answering questions about how hydrogen atoms spin and how fatty the human brain really is led to perhaps the most important medical technology advancement since the vaccine.

Now these aren't isolated anecdotes.  Nearly every industry built off of a technological advancement that can trace its roots to basic science discoveries that had no clear applications when they started. 


Pulling the rug out from under new industries


Today, science is exploding (sometimes literally) with many new fundamental discoveries.  We've found the Higgs boson, we've discovered life in places we never dreamed possible, and we've even figured out how use viruses to make neurons fire by using lasers.  None of these discoveries have any applied uses that we know of... yet.  But who know's what industries they may lead to in the coming years and decades?

Unfortunately, despite these amazing discoveries and advancements, the appreciation for basic research is plummeting in this country.  In general scientific literacy and public support for science is dropping.  Even major scientific funding agencies like the National Institutes of Health and Defense Advanced Research Projects Agency are pushing for more "applied research" projects at the expense of basic science.  While this may help facilitate immediate advancements in existing industries, it is only a short-term strategy that shifts focus away from the real work that leads to the advent of entirely new industries in the long-term.

I say it's time to take a step back and appreciate who really are the "job creators" around here.  Is it the executive who sends paychecks to tens, hundreds or maybe even thousands of employees or is it the unsung people whose discoveries eventually build entire industries that end up employing thousands or millions of individuals?

So the next time you hear a politician or television pundit talk about thanking a "job creator," head to your local university or research lab and thank a scientist.