tag:blogger.com,1999:blog-45826496275498243542024-03-16T11:53:09.061-07:00The Cognitive AxonMental bloviations on the world as seen through the eyes of a pathological neuroscientist.Unknownnoreply@blogger.comBlogger34125tag:blogger.com,1999:blog-4582649627549824354.post-87496262673805519582013-02-11T07:42:00.002-08:002013-02-11T07:42:27.315-08:00I have moved to Psychology TodayThe Cognitive Axon blog has evolved into a more professional form and is now <a href="http://www.psychologytoday.com/blog/white-matter-matters">White Matter Matters</a> over at <a href="http://www.psychologytoday.com/">Psychology Today</a>. Same great posts... half the whining.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4582649627549824354.post-1244529024715522632012-10-28T07:34:00.002-07:002012-10-28T07:38:24.512-07:00How to make a zombie brainHello kiddos. It's that time of year again for zombie neuroscience to rise from the ground and rear is viciously thoughtful head. <br />
<br />
Last year's <a href="http://cognitiveaxon.blogspot.com/search/label/zombies">series of posts</a> on the zombie brain that I did in collaboration with Brad Voytek over at <a href="http://blog.ketyov.com/">Oscillatory Thoughts</a> was a <a href="http://www.forbes.com/sites/alexknapp/2011/10/23/explaining-the-neuroscience-of-the-zombie-epidemic/">huge</a> <a href="http://chronicle.com/article/Zombies-on-the-Brain/133043/">hit</a>. Since then we've managed to turn this thought experiment into a book deal with the amazingly supportive folks at Princeton University Press. Brad and I have also worked with the <a href="http://ed.ted.com/">TED Education</a> group to do a two-part series of education shorts on teaching neuroscience using zombies. You can watch <a href="http://ed.ted.com/lessons/diagnosing-a-zombie-brain-and-behavior-tim-verstynen-bradley-voytek">both</a> <a href="http://ed.ted.com/lessons/diagnosing-a-zombie-tim-verstynen-brad-voytek">videos</a> over at TED-Ed or just watch the first one here.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/dACNHRPdgqc?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
I have to say, the TED Ed group is absolutely amazing to work with. If you get a chance to do a project with them, I highly recommend it.<br />
<br />
Okay, back to the main post. A key part of the talks that Brad and I give about the zombie brain is our 3D model of how it should look. This is a simulated brain brain to show where lesions who have likely occurred and how they relate to various zombie "symptoms." <br />
<br />
Check it out. The zombie brain is the one on the right.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/5kZqliPHaoY?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
Now I often get asked how we actually made this model zombie brain. I mean we can't really get a zombie and image it right? Right???<br />
<br />
Well until the zombie apocalypse gets us sufficient specimens to can, we had to do something else. We had to take a human brain and morph it into what the zombie brain should look like. <br />
<br />
So this post is for all you imaging geeks out there who want to know how to make your own model of the zombie brain.<br />
<br />
For this you'll need 7 things. All files can be downloaded from my website.<br />
<blockquote class="tr_bq">
1. A template brain of a normal human: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/ch2better.nii">HERE</a> </blockquote>
<blockquote class="tr_bq">
2. A segmented image of said human brain: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/aal.nii">HERE</a> </blockquote>
<blockquote class="tr_bq">
3. A list of voxel IDs for the segmented map: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/roi_label_list.txt">HERE</a> </blockquote>
<blockquote class="tr_bq">
4. A list of regions you wish to "lesion": <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/affected_roi_list.txt">HERE</a> </blockquote>
<blockquote class="tr_bq">
5. A routine for extracting regions from the segmented map: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/roi_separator.m">HERE</a> </blockquote>
<blockquote class="tr_bq">
6. A routine for merging region files: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/roi_merge.m">HERE</a> </blockquote>
<blockquote class="tr_bq">
7. The core lesion loop script: <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/make_zombie_brain_example.m">HERE</a></blockquote>
To follow exactly my steps, you'll need <a href="http://www.mathworks.com/">Matlab</a> and <a href="http://www.fil.ion.ucl.ac.uk/spm/">SPM8</a>. Although the logic is simple enough that if you're familiar with other imaging analysis programs/platforms you can replicate the process there.<br />
<br />
<b>Step 1: Get a good template MRI image</b>. I like using the Colin Brain, which is an average of several dozen structural brain scans of the same subject (Colin). The Colin Brain is a standard template image in most analysis software packages or you can download the original <a href="http://imaging.mrc-cbu.cam.ac.uk/downloads/Colin/">here</a>. For this to work properly, you should use the skull stripped version (I used the ch2better.nii template that comes default with MRICron). So if it's not skull stripped already, you should use a skull stripping program to do it (I've had good luck with BET2 in the past). You can download the specific Colin Brain template that I used <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/ch2better.nii">here</a>.<br />
<br />
This is what the Colin brain looks like normally. Consider this our human template.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-X1Hnt1qgz9s/UI1A3fopqVI/AAAAAAAAHZI/hBJ1yX6ThBY/s1600/ch2better.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://3.bp.blogspot.com/-X1Hnt1qgz9s/UI1A3fopqVI/AAAAAAAAHZI/hBJ1yX6ThBY/s320/ch2better.bmp" width="313" /></a></div>
<br />
<br />
<b>Step 2: Get a template segmentation map</b>. There are dozens of template segmentations these days. I used the Automated Anatomical Labelling (AAL) template from the aal.nii file that comes standard with MRICron, but this is a pretty ubiquitous template these days. Or if you don't want to go with a standard segmentation, you can use a program like Freesurfer to automatically segment the template brain you've chosen. The important thing is that the template you use has a list of labels for each region ID number so you can choose them from a list (e.g., the aal.lut.txt file). I've posted the <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/aal.nii">AAL</a> template and the <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/roi_label_list.txt">region list</a> I used on my website.<br />
<br />
Here's a snapshot of the regions in the AAL template. Each color represents a different labeled brain area.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-MLjD5rQY2rc/UI1BCCX6JeI/AAAAAAAAHZQ/3sz-wfoUN0E/s1600/aal.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://2.bp.blogspot.com/-MLjD5rQY2rc/UI1BCCX6JeI/AAAAAAAAHZQ/3sz-wfoUN0E/s320/aal.bmp" width="320" /></a></div>
<br />
<br />
<b>Step 3: Isolate the segmentation map</b>. This is a painful step. You need to write a routine to extract each region from the segmented template. I wrote a quick Matlab program to do it and you can download the segmentation routine <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/roi_separator.m">here</a> (NOTE: no promises it will work for you, I'm not gonna support any of this code... just sharing). If you load the text file of the region list and voxel ID numbers, this will run through and make separate images for each region.<br />
<br />
<b>Step 4: Reslice the regions to the template space </b>(NOTE: you can skip this step if you ran a segmentation routine on the template brain yourself). In my case the template (human) brain is a 3D matrix with <i>xyz</i> dimensions of 301x370x316. However, the AAL template file I used has the dimensions of 181x217x181. So we just need to reshape the matrix size of each region of interest we extracted from the AAL file to match the size of the template brain. I used spm_reslice.m for this.<br />
<br />
<b>Step 5: Get a list of regions you want to "lesion"</b>. Now from the full list of extracted regions, you'll want to choose which parts of the brain you want to wipe out. Here's the list I used for our <a href="http://www.psy.cmu.edu/~coaxlab/data/zombie_brains/affected_roi_list.txt">zombie brain</a>.<br />
<br />
<b>Step 6: The virtual lesion loop</b>. The principles of this loop are pretty easy. For each region you want to loop through the list, load the file of that region, find the voxel coordinates of that region, save them in a map. Then you'll want to make two maps. <br />
<br />
The first map, isolates the areas to be lesioned and removes the gray matter (i.e., voxels in these regions with an intensity less than some threshold will be set to zero). It will look like this.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-Ml5-BlCRR4M/UIxPlCM_5LI/AAAAAAAAHYQ/t1O_9njmZsM/s1600/chunked_regions.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://3.bp.blogspot.com/-Ml5-BlCRR4M/UIxPlCM_5LI/AAAAAAAAHYQ/t1O_9njmZsM/s320/chunked_regions.bmp" width="313" /></a></div>
<br />
<br />
The second map is a map of everything else to be spared. It looks like this.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-5g5addadpqo/UIxPpg2pROI/AAAAAAAAHYY/lC9YtFjaZ-s/s1600/preserved_regions.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://4.bp.blogspot.com/-5g5addadpqo/UIxPpg2pROI/AAAAAAAAHYY/lC9YtFjaZ-s/s320/preserved_regions.bmp" width="313" /></a></div>
<br />
<br />
Once you've got your two maps, you'll want to put them back together again. The resulting map looks like this.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-24BfyEweuC4/UIxPtyJ99MI/AAAAAAAAHYg/m-oCsTB5ha8/s1600/merged-preserved_regions.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://4.bp.blogspot.com/-24BfyEweuC4/UIxPtyJ99MI/AAAAAAAAHYg/m-oCsTB5ha8/s320/merged-preserved_regions.bmp" width="313" /></a></div>
<br />
<br />
It's not perfect but it's close. The final step of the process is to smooth out the rough edges. I used the spm_smooth.m function for this with a smoothing kernel of 3mm FWHM. The end result is what you saw above. <br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-SCfjJHO__rU/UIxP5dsFKlI/AAAAAAAAHYo/H1yC_TAiL7E/s1600/zombie_ch2better_capgras.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://1.bp.blogspot.com/-SCfjJHO__rU/UIxP5dsFKlI/AAAAAAAAHYo/H1yC_TAiL7E/s320/zombie_ch2better_capgras.bmp" width="313" /></a></div>
<br />
<br />
Voila. You've just turned Colin into Zombie Colin! Here's the overlay of the original Colin brain and the zombie brain (orange) cause it's so cool!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-ChckoXKzhAM/UIxQImxnz9I/AAAAAAAAHYw/XMKdPZN2lf8/s1600/z2h_overlay.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="72" src="http://1.bp.blogspot.com/-ChckoXKzhAM/UIxQImxnz9I/AAAAAAAAHYw/XMKdPZN2lf8/s320/z2h_overlay.bmp" width="320" /></a></div>
<br />
<br />
Now with this routine you can pick and choose which areas you want to lesion. This is how I came up with the fast versus slow zombie brains.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-6CfwXaMg9iU/UIxQpcEw8EI/AAAAAAAAHY4/cvKU-NOkxK8/s1600/ZRS_brain_stages.001.tiff" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="180" src="http://1.bp.blogspot.com/-6CfwXaMg9iU/UIxQpcEw8EI/AAAAAAAAHY4/cvKU-NOkxK8/s320/ZRS_brain_stages.001.tiff" width="320" /></a></div>
<br />
There's a lot of room to use this as you please for teaching and demonstrations. It makes for easy visualisation of lesion/atrophy for educational purposes. But remember, it's faked data, so ALWAYS use that disclaimer.<br />
<br />
So that's it. You've made Colin into Zombie Colin and now you can show him off to all your friends.<br />
<br />
Until next year, Happy Halloween everyone!<br />
<br />Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-4582649627549824354.post-14122031875900776712012-10-18T19:54:00.005-07:002012-10-18T19:54:57.612-07:00SFN 2012 RecapThis past week I was at the largest annual neuroscientific gathering in the world, where 28,000 people converged in one place to talk about brains. The <a href="http://www.sfn.org/">Society for Neuroscience</a> meeting was held in New Orleans and man was it a fun trip this year. Okay it did help that a) New Orleans is an absolutely amazing city that I will never get tired of going back to, b) we got to enjoy a little schadenfreude as <a href="http://boingboing.net/2012/10/18/why-casual-sexism-in-science-m.html">the internet wrought revenge on a famous scientist who let his sexism show</a>, c) Voytek and I had a very successful <a href="http://blog.ketyov.com/2012/10/voyteks-and-guerrilla-science-at-sfn.html">guerilla science campaign</a>, and d) I got to do my first press conference which has since been<a href="http://thechart.blogs.cnn.com/2012/10/18/your-brain-on-food-obesity-fasting-and-addiction/"> picked up and mis-reported by CNN's blog</a>.<br />
<br />
Oh, and <a href="http://www.kizoomlabs.com/blog/">Ned the Neuron</a> was a nice, warm and fuzzy presence at the conference and also at the late night bars.<br />
<br />
But beyond that, the science was exceptionally good this year. Here are some of the highlights that I caught.<br />
<br />
-- <a href="http://neurotheory.columbia.edu/~larry/">Larry Abbott</a>, perhaps one of the best theoretical neuroscientists alive, gave a fantastic lecture on how (counter-intuitively) unstructured networks lead to the best decoding. Basically it turns out that if you take a set of connections, in his case the olfactory system of the fly, then having a <i>completely random wiring pattern</i> actually makes it easier to decode the input stimulus. Now I haven't read the paper, so I'm not 100% sure I can say why this works, but it does provide some interesting food for thought. He also went on to present <a href="http://www.neuroscience.columbia.edu/?page=28&bio=611">Mark Churchland's</a> work on the subject which I'm saving for a later post (I have always been a big fan of Mark's work and I think he's onto something really amazing with his latest set of studies).<br />
<br />
-- Contrary to some pretty vocal (and <i>sometimes</i> warranted) criticism, the <a href="http://www.humanconnectomeproject.org/">Human Connectome Project</a> gave a preview of some of its preliminary data and I gotta say... it looks incredible! Their diffusion imaging data is hands down the best I've seen so far and they're already <a href="http://www.humanconnectomeproject.org/data/">letting people download it</a> to use in their research. Have already registered and feel like a kid in the candy store.<br />
<br />
-- Although I missed this talk at a pre-conference workshop, I hear that <a href="http://keck.ucsf.edu/~sabes/">Philip Sabe's</a> lab (my first post-doc adviser) has basically done the first step towards building The Matrix! His graduate student, Maria Dadarlat, had some fantastic work where she continually pair patterns of stimulation in the monkey brain with real visual stimuli during a motor control task. The monkey is trained to use this complex pattern of moving dot fields on a computer display to figure out how to navigate his arm around a workspace. After training, Maria can turn off the real visual stimulus (i.e., what the monkey sees) and the monkey uses the brain stimulation signals to guide his arm with as much accuracy as if he was actually seeing the visual stimulus. Expect to hear big things when this comes out in press.<br />
<br />
-- The <a href="http://cognitiveaxon.blogspot.com/2011/07/virus-guided-laser-neurons-and-need-for.html">optogenetics</a> footprint was the largest I've seen this year. Some really cool work by several labs showing extensions into the non-human primates (wonder how long until it's used in humans for something) and significantly improved bandwidth. This technology still boggles my mind, although I'm starting to appreciate some of the criticisms from the traditional physiology side of the table.<br />
<br />
-- I also learned a lot about the in's-and-out's of NSF funding priorities for the near future thanks to some really helpful program officers and a great presentation on how to navigate the NSF grant process. I literally couldn't write my notes fast enough. All I can say is, <u>emphasize novel and unique data sharing ideas as much as you can in your next set of applications</u>...<br />
<br />
Okay, I have many many more notes from visiting posters that I wont blather on about, but they will serve as fodder for future posts. Just wanted to put these up there for anyone interested before I'm reduced to a coma from exhaustion. <br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4582649627549824354.post-26155079488231833362012-09-20T11:18:00.002-07:002012-09-20T11:18:51.353-07:00Being Cautious About Consciousness<br />
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
Last month a distinguished group of scientists gathered at Cambridge University to attend the <a href="http://fcmconference.org/">Francis Crick Memorial Conference</a>. The larger public knows Dr. Crick for his Nobel Prize winning work on the structure of DNA. Few people outside of the scientific community know that he spent the last half of his career dedicated to finding the neural correlates of consciousness. This conference, which included experts in the fields of human perception, animal sensory systems, evolutionary biology and psychopharmacology, was meant to honor Dr. Crick's vision of one day identifying how our brains give rise to consciousness.</div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
At this conference, several prominent attendees signed <a href="http://mindhacks.com/2012/08/20/animals-conscious-say-leading-neuroscientists/">The Cambridge Declaration of Consciousness</a>. In this document, the signatories outlined several scientific findings, including that non-human animals experience binocular rivalry, have homologous brain regions as humans, many primitive emotions are linked to subcortical brain areas, birds have REM sleep, and hallucinogens affect human and non-human brains similarly. From these disparate observations, the conference attendees concluded, “Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors.” In summary, many animals have the same neural substrates for conscious states as humans, a clear implication that animals experience consciousness.</div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
To be blunt, this declaration is both inflammatory and grossly irresponsible for two reasons. </div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
First, it misrepresents the state of our scientific understanding of consciousness. As of yet, the scientific community has not reached a consensus on an empirically testable definition of what it means to be conscious. You see “consciousness” is a quale, which according to Merriam-Webster is defined as, “properties experienced as distinct from any source it may have in physical object.” Now to be fair, in psychology and neuroscience we deal with qualia on a daily basis. Concepts such as memory, attention, and emotion are all, in themselves, immeasurable entities; however, in the laboratory we are able to observe manifestations of these concepts by constraining our hypotheses to empirically testable phenomenon. For example, when we study memory, we are referring to the changes in behavior or in neural systems that arises from experience (or the lack of such abilities from lesions to different brain regions). We don’t have a way of seeing an actual memory itself, but we do have measures of recall, adaptation and synaptic plasticity. The same is true for emotion, attention, and perception. </div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
But consciousness, in its current definition, is a quale defined by other qualia. It is the state of being aware of things in the outside world and of ourselves. Thus consciousness is a construct built off of both “awareness” and “intention”. (It should be noted that in the Cambridge Declaration, the signatories often conflate the concept of consciousness with intention.) Thus, the researchers who study the neural correlates of consciousness are forced to look at these other states, but as of yet there is no unifying definition of the state of being conscious. There is no brain area, network of brain areas, or neurochemical system that, when damaged, definitively removes the ability to be “consciousness” while still being awake and alert. There are many lesions that cause problems with visual perception, memory, verbal recall, etc., but how many of those abilities can we lose and still be conscious? Science still does not have an answer to that question.</div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
The second, and by far the biggest, problem with the Cambridge Declaration of Consciousness is that it is a dangerous document. It gives the false impression that consciousness is a ubiquitous state of simple animal brains. This is a<a href="http://speakingofresearch.com/2012/08/23/consciousness-and-moral-status/"> damning statement to any researcher who uses animal subjects</a> in their experiments (and many of the signatories of the declaration do use animals in their research). Are the signers saying that we should stop all animal research or stop using animals as domesticated food sources? If animals do in fact experience consciousness as we do, then this has far reaching implications for how we use and treat animals. </div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
The impact of such a statement, however, goes well beyond the future of animal research and into even more hot button social issues. For example, this declaration implies that neuroscience should be able to tell us when consciousness starts in the fetal brain (since knowing the neural substrates means knowing when they develop), diving head first into a social issue we have no business being in at the moment. What about clinically “brain dead” patients whose families wish to let them pass away? If having a neocortex is not necessary for consciousness, then is someone who has lost 85% of her cortex from a stroke still conscious? By signing this declaration, these scientists have given the impression that the field of neuroscience has the answers to these questions, which could not be further from the truth. We don't even know where to start.</div>
<div class="MsoNormal" style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 13px;">
<br /></div>
Unknownnoreply@blogger.com4tag:blogger.com,1999:blog-4582649627549824354.post-88161075514102863522012-09-08T09:30:00.000-07:002012-09-09T11:36:24.843-07:00The unsung "job creators"Unless you've been living in a cave these last few years, you've undoubtably heard the term <a href="http://www.salon.com/2012/07/18/the_myth_of_the_job_creators_salpart/">"job creator"</a> thrown around a lot. The phrase is often used in reference to CEOs and other businessmen (and women) who run companies. Actually, more often than not, it is used to refer to anybody who somehow has made a lot of money (regardless of how many actual jobs they've personally "created" or whether they simply inherited their wealth). The connotation is of a benevolent capitalist who invests in an industry with a forward eye towards sustaining local economies.<br />
<br />
These executives are elevated to near angelic status by some, but truth be told, they are all actually sitting at the <i>end</i> of the "job creation" chain. The true stimulus for developing industries and subsequent employment usually happens much much earlier. It usually begins in the laboratory.<br />
<h3>
<br /></h3>
<h3>
Building an economy by asking a question</h3>
Since the dawn of the Industrial Revolution, science and technology has served as the bedrock of emerging industries. Without theoretical physics, we would never of had atomic energy. Mathematics gave us computers. Molecular biology gave us biotechnology. Geology gave us most of the energy industry.<br />
<br />
But often the scientific discoveries that allowed for an industry to bloom were not intended to shape an economy. These research endeavors were started merely to satisfy a curiosity. My favorite example of this is <a href="http://en.wikipedia.org/wiki/Photograph">photography</a>. Few can argue that the advent of the photograph didn't fundamentally change our world. It not only revolutionized journalism, but also spawned hundreds of new companies, most notably companies like Kodak. <br />
<br />
However, the early photograph came about because of independent, basic science discoveries in physics and chemistry. The research on photons, electromagnetism and light sensitive materials that lead to the photograph weren't developed explicitly for photography. They were the result of many independent scientists who were simply curious about the world around them. It wasn't until <a href="http://en.wikipedia.org/wiki/Joseph_Nic%C3%A9phore_Ni%C3%A9pce">Joseph Niepce</a> and <a href="http://en.wikipedia.org/wiki/Louis_Daguerre">Louis Daguerre</a> put these together that the early photographic process began to get off the ground.<br />
<br />
Or take another example more closely tied to my work. The advent of the now almost ubiquitous medical imaging procedure <a href="http://en.wikipedia.org/wiki/MRI">MRI</a> didn't come around on its own. It was built off of fundamental discoveries in physics on the electromagnetic properties of molecules and atoms, as well as biological studies on the tissue content of the human body. So basically, answering questions about how hydrogen atoms spin and how fatty the human brain really is led to perhaps the most important medical technology advancement since the vaccine.<br />
<br />
Now these aren't isolated anecdotes. Nearly every industry built off of a technological advancement that can trace its roots to basic science discoveries that had no clear applications when they started. <br />
<div>
<br /></div>
<br />
<h3>
Pulling the rug out from under new industries</h3>
<br />
Today, science is exploding (sometimes literally) with many new fundamental discoveries. We've <a href="http://www.nytimes.com/2012/07/05/science/cern-physicists-may-have-discovered-higgs-boson-particle.html?pagewanted=all">found the Higgs boson</a>, we've discovered <a href="http://www.arctic.noaa.gov/essay_vogt.html">life in places we never dreamed possible</a>, and we've even<a href="http://cognitiveaxon.blogspot.com/2011/07/virus-guided-laser-neurons-and-need-for.html"> figured out how use viruses to make neurons fire by using laser</a>s. None of these discoveries have any applied uses that we know of... yet. But who know's what industries they may lead to in the coming years and decades?<br />
<br />
Unfortunately, despite these amazing discoveries and advancements, the appreciation for basic research is plummeting in this country. In general <a href="http://www.youtube.com/watch?v=gHbYJfwFgOU">scientific literacy and public support for science is dropping</a>. Even major scientific funding agencies like the National Institutes of Health and Defense Advanced Research Projects Agency are pushing for more "applied research" projects at the expense of basic science. While this may help facilitate immediate advancements in existing industries, it is only a short-term strategy that shifts focus away from the real work that leads to the advent of entirely new industries in the long-term.<br />
<br />
I say it's time to take a step back and appreciate who really are the "job creators" around here. Is it the executive who sends paychecks to tens, hundreds or maybe even thousands of employees or is it the unsung people whose discoveries eventually build entire industries that end up employing thousands or millions of individuals?<br />
<br />
So the next time you hear a politician or television pundit talk about thanking a "job creator," head to your local university or research lab and thank a scientist.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-oFvt8CDdgbo/UEtwJpKEiEI/AAAAAAAAHX4/wICVnqzawyc/s1600/nye_the_job_creator.tiff" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://3.bp.blogspot.com/-oFvt8CDdgbo/UEtwJpKEiEI/AAAAAAAAHX4/wICVnqzawyc/s320/nye_the_job_creator.tiff" width="320" /></a></div>
<br />
<br />
<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4582649627549824354.post-52417112865416625742012-07-26T10:00:00.001-07:002012-09-09T11:36:30.580-07:00Of soda bans and neural strands<span style="background-color: white;">A lot of comedic energy has been recently focused on </span><a href="http://news.google.com/news/url?sr=1&sa=t&ct2=us%2F0_0_s_1_0_t&usg=AFQjCNFT6SQ5ZiDjejlx2YrijuiOZXbyYw&did=9f3ebe500372724b&sig2=mVbUPR0TofTJVwu64s6ThQ&cid=35185818405154&ei=hOcNUIjNHYucgQeiiQE&rt=STORY&vm=STANDARD&url=http%3A%2F%2Fwww.latimes.com%2Fhealth%2Fboostershots%2Fla-heb-16-ounce-soda-new-york-city-calories-20120723%2C0%2C2800445.story" style="background-color: white;">New York City's ban on extra large sodas</a>. Nowhere is this more so than at the Daily Show, <span style="background-color: white;">where Jon Stewart recently pointed out the irony that marijuana is practically legal, buyer beware if you want 32oz of your favorite beverage.</span><br />
<br />
<div style="background-color: black; width: 520px;">
<div style="padding: 4px;">
<iframe frameborder="0" height="288" src="http://media.mtvnservices.com/embed/mgid:cms:video:thedailyshow.com:414984" width="512"></iframe><br />
<div style="background-color: white; font-family: Arial, Helvetica, sans-serif; font-size: 12px; margin-bottom: 0px; margin-top: 4px; padding: 4px; text-align: left;">
<b><a href="http://www.thedailyshow.com/watch/thu-june-7-2012/jon-stewart-tries-to-figure-out-what-he-s-allowed-to-put-in-his-mouth">The Daily Show with Jon Stewart</a></b><br />
Get More: <a href="http://www.thedailyshow.com/full-episodes/">Daily Show Full Episodes</a>,<a href="http://www.indecisionforever.com/">Political Humor & Satire Blog</a>,<a href="http://www.facebook.com/thedailyshow">The Daily Show on Facebook</a></div>
</div>
</div>
<br />
Yet it is interesting that Mr. Stewart chose this particular comparison.<br />
<br />
<span style="background-color: white;">Years ago, social and political pressure mounted on science to show that </span><a href="http://news.bbc.co.uk/onthisday/hi/dates/stories/october/2/newsid_2540000/2540141.stm" style="background-color: white;">chronic marijuana consumption could damage brain cells</a><span style="background-color: white;">. The story was to be that toking up <a href="http://www.youtube.com/watch?v=3FtNm9CgA6U">meant killing neurons</a>. Well despite decades of federally funded research, there has been no definitive link between the consumption of marijuana and brain </span><span style="background-color: white;">cell death. Sure, there is a slight possibility that the ink is there and we haven't found it, but it hasn't been for lack of trying. </span><br />
<span style="background-color: white;"><br /></span><br />
Now let's flash forward to today, <a href="http://www.cdc.gov/nchs/data/databriefs/db82.htm">with over 1/3 of the US population is obese</a> and, as a population, we just keep getting bigger. This is mainly due to a decrease in physical activity and an increase in high caloric diets like the nefarious "super-sized" sodas.<br />
<br />
"So what?" you might say, "It's not like that soda is killing my brain cells."<br />
<br />
Actually, a growing body of evidence suggests that it might be doing just that. Well to be clear, not <i>that</i> single soda per se, but the obesity that such high caloric intake can lead to. <br />
<br />
<h3>
<span style="background-color: white;">From the waistline to the brain</span></h3>
<span style="background-color: white;">Most people, even scientists that I talk to, assume that if there is a relationship between the brain and obesity, it is only in the sense that certain people's brains drive them to eat more and that's why their obese. Maybe you've got a more addictive personality to begin with, so you're hardwired to seek reward and your drug of choice ends up being doughnuts and soda pop.</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Now I wont argue that there might very well be a case for this argument and, in fact, <a href="http://www.sciencemag.org/content/322/5900/449.short">there is some data to justify this hypothesis</a>. But let's step back for a minute and consider some general facts. The brain needs a lot of energy to do its thing. In fact, you can think of the brain as the United States of the global energy supply of the body. It occupies only <a href="http://faculty.washington.edu/chudler/facts.html">about 2%</a> of the total tissue volume in our body, but it <i><b>uses</b></i></span><span style="background-color: white;"> almost <a href="http://www.acnp.org/g4/gn401000064/ch064.html">15% of the output from the heart, 20% of the body's oxygen, and 25% of the circulating glucose</a>. </span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Now everyone knows that obesity is linked to all sorts of metabolic problems (e.g., diabetes, high blood pressured, cardiovascular disease, etc.)</span><span style="background-color: white;">. So if we stick with our metaphor of the brain being like the United States of energy consumption, then think about what happens when energy prices skyrocket in this country. The cost of doing things incrementally goes up and the overall productivity of the country pays a price. In fact, there are many studies showing that the <a href="http://www.ncbi.nlm.nih.gov/pubmed/22016109">brains</a> and <a href="http://www.ncbi.nlm.nih.gov/pubmed/21262422">cognitive</a> <a href="http://www.ncbi.nlm.nih.gov/pubmed/20406532">processes</a> of obese individuals function differently than lean counterparts.</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">But emerging evidence suggests that obesity may be much more nefarious to the brain than simply raising the metabolic gas prices. It might actually be, that's right... attacking brain tissue itself.</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Okay, I'll admit that last statement seems to be quite hyperbolic, but emerging research is giving us a very startling picture on the relationship between the size of your gut and your brain health. </span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Take for example a recent study my colleagues and I did that will be coming out in the journal of Psychosomatic Medicine. We looked at how the underlying architecture of the brain itself was different in obese individuals compared lean counterparts. We took a group of neurologically health adults who spanned a range of </span><span style="background-color: white;">body mass index (BMI) scores. Higher BMI means, generally speaking, greater obesity. </span><span style="background-color: white;">We then used MRI to measure the integrity of the physical connections in the brain. Remember that the two fundamental tissue types in the brain are gray matter (the cell bodies) and white matter (the long strands that connect cells together). The type of MRI we used, called diffusion tensor imaging, looked at this latter tissue type (by measuring something called fractional anisotropy, which a very basic measure of white matter integrity).</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">We found that with every point increase on the BMI scale, there was an incremental decease in the integrity of white matter <i>throughout the brain</i>. Now <a href="http://www.ncbi.nlm.nih.gov/pubmed/21183934">other</a> <a href="http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0018544">studies</a> also show this relationship between obesity and white matter, but our findings point to a <i>global and pervasive</i> effect throughout the brain.</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">As if that wasn't scary enough, in another study, <a href="http://www.ncbi.nlm.nih.gov/pubmed/22772650">recently published in the journal Cerebral Cortex</a>, my colleagues and I used the same MRI approach to look at how social factors relate to neural health. We found that lower social status (i.e., lower family income, fewer years of education and living in poorer communities) predicted a reduced integrity of the physical connections in the brain.</span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Let me repeat just to drive the point home: <i>Socioeconomic status could actually <b>predict</b> the microscopic architecture of the connections in the brain</i>. </span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">How can this happen? Well it turns out that this relationship is mediated by an increase in </span><span style="background-color: white;">unhealthy life-styles like smoking and, that's right, increased obesity. So life-style factors and access to resources that affect physical health may be directly influencing the physical structure your brain itself.</span><span style="background-color: white;"> That means this health-to-brain relationship has vast societal implications that we are only just beginning to comprehend.</span><br />
<h3>
<span style="background-color: white;"><br /></span></h3>
<h3>
<span style="background-color: white;">A molecular link between physical health and the brain</span></h3>
<span style="background-color: white;">Okay, if you've stuck through this far, you are probably wondering how the heck changes in the body can influence the brain? Well in the study I just described my colleagues also </span><span style="background-color: white;">measured levels of a molecule called C-reactive protein (or CRP) in the blood. This little protein reflects <a href="http://en.wikipedia.org/wiki/Inflammation">inflammatory</a> activity, which is an immune system reaction and is the reason why recent cuts turn red and flushed. It turns out that, the link between both smoking and obesity and the brain could be mostly explained by increased CRP levels. </span><br />
<span style="background-color: white;"><br /></span><br />
Let's put it together. Lower socioeconomic status lead to reduced physical health which led to increased inflammation which, in turn, led to reduced integrity of the white matter in the brain. <br />
<span style="background-color: white;"><br /></span><br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://1.bp.blogspot.com/-6j4zfvzhKG0/UA37LpWHGGI/AAAAAAAAG80/kLPkl_W1te8/s1600/SummaryFigure_2012-03-13.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="240" src="http://1.bp.blogspot.com/-6j4zfvzhKG0/UA37LpWHGGI/AAAAAAAAG80/kLPkl_W1te8/s400/SummaryFigure_2012-03-13.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Schematic of the relationship between social and lifestyle factors the brain (adapted from <a href="http://www.ncbi.nlm.nih.gov/pubmed/22772650">Gianaros et al. Cerebral Cortex 2012</a>)</td></tr>
</tbody></table>
<br />
<span style="background-color: white;">Does this mean that cells are dying? Well not </span><span style="background-color: white;">necessarily. </span><br />
<span style="background-color: white;"><br /></span><br />
Remember, I said that white matter is the long fiber strands that connect neurons together. So far all we can say is that the signal we use to measure this tissue is reducing. However, adding one more fact into the equation leads to some very scary hypothesis as to what might be happening.<br />
<br />
<span style="background-color: white;">It turns out that the fat around your gut is actually an organ that secretes inflammatory molecules. As that "organ" expands, it secretes more inflammatory </span><span style="background-color: white;">chemicals (called </span><a href="http://en.wikipedia.org/wiki/Cytokine" style="background-color: white;">cytokines</a><span style="background-color: white;">). Many of these chemicals can cross the blood brain barrier that protects the brain from a lot of bad things. Once in the brain they can induce a local inflammation of the support cells that basically serve as the scaffolding for the axons in the brain. After a while, this scaffolding may collapse and break the underlying axons.</span><br />
<br />
How do we know this can happen? Well because this is precisely what happens in <a href="http://en.wikipedia.org/wiki/Multiple_sclerosis">multiple sclerosis (MS)</a> and we know that MS definitely damages physical tissue. <br />
<br />
Now I should be up front. The scientific evidence isn't there yet to suggest that obesity physically kills brain cells the same way MS does. The emerging evidence nonetheless convincing that there is a troubling link between obesity and the same systems as MS. This evidence keeps mounting every month as more scientific studies come out.<br />
<h3>
</h3>
<h3>
<br /></h3>
<h3>
Food for thought</h3>
<span style="background-color: white;">So while comedians and politicians may poke fun at Mayor Bloomberg's decision to ban extra-large soda drinks in an effort to curb obesity, we should take a step back and look at the science. Increased obesity not only reduces your physical health, but it's becoming readily apparent that obesity is also interfering with the organ that sits as the root of all thinking. Our work, along the studies from many other labs, shows that this has dramatic implications that extend into social issues as well as medical issues. </span><br />
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">Will banning 32oz sodas solve the problem? Absolutely not... not even close. But is it taking a problem seriously that we have, thus far, only been talking about tongue-in-cheek? You bet it is. </span><br />
<br />
<br />
<span style="float: left; padding: 5px;"><a href="http://www.researchblogging.org/"><img alt="ResearchBlogging.org" src="http://www.researchblogging.org/public/citation_icons/rb2_large_gray.png" style="border: 0;" /></a></span>
<span class="Z3988" title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=J+Psychosom+Res&rft_id=info%3Adoi%2F10.1016%2Fj.jpsychores.2010.07.012&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=Impaired+decision+making+among+morbidly+obese+adults&rft.issn=&rft.date=2011&rft.volume=&rft.issue=&rft.spage=&rft.epage=&rft.artnum=&rft.au=Brogan+A%2C+Hevey+D%2C+O%27Callaghan+G%2C+Yoder+R%2C+O%27Shea+D.&rfe_dat=bpr3.included=1;bpr3.tags=Medicine%2CNeuroscience">Brogan A, Hevey D, O'Callaghan G, Yoder R, O'Shea D. (2011). Impaired decision making among morbidly obese adults <span style="font-style: italic;">J Psychosom Res</span> DOI: <a href="http://dx.doi.org/10.1016/j.jpsychores.2010.07.012" rev="review">10.1016/j.jpsychores.2010.07.012</a></span><br />
<br />
<br />
<br />
<span class="Z3988" title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=http%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F20406532%23&rft_id=info%3Adoi%2F10.1017%2FS1355617710000354&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=Anorexia%2C+bulimia%2C+and+obesity%3A+shared+decision+making+deficits+on+the+Iowa+Gambling+Task+%28IGT%29&rft.issn=&rft.date=2011&rft.volume=&rft.issue=&rft.spage=&rft.epage=&rft.artnum=&rft.au=Brogan+A%2C+Hevey+D%2C+Pignatti+R.&rfe_dat=bpr3.included=1;bpr3.tags=Medicine%2CNeuroscience">Brogan A, Hevey D, Pignatti R. (2011). Anorexia, bulimia, and obesity: shared decision making deficits on the Iowa Gambling Task (IGT) <span style="font-style: italic;">http://www.ncbi.nlm.nih.gov/pubmed/20406532#</span> DOI: <a href="http://dx.doi.org/10.1017/S1355617710000354" rev="review">10.1017/S1355617710000354</a></span><br />
<br />
<span class="Z3988" title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=Cerebral+Cortex&rft_id=info%3Adoi%2Fhttp%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F22772650&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=Inflammatory+Pathways+Link+Socioeconomic+Inequalities+to+White+Matter+Architecture.&rft.issn=&rft.date=2012&rft.volume=&rft.issue=&rft.spage=&rft.epage=&rft.artnum=&rft.au=Gianaros+PJ%2C+Marsland+AL%2C+Sheu+LK%2C+Erickson+KI%2C+Verstynen+TD.&rfe_dat=bpr3.included=1;bpr3.tags=Social+Science%2CNeuroscience">Gianaros PJ, Marsland AL, Sheu LK, Erickson KI, Verstynen TD. (2012). Inflammatory Pathways Link Socioeconomic Inequalities to White Matter Architecture. <span style="font-style: italic;">Cerebral Cortex</span> DOI: <a href="http://dx.doi.org/http://www.ncbi.nlm.nih.gov/pubmed/22772650" rev="review">http://www.ncbi.nlm.nih.gov/pubmed/22772650</a></span>
<br />
<br />
<span class="Z3988" style="background-color: white;" title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=Science&rft_id=info%3Adoi%2Fhttp%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F18927395&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=Relation+between+obesity+and+blunted+striatal+response+to+food+is+moderated+by+TaqIA+A1+allele&rft.issn=&rft.date=2008&rft.volume=&rft.issue=&rft.spage=&rft.epage=&rft.artnum=&rft.au=Stice+E%2C+Spoor+S%2C+Bohon+C%2C+Small+DM.&rfe_dat=bpr3.included=1;bpr3.tags=Neuroscience">Stice E, Spoor S, Bohon C, Small DM. (2008). Relation between obesity and blunted striatal response to food is moderated by TaqIA A1 allele <span style="font-style: italic;">Science</span> DOI: <a href="http://dx.doi.org/http://www.ncbi.nlm.nih.gov/pubmed/18927395" rev="review">http://www.ncbi.nlm.nih.gov/pubmed/18927395</a></span><br />
<br />
<br />Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-4582649627549824354.post-56755114642526275922012-03-12T08:53:00.000-07:002012-03-12T08:53:22.238-07:00Tales From The Science Trenches: The unasked question<br />
<i><u>Note:</u> This post reflects my final diatribe on the issue of modern scientific publishing for a little while. While I think that this is definitely a flawed process in desperate need of fixing, I also think there are other issues that deserve our collective attention much more urgently. I will get to those in upcoming posts.</i><br />
<br />
After a dreadfully long hiatus (due to being on the job market), I have finally returned to the world of blogging. <br />
<br />
I've decided to get back to the topic I left hanging a few months ago: the so-called publication "crisis" the field of cognitive neuroscience. Regular readers of this blog already know <a href="http://cognitiveaxon.blogspot.com/2011/12/tales-from-science-trenches-open-letter.html">my opinions</a> <a href="http://cognitiveaxon.blogspot.com/2011/12/tales-from-science-trenches-case-of.html">of this </a><a href="http://cognitiveaxon.blogspot.com/2011/12/tales-from-science-trenches-problem-of.html">problem</a>, but a little bit of time has added maturity and nuance to my opinions on this matter.<br />
<br />
In my travels these past few months I've had the opportunity to discuss the flaws of modern review process at length with many well respected colleagues. I've heard the good, the bad and the in-between about life on the front lines of scientific publishing. Some say it's way too difficult to publish today (or dreadfully biased). Others say it's too easy. But almost everyone agrees that it's a real pain in the ass.<br />
<br />
I don't want to spend too much time on the status of the current system. Dwight Kravitz & Chris Baker have published a very complete <a href="http://www.frontiersin.org/Computational_Neuroscience/10.3389/fncom.2011.00055/abstract">background & review</a> of this topic that has garnered a lot of good attention. I highly recommend it for anyone interested in this topic, if only to get a good understanding of how we got to where we are today.<br />
<br />
So for our purposes today, let's consider some anecdotal stories relayed to me by colleagues. <br />
<br />
<b>The Bad</b><br />
<br />
When given the choice between hearing "good news" and "bad news" I always accept the bad news first. So let's start by considering an example of where the process falls apart (from a colleague working in Ireland).<br />
<blockquote class="tr_bq">
Submitted a manuscript <i>Journal X [<b>Note</b>: obviously not the real journal name]</i> in April 2010, received an email in mid June expecting this to be reviews, this was in fact an email from the editor saying they would send it out to review. Mid-Oct 2010 (1 day before it reached 6months since submission!) we received an apologetic email and a decision, invited to resubmit based on the reviewer comments. I say reviewer because we only had one reviewer, the other had broken contact with the journal. Mid August 2011(6 months since re-submission, 1 and a half years since the initial submission), it appeared that the initial reviewer whose comments we’d addressed had also broken contact with the journal and we once again only had a single new review. At this stage they just accepted the article pending revisions based again on the single reviewer.<br />
<br />
Two other slightly irksome reviews both from <i>Journal Y</i>; in one case a reviewer told us we needed to discuss the role of the basal ganglia although this was a cerebellar paper, in the next round of reviews a different reviewer said “why are you discussing the basal ganglia in a cerebellar paper”. The second more annoying one was from one of an anatomy paper which used one of the most highly cited articles in my field (Kelly and Strick, 2003) to justify my regions of interest. The reviewer said Kelly and Strick were wrong and therefore we were wrong!</blockquote>
<br />
This highlights serveral key problems brought up again and again in my conversations with colleagues (and discussed at length by <a href="http://www.frontiersin.org/Computational_Neuroscience/10.3389/fncom.2011.00055/abstract">Kravitz & Baker</a>). <br />
<br />
First, there's the needless lag in getting the article turned around. Fifteen years ago, when manuscripts had to be physically mailed to journals, it was understandable to have a 1-2 year review process. However, we now live in an era of online submissions. Yet the time-line of many journals is still arduously slow and needlessly long. <br />
<br />
Second, mid-tier and low-tier journals often find themselves scrambling to find good reviewers for papers. I've had many colleagues tell me about a paper getting in with only a single reviewer (for non-scientists, the norm is 2-3 reviewers). In many of these cases there is a trend for these to also be out-sourced to graduate students or post-docs. <br />
<br />
Now don't get me wrong, this is an <i>incredible</i> training opportunity (and one that I am thankful to have started early in my career). However, without good oversight of an out-sourced review by the principle investigator who was originally solicited, a rookie mistake can kill a reasonably good study's chance of acceptance at a journal. This chance is worsened when the rookie is the <i>only</i> reviewer of a manuscript.<br />
<br />
Finally the critiques are generally random, arbitrary and sometimes not related to the manuscript itself at all. Sadly, it is fairly common these days for a paper to get rejected because of more global disagreements in the field rather than the quality of the study itself. We work in a large field with many big theoretical disagreements. As a result, all of us at one time or another have been collateral damage in a fight amongst the larger egos in our field. Yet it only serves to stifle communication of ideas and results, rather than evaluating the true merit of any individual study.<br />
<br />
<b>The Good</b><br />
<br />
But contrary to this (and the arguments raised by Kravitz & Baker, as well as many other scientists), there are many times when the system actually works well. Contrast the previous story with one communicated by a friend working here in Pittsburgh (paraphrased here and not her actual quote).<br />
<blockquote class="tr_bq">
I submitted a paper to <i>Journal Z </i>and after a month I got comments back that were very positive and indicated a strong willingness to accept the paper. The reviewers were very collegial, but brought up an ideal control experiment that would bridge my work with another theory as well as making a better overall argument for my hypothesis. The editor had also read the paper and provided his own comments. He coalesced the reviewers comments into a specific set of recommendations and even offered recommendations for how to modify the design of the control experiment so as to make it a tighter/cleaner study. In the end, I felt like they were all invested in pushing the project forward.</blockquote>
In my opinion this is a textbook example of how the process should work in the first place: fast turn around, constructive reviews and a well managed pipeline focused almost exclusively on the details of the study at hand. So while the system has it's flaws, it's not broken everywhere.<br />
<br />
<b>So What Gives?</b><br />
<br />
Now there are two big differences between the Bad Story and the Good Story. The journals talked about in the Bad are all from neuroscience journals, while the Good is a story from a psychology journal. So differences in fields may play a key role. <br />
<br />
But I think there's something deeper at play here. Specifically differences in editorial oversight.<br />
<br />
Let's face it, most of us can agree that in many cases, editors at some journals appear to be falling asleep on the job. But let's not be too harsh on this judgement. I can understand why this happens in many cases. <br />
<br />
Big publishing houses like <a href="http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cts=1331503993537&ved=0CEAQFjAA&url=http%3A%2F%2Fwww.elsevier.com%2F&ei=biNdT6vHJMPVgQezpPmhCw&usg=AFQjCNF_znWVwDulpZ_jymzqxGIjHt2HlA&sig2=8Md5Erx1c673p5_ikuZoPQ">Elsevier</a> want to make as much money on free labor as possible. So in many cases, not only are professional scientists giving free professional labor as reviewers, we're also serving as under-compensated editors. Editors are overworked, under appreciated, and have little time to manage their own research careers, let alone supervise the careers of others through the editorial process.<br />
<br />
In a context like this, I can understand the sub-optimal feedback and oversight in the editorial process.<br />
<br />
But what about higher impact journals with full-time professional editors? Well in my experience, those are just as poorly managed. After numerous experiences submitting to journals with full-time and professional editors, I have never received the type of constructive feedback and oversight like what happened in the Good Story above. In fact, I don't really believe that many editors at these journals even read the articles at all.<br />
<br />
Of course, editorial oversight is just one (albeit significant) part of the problem. Another part stems with the demand for reviews as well. Reviewers are under increasing pressure to turn around reviews in faster timelines, often with little time to really digest any particular study. This is on top of the increased quantity of reviews required to keep pace with the increased number of journals. <br />
<br />
Scientists push to get out as many papers as they can and this increased volume of manuscripts requires even more reviewers. Eventually the demand gets so great that quality review quality drops precipitously. <br />
<br />
The military has a word for this: "<i>clusterfuck."</i><br />
<br />
<b>The Unasked Question</b><br />
<br />
So this is the state of our current publishing process in the field of cognitive neuroscience. It's driven a lot of professional scientists to push for a change. To fix the broken pipeline. <br />
<br />
Many argue that we need to overhaul the <i>entire</i> process itself. I view this opinion fondly, although I might hesitate to say that I am a "proponent". Some argue for a complete open access model with unblinded reviews. Others want a centralized database for submitting unreviewed manuscripts in order to crowd-source the evaluation of merit of individual studies. <br />
<br />
If you ask five cognitive neuroscientists their opinion on the current publishing process, you'll get ten different recommendations.<br />
<br />
But there's one question I think we're all neglecting and it's the most important question of all if we are going to try and address this so-called crisis. <br />
<i><br /></i><br />
<i>What the hell do we want from the publication process?</i><br />
<br />
Do we want to go back to the more traditional view where papers are a completed treatise on research topic? Decades ago this was the central view of what a scientific paper should be. Researchers would spend years investigating a single question, run all possible control experiments (dozens or more), and carefully construct a single paper after all those years that eliminates all other possible alternative explanations for the effect of interest. These papers would be long, carefully constructed, and a nearly definitive treatise on a research topic. <br />
<br />
This is essentially the tortoise model of scientific advancement. The time-scale of publishing is very long, but also very reliable and consistent. Here the key measure of a research program's efficacy is the long-term citation record of a given paper. You might only publish one empirical article every few years, but it would be a definitive work.<br />
<br />
The alternative is the hare model of scientific advancement. Here publications should reflect a snap-shot of a research program's status. Only one or two experiments are reported in a paper (although thoroughly analyzed) and repeated publications all tie together over time to tell a larger story. This is a very accelerated model of scientific progress. Articles are less a treatise on a core theoretical question and more of a "status-update" of a research program.<br />
<br />
Over the last couple of decades, the push in cognitive neuroscience (and other fields) is to move away from the slower quality-focused model to the faster quantity-based model. There are many reasons for this, but you can primarily thank the increasingly heated "publish-or-perish" academic culture for this change. <br />
<br />
Right now we are, unfortunately, stuck in a hybrid paradigm where we have the expectations of the tortoise model with the demands of the hare model. In my opinion, this bipolar paradigm is what is driving researchers crazy these days.<br />
<br />
So I return to my original question. What do we want from scientific publishing? We can't be both a tortoise model or the hare model? We can't have both (or realistically we can't have the mechanics of one and the expectations of the other).<br />
<br />
Unfortunately, there is no good answer to this. I see equally valid pro's and con's to either. Also, this is a decision that has profound implications well beyond how we structure the peer-review process. Departments and universities will have to completely revise how they evaluate the progress of faculty and students. Similarly so for granting agencies. <br />
<br />
However, these changes are necessary. It's just a matter of deciding what the expectations need to be.<br />
<br />
Therefore answering this question comes down to us talking together as a field. We need to decide which model we want to use and commit to it (with all the implications that follow). Until we make this decision, we can propose as many new publishing systems as we want, but they'll end up being nothing more than intellectual thought experiments that describes the world we wish could be, rather than real ways to fix the world that we have.Unknownnoreply@blogger.com3tag:blogger.com,1999:blog-4582649627549824354.post-27507146618671738502011-12-10T08:41:00.001-08:002011-12-11T08:04:36.468-08:00Tales From The Science Trenches: The problem of journal addiction<br />
<div class="MsoNormal">
<span lang="EN-GB"><i><u>Note</u>: Over the next few months I'll be doing a series of posts critiquing modern peer-review process. As part of this I'd like to collect stories from other scientists on their experiences trying to get papers through peer-review. Good or bad, if you have a story you'd like to share, please email me (timothyv[at]gmail[dot]com) or put them in the comments section and I'll post them as I see them. </i></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">Today's guest post comes from <a href="http://www.icn.ucl.ac.uk/motorcontrol/">Joern Diedrichsen</a> who is one of</span><span lang="EN-GB"> the smartest, most creative scientists that I know. He studies motor control at University College London and has a provocative take on the pains of the current peer-review process and the addictive mentality that keeps bringing us back for more.</span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><b>The problem of journal addiction</b></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><b><br /></b></span></div>
<div class="MsoNormal">
<span lang="EN-GB">"Let’s admit
it, we are all suckers for glossy journals. Nature, Science, Neuron, Nature
Neuroscience... oh, how we puff up our chest when we have a paper accepted in
a high place! How we strut around and announce in talks: “our new paper in xxx
shows…”. And how deflated, angry, and bitter are we after a rejection. How we
hate the reviewers and the editors. "Ignorant bastards just do not understand anything."<o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">And of course,
we are correct. Our papers are misinterpreted, rejected for selfish reasons, or
because of plain ignorance. And the editors do not have the spine to stand up
and tell reviewers how petty they are.
And the next time WE get asked to review a paper for this journal – which
is really not as good as our rejected paper – we’ll show them! The review
request lands in our inbox and we metamorphose into the dreaded reviewer 2. Or
3 (depending on how many hours past lunch it is). <o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">Of course
we could boycott the whole system. Retreat to a small island. Only send papers
(no matter how good) to PLOSone or Frontiers. But a month later, when the
wounds have healed, we find ourselves preparing a cover letter for another submission
to one of the hated journals. Talk about addiction. <o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">Do we really
think that the title of the journal we publish in means this much? Considering the
degree to which professional editors are slaves to fashion, and how random the
review process is, we really shouldn’t. I think some of my weaker papers have
been published in “better” journals - and vice versa. There is a slight
positive correlation – but not very high. Many of the papers that in retrospect are important, get
cited, and have impact on the field are in 2<sup>nd</sup>-tier journals. But
then again, in terms of careers, candidate selection, and funding decisions, we
all like to rely on the fast heuristic of the impact factor, not on how
important we think the paper is. <o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">So, rather
than go by the journal name, if we all would READ the actual papers, see how clear,
compelling and novel the results really are, we should be able to break the yoke
that editors and reviewers hold over us, right? So, why are we not online every
week making sure that good papers in 2<sup>nd</sup> tier journals get their
well-earned exposure by posting online evaluations on journal websites? Why do
the online debates in PLOS Comp Biology often only consist of statements such
as “The reference section is missing a crucial reference: My article, 2011”? Why
does Faculty of 1000 seem to be struggling in terms of relevance and in getting
enough submissions? I guess we are simply too busy writing angry reviews, rewriting
our own papers for the next glossy journal, or arguing with journal editors. And
trust me, life as an editor is not rosy either. Talk about a thankless job.<o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">Currently, I
do not see a good way out. Do we really think that low-threshold mass-journals
like Frontiers and PLOSone are the solution? It seems there is just too much
stuff out there, and post-hoc online evaluations by people in the field seem not
to work very well. So maybe the traditional peer-review and tiered journal
system is – like western democracy and capitalism – the lesser of the evils….
But maybe we can start to not reject because we feel the paper is too novel,
doesn’t cite us enough, or infringes on our turf? Maybe we should stop pushing papers
by friends in high profile journals for political gain? Maybe we should stop evaluating
people based on where they publish and turn our attention to the science they
produce instead? Perhaps teach our students some integrity and honesty?<o:p></o:p></span></div>
<div class="MsoNormal">
<span lang="EN-GB"><br /></span></div>
<div class="MsoNormal">
<span lang="EN-GB">No time for
that…. I need to fine-tune that cover letter…"<o:p></o:p></span></div>Unknownnoreply@blogger.com3tag:blogger.com,1999:blog-4582649627549824354.post-76316935437007594732011-12-05T05:33:00.001-08:002011-12-05T08:09:45.523-08:00Tales From The Science Trenches: Case of the missing editor<br />
<i><u>Note</u>: Over the next few months I'll be doing a series of posts critiquing modern peer-review process. As part of this I'd like to collect stories from other scientists on their experiences trying to get papers through peer-review. Good or bad, if you have a story you'd like to share, please email me (timothyv[at]gmail[dot]com) or put them in the comments section and I'll post them as I see them. </i><br />
<br />
Almost immediately after my post yesterday a friend emailed me to share her story. I think it sets another good case for why proper editorial management is so important.<br />
<br />
If you or a colleague have a similar story or even a positive one about the submission process, please email me your tale.<br />
<br />
<b>Case of the Missing Editor</b><br />
<br />
"Hi Tim,<br />
<br />
I submitted a manuscript to the International Journal of <i>[redacted]</i> on March 7. This journal doesn’t have an online submission site, so authors are supposed to email (or paper mail) the submission to the editor. After several attempts to email the editor with the email address listed on the journal website and on his university web page, none of the emails went through. I also didn’t get a response to an email I sent to someone else on the editorial board looking for the editor’s email address. <br />
<br />
So I contacted someone in the office I work in to see if they had ideas about where to go from there. It turned out the email addresses listed for the editor were incorrect, so they gave me the right one. I sent in the manuscript, it seemed to go through, but I never received acknowledgement of receipt. I forgot about it for a while, and figured I’d hear from them with a decision in a few months.<br />
<br />
At the end of July, I sent an email to the editor asking for acknowledgement of receipt and information on when I might expect to hear a decision. He never responded. At the end of August, I emailed an associate editor at the journal who happened to be at the same university, and asked if he could help me out with getting in contact with the editor about my questions. He responded right away saying he would get in contact with the editor for me, but I never heard anything from him after that.<br />
<br />
At the beginning of September, my advisor emailed the editor asking about my manuscript and another of his that he had sent in May. The editor responded within a day, saying he received the manuscripts and sent them out for review, and he has a new email address that we should use.<br />
<br />
We heard nothing back by the end of October, so at that point I sent another email inquiring about both manuscripts that are under review. Three weeks later, he responded and apologized for the 'long delay' in his response, saying that he has had a 'high workload.' He said 'I will be able to inform you about a decision very soon now. You will hear from me before December 15.'<br />
<br />
And so we wait.<br />
<br />
I’ve considered (many times) pulling my manuscript from the journal and submitting it somewhere else, but haven’t because the topic of my paper just wouldn’t fit well at very many journals, and all of my other journal options have lengthy (2-3 year) waits between acceptance and publication. Needless to say, I don’t plan on ever submitting to this journal again, at least as long as this person remains the editor."Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-4582649627549824354.post-54526116362468206782011-12-04T09:47:00.001-08:002011-12-04T12:08:02.774-08:00Tales From The Science Trenches: An open-letter to Frontiers in Human Neuroscience<div>
<i><u>Note</u>: Over the next few months I'll be doing a series of posts critiquing modern peer-review process. As part of this I'd like to collect stories from other scientists on their experiences trying to get papers through peer-review. Good or bad, if you have a story you'd like to share, please email me (timothyv[at]gmail[dot]com) or put them in the comments section and I'll post them as I see them. </i></div>
<div>
<br /></div>
<div>
In an ideal sense, the peer-review process is designed to provide useful critiques of scientific manuscripts so that the end result has minimal errors and justified conclusions.</div>
<div>
<br /></div>
<div>
Unfortunately, personal biases, poor editorial oversight and general abuse of the review process can sometimes impair communication of scientific results. What is described below is one example of the flaws in the process that are sometimes experienced by those trying to publish scientific results.</div>
<div>
<br /></div>
<div>
Recently my colleagues and I tried submitting a manuscript to the journal <a href="http://www.frontiersin.org/human_neuroscience">Frontiers in Human Neuroscience</a>. <a href="http://www.frontiersin.org/">Frontier</a> is an open access set of journals that prides itself for a rapid turn-around (i.e., fast review cycles) and a simplified peer-review process focused primarily on methodological validity. </div>
<div>
<br /></div>
<div>
The manuscript we submitted was a study on the brain activation changes that occur with increasing obesity. The paper had already been bounced around to several other journals before being sent to Frontiers. As sometimes happens, the same anonymous reviewer followed us from journal to journal providing the same general (and in our opinion incorrect) critiques of the manuscript.</div>
<div>
<br /></div>
<div>
Now for those of you outside the process, it's generally considered bad manners to agree to review a manuscript that you have already critiqued at another journal. The logic being that you can't provide an unbiased review of the manuscript because you've already judged it previously.</div>
<div>
<br /></div>
<div>
Unfortunately, this particular reviewer keeps agreeing to critique the manuscript while letting his previous judgements color his critique of our manuscript. What's worse is that he (I'm assuming the gender because I have a good idea who it is) isn't interested in making helpful criticisms that could expand or fix the paper. You can see for yourself in the linked documents below.</div>
<div>
<br /></div>
<div>
So we submitted our work to Frontiers and this reviewer followed us there. He gave his usual critique, ignoring many of the changes we that made to address these concerns. We then spent a couple of months further revising the manuscript and running control analyses. Once the revisions were submitted to the journal, this hostile referee immediately <i>withdrew</i> his review (again, rather than address our arguments) while the only other referee signed off on our changes as being sufficient for publication.</div>
<div>
<br /></div>
<div>
Then we waited 8 weeks with no word from Frontiers. Again, for those of you non-scientists this is a pretty long time to wait for an editorial decision after the reviewers had submitted their responses. Emails to the editor were ignored and the status remained unchanged.</div>
<div>
<br /></div>
<div>
It turns out that the original editor had retired after we resubmitted our manuscript. Instead of seeing the submission through to completion (or not agreeing to be an editor in the first place), he simply withdrew as well. It appears that it took Frontiers almost to figure this out. Once a new editor was assigned, this individual (Dr. HH in the open letter below) made an executive decision to reject the manuscript <i>based only on the withdrawn review</i>. He completely ignored our replies and made no mention as to why our carefully constructed retort was incorrect. </div>
<div>
<br /></div>
<div>
In the end the new editor gave us his subjective opinion, based on the interpretations of an admittedly biased reviewer (he admitted he had reviewed our work at other journals) who withdrew from the review process instead of replying to our arguments. </div>
<div>
<br /></div>
<div>
Not exactly the thorough, intellectual conversation that the peer-review process plays itself to be.</div>
<div>
<br /></div>
<div>
Had this been my first experience like this, I would let it go. Disappointment comes with being a scientist. However, it is sad to say that this isn't an unusual circumstance in modern scientific publishing these days.</div>
<div>
<br /></div>
<div>
So what can I do besides submitting to another journal? Well I've taken the step of delivering an open letter to the editors at Frontiers in Human Neuroscience to outline my frustrations. <b>You can read it </b><b><a href="https://docs.google.com/document/d/1jJySqGIvIyp8B9Hko8SJdEb3AqFCvniRGWVb6H6cSaA/edit">here</a> </b>or see it below. </div>
<div>
<br /></div>
<div>
"Why make it an open letter?" you may ask. I'm not doing it to be vindictive. I'm doing it in the spirit of open access. With increasing demand to open up access to articles, I believe it's imperative to also open up access to critiques and criticisms of the review process itself. We don't have a formal repository of abuses of the peer-review process (i.e., a Yelp of scientific publishing to know editorial, reviewer, or journalistic biases before submitting manuscripts). Often young scientists only find these out through trial-and-error when trying to publish their results.</div>
<div>
<br /></div>
<div>
Shedding light on the flaws of the peer-review process with your colleagues is the only way to start addressing these problems. Until we have a formal system for bringing these experiences to light, I'd like to start serving that role. If you have an experience you'd like to share about your own experiences that highlight either errors/flaws or cases where it worked precisely how you think it should, please send them along. I'll post your stories here on my blog as they come along and respect your privacy.</div>
<div>
<br /></div>
<div>
As of the time of this post, the Frontiers editors have not replied to my concerns laid out in the letter. I'm not holding my breath.</div>
<div>
<br /></div>
<div>
<u><b><br /></b></u></div>
<div>
<u><b>Epilogue</b></u>: For those of you interested in what a hostile review looks like, you can read the entire critique from the problematic reviewer and our reply to their critique <a href="https://docs.google.com/document/d/1LA-rYoydVY4G4IxsrrB9su7zt-vAadlJ5IQMqqTjxqU/edit">here</a> since Frontiers won't likely be releasing this anytime soon.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b><u>Open letter to the editors of Frontiers in Human Neuroscience:</u></b></div>
<div>
<br /></div>
<div>
<div>
Dr. HH,</div>
<div>
<br /></div>
<div>
Note: In the letter that follows I speak only for myself and not my fellow co-authors. I will be making this letter open to my scientific colleagues so they can understand what they can expect from Frontiers.</div>
<div>
<br /></div>
<div>
I am extremely displeased with the way in which Frontiers has handled the review of this manuscript. Not only has it been mishandled and delayed due to lack of foresight and administration by the editorial staff, but the reasoning behind the final decision rests on a tautological adherence to the reputation of a reviewer who withdrew from the process altogether without addressing the merits of our arguments. At no place in the decision were our responses acknowledged and shown to be incorrect.</div>
<div>
<br /></div>
<div>
My specific complaints include:</div>
<div>
<br /></div>
<div>
1) Our original editor, Dr. NC removed himself as editor in the middle of the review process instead of either seeing the submission through to completion or not accepting it if he knew he was stepping down. Dr. NC ignored several email requests from our group asking about the status of the manuscript after we replied to the initial concerns. We received no correspondence from Frontiers for more than two months and no reply from Dr. NC altogether.</div>
<div>
<br /></div>
<div>
2) It took the Frontiers office almost 8 weeks to revisit our manuscript after the original reviewers had either withdrawn (Reviewer 1) or signed off (Reviewer 2) on our changes. </div>
<div>
<br /></div>
<div>
3) The editorial board chose to ignore the rude and unprofessional tone of Reviewer 1 (who even admitted that he has been a hostile reviewer of this manuscript at previous journals and yet did not decline the invitation to review). This reviewer did not provide constructive critiques that could improve the quality of the study. Instead they made rude, accusatory and unprofessional statements that were provably false. At no point did the editorial board intervene in this review. Further, the editorial decision completely ignored the second reviewer’s positive comments about the manuscript (some of which directly conflicted with the first reviewer).</div>
<div>
<br /></div>
<div>
4) Most insulting of all, rather than find an additional reviewer to replace the hostile referee who withdrew from the review, you made an executive decision based solely on the opinions of the withdrawn review! The arguments laid out in our reply were outright ignored, as was the fact that the reviewer had been withdrawn from the review process. If the reviewer had felt merit in their argument they would have further replied to our revisions. Instead they chose to simply withdraw from the discussion without providing critical feedback.</div>
<div>
<br /></div>
<div>
5) The basis of your executive decision was based solely on several demonstrably false statements that we laid out in our response and we again describe here. These include:</div>
<div>
<br /></div>
<div>
a) "One expert in the field of this paper had serious concerns about the design and interpretation of this study. The authors' responses did not alleviate the reviewer's concerns." </div>
<div>
<br /></div>
<div>
Reply: This comment comes from a withdrawn review so it is unclear whether we actually alleviated the concerns of the reviewer. However, if the reviewer contacted Frontiers after our reply and did not address our carefully laid out arguments, then this puts us in a grossly unfair position since there is no formal critique with which to make a response. </div>
<div>
<br /></div>
<div>
b) "In particular, the reviewer was concerned that the 3 groups (normal, overweight, and obese) had major age differences, that these age differences may explain some of the results, and thought that the analysis attempting to factor age out is insufficient. The associate editor and I agree that considering the small number of subjects indeed the age differences cannot be overlooked." </div>
<div>
<br /></div>
<div>
Reply: This argument is specious and wholly unsubstantiated on several levels. First, as we carefully lay out in our reply, it is the Overweight group that is slightly older than the Normal and Obese groups. In fact the ages of the Normal and Obese groups are statistically identical. Yet our neural effects are Normal < Overweight < Obese in the categorical analysis (specifically Normal < Obese) and strictly linear in the parametric regression analysis. If age was the driving factor our categorical effect pattern would be Normal < Obese < Overweight and have an inverted-U characteristic in the parametric analysis (Figure 7a). Therefore this argument fails the simple logic test.</div>
<div>
<br /></div>
<div>
Second, the reviewer's belief that using nuisance regressors is an invalid way of controlling for non-specific effects goes against basic statistical theory. The argument of accounted variance in nested regression models is the fundamental theory of structural equation modeling and mediation analyses. Yet, the reviewer mis-quotes one article in a psychiatry journal to make their case. We clearly demonstrated in the reply that BMI and age are not correlated and therefore (even according to the article cited by the withdrawn reviewer) age can be treated as a valid covariate for the neural effects.</div>
<div>
<br /></div>
<div>
Finally, an N=29 is NOT a small sample-size for conventional fMRI study. There might be a case for it to under-power the categorical analysis, but in the parametric regression analyses (which supports the categorical findings) this is sufficient even for the conservative sample size recommended by Thirion et al. 2007 which the reviewer himself cites. </div>
<div>
<br /></div>
<div>
At no stage has the withdrawn reviewer or editor pointed out why any of our arguments are incorrect. They are simply disregarded without reply.</div>
<div>
<br /></div>
<div>
c) "Moreover, the mere fact that such conspicuous age difference were found, suggests some non-random sampling, which suggests that other factors would covary with BMI. In fact, overweight and especially morbid obesity are associated with numerous factors which may be of importance (e.g. socio-economic factors, health problems, not to speak of comfort in lying in the scanner), but the paper unfortunately does not provide clinical or demographic data on the subjects."</div>
<div>
<br /></div>
<div>
Reply: We fail to see the logic in the argument of "non-random sampling." In fact, with random sampling you will get some inconsistencies between groups. Equal demographics across groups is only achieved through NON-RANDOM sampling. Nonetheless, as we carefully laid out in our reply, there is no logical way that age can explain our effects. As for the other covariates, the slippery slope argument to control for an ever increasing number of factors is a needless "torpedo" argument that can be lodged against virtually any between-group study. This is a proof of concept study of inhibitory control deficits in obesity. Once an effect is confirmed, further work can be done to elucidate mechanisms and causes. But you have to know what you're looking for first. If we followed your logic to its conclusion, no study could be published until all explanatory factors can be fully accounted for. For example, differences between younger and older adults could be due to lack of comfort in the MRI environment in older adults. Differences between borderline personality disorder and controls could be due to emotional reactions to the MRI environment. Differences between schizophrenics and controls could be due to discomfort and irritation lying in a tight enclosed space. In short, any between-subject study (including many of those published by the editors of this journal) could be said to have the same limitations as our study. This in no way undermines studies of age, schizophrenia, borderline personality disorder, autism, or any other disease. Instead, it emphasizes that despite these limitations in between-group comparisons there can be important information gained. Similarly, our study on obesity is not immune to these issues, but it does not mean that important information on brain health in obesity could not be gleaned from this work.</div>
<div>
<br /></div>
<div>
d) "Another issue is the fact that since the behavioral results did not in fact confirm the prediction of the authors (more inhibitory problems with obesity), the paper relies heavily on reverse inference – that is, on drawing conclusions from assumptions on the cognitive roles of specific activated brain regions."</div>
<div>
<br /></div>
<div>
Reply: In many brain imaging studies, behavioral effects are weaker or not present where evoked brain dynamics are clearly visible. In fact, the overall slowing of response times across groups is behavioral evidence for our argument that in higher BMI subjects, it requires more processing power to get the same behavioral output. Thus all responses would be slower. Again, this suggests that the manuscript was not clearly read since we make this point very clear in the text.</div>
<div>
<br /></div>
<div>
Given this rude and unprofessional experience, I will request that you no longer ask me to be a reviewer on future manuscripts since I do not wish to participate in such a flawed and obviously biased system. I will also not be sending any new manuscripts to Frontiers in the foreseeable future. </div>
<div>
<br /></div>
<div>
Sincerely,</div>
<div>
Timothy Verstynen Ph.D.</div>
<div>
Research Associate</div>
<div>
Department of Psychology</div>
<div>
Learning Research and Development Center</div>
<div>
University of Pittsburgh</div>
<div>
Pittsburgh, PA 15260</div>
</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>Unknownnoreply@blogger.com4tag:blogger.com,1999:blog-4582649627549824354.post-8203167331716577502011-11-22T15:01:00.001-08:002011-11-25T11:14:26.890-08:00Thoughts on the science of brutalityLike many Americans, I couldn't help but sit in astonishment last week as I watched the videos showing University of California police officers <a href="http://www.youtube.com/watch?v=buovLQ9qyWQ">hitting</a> and <a href="http://www.youtube.com/watch?v=6AdDLhPwpp4">pepper-spraying</a> non-violent student protestors. <br />
<br />
My initial reaction was disgust and anger. How could a reasonable person see this as proportional force to disperse the protesting students? While it's not as bad as what's happening in Egypt or Syria, this is definitely not behavior we expect to see happen in the land of the free and home of the brave. (I'd like to point out that the "it's worse elsewhere" defense is specious at best).<br />
<br />
Many of us direct our anger at the police officers themselves. I'm guilty of this too. When I saw the video of UC Davis students being pepper-sprayed, I immediately thought that Officer Pike must be sadistically enjoying the act. I mean obviously only a sadistic nutcase would do such a malicious act like this right? A "normal" person would act reasonably and find an alternative, more peaceful way right? <br />
<br />
Unfortunately, psychology says that this may <u>not</u> the case. <br />
<br />
The science of human behavior consistently demonstrates that conformity is the norm, even for aggressive and abusive actions. Doing the "right" thing in the face of authority demanding excessive force is the exception to the rule. All freshmen psychology majors know this story by heart due to the tragic success of the research by Stanley Milgrim and Philip Zimbardo. <br />
<br />
<b>Conformity To Power</b><br />
<br />
<a href="http://en.wikipedia.org/wiki/Stanley_Milgram">Stanley Milgrim</a> was an American psychologist who wanted to understand how it was that so many Germans could commit the atrocities that occurred under Nazi control. Are Germans just a sadistic people inclined towards violence or could anyone be pushed to kill an innocent human being? <br />
<br />
To test this, Milgrim took a random sample of people around Yale University and <a href="http://en.wikipedia.org/wiki/Milgram_experiment">put them in charge of "delivering" electric shocks to "participants" who were answering questions in another room</a>. Every time the "participant" gave an incorrect response, the test subject was told to deliver a shock and increase the intensity of the electrical stimulus on the next incorrect answer. Milgrim found that most people (over half) caved to the authority of the experimenter and would deliver even a "fatal" electric shock to the unseen "participant." <br />
<br />
Of course the "participant" was just an actor and the "shocks" were never actually delivered, but the conclusion was very clear. For most people the drive to conform to authority was so strong that they'd even hurt or kill a stranger rather than resist the authoritative figure. This is even <i>without</i> a threat of violence from the authority figure.<br />
<br />
<b>Power Corrupts</b><br />
<br />
About a decade later <a href="http://en.wikipedia.org/wiki/Philip_Zimbardo">Phillip Zimbardo</a> showed how context can even affect the authority figures themselves. In one of the most terrifying experiments in modern psychology, Zimbardo showed that <a href="http://en.wikipedia.org/wiki/John_Dalberg-Acton,_1st_Baron_Acton">John Dalberg-Acton</a> had a particularly acute insight into the pernicious influence of power on human behavior. <br />
<br />
Zimbardo took a random sample of Stanford University students and assigned half to playing the role of "prisoners" and the rest fake "guards" in a mock prison setup in the basement of the Department of Psychology. <a href="http://en.wikipedia.org/wiki/Stanford_prison_experiment">Zimbardo himself the role of "warden.</a>" Within a matter of days things quickly fell apart. Otherwise normal, college students found themselves abusing their classmates by playing the role of authoritarian prison guards. The entire expriment had to be shut down as people fell too deeply into their simulated roles (including Zimbardo himself) and started hurting fellow students. This was in a setup where <i>everyone</i> knew that their roles weren't real.<br />
<br />
The important thing to realize is that in both the Milgrim Study and the Stanford Prison Experiment, it wasn't a subset of people who were pre-disposed to violence that committed abusive acts it was an easy majority each time. When pressured by authority people will do extraordinary things and left unconstrained those in authority can go too far. It's simply human nature.<br />
<br />
<b>Two Eyes For An Eye</b><br />
<br />
Even in the moment, psychology teaches us that each forceful act is perceived differently by those doing the hitting and those being hit. Neuroscientist and motor researcher <a href="http://www.ted.com/talks/daniel_wolpert_the_real_reason_for_brains.html">Dan Wolpert</a> wanted to understand why it is that the fights between his two children escalated so rapidly (well really he was trying to understand how the brain optimally predicts sensory stimuli when we move, but it was his kids that apparently gave him this idea for the experiment). <br />
<br />
Wolpert had subjects <a href="http://www.ncbi.nlm.nih.gov/pubmed/12855800">press on a lever attached to a robotic arm</a>. Another arm delivered the <u>exact</u> same force to another participant seated in the same room. This participant was then instructed to deliver the force he felt back to the other person using a similar robot setup. With each cycle the amount of force being delivered would nearly double. This is because our brains underestimate the experience of force when we produce and so it feels weaker than the forces we experience from someone else. It's the same reason why we can't tickle ourselves. The sensory signals are sort of cancelled out when we produce them. When we expect it, our brains make things feel a bit less intense. <br />
<br />
Bringing this back to the events of the last few weeks, this may mean that the officers may not fully perceive how rough they are actually being. Their brains literally don't feel that they are hitting as hard as they are. So in the moment, it may literally seem as if the hits aren't as strong as they are. (Of course, who knows if this holds for the experience of pepper spraying.)<br />
<br />
<b>Putting The Context In Context</b><br />
<br />
Before I end I need to point out that there are thousands of upstanding police officers in this country who put themselves in dangerous situations every day to keep the public safe and treat civilians in a civil manner. There are many many officers who treat protesters with dignity and without violence. Sadly, we often react with more anger to misuses of pepper spray than we react with sadness when an officer is<a href="http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2011/11/17/BA691M0POT.DTL"> gunned down doing the right thing in the line of duty</a>. <br />
<br />
In no way do I mean to imply that because of the context that they are put in, all police officers are going to be abusive or lose control. I'm just trying to point out that we take a good look at ourselves and realize that many of us, if put in the same situation, would very likely do the same thing as Lt. Pike or the Berkeley police officers. <br />
<br />
So what do we do? Maybe instead of just admonishing the excessive force when we see it, we should also focus on rewarding those officers who stand up against the pressure and internal desires to use excessive force because it's the easy route. After all, reinforcement works better to change behavior than punishment (we can also thank psychology for knowing that). But until we start accepting that these actions are not abnormal, but in fact predictable within a context, any acts of punishment against officers who go too far won't change the likelihood of future similar acts<br />
<br />
With the exception of psychopaths and saints, if you treat a man like a dog you will get a dog. But treat a man like a man and you'll get a human being.<br />
<br />
<br />
<span style="float: left; padding-bottom: 5px; padding-left: 5px; padding-right: 5px; padding-top: 5px;"><a href="http://www.researchblogging.org/"><img alt="ResearchBlogging.org" src="http://www.researchblogging.org/public/citation_icons/rb2_large_gray.png" style="border: 0;" /></a></span>
<span class="Z3988" title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=Science&rft_id=info%3Adoi%2F10.1126%2Fscience.1085327&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=Two+Eyes+for+an+Eye%3A+The+Neuroscience+of+Force+Escalation&rft.issn=0036-8075&rft.date=2003&rft.volume=301&rft.issue=5630&rft.spage=187&rft.epage=187&rft.artnum=http%3A%2F%2Fwww.sciencemag.org%2Fcgi%2Fdoi%2F10.1126%2Fscience.1085327&rft.au=Shergill%2C+S.&rfe_dat=bpr3.included=1;bpr3.tags=Neuroscience">Shergill, S. (2003). Two Eyes for an Eye: The Neuroscience of Force Escalation <span style="font-style: italic;">Science, 301</span> (5630), 187-187 DOI: <a href="http://dx.doi.org/10.1126/science.1085327" rev="review">10.1126/science.1085327</a></span>Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-4582649627549824354.post-64272708565274525182011-11-13T12:36:00.001-08:002011-11-13T12:38:02.724-08:00On twitter now....Thanks to Brad Voytek I have caved and <a href="http://www.twitter.com/">Twittered</a> myself. Follow me @tdverstynen for my microblogging/ranting in 140 characters or less.Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-4582649627549824354.post-24388761129899049002011-10-31T04:48:00.000-07:002011-10-31T05:07:03.498-07:00Zombie Brain: Conclusions<i style="font-family: Arial, Helvetica, sans-serif;"><span id="internal-source-marker_0.7011585831642151" style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">This post is the final installment of our collaborative venture (between <a href="http://blog.ketyov.com/">Oscillatory Thoughts</a> and <a href="http://cognitiveaxon.blogspot.com/">Cognitive Axon</a>) exploring the <a href="http://cognitiveaxon.blogspot.com/2011/10/living-dead-brain-what-forensic.html">Zombie Brain</a>. We hope you’ve enjoyed this little ride. </span></i><i style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Sincerely, Bradley Voytek Ph.D. & Tim Verstynen Ph.D.</span></i><br />
<div style="background-color: transparent;">
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<span style="background-color: transparent; font-family: Arial, Helvetica, sans-serif; font-weight: bold; text-align: -webkit-auto; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Bringing it all together: The Zombie Brain</span></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/5kZqliPHaoY?feature=player_embedded' frameborder='0'></iframe></div>
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span class="Apple-style-span" style="white-space: pre-wrap;"><b><br /></b></span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Over the last ten days we’ve laid out our vision of the zombie brain. To recap, we’ve shown that zombies:</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">1) <a href="http://blog.ketyov.com/2011/10/zombie-brain-impulsive-reactive.html">Have an over-active aggression circuit.</a></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br /></span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">2) <a href="http://cognitiveaxon.blogspot.com/2011/10/symptom-2-lumbering-walk.html">Show cerebellar dysfunction causing them to move slowly.</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">3) <a href="http://blog.ketyov.com/2011/10/zombie-brain-long-term-memory-loss.html">Suffer from long-term memory loss due to damage to the hippocampus.</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">4) <a href="http://cognitiveaxon.blogspot.com/2011/10/zombie-brain-language-deficits.html">Present with global aphasia (i.e., can’t speak, can understand language).</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">5) <a href="http://blog.ketyov.com/2011/10/zombie-brain-selfother-delusion.html">Suffer from a variant of Capgras-Delusion</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">6) <a href="http://cognitiveaxon.blogspot.com/2011/10/zombie-brain-pain-perception.html">Have impaired pain perception</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">7) <a href="http://blog.ketyov.com/2011/10/zombie-brain-stimulus-locked-attention.html">Cannot attend to more than one thing at a time.</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">8) <a href="http://cognitiveaxon.blogspot.com/2011/10/zombie-brain-flesh-addiction.html">Exhibit addictive responses to eating flesh</a>.</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">9) <a href="http://blog.ketyov.com/2011/10/zombie-brain-insatiable-hunger.html">Have an insatiable appetite.</a></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Together these symptoms and their neurological roots reveal a striking picture of the zombie brain. </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Based on the behavioral profile of the standard zombie, we conclude that the zombie brain would have massive atrophy of the “association areas” of the neocortex: i.e., those areas that are responsible for the higher-order cognitive functions. Given the clear cognitive and memory deficits, we would also expect significant portions of the frontal and parietal lobes, and nearly the entire temporal lobe, to exhibit massive degeneration. Also, the hippocampuses of both hemispheres would be massively atrophied (resulting in memory deficits), along with most of the cerebellum (resulting in a loss of coordinated movements).</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In contrast, we would expect that large portions of the primary cortices would remain intact. Behavioral observations lead us to conclude that vision, most of somatosensation (i.e., touch), and hearing are likely unimpaired. We also hypothesize that gustation and olfaction would also remain largely unaffected. Relatedly, we must further conclude that large sections of the thalamus and midbrain, brainstem, and spinal cord are all likely functioning normally or are in a hyper-active state. </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Putting these elements together, we have reconstructed a plausible model for what the zombie brain would look like. This is shown in yellow below and presented over a normal human brain for comparison.</span><br /><span style="background-color: transparent; font-size: 10pt; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span></span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-i__9vOJzxnM/Tq2AZA3nxvI/AAAAAAAAGMg/MZaQcK3jg6E/s1600/z2h_overlay.bmp" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="90" src="http://2.bp.blogspot.com/-i__9vOJzxnM/Tq2AZA3nxvI/AAAAAAAAGMg/MZaQcK3jg6E/s400/z2h_overlay.bmp" width="400" /></a></div>
<div style="background-color: transparent; text-align: center;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span class="Apple-style-span" style="font-size: x-small;"><span class="Apple-style-span" style="white-space: pre-wrap;">Overlay (yellow is zombie, gray is human)</span></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span class="Apple-style-span" style="font-size: x-small;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br /></span></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">It is interesting to point out, from a historical standpoint, that many of the regions we hypothesize to be damaged in the zombie brain are part of what is generally referred to as the <a href="http://en.wikipedia.org/wiki/Papez_circuit">Papez circuit</a>. <a href="http://en.wikipedia.org/wiki/James_Papez">James Papez</a> first identified this circuit in 1936. Much like our current "study", Papez was trying to unify a cluster of behavioral phenomena he had observed into a neuroanatomical model of the brain. He wondered why emotion and memory are so strongly linked. His hypothesis was that emotional and memory brain regions must be tightly interconnected. </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">To test this theory, he injected the rabies virus into the brains of cats to watch how it spread and he made note of which brain regions were destroyed as a result of these injections. He observed that the hippocampus (important for memory formation) connects to the orbitofrontal cortex (social cognition and self-control), the hypothalamus (hunger regulation, among other things), the amygdala (emotional regulation), and so on. These experiments, conducted almost three-quarters of a century ago, may shed some insight into the nature of the zombie disorder today. We’re not suggesting that some super, brain-eating rabies virus is responsible for zombies. We’re just saying that it’s not </span><span style="background-color: transparent; font-style: italic; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">not</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> possible.</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The profile of damage we have outlined corroborates the behavioral observations we have made from zombie films. From a subjective standpoint, this pattern of cerebral atrophy represents a most heinous form of injury unparalleled in the scientific literature. It would lead to a pattern of violence and social apathy; patients thus affected would represent a grievous harm to society, with little chance of rehabilitation. The only recommendation is immediate quarantine and isolation of the subject. </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">However, as we learned in GI Joe “knowing is half the battle.” Based on our observations, we leave you with a<a href="http://www.wired.com/underwire/2011/06/zombie-apocalypse-science/"> few strategies to maximize survival in the event of a zombie encounter</a>.</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">1) </span><span style="background-color: transparent; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Outrun them</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: Climb to a high point or some other place they will have trouble reaching. Practice parkour. The slow zombie variant can’t catch up with a healthy adult human.</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">2) </span><span style="background-color: transparent; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Don’t fight them</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: They can’t feel pain and aren’t afraid of dying, so they’ve got the edge in close combat. If you can simply out run them, why risk the bite?</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">3) </span><span style="background-color: transparent; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Keep quiet and wait</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: The zombie memory is so terrible that if you can hide long enough, it will mill around only until something else captures its attention.</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">4) </span><span style="background-color: transparent; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Distraction, distraction, distraction</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: Throw something behind the zombie to capture its attention. Set off a flare, use a flashbang, or whatever you need to do to distract it to get away</span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">5) </span><span style="background-color: transparent; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">If you can’t beat them, join them</span><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: If you can’t out run them (or are around the fast zombie variant) take advantage of their self-other delusion and act like one of them. </span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br /><span style="background-color: transparent; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">There you have it folks... scientifically validated safety tips for surviving the zombie apocalypse. Use them wisely the next time you come face-to-face with the living dead.</span></span></div>Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-4582649627549824354.post-58690324795757596682011-10-29T08:29:00.000-07:002011-10-31T04:54:25.306-07:00Zombie Brain: Flesh Addiction<i><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">This is yet another installment of multi-day series on The Zombie Brain. Be sure to visit Oscillatory Thoughts tomorrow for another post in this series!</span></i><br />
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div class="p1">
<b><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Symptom 8: Flesh addiction</span></b></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<div style="background-color: transparent;">
<span id="internal-source-marker_0.621183977695182" style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">“Braainss... BRAAAINS!” Zombies call out for them like a man calls out for water after a week in the desert. Yet no matter how much they eat, they can never be satisfied. It’s as if the craving to consume brains and/or human flesh is the sole thought running through a zombie’s “mind”. Zombies will even risk loss of “life” and limb to satisfy these urges.</span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">These symptoms mirror those seen in dysfunction of the “reward circuits” in the brain. It’s as if the living dead are addicted to flesh and flesh consumption is a compulsion. </span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The sense of reward, or the “high”, that you experience originates first from dopamine cells that rest in an area of the brain collectively known as the ventral striatal reward pathway. This includes a larger network of areas in the neocortex, midbrain and brainstem.</span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-4_ACyGgGMbg/TqrWX4HgJ7I/AAAAAAAAGMM/crdveW42q40/s1600/S8_reward_circuit.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="275" src="http://2.bp.blogspot.com/-4_ACyGgGMbg/TqrWX4HgJ7I/AAAAAAAAGMM/crdveW42q40/s320/S8_reward_circuit.png" width="320" /></a></div>
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent; text-align: center;">
<span style="background-color: transparent; font-family: Arial; font-size: x-small; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Adopted from <a href="http://www.ncbi.nlm.nih.gov/pubmed/12383779">Wise (2002)</a></span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In many ways this circuit starts and ends in the brainstem with the release of dopamine. Note that these are a different set of dopamine neurons than those involved in the <a href="http://blog.ketyov.com/2011/10/zombie-brain-impulsive-reactive.html">aggression circuit</a> we discussed earlier. Activating these “reward cells” with stimulation (e.g., drugs, food, sex, etc., in humans, or direct electric stimulation in animals) causes them to transmit dopamine to other regions in the cortex and subcortex such as the striatum. This reinforces the drive for future reward seeking behaviors. </span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">These signals converge to a set of cells in the nucleus accumbens, which is essential for determining the motivational significance of the reward stimulus, causing the person to think, “Mmmmm that was fun; I’ll do that again.” </span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In cases of extreme drug abuse, simply showing pictures of drugs to an addict will engage this reward circuit. The same is true for people addicted to eating: showing them pictures of food can reengage the same reward regions as eating.</span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In zombies, this dopamine reward circuit is likely in overdrive. Paired with a loss of the feeding “off-switch” in the brain, this could lead to the insatiable appetite that zombies have. Of course, in humans fatty diets cause more hunger and the brain is a highly fatty substance, so unfortunately, the more the zombie eats... the more it wants… But we'll discuss that a bit later.</span><br />
<span class="Apple-style-span" style="font-family: Arial;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br />We expect that if you put a zombie in an MRI machine and showed it pictures of human flesh, you would detect activation in many regions of this ventral reward circuit. In fact, these would be the same activation patterns we'd expect to see in the brain of a (living) drug addict when presented with pictures of their drug of choice.</span></span><br />
<span class="Apple-style-span" style="font-family: Arial;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br /></span></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-kDQ8ijalPro/Tqr-wlGsv9I/AAAAAAAAGMY/8fpMQhKSRoI/s1600/S8_fMRI.001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://1.bp.blogspot.com/-kDQ8ijalPro/Tqr-wlGsv9I/AAAAAAAAGMY/8fpMQhKSRoI/s320/S8_fMRI.001.jpg" width="320" /></a></div>
<div style="text-align: center;">
<span class="Apple-style-span" style="font-family: Arial; font-size: x-small;"><span class="Apple-style-span" style="white-space: pre-wrap;"><i>What fMRI would look like in the zombie brain.</i></span></span></div>
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">But why isn’t a zombie satisfied even after it has consumed an entire human on its own? Well that's a whole other blog post. Let's just say the zombie brain doesn't know or doesn't care when it's full. </span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span><br />
<span style="background-color: transparent; font-family: Arial; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">So today's lesson shows us that zombies are depraved flesh addicts who will stop at nothing to get their next fix (i.e., <i>you</i>).</span></div>
</div>Unknownnoreply@blogger.com7tag:blogger.com,1999:blog-4582649627549824354.post-34463426068325766612011-10-28T12:16:00.000-07:002011-10-31T04:54:25.301-07:00Symptom 7: Stimulus-locked attentionHead over to <a href="http://blog.ketyov.com/2011/10/zombie-brain-stimulus-locked-attention.html">Oscillatory Thoughts</a> for today's Zombie Brain factoid: Stimulus-locked Attention.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-4582649627549824354.post-69473205275432681612011-10-27T08:26:00.000-07:002011-10-27T19:22:07.976-07:00Zombie Brain: Pain Perception<div class="p1">
<i><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">This is part six of our multi-day series on The Zombie Brain. Be sure to visit Oscillatory Thoughts tomorrow for symptom 7!</span></i></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div class="p1">
<b><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Symptom 6: Pain Perception</span></b></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Cut off an arm, yet they keep coming. Shoot them in the chest, they keep coming. Light them on fire, they keep coming. How does the zombie continue to chase us despite wounds that would cause debilitating pain in a normal person?</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">It's quite simple really. They’re not aware of the damage done to them. More specifically, they may not be <i>feeling</i> the damage being done.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-AhZdfnCpu5w/TqiNktG1bUI/AAAAAAAAGLw/EiPt7P5090A/s1600/ShaunOfTheDeadToaster.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="179" src="http://1.bp.blogspot.com/-AhZdfnCpu5w/TqiNktG1bUI/AAAAAAAAGLw/EiPt7P5090A/s320/ShaunOfTheDeadToaster.png" width="320" /></a></div>
<div style="text-align: center;">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif; font-size: x-small;"><i>That toaster's going to leave a mark!</i></span></div>
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Scientifically we call the sensation of painful stimuli </span><a href="http://en.wikipedia.org/wiki/Nociception" style="font-family: Arial, Helvetica, sans-serif;">nociception</a><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">.* </span><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">The physiological systems that regulate our experience of pain are incredibly complex. So I'm going to give you the <i><u>short</u></i> and simple version. </span><br />
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Receptors in the skin pick up mechanical, thermal or chemical changes relay this information to neurons in the spine. This information goes up the spine through a few different different routes and gets relayed to several cortical regions. The combined recruitment of these neocortical regions then gives rise that "Ouch! F#$% that hurt!" response. </span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-4Zs6_2k7Rz8/TqeOidx8muI/AAAAAAAAGLo/zs9CGpuZt3k/s1600/Basbaum_pathways.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span class="Apple-style-span" style="font-family: inherit;"><img border="0" height="320" src="http://2.bp.blogspot.com/-4Zs6_2k7Rz8/TqeOidx8muI/AAAAAAAAGLo/zs9CGpuZt3k/s320/Basbaum_pathways.jpg" width="280" /></span></a></div>
<div class="p2" style="text-align: center;">
<span class="Apple-style-span" style="font-family: inherit; font-size: x-small;">The pain pathways (from <a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2852643/">Basbaum et al. 2009</a>)</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">A majority of these pain signals are processed in a forward part of the parietal cortex, known as the somatosensory cortex. These area sits right behind the region of the brain that consciously controls movements. Now the somatosensory cortex actually regulates our experience of all physical sensations (touch, vibrations, etc.) and processes most of the conscious signals that we are aware of feeling. However, this area is actually made up of two distinct areas : the primary and secondary somatosensory cortices. Each regulates the processing of different types of sensory information.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">There is also a second pain pathway that regulates our rapid unconscious experiences of pain. Most of this engages the inappropriately named "fight-or-flight" circuit via the amygdala. Signals are relayed to a few separate areas such as the cingulate (that processes conflict) and the insula (that, well appears to do everything). It is thought that these areas regulate the emotional salience of pain. </span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Now let's think about this... when was the last time you saw a zombie get emotional about anything let alone a little thing like having a limb chopped off? This suggests that this second pain pathway is disrupted in the zombie brain.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">It’s also clear that zombies can still move and they have an idea of basic sensations (they know where their own bodies are, and they do react reflexively to stimuli), but they don’t appear to have conscious awareness of pain and other sensations. </span><span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">This gives us good reason to believe that the nerves that sense pain, pressure, and so on in the body are intact, because zombies do still react to stimuli. We also know that the spinal cord that transmits those sensations up to the brain (and movement signals down from it) must also be intact. Furthermore, before touch senses get to the brain, they stop in the brainstem where they can be mediated and controlled before entering “conscious” perception.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Thus, we believe that there are a couple of vectors for the zombie’s immunity to pain.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">First, their <a href="http://en.wikipedia.org/wiki/Secondary_somatosensory_cortex">secondary somatosensory</a> regions in the parietal cortices are impaired. This would minimize experiencing some types of painful sensations, but not all. Note that the <a href="http://en.wikipedia.org/wiki/Primary_somatosensory_cortex">primary somatosensory cortex</a> (regulating fine touch, sense of limbs, etc.) is still intact.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-l9U565lG3No/TqiQ0t_9C_I/AAAAAAAAGL4/o416E2IiNqk/s1600/S5_pain.001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://1.bp.blogspot.com/-l9U565lG3No/TqiQ0t_9C_I/AAAAAAAAGL4/o416E2IiNqk/s320/S5_pain.001.jpg" width="320" /></a></div>
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">More importantly, neocortical regions like the insula and cingulate should also be obliterated in the zombie brain. This would eliminate any emotional reactions to the residual painful stimuli processed in the somatosensory cortex.</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-48136p9Ijxw/TqiSrP4wcGI/AAAAAAAAGMA/P3TyidqM4js/s1600/S5_pain_pathway2.001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://1.bp.blogspot.com/-48136p9Ijxw/TqiSrP4wcGI/AAAAAAAAGMA/P3TyidqM4js/s320/S5_pain_pathway2.001.jpg" width="320" /></a></div>
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">Thus zombies may actually feel pain and really just not give a crap about it. Just like Chuck Norris.</span><br />
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><br /></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;">There you have it folks. A numb, cold-hearted creature incapable of feeling pain (please save the lawyer jokes for another forum).</span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div class="p2">
<span class="Apple-style-span" style="font-family: inherit;"><i><br /></i></span></div>
<div class="p1">
<span class="Apple-style-span" style="font-family: Arial, Helvetica, sans-serif;"><i>* Contrary to popular belief, we don't have just 5 senses. We probably have closer to 20, and most of them involve different types of physical senses. There's a sense for feeling your limbs in space (proprioception). There's a sense of fine touch and vibrations called epicritic touch. There's the feeling of heat, sharp pain, etc. </i></span></div>Unknownnoreply@blogger.com6tag:blogger.com,1999:blog-4582649627549824354.post-73737339657475308862011-10-25T08:38:00.000-07:002011-10-27T19:22:59.529-07:00Zombie Brain: Language Deficits<span class="Apple-style-span" style="background-color: white; color: #333333; font-family: Trebuchet, 'Trebuchet MS', Arial, sans-serif; font-size: 13px; line-height: 20px;">This is part four of our multi-day series on <a href="http://cognitiveaxon.blogspot.com/2011/10/living-dead-brain-what-forensic.html">The Zombie Brain</a>. </span><span class="Apple-style-span" style="background-color: white; color: #333333; font-family: Trebuchet, 'Trebuchet MS', Arial, sans-serif; font-size: 13px; line-height: 20px;">Be sure to visit <a href="http://blog.ketyov.com/">Oscillatory Thoughts</a> tomorrow for symptom 5!</span><br />
<div style="background-color: transparent;">
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 10pt; font-style: normal; font-variant: normal; font-weight: bold; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span id="internal-source-marker_0.5583381922915578" style="background-color: transparent; color: black; font-family: Arial; font-style: normal; font-variant: normal; font-weight: bold; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Symptom 4: Language deficits</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Let's face it, zombies aren’t known for their oratory skills. Usually you’ll hear nothing but a collective set of moans as they’re pounding at the barricaded doors. Keep in mind that the most fluent phrase we ever hear Tarman say in Return of the Living Dead is "Braaaaaains!"</span></div>
<div style="background-color: transparent;">
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 10pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-PIFglNE9uL8/TqXtjtq42PI/AAAAAAAAGLI/j96UENDw99w/s1600/tarman_brains.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="228" src="http://2.bp.blogspot.com/-PIFglNE9uL8/TqXtjtq42PI/AAAAAAAAGLI/j96UENDw99w/s400/tarman_brains.png" width="400" /></a></div>
<div style="background-color: transparent; text-align: center;">
<span class="Apple-style-span" style="font-family: Arial; font-size: x-small;"><span class="Apple-style-span" style="white-space: pre-wrap;">Tarman goes on a short lived speaking tour</span></span></div>
<div style="background-color: transparent; text-align: left;">
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 10pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">At best you’ll get a disjointed burst of individual words. For example, a somewhat intelligent zombie might utter into the walkie-talkie of a recently consumed police officer, “send... more... cops...” in order to get a new delivery of fresh meat (as observed in Return of the Living Dead). But that would be considered the Shakespeare of zombies.</span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; font-size: x-small;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br /></span></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">This type of speaking is called telegraphia, characterized by the fact that the words are present, but the execution is jammed. </span><span class="Apple-style-span" style="font-family: Arial;"><span class="Apple-style-span" style="white-space: pre-wrap;">Neurologically, this relates to </span></span><span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">a specific disorder known as <a href="http://en.wikipedia.org/wiki/Expressive_aphasia">expressive aphasia</a> or, as it is classically known, Broca’s aphasia. </span></div>
<div>
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Now in the normal living human, this language production ability is mediated by an area of the brain that rests just behind your temple. More often than not, just behind your <i>left</i> temple.</span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-2U4b9jsoPJQ/TqXwMZ3sP-I/AAAAAAAAGLQ/5mVeIj-siYo/s1600/brocas.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="247" src="http://4.bp.blogspot.com/-2U4b9jsoPJQ/TqXwMZ3sP-I/AAAAAAAAGLQ/5mVeIj-siYo/s320/brocas.png" width="320" /></a></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Broca's area is named after <a href="http://en.wikipedia.org/wiki/Paul_Broca">Paul Broca</a>, who described the language deficits of Patient "Tan." Tan was reportedly was unable to say anything but the word "tan" after this region of the frontal cortex was damaged. (Historical side note: Tan could actually say many other things however, they were all just vulgar profanities. Apparently French neurological societies frowned upon the idea of naming him Patient "Foutre!").</span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Okay, back to zombies! Zombies don't just have a problem producing language, they also don't seem to be able to comprehend it either. </span><span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">They never respond to verbal commands and rarely seem to stop read road signs (hence they are chronically lost). </span><span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">This inability to comprehend language reflects another type of classical deficit called <a href="http://en.wikipedia.org/wiki/Receptive_aphasia">receptive aphasia</a>, known by it's more common name </span><span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Wernicke’s aphasia. You guessed it... that's because the guy who discovered it was <a href="http://en.wikipedia.org/wiki/Carl_Wernicke">Carl Wernicke</a>. </span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Wernicke's aphasia comes from damage to a different region of the brain. This sits farther back in your head, at the junction of the temporal and parietal lobes (basically behind and slightly above your ear). </span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-cUbAzbS4ZE0/TqYC27AR3cI/AAAAAAAAGLY/MbAZPS5wUFY/s1600/wernickes.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://2.bp.blogspot.com/-cUbAzbS4ZE0/TqYC27AR3cI/AAAAAAAAGLY/MbAZPS5wUFY/s320/wernickes.png" width="320" /></a></div>
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">What does this tell us about the zombie brain? Well it would appear that the frontal language production areas and the temporal/parietal language comprehension areas are both atrophied in the zombie cortex. Since these regions communicate with one another via a large bundle of white matter called the arcuate fasiculus, its safe to say that this “arcuate circuit” is obliterated in the zombie brain, as well as the frontal and parietal language regions. </span></div>
<div style="background-color: transparent;">
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-bmb-EBv6s18/TqYEi--B6hI/AAAAAAAAGLg/esEO6QklyPI/s1600/S3_language.001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://3.bp.blogspot.com/-bmb-EBv6s18/TqYEi--B6hI/AAAAAAAAGLg/esEO6QklyPI/s320/S3_language.001.jpg" width="320" /></a></div>
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;">Damage to the frontal (Broca’s) region leads to expressive (Broca’s) aphasia, and damage to the parietal (Wernicke’s) region leads to receptive (Wernicke’s) aphasia. Thus, all language and communication skills would be severely disrupted in the zombie brain.</span><br />
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span><br />
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><b>Bottom line</b>: Don't try talking to a zombie. It's not worth your time.</span><br />
<span class="Apple-style-span" style="font-family: Arial; white-space: pre-wrap;"><br /></span></div>Unknownnoreply@blogger.com6tag:blogger.com,1999:blog-4582649627549824354.post-36734295115249167882011-10-24T11:19:00.001-07:002011-10-27T19:22:59.522-07:00Symptom 3: Long Term Memory LossBe sure to head over to <a href="http://blog.ketyov.com/2011/10/zombie-brain-long-term-memory-loss.html">Oscillatory Thoughts</a> for our third symptom of the Zombie Brain: Long Term Memory LossUnknownnoreply@blogger.com0tag:blogger.com,1999:blog-4582649627549824354.post-68462008421226557772011-10-23T09:01:00.000-07:002011-10-27T19:22:59.527-07:00Zombie Brain: Lumbering Walk<i>This is part three of our multi-day series on <a href="http://cognitiveaxon.blogspot.com/2011/10/living-dead-brain-what-forensic.html">The Zombie Brain</a>. Be sure to visit <a href="http://blog.ketyov.com/">Oscillatory Thoughts</a> tomorrow for symptom 3!</i><br />
<br />
<b>Symptom 2: Lumbering walk</b><br />
<br />
As soon as zombies rise from the dead, they begin walking. Well not walking... more like lumbering. Each step is slow and arduous. Their stance is wide and steady. This presents us with a very important clue about their brains.<br />
<br />
Now a lot has been said about the origins of the zombie “walk.” Given the pervasiveness of the disease, some have argued that zombie movements are like those seen in <a href="http://en.wikipedia.org/wiki/Parkinson's_disease">Parkinson’s disease</a>. Parkinson’s is a devastating neurodegenerative disorder caused by the loss of dopamine neurons in the brain that project to a group of regions collectively known as the basal ganglia. It is partly characterized by a slow decrease in the coordination and ability to move (not spastic, jerking movement as is stereotyped... that’s a side effect from the medications).<br />
<br />
However, consider this, persons afflicted with this disease will shuffle when they walk, adopting short sliding movements, and a hunched posture. Shaking and tremors are also present while patients are not moving. This does not sound like the zombie movements we see on the silver screen. Zombies can move quickly when striking and show no signs of a hunched posture or tremor. Therefore, we believe that it’s time to do away with the basal ganglia theory of zombie locomotion!<br />
<br />
<div style="text-align: center;">
Example of a Parkinsonian Gait (skip to 0:30 mark)</div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/j86omOwx0Hk?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
The lumbering zombie walk more resembles the movements characterized by damage to an area of the brain called the <a href="http://en.wikipedia.org/wiki/Cerebellum">cerebellum</a>.* The cerebellum is a little cauliflower shaped region at the back and base of your brain.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-vaKYMd-4yac/TqIbX-4wLKI/AAAAAAAAGK0/7mUI3vNmjCk/s1600/S2_cerebellum.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="269" src="http://4.bp.blogspot.com/-vaKYMd-4yac/TqIbX-4wLKI/AAAAAAAAGK0/7mUI3vNmjCk/s320/S2_cerebellum.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<br />
It is involved in many functions (e.g., learning, language, memory, sensations), however it is classically described as a motor coordination region. Indeed, this “little brain” has about half of the neurons in your entire brain!<br />
<br />
<div style="text-align: center;">
Example of Cerebellar Ataxia Gait</div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/eBvzFkcvScg?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
Patients with degeneration of the cerebellum exhibit a syndrome referred to as <a href="http://en.wikipedia.org/wiki/Spinocerebellar_ataxia">spinocerebellar ataxia</a>, which is characterized by uncoordinated movements of many kinds, including a wide-stance and lumbering walk.<br />
<br />
Although patients with cerebellar ataxia exhibit many coordination problems, the symptoms are alleviated somewhat with the assistance of vision. This may be another important clue about the zombie brain.<br />
<br />
Thus, we contend that zombies suffer from a severe spinocerebellar ataxia. Well, the “slow zombies” do, at least.<br />
<br />
What about fast zombies? Given the terrifyingly coordinated movements that “fast zombies” exhibit (think 28 Days Later or the recent remake of Dawn of the Dead) their cerebellums are likely intact. Thus we can also begin to develop neurological classifications of different subtypes of the zombie disorder that may give important clues to the etiology of the zombie epidemic.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-VxpLPN6SDC0/TqIeHQSe4vI/AAAAAAAAGK8/LnxQmCa3afo/s1600/S2_cerebellum_comparison.001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="http://3.bp.blogspot.com/-VxpLPN6SDC0/TqIeHQSe4vI/AAAAAAAAGK8/LnxQmCa3afo/s400/S2_cerebellum_comparison.001.jpg" width="400" /></a></div>
<br />
<br />
<i><br /></i><br />
<i>* Truth be told, when we had the o<a href="http://blog.ketyov.com/2010/11/zombcon-interview-with-george-romero.html">pportunity to ask George Romero</a> why he made his ghouls walk they way they did in the Living Dead Movies, he responded “They’re suppose to be dead. They’re stiff. That’s how you’d walk.” Not quite the answer that appeals to our neuroscience instincts, but a good alternative hypotheses to test in the next zombie apocalypse. Check out part of the interview below.</i><br />
<i><br /></i><br />
<i><br /></i><br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/vQwBgyo1TfE?feature=player_embedded' frameborder='0'></iframe></div>
<i><br /></i>Unknownnoreply@blogger.com8tag:blogger.com,1999:blog-4582649627549824354.post-66884850194585294042011-10-22T09:36:00.000-07:002011-10-27T19:22:59.524-07:00Symptom 1: Impulsive-Reactive AggressionBe sure to head over to <a href="http://blog.ketyov.com/2011/10/zombie-brain-impulsive-reactive.html">Oscillatory Thoughts</a> for our first symptom of the Zombie Brain.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4582649627549824354.post-18997543760040002011-10-21T09:02:00.000-07:002011-10-27T19:22:59.518-07:00The Living Dead Brain: What Forensic Neuroscience Can Tell Us about the Zombie BrainDr. Timothy Verstynen & Dr. Bradley Voytek, <a href="http://www.zombieresearch.org/">Zombie Research Society</a><br />
<br />
<i>This is a cross-post between <a href="http://blog.ketyov.com/2010/10/zombie-neuroscience.html">Oscillatory Thoughts</a> and <a href="http://cognitiveaxon.blogspot.com/">Cognitive Axon</a>. Stay tuned to both sites over the following days leading up to Halloween for updates on our model of the zombie brain. </i><br />
<br />
<br />
<center>
<object height="190" width="374"><param name="movie" value="http://www.youtube.com/v/5kZqliPHaoY?version=3&hl=en_US&rel=0">
</param>
<param name="allowFullScreen" value="true">
</param>
<param name="allowscriptaccess" value="always">
</param>
<embed src="http://www.youtube.com/v/5kZqliPHaoY?version=3&hl=en_US&rel=0" type="application/x-shockwave-flash" width="374" height="190" allowscriptaccess="always" allowfullscreen="true"></embed></object>
</center>
<br />
<br />
What can <a href="http://bit.ly/mHTBkN">neuroscience teach us</a> about <a href="http://blog.ketyov.com/2011/06/using-science-to-survive-zombie.html">surviving the zombie apocalypse</a>?<br />
<br />
What makes a zombie a zombie or, more importantly, what makes a zombie not a human? Philosophers contend that a zombie lacks that qualia of experience that belies normal consciousness.<br />
<br />
However this is a less than satisfying explanation for why the lumbering, flesh eating creatures are pounding outside the door of your country farmhouse.<br />
<br />
Beyond the (currently) immeasurable idea of consciousness or the whole supernatural “living dead” theory, zombies are characterized primarily by their highly abnormal but stereotyped behaviors. This is particularly true in more modern manifestations of the zombie genre wherein zombies are portrayed not as the reanimated dead, but rather as living humans infected by biological pathogens. They are alive, but they are certainly not like us.<br />
<br />
Neuroscience has shown that all thoughts and behaviors are associated with neural activity within the brain. Therefore, it should not be surprising that the zombie brain would look and function differently than the gray matter contained in your skull. Yet, how would one know what a zombie brain looks like?<br />
<br />
Luckily, the rich repertoire of behavioral symptoms shown in cinema gives the astute neuroscientist or neurologist clues as to the anatomical and physiological underpinnings of zombie behavior. By taking a forensic neuroscience approach, we can piece together a hypothetical picture of the zombie brain.<br />
<br />
Over the course of the next week, <i>Oscillatory Thoughts</i> and <i>Cognitive Axon</i> will team up to show our hypothetical model of the zombie brain. Each day we will present a new "symptom" associated with a zombie behavior and show its neural correlates in our simulated zombie brain.<br />
<br />
This entire endeavor is partly an academic "what if" exercise for us and partly a tongue-in-cheek critique of the methods of our profession of cognitive neuroscience. We’ll be breaking up the workload and alternating days (hey... we gotta work our real jobs too) so be sure to check both places for the newest updates on zombie neuroscience.
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<img alt="Timothy Verstynen and Bradley Voytek - Zombie Research Society zombie brain" border="0" src="http://2.bp.blogspot.com/-dfptjjQw-yE/TqENJauZbRI/AAAAAAAADsQ/BjNAvhALA30/s320/VerstynenVoytekZombieBrain.jpg" width="400" /></div>
<b><br /></b><br />
<b><br /></b><br />
<b>DISCLAIMER:</b> We need to be very clear on one point. While we sometimes compare certain symptoms in zombies to real neurological patient populations, we are in no way implying that patients with these other disorders are in some way “part zombie”. Neurological disorders have provided critical insights into how the brain gives rise to behavior and we bring them up for the sake of illustration only. Their reference in this context is in no way meant to diminish the devastating impact that neurological diseases can have on patients and their caregivers.Unknownnoreply@blogger.com3tag:blogger.com,1999:blog-4582649627549824354.post-85684855950607093722011-10-01T14:37:00.000-07:002011-10-01T14:47:10.911-07:00When good science is used badly<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-yMk9Am36MWs/ToeHzXuv5GI/AAAAAAAAGKo/Ua4sWnaeu1I/s1600/Brain-phone-love.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="178" src="http://4.bp.blogspot.com/-yMk9Am36MWs/ToeHzXuv5GI/AAAAAAAAGKo/Ua4sWnaeu1I/s400/Brain-phone-love.png" width="400" /></a></div>
<br />
<br />
In a recent <a href="http://www.nytimes.com/2011/10/01/opinion/you-love-your-iphone-literally.html?_r=1">New York Times op-ed piece</a>, branding guru and self-described scientist <a href="http://www.martinlindstrom.com/">Martin Lindstrom</a> gives a perfect example of why scientific tools should only be used by professional scientists and not self-trained hacks. In his article titled "You love your iPhone. Literally." Mr. Lindstrom made the case that we are not addicted to our smart phones, but that we have established a relationship with our technology that is on par with the process of "love." <br />
<br />
Mr. Lindstrom would have you believe that he's not just giving his professional opinion as a marketing consultant, but that he has scientific data to validate this claim. However let's take a close look at these spurious claims.<br />
<br />
First, Mr. Lindstrom describes an imaging experiment that he undertook to see if marketed brand names engage the same brain circuits as religious symbols. <br />
<blockquote>
<span class="Apple-style-span" style="background-color: white; font-family: georgia, 'times new roman', times, serif; font-size: 15px; line-height: 22px;">"A few years back, I conducted an experiment to examine the similarities between some the world’s strongest brands and the world’s greatest religions. Using functional magnetic resonance imaging (fMRI) tests, my team looked at subjects’ brain activity as they viewed consumer images involving brands like Apple and Harley-Davidson and religious images like rosary beads and a photo of the pope. We found that the brain activity was uncannily similar when viewing both types of imagery."</span></blockquote>
To someone who doesn't live in the world of brain images all day (like I do), this sounds pretty promising right? You might think, "Huh? My brain is activated the same way when I see the <a href="http://images.google.com/search?tbm=isch&hl=en&source=hp&biw=1074&bih=762&q=apple+logo&gbv=2&oq=apple+logo&aq=f&aqi=g10&aql=&gs_sm=e&gs_upl=953l2848l0l3080l12l10l1l0l0l0l202l1134l4.4.1l9l0">Apple logo</a> as when I see the <a href="http://en.wikipedia.org/wiki/Madonna_of_Bruges">Madonna of Brugges</a>." But that is nowhere close to what these results reflect. Now I haven't seen the details of his study, so I can't lay claim to the soundness of his methodologies. However, I can point out two major inconsistencies in his interpretations. <br />
<br />
First, seeing "uncannily similar" brain areas engaged when seeing objects and religious symbols is not that surprising. It's not surprising that visual symbols are encoded in the same brain networks. They're visual stimuli with interpretive meaning. But that doesn't mean that you <i>value</i> them the same way. A symbol may just be a symbol as far as the brain is concerned. <br />
<br />
Second, Mr. Lindstrom is committing one of the most basic of scientific fallacies. <b> Not detecting a difference between two conditions isn't the same thing as there not being a difference.</b> It is called "arguing the null hypothesis". In science we can't say anything definitive about differences that we don't see, only difference that we do. Just because monkeys and children pick their noses at the same rate does not mean that they're the same creature. But this is essentially Mr. Lindstrom's conclusion. <br />
<br />
Okay, so he's a bad fMRI researcher. Big deal... there are a lot of them these days. Let's look at some of Mr. Lindstrom's other data.<br />
<br />
<blockquote>
<span class="Apple-style-span" style="background-color: white; font-family: georgia, 'times new roman', times, serif; font-size: 15px; line-height: 22px;">"...I gathered a group of 20 babies between the ages of 14 and 20 months. I handed each one a BlackBerry. No sooner had the babies grasped the phones than they swiped their little fingers across the screens as if they were iPhones, seemingly expecting the screens to come to life."</span></blockquote>
<br />
This again is wrong in so many ways. As anyone who has ever interacted with children can tell you, they aren't the most coordinated of folks. In fact the brain systems that regulate our movements aren't fully developed until you're almost a teenager. Did Mr. Lindstrom give phones to children from countries where iPhones aren't as common for a control group? Presumably not, but that would be one way to see whether this behavior is just random grasping from people without fully formed cerebellums. <br />
<br />
Perhaps most importantly, children <i>imitate</i> adults. They adopt the behaviors of the people around them as a way of learning the world. That's a key part of development (as evidenced by these <a href="http://www.google.com/url?sa=t&source=video&cd=2&ved=0CE0QtwIwAQ&url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DgsCu5s0J_Do&ei=_n2HTvKUEoHt0gGAo7j7Dw&usg=AFQjCNE_w1VXfJkoQScv-K4_gFboEJqp6A&sig2=FBHcO_T01Cgp8vi5XaZ56A">two kids</a> who obviously don't know how to speak, but sure know how to act like it). Just because children are imitating their parents doesn't mean that they value them in the same way as Mr. Lindstrom appears to be suggesting.<br />
<br />
Finally, and perhaps most egregiously, Mr. Lindstrom reports on yet another brain imaging study. In this case he presented either a visual movie of a ringing phone or the sound of a ringing phone. <br />
<blockquote>
<span class="Apple-style-span" style="background-color: white; font-family: georgia, 'times new roman', times, serif; font-size: 15px; line-height: 22px;">"In each instance, the results showed activation in <em>both </em>the audio and visual cortices of the subjects’ brains. In other words, when they were exposed to the video, our subjects’ brains didn’t just see the vibrating iPhone, they “heard” it, too; and when they were exposed to the audio, they also “saw” it. This powerful cross-sensory phenomenon is known as synesthesia."</span></blockquote>
Had Mr. Lindstrom bothered to go to Wikipedia, he would know that this effect is <i>not</i> <a href="http://en.wikipedia.org/wiki/Synesthesia">synesthesia</a>. Synesthesia is an inherent, hard-wired "cross connection" in the brain. It's not learned. <br />
<br />
What Mr. Lindstrom is in fact reporting is the very simple result of <a href="http://en.wikipedia.org/wiki/Hebbian_learning">Hebbian learning</a>: "neurons that fire together wire together." Seeing visual areas light up with certain auditory (or tactile) stimulation is a fairly commonplace finding in the brain imaging literature. We often both see the phone light up (or vibrate) and hear it ringing at the same time. Eventually an association is formed within the brain. Mr. Lindstrom would probably see the same thing if he showed his subjects a picture of a baby crying, a doorbell, etc.<br />
<br />
Finally, there's this last bit of "evidence" (quotes are mine).<br />
<blockquote>
<span class="Apple-style-span" style="background-color: white; font-family: georgia, 'times new roman', times, serif; font-size: 15px; line-height: 22px;">"But most striking of all was the flurry of activation in the insular cortex of the brain, which is associated with feelings of love and compassion. The subjects’ brains responded to the sound of their phones as they would respond to the presence or proximity of a girlfriend, boyfriend or family member."</span></blockquote>
There are a host of other areas that are also associated with "love and compassion" in the brain. There's not one single area that encodes these concepts. As far as I know, there is no definitive conclusion about where the concept of "love" is encoded in the brain.<br />
<br />
So this becomes a guilt by association conclusion: <i>brain area A is active when experiencing X and Y, therefore X is the same as Y (or worse, X causes Y).</i> If Mr. Lindstrom had seen the same area of the brain engaged when he presented an image of a fire-truck and an image of a t tomato, it doesn't mean that your brain thinks of the truck as being made of tomatos (nor that tomatoes are baby fire-trucks). Sadly, this is a fallacy that many established neuroscientists also make. But that's a topic of another post.<br />
<br />
As a professional neuroscientist, my reaction to the findings Mr. Lindstrom presents in his op-ed is "So what?" Nothing he reports provides a shred of evidence that we "love" our iPhones, at least neuroscientifically speaking. Nor does he show that the experience of using your smart phone is the same as falling in love or having a religious experience. All Mr. Lindstrom demonstrated was what can happen when the tools of sciences are placed in the wrong hands.<br />
<br />
Perhaps all neuroimaging articles should come with the disclaimer: "Performed by trained professionals, do not try this at home."<br />
<br />Unknownnoreply@blogger.com4tag:blogger.com,1999:blog-4582649627549824354.post-63777469916469888582011-09-15T16:18:00.000-07:002011-09-15T16:19:00.076-07:00Cloud computing in the brain<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-V8rx4Y4LVd8/TnKEdhibVyI/AAAAAAAAGKk/Ym5xNGdkAA0/s1600/cloud_computing_neurons2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="263" src="http://1.bp.blogspot.com/-V8rx4Y4LVd8/TnKEdhibVyI/AAAAAAAAGKk/Ym5xNGdkAA0/s400/cloud_computing_neurons2.png" width="400" /></a></div>
<br />
I meant to post on this earlier this summer but got distracted... work & life have a way of doing that.<br />
<br />
This past July saw one of my most grueling projects finally come to a close. Published in the Journal of Neuroscience and titled “<a href="http://www.ncbi.nlm.nih.gov/pubmed/21734297">How each movement changes the next: an experimental and theoretical study of fast adaptive priors in reaching</a>" this paper goes well beyond the topic of simple motor control and into fundamental issues about how our brains learn from experience (this is the reason why it took over 2.5 years to get through the review process!). In particular we show how neurons in the brain might do something akin to cloud computing. <br />
<br />
Confused? Let me explain.<br />
<br />
First I'll have to talk about a little hard science for some background. What my <a href="http://keck.ucsf.edu/neurograd/faculty/sabes.html">co-author</a> and I found was that whenever we move, our brain keeps track of where we go. Over time, the motor control regions of the brain begin to generate an internal model of the recent movements we've made. It then uses this memory when we make future actions. For example, if I keep reaching for this soda can on my desk (and return it to the same position), then over time my brain retains the history of all the reaches I've made to that same spot on my desk. Over time my reaches to the soda get a little more accurate. So in a sense, practice makes perfect. <br />
<br />
Now, here's the kicker... if I suddenly want to reach for something else ("Hey, that cookie over there looks mighty tasty!") then my brain biases this new movement in the direction of the soda can. We don't really notice it that much, but it's detectible with the fancy machines that we use to monitor your movements. The more often I reach for that soda can, the more biased my reach for the cookie gets.<br />
<br />
It turns out that this type of learning is really sophisticated and follows what appears to be rigid statistical principles. What I mean is that our brain somehow encodes our recent actions as a prior probability distribution (think of the "bell curve" you've heard about). It then integrates this prior with all the incoming sensory information you're getting from your eyes, your hand, etc. This integrated information is then used when you make your next reach. The stronger the prior, the more biased your future actions will be. The stronger the sensory input is, the less biased you'll be.<br />
<br />
For the math-nerds out there, this is a form of adaptive <a href="http://en.wikipedia.org/wiki/Bayesian_inference">Bayesian inference</a>. I wont go in to the awesome details of Bayesian statistics because I don't want to lose 90% of whoever it is actually reads these posts ("Hi Mom!"). This is a branch of mathematics that's used to filter Spam from your inbox, improve images of the stars from telescopes, optimize airline travel, make video games more difficult, and almost everything else that's cool these days. Needless to say, these are some pretty freaking sophisticated computations that our brains are doing almost effortlessly. And not just for any high level cognitive process (I mean this process worked in both Shakespeare's brain and in Pauly D's brain)... but for something as simple as reaching for a can of soda.<br />
<br />
Okay... hopefully I haven't lost anybody. Because here's the truly insane part. <br />
<br />
Through simulations of neural tissue, my co-author and I found that this really cool mathematical computation likely happens through a form of "<a href="http://en.wikipedia.org/wiki/Cloud_computing">cloud computing</a>" in the brain. For those of you who don't know, cloud computing is the process by which a computation is broken down into a set of little chunks and then distributed to a whole bunch of computers that live in the vast ether of the internet (<b>Note</b>: So technically what I'm talking about is more similar to a computer clusters, but "cloud computing" is the hip new thing these days). You know those nasty things called "botnets" that take down servers in foreign countries or send you all that spam in your inbox... they're cloud computing gone bad.<br />
<br />
We found that a similar principle might work in the brain. There might not be a single neuron or group of neurons that store this statistical prior. Instead, we were able to show how this memory can naturally emerge in the dynamics of the information passing <i>between</i> neurons. Thanks to Hebbian learning ("neurons that fire together wire together") our brain is able to store little bits of information distributed across a mass of connected cells. Basically populations of neurons remember their collective pattern of activity from the recent past. Over time, this distributed collective memory shapes the way the network responds to new inputs. Eventually, this learning exhibits very sophisticated properties that look almost exactly like human behavior, as well as the expectations of statistical theory. So the whole is, in fact, greater than the sum of its parts. <br />
<br />
Let's put this all together shall we? We've got a group of neurons in your brain that are building a complex statistical model of everything you just recently did. But this model isn't encoded by the activity of any one cell or even in the response properties of a group of cells. <i>Instead this model simply exists in the abstract dynamics of how these neurons talk to one another</i>. Mind blown yet?<br />
<br />
Now to be completely honest this is hardly the first time that someone has come up with the idea that information is broken down and stored across a network of neurons (in fact there's a formal name for this called "<a href="http://en.wikipedia.org/wiki/Sparse_coding">sparse coding</a>"). But what's interesting from our study is that complicated, mathematically principled information can just emerge naturally in the brain thanks to the fact that neurons are recurrently connected (i.e., send information back and forth) and they have associative learning. This dramatically increases the complexity of information that our brain can store. Information isn't just encoded in how the cells fire, but also in how they talk to each other as a group. It's a sort of meta-level type of information storage. <br />
<br />
Let's end with this... our simulation used about 180 simulated neurons (with 32,400 connections) and we were able to do some pretty fancy mathematical processing. The human brain has more neurons than there are stars in the Milky Way. Each of these neurons has, on average, 10,000 connections or so. A conservative estimate puts it as about 100 billion neurons with about 100 <i>trillion</i> axons. Think about just how complex of a biological computer that this system could hold!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
Yup.... crazy stuff like this is why I do science!Unknownnoreply@blogger.com16tag:blogger.com,1999:blog-4582649627549824354.post-5943355268043821672011-09-07T19:48:00.000-07:002011-09-08T07:58:08.187-07:00If you don't like the message, kill the messengerAs some of you may know, one of my goals in life is to facilitate the role of science in society. Last week I came across a little gem that reminds me of what we, as scientists, are up against as far as how science is discussed outside the lab.<br />
<br />
Now before I start, let me be honest. There have always been, and always will be, those who will instinctively disavow science. For some it's a matter of fear ("That appears to threaten my world view and scares me"). For others it's simple ignorance ("That doesn't easily make sense to me, so I don't believe it"). But for a few it is really more a matter of power ("By minimizing science and its scope, I can act how I want regardless of what evidence there is that I shouldn't.") <br />
<br />
The first two groups I can understand and even empathize with. It's our nature to be cautious and skeptical. Hell, as a scientist I'm trained to be skeptical and to wait for mounting evidence to make the case that I should reject my current beliefs. However, usually those who dislike science out of fear or ignorance can be reasoned with if approached in the right way. Remember that even Pope John Paul II, a man who lead an institution that was terrified of the concept of evolution, eventually conceded that it exists.<br />
<br />
It is that third group of science denier terrifies me. And that brings me to what started this whole post. Check out this clip of Presidential candidate Rick Santorum speaking to a law school audience last week.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<object class="BLOGGER-youtube-video" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" data-thumbnail-src="http://2.gvt0.com/vi/iQC1L_eN3y4/0.jpg" height="266" width="320"><param name="movie" value="http://www.youtube.com/v/iQC1L_eN3y4&fs=1&source=uds" />
<param name="bgcolor" value="#FFFFFF" />
<embed width="320" height="266" src="http://www.youtube.com/v/iQC1L_eN3y4&fs=1&source=uds" type="application/x-shockwave-flash"></embed></object></div>
<br />
<br />
Let's listen very carefully to what Mr. Santorum says. First he appeals to authority" "Because I believe what the Catholic church teaches with respect to homosexuality" he therefore should be allowed to hold his beliefs that same-sex couples should not have equal rights in terms of marriage and adoption. Now if it were just a personal belief that would be fine. We all have to appeal to authority at some point in order to have baseline assumptions with which to act in the world. But when faced with new evidence that those assumptions are incorrect, a logical person would change the assumption. Mr. Santorum inherently acknowledges this fact when he goes on to say that there is no evidence to refute his "bigoted" beliefs. <br />
<br />
But now listen to what Mr. Santorum says when confronted with two critical pieces of information that there is, in fact, evidence to refute his assumptions. First the fact that the American Psychiatric Association (not the American Psychological Association as stated in the video, although they take the same stance) delisted homosexuality as a mental illness in the DSM-III in1974. This, by the way, was after numerous empirical studies showing that, in all other aspects, homosexuals had no other presenting symptoms or negative life outcomes than heterosexual individuals. Second, the fact that there is overwhelming peer-reviewed research by behavioral scientists that children growing up in same-sex households are no different, in terms of mental health and emotional well-being, than those who grow up with heterosexual parents. <br />
<br />
Mr. Santorum replies "The American Psychological Association is made up of people who agree with the American Psychological Association." He then follows up with this gem, "A lot of psychologists don't belong to the American Psychological Association. A lot of doctors don't belong to the American Medical Association."<br />
<br />
Now keep in mind this is a serious (albeit unlikely) candidate for the GOP nomination. He's a trained lawyer and served in Congress both as a Representative and a Senator. So he should, by all accounts, be a very bright and logical man. Yet his response to just the existence of scientific evidence refuting his beliefs is the same as that taken by the lunatic anti-science fringe:<i> if you don't like the message, attack the messenger</i>. <br />
<br />
As a scientist, I'd be fine if Mr. Santorum argued with the science itself. Every study and every field has its weaknesses. Had he argued that showing a lack of a difference is not the same thing as there not being a difference, he'd actually be correct. In science, we call it "arguing the null." Had he said that the research was still preliminary because the children adopted by homosexual couples are just now reaching adulthood, that would also be a valid and reasoned response. <br />
<br />
But that wasn't Mr. Santorum's reaction. This initial reaction reveals something far deeper than a simple lack of understanding of the issue. It shows a complete and utter lack of respect for the science being discussed. In fact, he turned the appeal to authority on it's head. It's okay for Mr. Santorum to hate homosexuality because the Catholic church says so, but it's not okay for scientists in the APA to feel that there's nothing wrong with homosexuality because they're just preaching to the choir. Do you see that Catch-22?<br />
<br />
If Mr. Santorum were the only major politician to act this way toward science, I could let it slide. He'd be annoying but not threatening. Unfortunately this is the rule these days and not the exception. For many politicians, as well as media personalities and a growing minority of the U.S. population, refuting a scientific fact <i>outright</i> because scientists say it is a badge of honor. It's something to brag about. In fact, we live in a world where most of the contenders for a major political party's presidential nomination don't believe in evolution or global climate change. Both of which have thousands (that's right.. <i>thousands</i>!) of empirical publications supporting them.<br />
<br />
And that brings us back to the motive. As I mentioned before, Mr. Santorum has to be a smart and logical man in order to get as far as he did. The same is true for the other leaders who instinctively attack science as a whole (with the exception of Mrs. Bachmann who's a whole other kind of crazy). I'm afraid, their attack comes from a need for power. "I can ignore the research on homosexuality and evolution because I want to court the religious vote." "I can ignore the evidence of climate change because I want to court the industries who it will affect." <br />
<br />
Unfortunately, this is an endgame move in the debate. There's nothing you as a scientist can say or do that will change their beliefs because their ignorance is a source of power. Arguing with those who take this stance is like arguing with the three year old holding his fingers in his ears... you just have to wait until they grow up so you can talk to them like an adult.<br />
<br />
<b>Update:</b> As if on cue, last night's GOP primary debate featured Rick Perry demonstrating my point perfectly. "Just because you have a group of scientists standing up and saying 'Here are the facts'... Galileo got outvoted for a spell." The important point being distorted by Mr. Perry is that Galileo was not outvoted by his scientific peers... he was outvoted by the Catholic Church. His "natural philosophy" peers respected and built off of his work. In fact, by taking such a stance against the empirical findings, Mr. Perry is putting himself in the same boat as those who "outvoted" Galileo.<br />
<br />
<div style="text-align: center;">
<object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" height="345" id="FiveminPlayer" width="560">
<param name='allowfullscreen' value='true'/>
<param name='allowScriptAccess' value='always'/>
<param name='movie' value='http://embed.5min.com/517157087/'/>
<param name='wmode' value='opaque' />
<embed name='FiveminPlayer' src='http://embed.5min.com/517157087/' type='application/x-shockwave-flash' width='560' height='345' allowfullscreen='true' allowScriptAccess='always' wmode='opaque'>
</embed>
</object>
</div>
<br />
<br />Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-4582649627549824354.post-66017761636793654302011-07-02T08:09:00.000-07:002011-07-02T09:16:35.873-07:00Virus-guided laser neurons and the need for creative explorationLike all meetings, this year's Human Brain Mapping conference was packed with the ubiquitous blend of brilliance and social awkwardness that permeates nearly every neuroscience meeting. That's why I like science. <br /><br />Real geeks, unlike chic geeks, embrace the fact that they live 90% of the time in their heads, trying to frame the world that they see into the science that they know. Either trying to understand how something works or what would happen if two disparate things were suddenly fused together to make something new.<br /><br />Most of us are constantly living in our own heads. That's why the "absent minded professor" isn't really much of a stereotype as it is a valid descriptor of our behavior. <br /><br />But it is this mental wandering that makes good scientists great. Daydreaming has it's creative advantages. Those "what if?" or "how did that work?" scenarios that constantly bounce around within our cranium are the key to pushing science forward.<br /><br />Case in point, the story behind "virus-guided laser neurons" (also known as "<a href="http://en.wikipedia.org/wiki/Optogenetics">optogenetics</a>"). For those of you who have never heard of optogenetics, it's the mind blowing technology by which viruses are used to insert genetic material from algae or bacteria into the neurons of living mammals. This genetic clipping changes the properties of neurons so that they will either fire or not fire in response to light pulses of particular wavelengths.<br /><br />You heard that right. Viruses change neurons so that they can be controlled by lasers... in living animals. I'll let you think about that one for a minute.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/-YhDQn4XjG8M/Tg85Lp7xInI/AAAAAAAAGIM/Kp3sgcX9_HY/s1600/mouse_x220.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 198px; height: 200px;" src="http://1.bp.blogspot.com/-YhDQn4XjG8M/Tg85Lp7xInI/AAAAAAAAGIM/Kp3sgcX9_HY/s200/mouse_x220.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5624777332006920818" /></a><br /><br />This technology was developed by <a href="http://www.stanford.edu/group/dlab/">Karl Deseroth</a> and his lab at Stanford around 2004. In just a short time, this tech has revolutionized neuroscience and lead many to speculate (privately and openly) that the Nobel is waiting in the future. <br /><br />Dr. Deseroth was featured as a keynote speaker last week and while the whole lecture was absolutely fascinating, something he said at the beginning caught my attention. Deseroth mentioned that when his lab started working on the idea of optogenetics, they were using funds from a grant from the National Institute of Mental Health that had proposed to do something VERY different. But his lab wanted to try this new venture and they took the risk and it paid in dividends. <br /><br />"It's a great example of the need for creative exploration," Deseroth said. <br /><br />And therein lies one of the many problems facing today's researchers. Funding agencies like the National Institutes of Health (NIH) are taking fewer and fewer risks in science. They're hedging their bets on what seems to be the safest horse and in the process stifling most of the creative processes that lead to developments like optogenetics. As the government cuts more funding to science and education, what's left in the pot gets distributed mainly to the established laboratories who are building on established ideas.<br /><br />Ask anyone who's written an NIH grant recently about the process. Go ahead, buy them a beer. They'll go on at length about how you need to have already finished half the experiments that you "propose" to do in the grant application. This is before it's even considered "fundable." Indeed, the average age at which researchers get their <span style="font-style:italic;">first</span> full NIH grant is now <a href="http://www.genomeweb.com/getting-your-first-grant">43 years old!</a> Consider that most professors start their first position in their mid-to-late 30's.<br /><br />As a result, most scientists in my field spend a vast majority of their time designing and performing research projects that they don't really want to do. They write grants to tell the NIH what it wants to hear. They'll write grants creatively dancing around the projects that are of interest to them, so that they can get the funds to do the real science. <br /><br />I've even had one established scientist (with decades of experience in neuroscience) tell me that he always writes grants knowing that he'll only do 50% of what he proposes. He'll then spend the other 50% of funds on projects he really wants to do. <br /><br />This problem is compounded when you consider that, in neuroscience, the current funding rates for grants at the NIH range from ~2.5-16.0%, depending on the agency (down from a stable ~20-25% in the 1990s). That's a lot of highly intelligent brain power being wasted on useless dead ends that could be spent creatively exploring something really interesting. <br /><br />Unfortunately, that's the dirty secret of the world in which we do science in this country. Unless it changes, the US may stop becoming the place where people can turn crazy ideas like making virus-guided laser-neurons into a reality.<br /><br />In the meantime, I'm going to keep on daydreaming my weird science ideas.<br /><br />(Image taken from <a href="http://www.technologyreview.com/biomedicine/23767/">Technology Review</a>)Unknownnoreply@blogger.com0