From: 
http://www.livescience.com/28505-map-the-brain.html



As a biomedical research scientist I am concerned about President Obama's broad new research initiative "to map the human brain."

The Brain Activity Map is a very ambitious, and perhaps even noble, effort, and I am most definitely not against imaging or nanotechnology as tools for research.'But, without specific goals, hypotheses or endpoints, the research effort becomes a fishing expedition. That is, if we throw enough technology at the project and get enough people involved, something is sure to come of it — maybe.

I am also not against Big Science projects, if they are based on viable precepts. However, I do think we need to have a more thoughtful discussion of the immediate and long-term issues, with a wider range of participants and perspectives, and some attention to alternatives and priorities, before we dedicate increasingly limited, long-term public funding to such an effort — starting with $100 million per year and a proposed rise to at least $300 million per year for at least 15 years.

Senior scientists in the president's administration have compared the brain-mapping initiative to the human genome project, but in a recent New York Times article, John Markoff and James Gorman rightly pointed out that, "It is different however, in that it has, as yet, no clearly defined goals or endpoint." In a subsequent interview with Jonathan Hamilton on National Public Radio, the director of the National Institutes of Health, Francis Collins, made the same point.In an article last month, also in the New York Times, Tim Requarth pointed out: "Other critics say the project is too open-ended — that it makes little sense without clearly defined criteria for success. 'It's not like the Human Genome Project, where you just have to read out a few billion base pairs and you're done,' said Peter Dayan, a neuroscientist at University College London. 'For the human brain, what would you need to know to build a simulation? That's a huge research question, and it has to do with what's important to know about the brain.'"

Every scientist (including me) would love to be able to get a grant without having to specify any goals, hypotheses or endpoints, but is this a realistic way to do science?

Why is this mapping initiative more important than other possible initiatives? Is it more important than finding a cure for AIDS? More relevant than beating cancer in all its manifestations? Although the notion of mapping everything going on in the brain has curb appeal, such an open-ended endeavor calls for at least some solid evidence that it is likely to produce substantive changes in disease outcome, understanding of diseases and better public health for the nation.
Consciousness Raising for neuroscience

A deep problem hampering this discussion is the near-universal lack of awareness about the limited, historically determined, very probably transient character of our prevailing assumptions about the relationship between gray matter and brain function.

The attraction of brain mapping owes much to an obsolete scientific paradigm. Attempts to map and parcellate the human (and animal) brain into morphologically and anatomically distinct areas, each with its specific function, have been around for more than a century. In the mid-1800s it became scientifically fashionable in neurology to discover and "map" the functions of the cerebral cortex using a variety of methods and techniques available at any given moment. This was called phrenology, and this mapping paradigm became the major focus of the neurological disciplines that led to the doctrine of cerebral localization of functions. The phrenological trend continues to the present; its ever more sophisticated technologies mask what some of us consider an obsolescent concept (the article by Cold Spring Harbor Laboratory professor Partha Mitra in Scientific American presented a good example).

Mapping the brain with modern technology is a direct extension of that same paradigm. The paramount question here is not about the technology per se, but whether what it represents and what it measures is an accurate reflection what we want to know about how the brain works. Given what we've learned so far, we have to ask whether the concept is valid or whether we are calling for a lot of effort and spending based on an outmoded paradigm.

Is mapping a valid concept?

Although it is well established that the connections between dendrites and synapses in the brain are in a state of constant change, we cannot seem to get out from under the idea that brain activity has some kind of shape — a geography that lines up with function. The brain does not sleep and nothing ever gets turned off in the brains of living creatures. The map of what connects to what must always be changing. Any one instant of imaging will represent just that instant and perhaps nothing more. A map of how "billions, if not trillions of nerve cells interact" has also to account for the role of the billions upon billions of support cells called glia that also make up the brain. No one associated with the mapping initiative seems to be asking what these critical cells contribute to normal and abnormal functions of the brain — so the dynamics and dynamical changes that are always in flux are not going to be characterized by temporally static or even dynamic measures, no matter how technologically sophisticated they may be.

"Maps" are, at absolute best, only limited approximations of the constantly intense dynamics of brain activity, structure and function. The neuroscience community cannot agree as to what it is, exactly, that should be mapped. Molecular changes? Genomic changes? Proteins? Structural changes? Electrical? Biochemical? All of those "events" involve vast numbers of signaling pathways, each of which affects the others in a vibrant, ever-changing cascade. And this doesn't even begin to address how environmental and behavioral feedback loops affect these mechanisms.

In the present state of neuroscience, there is no consensus about the best approach to mapping, and which approaches should be given the highest priority. And as Mithra notes, even if we could map the action potentials for every single neuron in the mammalian brain, how do we make the jump to complex behavior that emerges from the measurement of action potentials? When, and for how long, will recording have to be done to generate that information? This is no small issue. Others have also expressed concern that current imaging technologies have often been incorrectly applied, leading to the wrong conclusions about how the brain is "wired" and how it functions in a dynamic state.

How should we proceed?

Before we try to map brains (even brains of worms and fruit flies and mice), we need to work out better concepts of what needs to be measured, and then apply the appropriate technologies to measure it. As it now stands, we have high-level technology with no clear concept of what to measure and no defined goals or endpoints. Does the project simply go on forever? When will we know that we have the answers? I agree with others that despite the rhetoric of administration spokespersons and those who will benefit directly, this is not at all like the genome or moon landing projects.

In my own field of specialization, traumatic brain injury and stroke, we know that even humans with massive damage to the brain can make remarkable recoveries of function — under the right conditions — sometimes almost instantly. The problem we face is how to unlock those conditions. Brain maps cannot account for this extensive plasticity and repair at all, any more than most diseases can be attributed to the regulation and expression of just one gene — as most systems biologists will tell you if given the chance.

What practical outcomes do we expect?

Some have argued that investing in the mapping project will generate new jobs and wealth, and this could happen. The Human Genome Project is generating considerable wealth and biomedical startup companies (for example, screening genomes for individual clients) — certainly more than the initial dollars invested. However, the actual benefit to patients has so far been very limited. We now know a lot about the human genome map, but how many diseases have been cured?

New York Times reporter Gina Kolata, recently reporting on DNA testing for rare disorders, noted that sequencing of the entire genome of patients with rare diseases is becoming so popular that the costs are now down, from $7,000 to $9,000 for a family, and demand is soaring — hence the commercial value of such tests. Yet all the sequencing offers no panacea, she says: "Genetic aberrations are only found in about 25 percent of cases, less then 3 percent get better management of their disease and only about 1 percent get an actual treatment and a major benefit."

With the Brain Mapping Initiative, are we about to make a very heavy investment in a project that promises no end-points and nothing specific in the way of actual benefit? If so, we ought to be clear about it and not let the public think that "miraculous cures" and full understanding of brain functions are just around the corner.

We need to talk

I urge that we need broader and more considered discussion of how we want to invest our research resources. I marvel that a small group of scientists were able to catch the president's attention and support, but is this kind of earmarking in lieu of salient peer-review the way we want to decide allocations for research? We hate it when Congress does this (if we're not the beneficiaries), so do we want to adopt the same model? These questions should all be a part of the debate.

Whether I agree with the paradigm or not, I most certainly support those who still want to continue research on brain mapping. But we need to look again at whether it merits the disproportionate investment and prestige proposed for it, especially now, in a time of severe, perhaps permanent curtailments in biomedical research funding. This is not about big science or small science and this is not just about the $100 million kick-start — the stakes and costs will be much higher. This is about good science and bad science, or at best, not-so-good science. In the current zero-sum game of funding, many other areas of critical biomedical research, including hundreds of small, or smaller, projects with potential for important near-term clinical application, will suffer as the money goes elsewhere and as students and researchers flock to where the money is. Is this good for biomedical research? Are we sure?



 
January 4, 2013

Recently a colleague asked to me contribute suggestions for a strategic plan being developed for one of the institutes of the NIH. The idea was to come up with bold new initiatives that could set the institute off on some new directions. The NIH has been facing harsh criticism lately from policy-makers, Congress and the biomedical scientific community, all concerned that the agency is losing its way. A growing number of researchers are beginning to feel that trying to combine basic and translational research is like crossing the “Valley of Death”: any new idea or approach in moving from basic to translational research is immediately quashed because the institutes and their scientific advisors and reviewers are too timid to try anything novel or risky.

Sharon Begley, a former science editor and senior science writer for Newsweek (now senior health and science correspondent at Reuters), argues that the valley of death is especially perilous for biomedical researchers. She reports that "from 1996 to 1999, the U.S. Food and Drug Administration approved 157 new drugs. In the comparable period a decade later—that is, from 2006 to 2009—the agency approved 74. Not among them were any cures, or even meaningfully effective treatments, for Alzheimer’s disease, lung or pancreatic cancer, Parkinson’s disease, Huntington’s disease, or a host of other afflictions that destroy lives" [1]. Begley claimed that

the chance of FDA approval for a newly discovered molecule, targeting a newly discovered disease mechanism, is a dismal 0.6 percent. Diseases are complicated, and nature fights every human attempt to mess with what she has wrought. But frustration is growing with how few seemingly promising discoveries in basic biomedical science lead to something that helps patients, especially in what is supposed to be a golden age of genetics, neuroscience, and biomedical research in general [2].

What’s going on here? Is Big Pharma the culprit? The NIH? or what? 

·   We have met the enemy and he is us

A widely noticed recent article in Nature [3] accuses the NIH of “promoting mediocrity” and “small science” rather than supporting novel and courageous studies with low probability of success but high potential. According to Nicholson and Ioannidis, between 2002 and 2011 it issued around 460,000 research grants totaling almost $200 billion. Yet when the Institute Director, Francis Collins, proposed a new National Center for Advancing Translational Sciences (NCATS), there was a tremendous hue and cry from many biomedical scientists that this would divert critical funds from basic research—despite the fact that NIH support of basic science over the last 50 years has led to few discoveries with direct benefit to patients.

I’m coming to believe that the NIH and other Federal funding agencies are not at the heart of the problem. It’s us—the scientists. On both internal and national study section reviews of research proposals, I’ve observed a growing tendency of the peer reviewers to focus on the small and petty rather than on the larger issues of whether the potential work is creative and innovative. Lip service is often paid to these last two factors, followed by a feeding frenzy to tear apart the work and find reason to reject, often based on technical “flaws” that could easily be corrected by a phone call to the applicant asking for clarification. All it seems to take to start this cycle is one person raising a criticism—rarely does anyone challenge the critic,

It has been my impression that the younger and less experienced reviewers are often the harshest and most unsympathetic to their peers. It’s as if they have to make their bones by shredding their colleagues, especially when, God forbid, anyone has the temerity to propose solving a scientific problem with a paradigm that differs from the reviewer’s own ideas. It may be understandable that in the face of very limited funding one has to be more selective about what gets funded, but this adversarial approach to reviewing research proposals has gone too far and is eroding the morale and productivity of the scientific community as more and more young people are turning away from research careers in disgust and frustration.

I can remember a time when Study Sections were actually very collegial, and not always on the attack. They often gave investigators the benefit of the doubt and suggested specific modifications that the PI could agree to do. Certainly, the awards were still very competitive, but for the most part PIs felt that everyone was part of the same scientific community.  

The prevailing adversarial trend is exacerbated by the continuing lack of vision in the strategic planning of the research universities—who continue to think that they can count on external grants and indirect cost returns to fund their research mission, all the while putting tremendous pressure on the investigators to “go where the money is” and find the funding to keep their careers alive. The mission is no longer about doing the research but about attracting research funding at any price. We all know this, but either we won’t openly admit it or we are in denial about the reality because we haven’t planned on what we’re going to do with all those research buildings, staff and faculty we’ve just thrown up all over the campus in the forlorn hope that the Feds will pick up the tab.

So where does all this leave us, especially as we face the prospect of ever-diminishing NIH and other Federal support for research? Where do we go from here? What kind of mess we are leaving for our next generation of students and biomedical researchers?

·   How can institutions foster out-of-the-box thinking?

As I looked over the strategic plan and the White Papers from various ”consensus” committees, they all seemed very déjà vu and plus ça change. I think this was apparent in the publically available materials. It was my impression that the paradigms behind these ideas continued to support the belief that if we can just map, record and classify everything, good things will emerge and somehow we will find answers. This idea is based on 18th and 19th century taxonomy with more modern tools and techniques. As famously expressed by Einstein and Frank Zappa, "information is not knowledge," no matter how much of it accumulates. Where are the "Bold Ideas" and themes that emerge from all the work, planning and White Papers of the consensus committees?  Here is the website—you can judge for yourselves: http://www.nih.gov/news/health/dec2012/od-07.htm

Over 40 years ago Thomas Kuhn theorized that as any given area of science approaches maturity, its paradigms become well established and are taught to the rising generation of new scientists. This stage of development—“normal science”—“often  suppresses fundamental novelties because they are necessarily subversive of its basic commitments” [4]. Thus most researchers throughout their careers are engaged in “mopping-up operations”:

The enterprise then seems an attempt to force nature into the preformed and relatively inflexible box that the paradigm supplies. No part of the aim of normal science is to call forth new sorts of phenomena; indeed those that will not fit the box are not seen at all. Nor do scientists aim to invent new theories, and they are often intolerant of those invented by others. (p. 24)

Long before Kuhn, Leo Tolstoy made the same point:

I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives. [5]

If Kuhn and Tolstoy are right, it’s not surprising that we are now facing some very serious problems. In biomedical research, when you go to a major professional scientific meeting like the Society for Neuroscience annual meeting where some 15,000 posters are presented without any prior review, how much of that tremendous body of work is going to be anything more than small, “normal” science? Of the 20 million scientific papers published over the last dozen years [6], how many ever get cited? Who even reads them? Sooner or later we have to ask whether this is good use of increasingly limited financial support for scientific research. 

With this in mind, I looked over the priority items on the NIH list and selected the few that I thought most interesting, added my comments (below) and sent them to my friend.  

•   Develop more personalized, patient-centered medicine

This is a bold new idea? My generation grew up thinking that all medicine was personalized! This “novel” concept implies that we will go back in time to when physicians actually knew their patients and their history, and apply integrative diagnostic tools that take into consideration the patient as a person rather than a collection of diverse organs each attended to by its own expert. This paradigm is now getting some attention at my institution. Out of the hundreds and hundreds of physicians at Emory, I am pleased to report that two young doctors have been assigned to the new “Center for Personalized Medicine.” I was even more pleased to learn that a third will soon be added. This is certainly a step in the right direction. You can check it out at   http://www.emoryhealthcare.org/patient-centered-primary-care-clinic/providers.html.   

Maybe this is the cyborg wave of the future, but this integrated care is not data- or computer-based; it’s more like a marketing ploy than full-blown personalized medicine.

As practiced by genuinely innovative institutions, personalized medicine is described by Leroy Hood, founder of the Institute of Systems Biology, in this press release:

The vision of P4 [Predictive, Preventive, Personalized and Participatory] medicine is that each patient will be surrounded by a virtual cloud of billions of data points. Advances in science and technology will reduce this enormous data dimensionality to simple hypotheses about human health and disease. The ultimate outcome is to create individualized patient disease models that are predictive and actionable. The shift to P4 Medicine will also require societal changes. [7]

 Where does the physician fit into this scheme? Sounds like all we need to figure out what’s wrong with us is a computer with lots of memory (for the billions of data points) and maybe a new, subcutaneous USB connection into our bodies and everything will be revealed via an immediate printout we take to the pharmacy for our meds. Robotic surgery can no doubt handle the rest—as long as the patient stays connected to the computer terminal. Given the over-specialization of just about everything in academia and medicine, looking at the complexities of diseases and how they affect the organism as a complex system is very much a step in the right direction. Here's where all those tools and P4 tests can be used to generate a comprehensive, computerized avatar of the patient as he/she moves through the developmental spectrum from infancy to old age. Maybe the patient-centered care approach will also require new thinking along the lines of those now used at the Mayo and Cleveland Clinics—a team approach which is likely to be fought tooth and nail by health care systems whose primary concern is the practice of fee-for-service medicine. This paradigm shift is clearly something the National Institutes of Health should urgently discuss and make part of its policies in planning and supporting its research mission. Given the dissociative and often unsympathetic way patients are treated these days, promoting this aim would be a really BOLD step for the NIH.

•   Invest more in systems biology research

I like this idea because to me this means moving away from the paradigm of mechanistic reductionism where very complex events are reduced to the level of a single gene, protein, receptor or organ. Such reductionism as applied to behavioral and neuroscience research has its grounding in 19th century biology and phrenology. I believe it’s time to escape this outdated paradigm. It would be a VERY BOLD step for the NIH to devote more support to holistic approaches to the study of human development, diseases and behaviors. And I’m not talking about alternative medicine here, but rather about giving more attention to a different paradigm of holistic thinking about the causes and evolution of disease, brain function and other systems. Some of these initiatives will obviously include the reconsideration of all the other "omics," along with environmental and epigenetic changes and how the changes affect disease and the course of treatment.

•    Change the nature and mission of the Study Sections.

To move forward in scientific research and education, we have to come to grips with the unremitting silo-ization and micronization of research. This could mean changing the make-up and mission of the various NIH study sections AND providing more instruction and training about the specific mission to individual study section members. This change in the status quo will take a lot of courage on the part of NIH staff and the Director, but if you don't do this the NIH will continue to be seen by many as a "valley of death" for major breakthroughs in biomedical research. How many members of study sections even know what the missions of the different NIH institutes are? 

The focus on perpetuating the same small science driven by the latest gadgets is going to continue to restrict scientific advancement and the delivery of improved health care to the taxpayers footing the bill for research. We’re crippling ourselves with what we are doing. Working to change this situation could be the THE BOLDEST step of all for the NIH—to change the culture in which study section priorities are set and grants are reviewed.

•   More focus on plasticity

As a part of the effort to move to a systems biology paradigm, at least for the neurosciences (where I work), I think we need to precisely define what is meant by "plasticity" and how it can be applied to the study of health and disease. We have to get more serious (and scholarly) about this. What we have now, especially in the neurosciences, is nothing more than explanation by naming. There are currently 54,631 papers on PubMed with “plasticity” in their titles. Can we take it for granted that we all just know intuitively what the term means? Does anyone even care what it means as long as it continues to generate grant funding? I looked at the white paper on this topic and it’s still as unclear as ever. Given how it is used in the documents and in most published papers—what's NOT an example of plasticity?

I’ve discussed this question in a recent article because I think it has important consequences for teaching about how the brain works and for research resource allocations [8]. If we’re going to continue to allocate resources to this field, it would be at least a moderately bold step to actually ask those doing the work to come up with precise definitions of plasticity and what it means at the different levels of analyses (molecular, anatomical, physiological and behavioral). We've been batting the term around for a long time—but what has actually come of it that we didn't know before someone invented the term as a catch-all phrase? Even the new DSM-5 does better than this.

•   Re-evaluate how clinical trials are done

It would be a very big bold step for the NIH to come to grips with the question of whether the randomized, double-blind, controlled trial is the best use of public funding of clinical trials. Should we be using newer, adaptive design approaches which permit more fluidity in trial design and hypotheses, taking immediate advantage of the data as it unfolds rather than staying locked into a design until it self-destructs? Is this a better strategy for the patients in the trials and those to be enrolled in the trials? There is much debate about this but a BOLD step would be to address the question head-on and for the NIH to take the lead in the debate and planning of future trials.

•   Direct resources into making a scientific career more attractive to young people

Here are some very recent NIH remarks on helping young investigators remain in the field:

This is the key to the future of ALL the institutes and to the use of public funds to support education and research in the years ahead. I see so many signs that we are falling behind (and that paper from Nature confirms this) because we are simply not getting the kinds of students we need who want to pursue a career in academic research. It’s time to stop just talking about the issue and initiate specific programs (both financial and educational) that will bring young scientists to the field,  provide decent salaries and keep them engaged in their education and provide some stability in the funding process to keep them going through their early years. 

·   The average age of a first NIH grant recipient is 44                  

What is the point of cranking out PhDs if we can’t support their work in any reasonable way? This is why many students I talk to have no desire at all to pursue a university research career. This is totally nuts … we all know it and we keep perpetuating the system that leads to this. Again, I have to come back to the problem of how to overcome timidity over changing the status quo (the mentality of: “if it was good enough for me in my training—it should be good enough for them as well”). The community has to work harder to build those bridges over the valley of death if we are to succeed down the road.

Having been around the sciences for now more than 50 years, this is what I see going on now. It seems like we’ve come from a much better place in the recent past two decades to where we are now. If there are better alternatives, we ought to be discussing them in open forums and coming up with viable priorities instead of just writing form letters to our Congressional delegations telling them how important and wonderful we are and why we need support more than anyone else with claims on the Federal budget. If we do write, at the very least, we ought to tell them the truth.

REFERENCES

1. Begley Desperately Seeking Cures http://www.thedailybeast.com/newsweek/2010/05/15/desperately-seeking-cures.html

2.  op cit

3.  Nicholson, JM and Ioannidis, JPA. Conform and be funded. Nature, Vol 492, 6 December 2012, 34-36. http://www.nature.com/news/2008/080611/full/453840a.html

•       Begley’s comment on it, http://www.reuters.com/article/2012/12/05/us-science-nih-innovation-idUSBRE8B412J20121205

4. Thomas Kuhn, The Structure of Scientific Revolutions 1962, University of Chicago Press, p. 5



5. Leo Tolstoy, What is Art, Chapter 14. Oxford University Press, 1930, trans. Aylmer Maude. Quotes.net.  <http://www.quotes.net/quote/36375>.

6. Nicolson and Ioannidis, p. 35.

7. Emory Health Now, “Personalized Medicine Day in Georgia,” September 9, 2011. http://www.emoryhealthsciblog.com/?p=5085

8. Stein, DG, “Concepts of Central Nervous system plasticity and repair and their implications for recovery after brain damage.” In Zasler, ND, Katz, DI and Zafonte RD (eds), Brain Injury Medicine: Principles and Practice, Chapter 13, p. 162. New York: Demos Publishing, 2012.

 
COMMENTARY ON:        Issues and Perspectives: Can NIH Renovate the Biomedical Workforce?
FROM:                            sciencecareers.sciencemag.org [1]
A MODEST PROPOSAL

This recent article in Science summarizes an NIH Advisory Committee draft report [2] calling for the diversion of funding from research grants to training grants for graduate students and post-docs in the biomedical sciences. The proposal is one  that needs to be taken seriously because it represents a paradigmatic shift in science education and training policy. It could open the way to making science a profession rather than a technical craft for current and future generations of researchers. The report also recommends better pay and benefits for post-docs and staff scientists—resulting in all likelihood in fewer but better-paid positions.

Some members of the Advisory group argued that the current state of affairs is “dysfunctional and unsustainable” and that most graduate students do not receive the training they need to work in other roles in industry, government, policy and science writing. I agree. Nor, I would add, do they get adequate training to teach science in primary or secondary schools, community colleges and four-year colleges—the very places where early instruction in the sciences is so critical for the future of the United States in technology, biomedical research and a host of other areas of strategic interest to this nation. 
PUSHBACK

Faculty and administrations in research universities all know this to be true, so why do we need a special advisory panel to tell us what is so clear to all? Because the status quo serves senior administration through increased indirect cost returns on awarded grants and those faculty who want to sustain and advance their careers by using graduate students, post-docs and non-tenure track staff scientists as cheap-labor laboratory drones. The pushback has already started, as Price reports, with establishment scientists from within and outside the advisory committee weighing in on behalf of the status quo.

As Price notes, this situation has been going on for a very long time. Keeping students and post-docs narrowly focused on their “mentor’s” research can lock them into  student/fellow/staff scientist positions for 7-11 years or more at a low cost in salary and benefits. Putting more students on training grants instead of research grants could mean that some serious attention in graduate programs will have to be given to actually training students beyond the confines of an individual’s laboratory. This could mean a real shift in effort and a distraction from grant writing and getting out those publications we need for promotion and tenure (and more grants) (and more grants) (and still more grants).

Given these conditions is it any surprise that so many laboratories are filled with students and post-docs from overseas, mostly from China and India? We all know this but we turn a blind eye to the reality of the situation. We rejoice in our openness to the global science community and congratulate ourselves on attracting cheap highly skilled labor, while we ignore the crisis represented by our failure to maintain human capital to forward our larger national science agenda.
DEJA VU

When I came to Emory as Dean of the Graduate School, my staff and I proposed a then-new program to train graduate students in all fields across the university to be more effective teachers and communicators. The students would get stipends to take required training, which would then allow them to participate in the teaching of at least one course as they advanced through their graduate curriculum. The students were enthusiastic about the program and the President, Board of Trustees and department chairs all approved it. As a new dean at the time, I was invited to lunch by several department chairs in the biomedical sciences to discuss the teaching-training program. After a few pleasantries, I was told that if I expected the faculty in the departments to honor what they had agreed to do, they, the Chairs, would “make my life a living hell.” They argued that even  this limited training opportunity would detract from the students’ research and impair the productivity of the mentor’s laboratory and his/her ability to get grants.

This happened over 15 years ago. Just as Shirley Tilghman, President of Princeton University and chair of the Advisory Committee noted, the issue of how to properly train and mentor graduate students and post-docs is still with us today.


YOU GET WHAT YOU PAY FOR


In fairness to my faculty colleagues, I realize that the issue is complex and compounded by a host of other unpleasant factors that are not of their making. The problems we face are pervasive. The May 9th issue of The Scientist [3] has an interesting “Opinion” article by Fred Southwick, a professor of medicine and Chief of Infectious Diseases at the University of Florida: “Academia Suppresses Creativity—By discouraging change universities are stunting scientific innovation, leadership and growth.” Southwick argues that “in the academic world—where much of today’s scientific innovation takes place—researchers are encouraged to maintain the status quo and not ‘rock the boat.’ This mentality is pervasive, affecting all aspects of scientific research from idea generation to funding to the training of the next generation of scientists.”

Given the tremendous pressure for scientists to obtain external grant funding, it is hardly surprising that anything that would rock the boat—like giving students a better-rounded educational experience or teaching opportunities, and helping them to learn real-world job survival skills (other than grant writing)—is considered by some to be a waste of time. Southwick also argues that grant pressures inhibit creative, high-risk, but potentially high-yield research. 


PART OF THE PROBLEM: STUDY SECTIONS AND THEIR LITTLE WAYS

I've lost count of the number of times younger faculty have told me they feel tremendous pressure “to go where the money is” rather than pursue novel ideas or follow their creative urges. There is an overwhelming emphasis on teaching and testing technical skills rather than critical and conceptual thinking that could lead to a lifetime of interesting research. The NIH is aware of this problem, but program officers are often overruled by study section members who simply cannot get beyond technical nitpicking to look at the broader issues of where their respective fields need to go. We end up practicing “safe science” to make sure we get that next round of grant funding.

It’s not surprising that the NIH has been called the “valley of death” when it comes to the development of new ideas [4]. 


THE UNIVERSITY OF OZYMANDIAS


Clearly, the pressure on the faculty to get grants relegates the teaching and training mission to the lowest possible priority because there is little, if any, reward for the faculty to put time into this very important endeavor. How many medical school faculty do we know who got tenure or support because they are excellent teachers and mentors?

Every time we turn around at Emory we see trees going down and new “research” buildings going up, but what about university support for faculty and student development? Where is the moral commitment on the part of the administration and department chairs to make the university whole? Our institution is closing  more than half a dozen departments in the college of liberal arts and freezing staff and faculty salaries across the campus while at the same time acquiring new hospitals and constructing many new research facilities, all presumably to be paid by indirect cost recuperation from Federal grants. Given the  state of US economy, some would argue that this is building a house of cards on a bed of quicksand. I'm just saying.

Fifteen years ago graduate students could count on at least five years of stipend support from the Graduate School. When they got training grants or other support, the departments were allowed to “bank” most of the released stipend funds to support their graduate programs and recruit top students. When was the last time that kind of investment in ”human resources” happened anywhere? As Southwick points out, “universities can take a clue from the business world, which has (finally) realized that investment in human resources and employee development—not brick-and-mortar structures—creates successful, competitive enterprises.”

I’ve said this before [5] and I’ll say it again: the concept of the University is not all about seven-figure salaries for administrators or throwing up new buildings in the hope of generating enough indirects to cover costs, or hiring a few stars who manage to pull in five or more NIH grants to help with this, or using the latest big ticket gadgets to generate another parametric study. It’s about the generation and imparting of new knowledge, and teaching students how to generate new ideas, create hypotheses, solve problems and develop a deep conceptual and theoretical understanding of their fields.


I HAVE SEEN THE ENEMY AND HE IS US


The faculty have themselves to blame for a lot of this. By letting ourselves, by default, be complicit in accepting the status quo, we fail our students and ourselves and sacrifice the long-term strategic interests of our country for short-term personal gain. This is not how it is supposed to be.



1. Michael Price. Can NIH Renovate the Biomedical Workforce? Science: Science Careers. June 22, 2012
http://sciencecareers.sciencemag.org/career_magazine/previous_issues/articles/2012_06_22/caredit.a1200069

2. A Working Group of the Advisory Committee to the Director National Institutes of Health. Biomedical Research Workforce Working Group Draft Report. June 14, 2012
http://acd.od.nih.gov/bmw_report.pdf

3. Fred Southwick. Opinion: Academia Suppresses Creativity. By discouraging change, universities are stunting scientific innovation, leadership, and growth. May 9, 2012
http://the-scientist.com/2012/05/09/opinion-academia-suppresses-creativity/

4. Sharon Begley. Where Are the Cures? Oct 31, 2008 8:00 PM EDT
http://www.thedailybeast.com/newsweek/2008/10/31/where-are-the-cures.html

“’It's called the valley of death,’ says Greg Simon, president of FasterCures, a center set up by the (Michael) Milken Institute in 2003 to achieve what its name says.”

5. Donald G. Stein. Hard Money/Soft Money: The two cultures. The Academic Exchange,  May 2007 Vol. 9 No. 6 
http://www.emory.edu/ACAD_EXCHANGE/2007/may/steinessay.html

 
I used to think that companies like Apple, Amazon, Google and Facebook were the masterminds of marketing, but an article in the May 31st  New York Times has changed my opinion. In “For some, exercise may increase heart risk” science writer Gina Kolata asks whether exercise could actually be bad for some people. She reports on data from six retrospective exercise studies involving over 1600 people showing that, for those who exercise regularly, about 10% become worse on measures related to heart disease—blood pressure, insulin resistance, and levels of HDL cholesterol and triglycerides. About 7% grow worse on two or more of these measures. What a, um, stroke of good fortune for cardiologists and what a fabulous marketing strategy for the field.

Most people who exercise a lot are young adults who usually don’t frequent a doctor’s office very much, and do not spend a lot of time and money being evaluated by cardiovascular and endocrinology specialists. Now, in this brave new world where exercise can be dangerous, unless you are properly evaluated, you won’t know which group you belong to—the fortunate 90% that will benefit from exercise, or the luckless 10% who would be harmed by it, or could be harmed by it over the, um, long run. So the active, healthy cardiologist avoiders will now be loping to the cardiologists as fast as their Nikes will carry them.

The folks who hate exercise will now use the report to stay away from activity or exercise even less, not knowing which group they belong to. But for anybody who wants to be sure, the only option is more visits to the cardiologist and more extensive checkups than the 3.7-5.0 minutes usually allowed for an office visit. Sorry, couch potatoes, you don’t get a free ride on this one.

With a stroke of a pen, the relative value units[1] of millions of people will dramatically increase. Can you imagine how General Motors would respond if it found a marketing strategy that could increase its sales by millions of customers just by issuing a report? I should’ve listened to my parents when they pleaded with me to become a “heart doctor."

With all the pressure to reduce health-care costs, the issues of maintaining physician income and increasing clinical revenue streams to safeguard the status quo are growing in urgency, so this report could not have appeared at a better time.

I now stand in awe of cardiology. With this marketing report they have struck the mother lode. Even if the studies prove to be flawed or downright wrong, their new revenue streams could be fantastic before any corrective actions can be taken.   

So congratulations, cardiologists--you are the new masters of the universe. At least until the new studies come along showing that sleeping and or sex shortens our lifespan.

 
[1] A relative value unit is an AMA copyrighted formula for figuring out what fees physicians can charge Medicare or other patients for their services. The more "resources" the doctors use (tests, time spent in the office with the patient, etc.), the higher the relative value of that unit. Old folks are typically worth far more in RVUs than a young and healthy patient. Today, most physician incomes are based on RVUs. The more RVUs he or she produces, the higher the salary.    
 
The Scientist is a magazine of contemporary science dealing with new findings as well as the politics of academic and industry research. I wish I had written this recent opinion piece. It highlights the critical changes in the academic culture and describes the moral failure of modern universities to support and encourage scholarly research across all academic disciplines.

We tend to think egocentrically that "our own institutions" offer the "worst" examples of commercialization and commodification of academic life, but the article by Dr. Fred Southwick of the University of Florida, shows that the loss of academic integrity in the pursuit of money is more endemic than one might imagine. As a former university administrator and a current member of the faculty, I see and feel these pressures all the time. The pressures to get the grants, "go where the money is," bring in the indirects, are particularly intense in schools of medicine--even moreso in those that are not a part of a state university system.  

It is not at  all uncommon to throw up medical buildings at mega-expense as fast as possible in the hopes of covering the costs through indirect cost returns on grants generated by the faculty--while at the same time informing the staff that there can be no raises and no internal support for research and scholarship. These edifices are being constructed on quicksand--given the economic uncertainties we face hese days.  

Southwick hammers home this point clearly and argues that the results of all this commodification is a loss of faculty creativity and enthusiasm for taking risks into uncharted territories of scholarship and research. Southwick is right in claiming that teaching--especially at the undergraduate level--also suffers--especially in the arts and sciences.  

I wish it were otherwise. I come from a different time when it was possible (and encouraged by Deans and university presidents) to explore and chart new paths. It was a Golden Age and I am sorry that the next generations of university faculty and students will not be able to share it.
 
My wife and I believe it is critically important to help support young scientists  to advance their careers in our field of research.  

One area that has not received as much funding and interest is women's health and neuroscience. So we asked the Women's Society for Health Research to help us.  The WSHR decided to create four travel awards that would allow winners to present their results at any professional meeting of their choice.  
"The Society for Women’s Health Research (SWHR) is proud to announce the Donald G. and Darel Stein Fellowship winners. This fellowship promotes the study of sex differences in neuroscience by affording four students the opportunity to attend and present a poster on neuroscience and sex differences at a scientific meeting."
We felt that this was a highly original and very generous use of our gift. As a member of many scientific and professional organizations, I can't think of any one of them that would do something like this. All would use such funding to make sure the recipients would be able to attend only their own meetings. We are very proud and happy to support WSHR.  
 
Several years ago, I was asked by the editor of my university's faculty newsletter, The Academic Exchange, to write a short essay on “community" in the world of academic research. She wanted my perspective on how we build a sense of community when so many faculty members are required to generate some or all of their salary and fringes from sources outside the university. This is the “hard money/soft money” issue that continues to create so much tension among faculty at many research universities including my own. It’s gotten even worse in the last five years.

How can a young investigator hope to build a career in research at a research university if all the funding for her very survival must come from grants and contracts at a time when fewer than 8% of grant applications get funded?

Why would you even try to make close friends if most of your peers, or you, are not going to be around in the next couple of years? Why do people in the Business Office of a university have more job security than a seasoned biomedical researcher? What kind of “community” is this? What are the inducements to become involved in university affairs and community when your bills are paid by the NIH or some other federal funding agency, when everyone (except the Chair, of course) in a department is basically competing for the same funding with less than a 10% chance of success?

Picture a large open—"common"—space filled with laboratory benches and procedural rooms. Now picture what it must be like if the space you get depends on how many external grant dollars you can bring in. Your space gets better with bigger grants and is publicly rescinded when a grant ends or your federal dollars begin to shrink—a public humiliation like a scarlet letter or a dunce cap.

How much of a community spirit will be built under such conditions?

Faculty in the Arts and Sciences and the law and business schools are typically on hard money lines: all or most of their salary is covered by their school and, in return, they are expected to engage fully in teaching, research, and service throughout their careers. These faculty often supplement their income with grants or awards to cover teaching or scholarly work over the summer; many travel so they can work and reflect away from the day-to-day pressures of the school year. The protection of tenure and other benefits such as sabbaticals and paid leaves become “golden handcuffs” that make it hard to leave the institution. It's not surprising that many universities are top-heavy with senior faculty, often making replenishment and rejuvenation of departments difficult. 

In sharp contrast, almost all faculty in the medical and health sciences must generate a very substantial part, if not all, of their income through externally funded research grants. Sabbaticals and other opportunities for re-training outside their institutions are rare. A researcher on a grant is expected to be there all the time, and effort reporting is relentless as granting agencies demand to know exactly how many hours/week are actually spent on the research.  

This pressure can be unrelenting, and is one of the reasons most U.S. research labs are now staffed largely by young researchers from China and India. This doesn’t bode well for building a future research community  (We should discuss this issue in another blog).

If the faculty are also clinicians, they must generate their income through clinical practice or a combination of practice and research grants. Their income is wholly or largely dependent on vagaries of federal or philanthropic funding priorities that may have nothing to do with the priorities of their home institution.

Background. During the Cold War, federal agencies were mandated by Congress and by presidents of both parties to outperform and outspend anything that the Soviet Union or China might do. This resulted in a tremendous growth of federal spending for research—for the arts and humanities, but especially for the sciences.

When times were good, being on soft money had its rewards. A scholar who was successful in getting grants could focus exclusively on her research and enjoy all the advantages of being at a university—freedom to select what problems to work on, access to graduate, undergraduate and post-doctoral students to perform research tasks, library and ancillary facilities, and many other amenities not necessarily found in the corporate research sector. With their external  grants in hand they were asked to do no teaching or service. Undergraduate teaching was handled by TAs or unlucky assistant professors without grants. Undergraduate teaching was often disdained as taking away too much time from the research. In Arts and Sciences, faculty who got grants often put some salary on their grant so they could reduce their teaching loads (justifiably). 

Faculty with big federal grants became the darlings of the academic world (and still are, to some extent) because of their ability to generate income and status for the university as measured by the number of external dollars brought in.

Deans, too, were thrilled with this arrangement—not only were they spared the expense of paying salaries and fringes from their budgets, they received supplemental funding of from 30-100% of the grants in indirect cost returns from the granting agencies. Institutions raced to build new facilities to attract soft-money researchers, many of whom brought with them multiple grants paid for by someone else. They were the free agents of the academic world.

From the 1960s to the early 1990s was pretty much the golden age for research and development and there seemed to be no end in sight. Grant-supported faculty needed teams of students and post-docs to support their research while they were out applying for ever more grants. It was a never-ending cycle turning out more and more students as more and more graduate programs were created to capture a larger market share of mostly ONR, NIH and NSF dollars. New science facilities were built on campuses across the nation and the rising waters floated all boats, so even faculty less talented in grantsmanship had access to research laboratories, libraries and such, albeit at a lower level.

At one point there was a doubling of the NIH budget, but this bubble burst about ten years ago. 

Federal and foundation research funding stopped growing and has stayed flat year after year, not even keeping up with inflation. Yet pressures on biomedical research faculty to continue supporting themselves remains relentless, and has even increased as university administrations find themselves strapped with all those big buildings to maintain. 

As everyone knows, these last four years have been even more difficult. Now, with the $14 trillion national debt and pressures to cut budgets stronger than ever, what can we do? 

Two cultures. Today, the system that supports some faculty with hard money while many of their colleagues depend on soft money creates a two-tiered structure inimical to the idea of creating a collegial community of scholars. 

Soft culture and community: Soft-money faculty spend almost all their time writing grant proposals (my lab has submitted four just since the beginning of this year). Almost no grant application now gets funded the first or even second time around. Each time an application is turned down, it takes about nine months to a year to go through the same review process again. And now at the NIH and other federal agencies it's “two strikes and you’re out!” 

Faculty caught in this cycle have little time for teaching or mentoring. In fact, government effort-reporting requirements prevent researchers from engaging in such activities unless they take a reduction in time from the grant support and make up the difference from university funds. In many top-level research institutions, having a grant is not enough—now if one does not have at least two federal grants (with those important indirects) and lots of publications and presentations, the possibility of tenure becomes pretty remote.  

Government regulations prohibit even writing a grant proposal using grant funds because such time is not being spent on the research per se. Even tenured faculty in the health sciences who lose their grant support may soon find themselves out on the street, because the tenure applies to the position rather than their salary. This means that they can keep their title, but the office and lab will go to someone else. 

Soft-money faculty can’t commit time to building the university “community” because that’s not what they’re paid to do. Where should their loyalties lie? With the federal agency that pays their salaries, or with a university which may provide some space (competitively based on how many grant dollars they generate), but offers neither means nor motive for collegial interaction with peers or students?  

Soft culture and courageous leadership: When I was a member of the Advisory Council of one of the National Institutes of Health, my colleagues and I would often discuss the concern that so many biomedical scientists have to go where the money is rather than follow their passions and interests, or even the logic of their research trajectory, in designing and conducting research. Dependence on scarce federal money increases anxiety while systematically reducing creativity as investigators apply for “safe projects.” In fact, the New York Times recently quoted a Research Director at the National Institute of Aging as stating that “the NIH can only afford to fund ‘safe science’—it no longer has the capacity to support innovative research.”  

"Without basic research, there can be no applications. … After all, electricity and the light bulb were not invented by incremental improvements to the candle.” --President Nicolas Sarkozy of France, addressing the International Conference on High Energy Physics in Paris. Sarkozy announced France's plan to increase spending on higher education and basic and applied research by €35 billion for the next 4 years as part of the country's bailout strategy.

In the face of ever-increasing budget restraints and with more and more people applying for grants to keep themselves working, the study sections on which I previously served have become very conservative and timid about supporting novel or controversial research. Worse, my impression is that these peer review groups have become adversaries rather than advocates of their fellow scientists, often looking every which way to find fault with a grant proposal.There was a time when study sections made suggestions, and if they were minor, the Program Officer would call the applicants and ask them to make the suggested changes--it was a much more collegial time. Now even the smallest flaw in a proposal, one easily fixed after a brief discussion, is used as excuse to give a weaker priority score. An applicant who proposes something innovative is very likely to be slammed as not having any “preliminary data" to support the idea or as not having any experience in the specific area, when the area may not yet exist. Think what this does to the spirit and the motivation to continue in a research career--and when your colleagues in the department (and the reviewers themselves) are competing for the same pot of shrinking research dollars. 

Many of my colleagues across the country have simply quit because they no longer find their work enjoyable and personally or professionally rewarding.  

The issues for the soft-money faculty themselves are clear, but what does it mean to the community to have a high proportion of soft-money faculty? Trying to build and then sustain a research community on such an unstable foundation does not seem like a good idea.

Addressing the contemporary reality. Sooner or later some great university is going to rise to the challenge of changing the status quo and really make a substantial commitment of its resources to the support of scholarship and research across the institution. If we don't do this, how can we get faculty to become more engaged in the life of the university at a time when higher education has become so important, in a global economy, to sustaining the economic and social competitiveness of the United States?   

If universities were to acknowledge that the circumstances that brought about the soft money bonanza no longer prevail, and reassume their responsibility to support faculty salaries, grants could be substantially smaller, leaving the funding agencies more capital to spread around. 

This idea is not far-fetched. It happens in Canada and most of the European Union. 

Daniel Greenberg, writing in the Chronicle of Higher Education (March 2, 2007 http://chronicle.com/article/A-New-Source-of-Research-Money/36474/) points out that in 2004 Harvard (with a $25.4 billion endowment) and Yale (with $15.2 billion) spent virtually none (Harvard $0.00, Yale $26 million) of their own money on research and development. It’s possible that even in today’s economic climate, these two institutions are so successful in attracting and retaining faculty with major grants that they don’t need to worry about the difficult times facing everyone else. 

As Greenberg writes, “Even while deploring the declines in federal research spending, major universities demonstrate no willingness to make up even some of the difference.” The situation is exactly the same today—if not worse. Some data points: (1) The recent Emory Capital Campaign boasts of raising over $1.3B; (2)  Emory University faculty bring in well over $300m in external funding, most of which goes to the biomedical sciences, but (3) the university itself only gives about $1m to the University Research Council to support faculty research across the entire institution. As an institution, I think we can do a lot better to support novel and innovative projects.
  • Do we continue to throw up research buildings in the hopes of attracting well-heeled faculty with big grants, or do we begin to put more money into faculty development and support over the long haul?  I especially believe that the well-endowed universities like mine are morally obligated to support, materially and not just rhetorically, the scholarly mission(s) they claim as essential to their identity.
  • If a university sees itself as a major player in research and scholarship, a good part of its mission should be to fund creative and risk-taking research to a much greater extent than it is now doing.
  • It might start by providing several years of full funding for interdisciplinary, collaborative, high-risk, and innovative projects. This would require dismantling some of the academic silos we take for granted.
  •  Before conducting external searches, departments and programs should turn to Health Sciences and other science faculty, offering opportunities for grant-funded researchers to redirect some of their efforts to teaching. In the long run, doing good science is also all about teaching. And the long run needs a lot more attention than it's getting.
  • The medical schools and health sciences could create jointly paid appointments with faculty from the Arts and Sciences—strengthening all parts of the university at a lower cost than duplicating expertise on both sides of the street.
  • One way to attract outstanding scholars and researchers is to provide essential salary support in return for greater engagement with the university’s teaching and service mission. This is nothing new. A lot of state universities have a salary base that can be supplemented with external grant support.
  • When faculty do get external funding, it would be an exciting idea for the university to allow their departments to bank the salary savings and apply those funds for faculty and student research, bridging support, recruitment packages and other resources.
  • Do deans and chairs really have to take everything? Some universities even give bonuses for faculty members who bring in grants, or simply give them research funding to try out far-fetched and innovative ideas. Such a program might not work for every area, and not every soft-money scholar would choose these options, but having such choices could do much to bridge the two cultures.
  • Fund some hard-money positions where soft money now predominates. This would allow department chairs and program directors to engage more of their faculty in teaching needed or innovative courses rather than farming these assignments out to graduate assistants or adjuncts. 
  • Hard-money rotating slots could be offered to soft-money faculty every third or fourth year, freeing them to teach something they might like to teach and perhaps to launch research not fundable through traditional grant mechanisms.
  • Granting paid sabbaticals to research faculty could also have major benefits in helping to retrain and reinvigorate people who have been tied to their laboratory benches year in and year out without any break. Is this any different than giving senior administrators a year off at full salary when they step down from their posts? 
  • In sum, the two-tier system as it stands weakens rather than strengthens the goal of being a community of scholars. It also commodifies scientific research, diminishes creativity, and weakens the spirit.
The choices are there. It will take courageous leadership to make the good ones.                                                                

This post is an updated version of an article I wrote for the Emory University “Academic Exchange” newsletter for faculty in May 2007.  Permission for this adaptation was given by the Editor of the publication.
 
Why start a blog at this stage of life? 

I've done pretty well without having one up to now, but my students and colleagues in the lab thought it was a good idea to put some of my rants about what's going on in science and academia to better use (and maybe give them a little more peace and quiet). I've been in teaching and research since the early 1960's--some would say too long--but  during that time I've seen a lot and done a lot. 

Do my thoughts on higher education and scholarly research carry more weight than anyone else's? Not really, but there are a lot of things that worry me about the future of education, science and our country itself, so I'd like to get some discussion going.

When I hear presidential candidates trashing universities and colleges for political gain, I think there's real cause for concern. But  universities and the research establishment are by no means completely above reproach. I think academic values need to be re-considered. I don't think that teaching, research and scholarship are just commercial commodities to be sold like iPads or cars, but I see a lot of that going on these days. Are we losing our independence and objectivity by buying into the commercialization and commodification of academia? I think we are. I also think we need to fix some things and they won't get fixed unless people directly involved are willing to debate the issues and suggest some specific solutions.

So I want to try this Blog. 

I look at this from a number perspectives: as a college teacher and researcher in a small New England University for the first 22 years of my career, as a Dean of graduate studies and Vice Provost for research first at a large state university and then at well-regarded private university, for a total of almost 13 years in administration, and as a former Congressional and Fulbright Fellow who worked on Capitol Hill and spent a number of years working in European universities and research centers. So, maybe some of my perspectives might have value for colleagues and friends who have many more years of work ahead of them than I. 

I hope this Blog will lead to some discussions and maybe even open up some ideas for fixing difficult situations that are outside the box of current academic values and practice. So let's see what happens. My first one will appear in about a week. 


Scientific research, science, science education, brain research, academia, politics of science,