By Christopher Green, August 11, 2008
I would like to reflect on the approaches taken by my colleagues, Jonathan Moreno and Margaret Kosal, in the beginning of this discussion. Specifically, they have addressed what they contend to be two important core “rate-limiters” in the future progress of neuroscience research that could have military application: funding and ethics. While I do not disagree with anything that Jonathan and Margaret say in principle, my view is that these two core issues make it clear that this discussion must become global. If not, we will miss the opportunity to address the fundamental issue: what are the military applications of neurosciences? The research conditions in U.S. labs are well known, and much if not most of the important militarily relevant neuroscience research will take place overseas. If this discussion becomes global and we engage with non-U.S. researchers, then we can really begin to address this question.
While Jonathan, Margaret, and I are all in violent agreement about the basics, we need to discover the subtle differences in our opinions to help us learn. To do this, I will purposely overstate in tone (not in facts) two key issues: First, I stand by my initial belief that the pace of discovery in the neurosciences is unlikely to accelerate faster than it is today and will not happen preferentially in the West. Second, we should not assume that the military applications of this research can be influenced by incremental additional financial support.
The pace of discovery in neurosciences in the West is widely believed to be driven by stable and moderately well-funded Centers of Excellence. The research with applicability to military use is well known and well publicized. Those of us with access to information about classified programs are underwhelmed by the lack of significant novel research. In fact, as part of the National Academy of Sciences commission I head, the 16 members of the committee, 6 staff, and 12 peer reviewers humbly received briefings and work product from 25 military and intelligence agencies, national laboratories, universities, nongovernmental organizations, and other private institutions doing neurosciences research directly or indirectly involved in work of potential military value, and reviewed hundreds of peer-reviewed publications. The committee decided that it had no scientific or substantive reason to write a classified annex. (The peer review of the commission’s final report is complete and the report will be released as “Emerging Cognitive Neuroscience and Related Technologies.”)
From the information that is accessible, we know the ongoing work is relatively well supported, as Margaret described. Should we expect any significant increase in a few years for “basic” research? The bigger problem is that the military and intelligence communities cannot understand the implications of any such research. As shocking as it may be, fewer than a dozen persons in intelligence and military constituencies understand the neurosciences involved, and they are happy to say so–hence, the plethora of outside reviews on the subject in the past two years. And government scientists are not well placed to accelerate application development from the basic neurosciences. Margaret has it just right: Only some form of increased exchange and communication between academia and the government in this area will work.
Additional funding alone won’t help, in any event, for a separate reason. The goal of the basic neurosciences funded today is for the most part hypothesis testing, not hypothesis generating. Grants and awards are also appropriately constrained by the Health Insurance Portability and Accountability Act (HIPAA) and ethics. As Jonathan wisely pointed out, the “first experiments” in the neuroscience arenas most talked about–psychopharmacology and aides to interrogation–have to wait for approvals and reviews. The drugs and compounds being considered today are older, and the ones we might expect to be useful to modify emotional state will not be tested on humans in any context of reasonable funding or ethics. It is not in the realm of possibilities that next-generation drugs with safe and efficacious properties for military use will be developed absent a huge increase in funding. I am happy to go on record as stating that for both scientific and ethical reasons I oppose the development of such drugs.
Funding priorities require a solid roadmap that includes research that is certain to meet prescribed goals as contained in the rules of grant applications. From the landscape we have seen, it is unlikely that a disruptive technology could escape from a basic science laboratory. (Clayton Christensen, thank you for teaching us that disruptors by definition are today’s technology turned inward). With careful planning and human-use approvals, it is just as unlikely that a new discovery with unintended consequences will pop-out to become a new military application, unless it is looked for systematically–almost (but not quite) an oxymoron. Screening for the unintended applications of a drug is not finding a surprise, it is reading carefully the data from good epidemiology.
It would be productive to turn this discussion. Basic science funding is likely to stay stable and little serious research will delve into the psychopharmacology of interrogation. Thus, a vector that we may wish to explore is the new data from several global laboratories that indicates culture matters in the decision to use any military application of asymmetric force. Battlefield commanders of all nations hold sacrosanct the right to determine the applications that may cause harm to those outside the bandwidth of an lethal dose 99+ weapon and generally don’t intend to develop for use materials that could cause collateral harm to civilians and non-combatants. If governments or scientists were to try to develop a system to pre-screen neuroscientific cognitive manipulators, which would be HIPAA approved and tested, and robust in its core science, success would be as likely as it was with mines and cluster-bombs–meaning not likely. And if we did have such success, our enemies of the future would not care.