Viewing entries tagged

Eating Wheat: Avoiding the Bad and Getting the Good


Eating Wheat: Avoiding the Bad and Getting the Good

A bagel or bowl of cereal is common for breakfast, followed by a sandwich or burger for lunch. Dinner often stars pasta, pizza, or a casserole as the main dish. There is one ingredient that lurks in nearly every American meal.

Wheat. It’s the main ingredient in bread, the most purchased packaged food in the United States.1 It plays an integral role in many diets, but if not correctly consumed, it can damage the human body. The harmful effects include increased risk of weight gain, cardiovascular disease, and even cancer.2 To avoid adverse effects while reaping the benefits wheat offers, three factors should be considered: wheat type (whole-grain or refined), the portion size, and the accompanying ingredients.

Whole Grains Instead of Refined Grains

Guidelines by the United States Department of Agriculture (USDA) recommend Americans consume whole-grain rather than refined wheat. Currently, the average consumption of whole-grain foods is approximately one serving a day, falling short of the recommended three servings.3 Wheat grains are divided into three parts: endosperm, germ, and bran (Figure 1). Whole-grain wheat grains have the germ and bran intact. In contrast, refined grains that have the bran and germ separated from the starchy endosperm comprises 80% of the grain. Unfortunately, this processing robs wheat of the majority of its nutrients, which are concentrated in the bran and germ.

Whole grain wheat has nearly ten times more dietary fiber, five times as many vitamins and cancer-preventing phenolic compounds, and three times as many essential minerals including zinc, iron, and selenium (Table 1).

The extra dietary fiber of whole-grain wheat itself is a compelling reason to choose it over refined wheat. Increased consumption of dietary fiber has been observed to improve cholesterol concentrations, lower blood pressure, and aid in weight loss. These effects all reduce the risk for coronary heart disease, the leading cause of adult deaths in the United States.4 High-fiber foods facilitate metabolic effects and control caloric intake by increasing satiety. Dietary fiber, consisting of insoluble and soluble components, promotes gastrointestinal health as a probiotic for beneficial bacteria in the colon. Both fibers also provide cardiovascular benefits by lowering “bad” cholesterol, or LDL.

In the broader context of a person’s entire diet, high-fiber foods often have lower energy density and take longer to eat. These two traits promote satiety, curbing consumption of potentially unhealthy foods and lowering total caloric intake. Eating refined wheat, such as white bread and pasta, causes one to not only forego nutrients, but also consume more calories before feeling full. Overconsumption of calories coupled with physical inactivity are major risk factors leading to heart disease and obesity.5

Control Portion Size

In addition to considering what type of wheat one eats (e.g., wholewheat instead of white bread for toast in the morning), an equally important factor is quantity. Feasting upon large portions of wholegrain wheat regularly results in damaging spikes in blood sugar that can lead to an chronic state of Type 2 diabetes.6 Since diabetes is the leading cause of kidney failure in the United States and doubles the risk of stroke, its correlation with consumption of refined wheat is important to understand.7

The biochemical phenomenon underlying this link is called insulin resistance. Insulin is a hormone stimulating various tissues to store glucose from the blood as glycogen. When carbohydrates are digested, they are broken down into glucose, which is transported into the bloodstream, consequently increasing blood sugar levels. This causes pancreatic beta cells to synthesize insulin to convert the increased glucose into glycogen. When the body does not perform these functions well, the resulting condition is Type 2 diabetes.

Even though the USDA advises adding whole-grain wheat to one’s diet, USDA guidelines do not account for the spiking effect on blood sugar when a large portion is eaten in a short time frame. Their guidelines use a rating system called the glycemic index (GI) that is widely utilized in nutrition studies as a quality standard of carbohydrate foods.8 Wonder®, fully enriched white bread, has a GI of 71 while bread made of 80% whole-grain and 20% refined wheat flour has a GI of 52.8 In practical terms, these GI values indicate a 70% increase in blood sugar compared to the blood sugar increase caused by a comparable amount of pure glucose. Likewise, whole wheat bread causes an increase in blood sugar 52% of that caused by glucose. Based on the aforementioned pathogenic contribution of blood sugar spikes, the lower GI of whole-wheat bread quantitatively demonstrates its superiority over white bread.

However, consider the following: the Twix candy bar has an even lower GI of 44. Watermelon has a GI of 72. How does this make sense? The glycemic index fails to account for realistic portion sizes. When the foods are empirically tested on people for their effects on blood sugar, the quantities eaten are equivalent to 50 grams of carbohydrates. Three-quarters of a king-sized Twix bar constitutes 50 grams of carbohydrate, but so do 5 cups of diced watermelon. This difference in volume is due to the fiber and water content of watermelon.

Realistically, a person is likely to eat a whole king-sized Twix bar or one cup of diced watermelon in one sitting. Adjusting for actual serving sizes and assuming linearity, the Twix bar has what is now called a glycemic load (GL) of 58.7 and watermelon a GL of 14.4. As a more relevant implementation of GI values, glycemic load emphasizes control of portion sizes in eating carbohydrates. The GI value of whole-grain wheat is always lower than refined wheat vary, but the difference is small enough that one cup of refined flour pasta might be better than 2 cups of whole-wheat flour pasta in preventing Type 2 diabetes.

Watch Out for Accompanying Ingredients

The final factor to consider is that wheat is rarely eaten alone. In the processing and cooking to make it edible, wheat is nearly always mixed with other ingredients that are potentially harmful. Most breads, pastas, pancakes, cereals, and other wheat products have at least five ingredients trailing behind the primary wheat ingredient, which are broadly classified as preservatives, sweeteners, emulsifiers, leavening agents, flavor enhancers, and dough conditioners. All of these additives to wheat affect short-term feelings after consumption as well as long-term effects on the body. In particular, one should avoid partially hydrogenated oils and moderate high-fructose corn syrup.

Added as dough conditioners and preservatives, partially hydrogenated oils are considerable factors in coronary artery disease, which causes at least 30,000 premature American deaths per year.9 They contain trans fats, which have been unequivocally linked to lowering “good” high-density lipoprotein (HDL) cholesterol and raising “bad” low-density lipoprotein (LDL) cholesterol. Although large companies have removed trans fats, including partially hydrogenated oils, from foods such as Kraft’s Oreos in response to mounting criticism beginning in 2005, numerous food companies still include partially hydrogenated oils in their wheat products. For example, cake mixes, packaged baked goods, and peanut butter are commercially made with partially hydrogenated oils on a regular basis because they simplify manufacturing and reduce costs while increasing the final product’s shelf life. Manufacturers obfuscate this addition by stating the trans fat content of foods as “0g” on nutrition labels. This is allowed because 0.5 grams of trans fat is one serving. However, less than 0.5 grams of trans fat per serving can accumulate when consuming multiple servings of foods such as chips or crackers. Instead, check for the words “partially hydrogenated” or “shortening” in the ingredients list.

While partially hydrogenated oils are conclusively life-threatening, high-fructose corn syrup (HFCS) is a controversial additive. Manufacturers favor the use of HFCS as a sweetener in wheat products due to lower cost, sweeter taste, and higher miscibility. Scientists hypothesize that corn-derived sugar has endocrine effects that lead to obesity, Type 2 diabetes, and metabolic syndrome.8 Insulin and leptin are key hormone signals that regulate a person’s sense of hunger, but consumption of high-fructose corn syrup depresses these internal signals from controlling calorie intake. Another consequence of foods sweetened with HFCS is plaque buildup inside the arteries.10 Nearly any sweet good made from wheat will likely contain HFCS. Although data about its health effects are still inconclusive, HFCS should be avoided.

Being a health-conscious consumer of wheat can mean significant changes in daily choices of which foods to eat and how to eat them. Whole grains provide more fiber and life-boosting nutrients than refined grains, but accompanying ingredients in available food choices need to be considered as well. More importantly, the impacts of wheat on blood sugar need to be controlled by consuming a commensurate amount of fruits and vegetables. Awareness and application of these principles are the main steps to avoiding the bad and getting the good of wheat.


  1. Nielsen Homescan Facts, The Nielsen Company. Jan. 15, 2013).
  2. Slavin, J. L. Amer. J. Clin Nutr. 1999, 70, 459S-63S.
  3. Cleveland, L. E. J. Amer. Coll. Nutr. 2000, 19, 331–8.
  4. Anderson, J. W. Nutr. Rev. 2009, 67, 188–205.
  5. Swinburn, B. Public Health Nutr. 2007, 7, 123–46.
  6. Liu, S. J. Amer. Coll. Nutr. 2002, 21, 298–306.
  7. World Health Organization: Diabetes Fact Sheet, Media Centre. 2012 (Accessed Jan. 15, 2013).
  8. Foster-Powell, K. Amer. J. Clin Nutr. 2002, 76, 5–56.
  9. Ascherio, A. Amer. J. Clin Nutr. 1997, 66, 1006S–10S.
  10. Stanhope, K. Amer. J. Clin Nutr. 2008, 88, 1733S-7S.
  11. General Mills. What is Whole Grain, Anyway? Demystifying Whole Grains. (Accessed Jan. 15, 2013).
  12. Thompson, L. U. Contemp. Nutr. 1992, 17.


The Ghostly Haunting of Limb Lost


The Ghostly Haunting of Limb Lost

The brain’s neural pathways are like a city’s infrastructure. Once the routes and support structures are firmly in place, it is difficult to remove them to construct a new route. This helps explain amputees’ reports of phantom limbs and the painful sensations they radiate. How much of the pain is real and how much is psychological has yet to be determined, but treatments address both sources.

The phantom limb was first documented by Dr. S. Weir Mitchell after observations with Civil War amputees.1 It is a fascinating enigma that has appeared in literature: Captain Ahab’s missing leg in Herman Melville’s Moby Dick, Captain Hook’s lost hand in J. M. Barrie’s Peter Pan, and Long John Silver’s absent leg in Robert Louis Stevenson’s Treasure Island. Why does the brain yearn for the absent limb so much that fantasizes emerge? The answer may reside in ascending sensory pathways from the peripheral nervous system. Once established, the brain finds it difficult to change expected input from these neural pathways.

During infancy, the brain examines the body to understand itself spatially and topologically, building upon this image from the senses throughout life. Interestingly, those who undergo amputations in infancy experience neither the sensation nor the pain of phantom limbs because the missing limbs had not been there long enough to establish a solid pathway.2 However, for those that retain their limbs, the development of the senses in early childhood is faster than at any other point. Changing body image at an advanced age is too drastic and demanding for the brain. One contributing factor is the elderly’s diminished brain size. On average, the brain loses 5-10% of its weight between the ages of 20 and 90, with a higher proportion lost with increasing age.3 In addition, the grooves on the surface widen while the swellings and depressions become smaller. Deep grooves in the brain indicate increased surface area for synapses, the connecting space between neurons, to form. Moreover, the formation of neurofibriallary tangles, decayed portions of the dendrites receiving the sensory information from other neurons, impede information transmission.3 Finally, abnormally hard clusters of damaged or dying neurons, known as “senile plaques,” emerge and accumulate. Neurons are not replaced when they die, so as one gets older, one literally has less to work with. Thus, with decreased plasticity, the body image becomes fixed with one’s brain regressing to the stage formed in earlier years.

This pathway, however, is not indestructible because amputees report that phantom limb sensations decrease with time. Due to the plasticity of the brain, the brain takes time to “rewire” itself by abolishing old connections in favor of new, useful connections elsewhere. For example, after an amputation, patients often describe the entire appendages, with the most awareness at the distal (end) portions of the limb (i.e. fingers and toes compared with the forearms and calves, respectively).4,5 This is because distal anatomical structures contain the greatest amount of sensory nerves and command a larger portion of the somatosensory cortex. In time, however, the phantom limb perception shrinks until it disappears into the stump.6-8

These concepts are visualized by the sensory homunculus (Figure 1), where the size of the appendage reflects the sensitivity and thus concentration of neurons there. Thus, infants use their hands, lips, and tongue frequency in order to shape and understand their world. Since more neurons are dedicated to these extremities, it takes longer to rewire the corresponding pathways. Instead, the brain completely rewires the proximal portions of the limb so that the phantom sensation in the length of the appendage seems to shrink faster than the distal portions.

When subjects encounter identical stimuli, the sensation experienced is usually comparable between them. For example, when we touch a pot on a lit stove, we feel burning and not tickling. With amputees, this precedence doesn’t hold. Each amputee’s phantom limb is unique: it can feel authentic and present but fake, painful, or painless. There is little to suggest that patients are lying about the pain, yet it is well known that the brain frequently tricks the body.

Psychological pain can also manifest itself as physical pain. Amputee patients who feared an inability to recover were hostile to and jealous of other members of society, consequently experiencing pain in the phantom limb with these heightened emotions. However, once these patients underwent therapy and obtained a positive attitude, the pain faded.2

Traditional approaches to alleviating pain, such as injection of nerve blocks, myoelectric prosthesis, and cordotomy, have been more procedural.9,10 A nerve block is an injection of a local anesthetic to stop transmission of a message along the nerve so that the brain never receives the pain signal from the stump. A myoelectric prosthesis is an artificial limb, which uses electronic sensors to translate muscle and nerve activity into the intended movement. While the brain is manipulated into replacing the phantom limb with an artifical one, prosthetics often do not alleviate phantom pain. One theory for this is that since visual sensory information contradicts tactile sensory information, the brain refuses to be tricked. Cordotomy is the most invasive of the procedures listed because it requires a neurosurgeon to disable certain rising tracts in the spinal cord. Thus, it is only employed in severe cancer- or trauma-related cases. Despite the variety of approaches, the results are slightly effective at best.9

Recently, researchers have turned to mind-body therapies to relieve chronic phantom pain, yielding tentatively successful results. A review by Dr. Vera Moura of the Department of Physical Medicine and Rehabilitation on Integrative Medicine at University of North Carolina Hospitals tied together studies that used hypnosis, guided imagery, and biofeedback (such as visual mirror exercises).11 These non-invasive mind-body alternatives consider the psychological aspect of pain. Hypnosis has been found to reduce postsurgical pain, so researchers attempted to transfer its effects to amputees.12 In several studies, arm amputees varying in sex and age saw a reduction in pain frequency and intensity after attending hypnosis sessions.13-15 These studies indicate that mind can truly triumph over matter, but caution must be taken because trial sizes were small and hypnosis is a murky field. Therefore, more research is necessary before any definite conclusions can be made.

Guided imagery is another mind-body approach that extends beyond the typical denotation of our senses, and it utilizes more neural pathways than normal to create a memorable, mental image. This treatment combines interactions between patient and therapist and patient and body image.9 In Zuckweiler’s experiment, 14 patients with diverse backgrounds had 5 to 15 imagery sessions, during which they attempted to reprogram their minds to accept the new body form.16 Patients were taught Zuckweiler Image Imprinting (ZIP), which involves taking an object and storing it as a mental image. They were then asked to compare their phantom limb pain to the object in their mental image and switch the sensations associated with the two objects. Over time, as the phantom sensation decreased by using different mental images, the discrepancy between the new body image without the limb and old body image with the limb was reconciled. Zuckweiler’s study showed successful pain intensity reduction within only six months. ZIP forces patients’ minds to accept their new bodies. Since his method encompasses visual, auditory, and kinesthetic learning, customized treatment allows patients to comprehend and create new connections.

The final mind-body approach is biofeedback, of which there are two popular kinds. Thermal biofeedback teaches patients to increase the peripheral skin temperature at the stump.17 This seems unlikely, since body temperature is an autonomic function along with vital processes such as heart rate and breathing. In some instances, however, an individual can have partial, conscious control. Although the hypothalamus is responsible for standard body temperature of 37.0°C (98.6°F), it is possible for consciousness to affect peripheral skin temperature. Successful patients begin to link skin temperature with pain.18 Physiologically, the regulation of one function often results in the coupling of the response to a stimulus. For example, thermal biofeedback was coupled with breathing relaxation techniques, which caused the temperature of the stump to increase and relax, decreasing the pain and thus increasing the patient’s ability to contend with remaining pain. It is unknown, however, if thermal biofeedback is an effective treatment for all phantom pain; like most areas of science and medicine, more research is needed.

The second biofeedback type, visual mirror feedback (VMF), uses a box with mirrors to fool the brain. A rectangular box with no top and two holes for each arm (or leg) is set in front of a patient. In the middle of the box is a one-sided mirror septum facing the limb that is intact (Figure 2). Patients are thus presented with the illusion that both appendages are whole. Dr. Ramachandran, the inventor of this technique, conducted a study in which 10 amputee patients were treated with VMF in six sessions of 5 to 15 minutes a day for several weeks.19 Every patient had a positive reaction that included reduced pain, pain intensity, mobility restriction, and spasms. Once again, there was a conscious effort to train the brain, so patients were able to redirect unpleasant sensations. This therapy is almost opposite to ZIP since the patient is picturing the limb as whole to alleviate the pain rather than ignoring it. VMF treatment is one of the most common due to its success amongst many different amputees.

A theory behind mind-body approaches’ emerging successes is the conscious effort patients put forth to overcome pain. In previously mentioned traditional procedural methods, patients passively receive a certain treatment and hope to obtain a positive result. In some cases, there are even negative side effects; for example, a nerve block may lead to rashes, itching, and an abnormal rise in blood sugar. Invasive procedural approaches like the cordotomy can only be attempted once. Mind-body approaches can be practiced, optimized over time, and are much safer than procedural methods.

Understanding of phantom pain has progressed significantly since its initial documentation during the Civil War. Traditional procedural methods to treat it have been developed, but recently, the psychological aspect of pain and sensation has been addressed in mind-body methods. Unfortunately, neither approach has achieved complete success, partially because of the individualistic nature of phantom limbs and the associated pain. The neurological explanations behind both phenomena are relatively unknown, but it is agreed the ghostly perceptions are a mixture of psychological and real sensations. Perhaps the most effective treatments are those that address both.


  1. Lehrer, J. Proust Was a Neuroscientist; Houghton Mifflin: Boston, 2008
  2. Kolb, L. C. The Painful Phantom, Psychology, Physiology, and Treatment; Charles C. Thomas: Springfield, Illinois, USA, 1954.
  3. Guttman, M. The Aging Brain. USC Health Magazine (Accessed Jan. 22, 2013).
  4. Newton, A. Somatosensory Map. (Accessed Jan. 18, 2013).
  5. Pain and Touch, Handbook of Perception and Cognition. 2nd ed. Lawrence Kruger, Ed.; Academic: San Diego. CA, 1996.
  6. Jensen, T. S. Pain. 1985, 21, 267-78.
  7. Hunter, J. P. Neuroscience. 2008, 156, 939-49.
  8. Desmond, D. M. Int. J. Rehabil. Res. 2010, 33, 279-82
  9. Lotze, M. Nat. Neurosci. 1992, 2, 501-2.
  10. Pool, J. L. Ann. Surg. 1946, 124, 386-91.
  11. Moura, V. L. Am. J. Phys. Med. Rehabil. 2012, 8, 701-14.
  12. Black, L. M. J. Fam. Pract. 2009, 58, 155-8.
  13. Oakley, D. A. Clin. Rehabil. 2002, 18, 84-92.
  14. Bamford, C. Contemp. Hypn. 2006, 23, 115-26.
  15. Rickard, J. A. Ph.D. Dissertation, Washington State University, Pullman, WA, 2004.
  16. Zuckweiler, R. JPO. 2005, 17, 113-8.
  17. Sherman, R. A. Am. J. Phys. Med. 1986, 65, 281-97.
  18. Shaffer, F.; Moss, D. Textbook of Complementary and Alternative Medicine; 2nd Ed. Informa Healthcare: London, UK, 2006.
  19. Ramachandran, V. S. Brain. 2009, 132, 1693-710.
  20. Trivialperusal. Sensory Homunculus. (Accessed Jan. 18,         2013).
  21. Phelan, L. Mirror Box Therapy. (Accessed Jan. 18, 2013).


The Bridge from Discovery to Care: Translational Biomedical Research


The Bridge from Discovery to Care: Translational Biomedical Research

Since the 1970s, both the number of molecular biology PhD scientists and the amount of biomedical research have grown rapidly, greatly expanding our knowledge of the cell.1 This explosion has led to incredible scientific achievements, including development of the polymerase chain reaction in the 1980s and completion of the Human Genome Project in 2003.2-4 The focus of research has shifted from single genes to all genes, from single proteins to all proteins. Neither scientists nor pharmaceutical companies, however, have been able to keep pace with the sheer quantity and complexity of modern biomedical research. Additionally, while the majority of medical researchers were once physician-scientists in the 1950s and 1960s, they are predominantly PhDs today.1 Questions of basic and clinical research, once addressed side by side, are now separate.

The widening gap between scientific discovery and therapeutic impact is a result of these changes. In the United States, the dramatic increase in spending for pharmaceutical research and development has been offset by a disappointing decrease in therapeutic output (Figure 1). As this paradox becomes more apparent, translational research, which aims to convert laboratory findings into clinical successes, emerges as an increasingly important endeavor.5,6

In 2006, the U.S. National Institutes of Health (NIH), the largest source of funding for medical research in the world, focused its attention on translational research by launching the Clinical and Translational Science Awards program.7,8 However, implementing effective translational research is both time- and labor-intensive. According to Dr. Garret FitzGerald, Director of the Institute for Translational Medicine and Therapeutics at the University of Pennsylvania, challenges include a lack of human capital with translational skill sets, relevant information systems, and intellectual property incentives.9

During his leadership of the NIH from 2002 to 2008, Dr. Elias Zerhouni witnessed the consequences of clinicians lacking in training on the speed of scientific advancements for patient care.10,11 Beyond the need for manpower, an open culture of communication between scientists and clinicians is necessary.

Drug development is a one-way process from benchside to bedside in which scientists identify drug targets, conduct clinical tests, and develop marketable drugs. Many argue, however, that the communication must run in the opposite direction, too; feedback from clinical trials and doctors is valuable because understanding their concerns allows researchers to improve drug development.12 The third challenge derives from current institutional practices and regulations. An investigator’s publication record rather than their efforts to advance medicine determines success.13 Research funding is also granted on an individual basis, which does not promote the collaboration necessary for successful translational research. Lastly, the regulatory and patent processes governing drug development require much expertise and time to navigate, which offer little incentive for researchers to become involved.1

To better integrate basic science with clinical science progress, countries such as the United States are building a new team of leaders in all aspects of clinical research: medicine, pharmacology, toxicology, intellectual property, manufacturing, and clinical trial design and regulation.13 Dr. Francis Collins, Director of the NIH since 2009, has called for a partnership between academia, government, private, and patient organizations to repurpose molecular compounds previously failing in their original use.15,16 Historically, Collins referred excitedly to azidothymidine, a drug originally developed to treat cancer that later treated HIV/AIDS.14 Tremendous potential lies in applying scientific developments to other contexts, and the NIH has already drafted policy for this purpose.15

However, the growing support for translational research does not diminish the importance of basic scientific research, which poses the most interesting questions. Translational biomedical research creates an efficient environment for scientists to work at the interface of basic science and therapeutic development and to help fulfill the social contract between scientists and citizens. The full impact of translational initiatives has yet to be seen because the success of drug development, which can take up to 20 years, cannot be evaluated easily or quickly. For now, we can hope that integrating the work of scientists and clinicians will benefit both the patients, who await treatment, and the researchers, who only dream of seeing their discoveries transformed into new therapies for disease.


  1. Butler, D. Nature. 2008, 453, 840–2.
  2. Smithsonian Institution Archives. Smithsonian Videohistory Collection: The History of PCR (RU 9577). (Accessed Jan. 15, 2013).
  3. National Center for Biotechnology Information (NCBI). Probe, Reagents for Functional Genomics: PCR. (Accessed Jan. 15, 2013).
  4. Human Genome Project Information. About the Human Genome Project. (Accessed Jan. 15, 2013).
  5. CTSI (Clinical and Translational Science Institute) at UCSF. Translational Medicine at UCSF: An Interview with Clay Johnston. (Accessed Jan.15, 2013)
  6. Helwick, C. Anticancer Drug Development Trends: Translational Medicine. American Health & Drug Benefits. (Accessed Jan. 15, 2013).
  7. National Institutes of Health (NIH). About NIH. (Accessed Jan. 15, 2013).
  8. National Institutes of Health National Center for Advancing Translational Sciences (CTSA). About the CTSA Program. (Accessed Jan. 15, 2013).
  9. Pers. comm. Dr. Garret FitzGerald, Director of the Institute for Translational Medicine & Therapeutics at the University of Pennsylvania.
  10. NIH News. Elias A. Zerhouni to End Tenure as Director of the National Institutes of Health. (Accessed Jan. 15, 2013).
  11. Wang, S.S. Sanofi’s Zerhouni on Translational Research: No Simple Solution. The Wall Street Journal. Health Blog 2011 (Accessed Jan. 15, 2013).
  12. Ledford, H. Nature. 2008. 453, 843-5.
  13. Nature. 2008, 543, 823.
  14. TEDMED 2012. Francis Collins. (Accessed Jan. 15, 2013).
  15. Wang, S. Bridge the Gap Between Basic Research and Patient Care, NIH Head Urges. The Wall Street Journal Health Blog.      research-and-patient-care-nih-head-urges/ (Accessed Jan. 15, 2013).