Viewing entries tagged
Health & Medicine

Healthcare Reforms for the Mentally Ill

Comment

Healthcare Reforms for the Mentally Ill

Neuropsychiatric illnesses are some of the most devastating conditions in the world. Despite being non-communicable, mental and neurological conditions are estimated to contribute to approximately 30.8% of all of the years lived in disability1. Furthermore, in developed nations like the United States, mental disorders have been reported to erode around 2.5% of the yearly gross national product, which fails to account for the opportunity cost of families who have to take care of patients long-term.1 If left untreated, many patients with neuropsychiatric illnesses cannot find gainful employment; their aberrant behavior is stigmatized and prevents forward professional and personal advancement. In fact, about three times as many individuals living with mental illnesses who are in state/local prisons rather than rehabilitative psychiatric institutions.2

Though the Affordable Care Act has substantially decreased the amount of uninsured individuals in the U.S., there are still millions of people who fall into something called the Medicaid gap.3 People in this group make too much money for Medicaid, but too little money to be able to qualify for government tax credits in purchasing an insurance plan. In an attempt to fix this ‘hole,’ the federal government offers aid to states in order to expand their Medicaid programs as needed.4 States that have accepted the Medicaid expansion sponsored by the federal government, have seen sudden reductions in their populations of uninsured people, which has directly improved quality of life for the least fortunate people in society. However, in the many states that continue to reject federal aid, the situation is considerably worse--especially for the mentally ill.

Mental health patients are especially vulnerable to falling into the Medicare gap. Many patients suffering from psychiatric conditions often are unable to find serious employment. According to a report by the Department of Health and Human Services in March 2016, there are 1.9 million low-income, uninsured individuals with mental health disorders who cannot access proper healthcare resources.5 These impoverished psychiatric patients are originally eligible for Medicare. However, once their treatment takes and they become employed, they might pass the Medicare income threshold. If their private health insurance does not cover the cost of their psychiatric treatments, patients will relapse, creating a vicious cycle that is exceptionally difficult to break out of.6

Furthermore, many psychiatric illnesses often initially present during adolescence or early adulthood, which is right around the time students leave home to go to college. So, during initial presentation, many students lack the proper support system necessary to deal with their condition, causing many to drop out of college or receive poor grades. Families often chalk up these conditions to poor adjustments to a brand new college environment at home, preventing psychiatric patients from properly receiving treatment.6 Alone, many students with psychiatric conditions delay seeking treatment, fearing being labeled as “crazy” or “insane” by their peers.

Under the status quo, psychiatric patients face significant barriers to care. As the Medicaid gap is unfortunately subject to political maneuverings, it probably will not be fixed immediately. However, the United States could fund the expansion of Assertive Community Treatment programs, which provide medication, therapy, and social support in an outpatient setting.8 Such programs dramatically reduce hospitalization times for psychiatric patients, alleviating the costs of medical treatment. Funding these programs would help insurance issues from being a deterrent to treatment.

In the current system, psychiatric patients face numerous deterrents to receiving treatment, from lack of family support to significant social stigma. Having access to health insurance be a further barrier to care is a significant oversight of the current system and ought to be corrected.

References

  1. World Health Organization. Chapter 2: Burden of Mental and Behavioural Disorders. 2001. 20 3 2016 <http://www.who.int/whr/2001/chapter2/en/index3.html>.
  2. Torrey, E. F.; Kennard, A. D.; Elsinger, D.; Lamb, R.; Pavle, J. More Mentally Ill Persons Are in Jails and Prisons Than Hospitals: A Survey of the States .
  3. Kaiser Family Foundation. Key Facts about the Uninsured Population. 5 8 2015. 25 3 2016 <http://kff.org/uninsured/fact-sheet/key-facts-about-the-uninsured-population/>.
  4. Ross, Janell. Obamacare mandated better mental health-care coverage. It hasn't happened. 7 8 2015. 24 3 2016 <https://www.washingtonpost.com/news/the-fix/wp/2015/10/07/obamacare-mandated-better-mental-health-care-coverage-it-hasnt-happened/>.
  5. Dey, J.; Rosenoff, E.; West, K. Benefits of Medicaid Expansion for Behavioral Health. 28 3 2016 <https://aspe.hhs.gov/sites/default/files/pdf/190506/BHMedicaidExpansion.pdf>
  6. Taskiran, Sarper. Interview. Rishi Suresh. Istanbul, 3 3 2016.
  7. Gonen, Oner Gurkan. Interview. Rishi Suresh. Houston, 1 4 2016.
  8. Assertive Community Treatment https://www.centerforebp.case.edu/practices/act (accessed Jan 2017).

Comment

Telomeres: Ways to Prolong Life

Comment

Telomeres: Ways to Prolong Life

Two hundred years ago, the average life expectancy oscillated between 30 and 40 years, as it had for centuries before. Medical knowledge was fairly limited to superstition and folk cures, and the science behind what actually caused disease and death was lacking. Since then, the average lifespan of human beings has skyrocketed due to scientific advancements in health care, such as an understanding of bacteria and infections. Today, new discoveries are being made in cellular biology which, in theory, could lead us to the next revolutionary leap in life span. Most promising among these recent discoveries is the manipulation of telomeres in order to slow the aging process, and the use of telomerase to identify cancerous cells.

Before understanding how telomeres can be utilized to increase the average lifespan of humans, it is essential to understand what a telomere is. When cells divide, their DNA must be copied so that all of the cells share an identical DNA sequence. However, the DNA cannot be copied all the way to the end of the strand, resulting in the loss of some DNA at the end of the sequence with every single replication.1 To prevent valuable genetic code from being cut off during cell division, our DNA contains telomeres, a meaningless combination of nucleotides at the end of our chromosomal sequences that can be cut off without consequences to the meaningful part of the DNA. Repeated cell replication causes these protective telomeres to become shorter and shorter, until valuable genetic code is eventually cut off, causing the cell to malfunction and ultimately die.1 The enzyme telomerase functions in cells to rebuild these constantly degrading telomeres, but its activity is relatively low in normal cells as compared to cancer cells.2

The applications of telomerase manipulation have only come up fairly recently, with the discovery of the functionality of both telomeres and telomerase in the mid 80’s by Nobel Prize winners Elizabeth Blackburn, Carol Grieder, and Jack Sjozak.3 Blackburn discovered a sequence at the end of chromosomes that was repeated several times, but could not determine what the purpose of this sequence was. At the same time, Sjozak was observing the degradation of minichromosomes, chromatin-like structures which replicated during cell division when introduced to a yeast cell. Together, they combined their work by isolating Blackburn’s repeating DNA sequences, attaching them to Sjozak’s minichromosomes, and then placing the minichromosomes back inside yeast cells. With the new addition to their DNA sequence, the minichromosomes did not degrade as they had before, thus proving that the purpose of the repeating DNA sequence, dubbed the telomere, was to protect the chromosome and delay cellular aging.

Because of the relationship between telomeres and cellular aging, many scientists theorize that cell longevity could be enhanced by finding a way to control telomere degradation and keep protective caps on the end of cell DNA indefinitely.1 Were this to be accomplished, the cells would be able to divide an infinite number of times before they started to lose valuable genetic code, which would theoretically extend the life of the organism as a whole.

In addition, studies into telomeres have revealed new ways of combatting cancer. Although there are many subtypes of cancer, all variations of cancer involve the uncontrollable, rapid division of cells. Despite this rapid division, the telomeres of cancer cells do not shorten like those of a normal cell upon division, otherwise this rapid division would be impossible. Cancer cells are likely able to maintain their telomeres due to their higher levels of telomerase.3 This knowledge allows scientists to use telomerase levels as an indicator of cancerous cells, and then proceed to target these cells. Vaccines that target telomerase production have the potential to be the newest weapon in combatting cancer.2 Cancerous cells continue to proliferate at an uncontrollable rate even when telomerase production is interrupted. However, without the telomerase to protect their telomeres from degradation, these cells eventually die.

As the scientific community advances its ability to control telomeres, it comes closer to controlling the process of cellular reproduction, one of the many factors associated with human aging and cancerous cells. With knowledge in these areas continuing to develop, the possibility of completely eradicating cancer and slowing the aging process is becoming more and more realistic.

References

  1. Genetic Science Learning Center. Learn.Genetics. http://learn.genetics.utah.edu (accessed Oct. 5, 2016).
  2. Shay, J. W.; Woodring W. E.  NRD. [Online] 2016, 5. http://www.nature.com/nrd/journal/v5/n7/full/nrd2081.html (accessed Oct. 16, 2016).
  3. The 2009 Nobel Prize in Physiology or Medicine - Press Release. The Nobel Prize. https://www.nobelprize.org/nobel_prizes/medicine/laureates/2009/press.html (accessed Oct. 4, 2016).

Comment

The Health of Healthcare Providers

Comment

The Health of Healthcare Providers

A car crash. A heart attack. A drug overdose. No matter what time of day, where you are, or what your problem is, emergency medical technicians (EMTs) will be on call and ready to come to your aid. These health care providers are charged with providing quality care to maintain or improve patient health in the field, and their efforts have saved the lives of many who could not otherwise find care on their own. While these EMTs deserve praise and respect for their line of work, what they deserve even more is consideration for the health issues that they themselves face. Emergency medical technicians suffer from a host of long-term health issues, including weight gain, burnout, and psychological changes.

The daily "schedule" of an EMT is probably most characterized by its variability and unpredictability. The entirety of their day is a summation of what everyone in their area is doing, those people's health issues, and the uncertainty of life itself. While there are start and end times to their shifts, even these are not hard and fast--shifts have the potential to start early or end late based on when people call 911. An EMT can spend their entire shift on the ambulance, without time to eat a proper meal or to get any sleep. These healthcare providers learn to catch a few minutes of sleep here and there when possible. Their yearly schedules are also unpredictable, with lottery systems in place to ensure that someone is working every day, at all hours of the day, while maintaining some fairness. Most services will have either 12 or 24 hour shifts, and this lottery system can result in EMTs having stacked shifts that are either back to back or at least within close proximity to one another. This only enhances the possibility of sleep disorders, with 70 percent of EMTs reporting having at least one sleep problem.1 While many people have experienced the effects of exhaustion and burnout due to a lack of sleep, few can say that their entire professional career has been characterized by these feelings. EMTs have been shown to be more than twice as likely than control groups to have moderate to high scores on the Epworth Sleepiness Scale (ESS), which is correlated with a greater likelihood of falling asleep during daily activities such as conversing, sitting in public places, and driving.1 The restriction and outright deprivation of sleep in EMTs has been shown to cause a large variety of health problems, and seems to be the main factor in the decline of both physical and mental health for EMTs.

A regular amount of sleep is essential in maintaining a healthy body. Reduced sleep has been associated with an increase in weight gain, cardiovascular disease, and weakened immune system functions. Studies have shown that, at least in men, short sleep durations are linked to weight gain and obesity, which is potentially due to alterations in hormones that regulate appetite.2,3 Due to this trend, it is no surprise that a 2009 study found that sleep durations that deviated from an ideal 7-8 hours, as well as frequent insomnia, increased the risk of cardiovascular disease. The fact that EMTs often have poor diets compounds that risk. An EMT needs to be ready around the clock to respond, which means there really isn’t any time to sit down and have a proper meal. Fast food becomes the meal of choice due to its convenience, both in availability and speed. Some hospitals have attempted to improve upon this shortcoming in the emergency medical service (EMS) world by providing some snacks and drinks at the hospital. This, however, creates a different issue due to the high calorie nature of these snacks. The body generally knows when it is full by detecting stretch in the stomach, and signaling the brain that enough food has been consumed. In a balanced diet, a lot of this space should be filled with fruits, vegetables, and overall low calorie items unless you are an athlete who uses a lot more energy. By eating smaller, high calorie items, an EMT will need to eat more in order to feel full, but this will result in the person exceeding their recommended daily calories. The extra energy will often get stored as fat, compounding the weight gain due to sleep deprivation. Studies involving the effects of restricted sleep on the immune system are less common, but one experiment demonstrated markers of systemic inflammation which could, again, lead to cardiovascular disease and obesity.2

Mental health is not spared from complications due to long waking periods with minimal sleep. A study was conducted to test the cognitive abilities of subjects experiencing varying amounts of sleep restriction;the results showed that less sleep led to cognitive deficits, and being awake for more than 16 hours led to deficits regardless of how much sleep the subject had gotten.4 This finding affects both the EMTs, who can injure themselves, and the patients, who may suffer due to more errors being made in the field. First year physicians, who similarly can work over 24 hour shifts, are subject to an increased risk of automobile crashes and percutaneous (skin) injuries when sleep deprived.5 These injuries often happen when leaving a shift. A typical EMT shift lasts from one morning to the next, and the EMT will leave his or her shift during rush hour on little to no sleep, increasing the dangerous possibility of falling asleep or dozing at the wheel. A similar study to the one on first year physicians mentioned prior studied extended duration work at critical-care units, and found that long shifts increased the risk of medical errors and lapses in attention.6 In addition to the more direct mental health problems posed by the continuous strain, EMTs and others in the healthcare field also face more personal issues, including burnout and changes in behavior. A study on pediatric residents, who face similar amounts of stress and workloads, established that 20% of participants were suffering from depression, and 75% met the criteria for burnout, both of which led to medical errors made during work.7 A separate study found that emergency physicians suffering from burnout also faced high emotional exhaustion, depersonalization, and a low sense of accomplishment.8 While many go into the healthcare field to help others, exhaustion and desensitization create a sort of cynicism in order to defend against the enormous emotional burden that comes with treating patients day in and day out.

Sleep deprivation, long work duration, and the stress that comes with the job contribute to a poor environment for the physical and mental health of emergency medical technicians and other healthcare providers. However, a recent study has shown that downtime, especially after dealing with critical patients, led to lower rates of depression and acute stress in EMTs.9 While this does not necessarily ameliorate post-traumatic stress or burnout, it is a start to addressing the situation. Other possible interventions would include providing more balanced meals at hospitals that are readily available to EMTs, as well as an improved scheduling system that prevents or limits back to back shifts. These concepts can apply to others facing high workloads with abnormal sleeping schedules as well, including college students, who are also at risk for mood disorders and a poorer quality of life due to the rigors of college life.10

References

  1. Pirrallo, R. G. et al. International Journal of the Science and Practice of Sleep Medicine. 2012, 16, 149-162.
  2. Banks, S. et al. J. Clin. Sleep Med. 2007, 3(5), 519-528.
  3. Watanabe, M. et al. Sleep  2010, 33(2), 161-167.
  4. Van Dongen, H. P. et al. Sleep 2004, 27(4), 117-126.
  5. Najib, T. A. et al. JAMA 2006, 296(9), 1055-1062.
  6. Barger, L. K. et al. PLoS Med. [Online] 2006, 3(12), e487. https://dx.doi.org/10.1371%2Fjournal.pmed.0030487 (accessed Oct. 3, 2016)
  7. Fahrenkopf, A. M. et al. BMJ [Online] 2008, 336, 488. http://dx.doi.org/10.1136/bmj.39469.763218.BE (accessed Oct. 3, 2016)
  8. Ben-Itzhak, S. et al. Clin. Exp. Emerg. Med. 2015, 2(4), 217-225.
  9. Halpern, J. et al. Biomed. Res. Int. [Online] 2014, 2014. http://dx.doi.org/10.1155/2014/483140 (accessed Oct. 3, 2016)
  10. Singh, R. et al. J. Clin. Diagn. Res. [Online] 2016, 10(5), JC01-JC05. https://dx.doi.org/10.7860%2FJCDR%2F2016%2F19140.7878 (accessed Oct 3, 2016)

Comment

The Creation of Successful Scaffolds for Tissue Engineering

Comment

The Creation of Successful Scaffolds for Tissue Engineering

Abstract

Tissue engineering is a broad field with applications ranging from pharmaceutical testing to total organ replacement. Recently, there has been extensive research on creating tissue that is able to replace or repair natural human tissue. Much of this research focuses on the creation of scaffolds that can both support cell growth and successfully integrate with the surrounding tissue. This article will introduce the concept of a scaffold for tissue engineering; discuss key areas of research including biomolecule use, vascularization, mechanical strength, and tissue attachment; and introduce some important recent advancements in these areas.

Introduction

Tissue engineering relies on four main factors: the growth of appropriate cells, the introduction of the proper biomolecules to these cells, the attachment of the cells to an appropriate scaffold, and the application of specific mechanical and biological forces to develop the completed tissue.1

Successful cell culture has been possible since the 1960’s, but these early methods lacked the adaptability necessary to make functioning tissues. With the introduction of induced pluripotent stem cells in 2008, however, researchers have not faced the same resource limitation previously encountered. As a result, the growth of cells of a desired type has not been limiting to researchers in tissue engineering and thus warrants less concern than other factors in contemporary tissue engineering.2,3

Similarly, the introduction of essential biomolecules (such as growth factors) to the developing tissue has generally not restricted modern tissue engineering efforts. Extensive research and knowledge of biomolecule function as well as relatively reliable methods of obtaining important biomolecules have allowed researchers to make engineered tissues more successfully emulate functional human tissue using biomolecules.4,5 Despite these advancements in information and procurement methods, however, the ability of biomolecules to improve engineered tissue often relies on the structure and chemical composition of the scaffold material.6

Cellular attachment has also been a heavily explored field of research. This refers specifically to the ability of the engineered tissue to seamlessly integrate into the surrounding tissue. Studies in cellular attachment often focus on qualities of scaffolds such as porosity as well as the introduction of biomolecules to encourage tissue union on the cellular level. Like biomolecule effectiveness, successful cellular attachment depends on the material and structure of the tissue scaffolding.7

Also critical to developing functional tissue is exposing it to the right environment. This development of tissue properties via the application of mechanical and biological forces depends strongly on finding materials that can withstand the required forces while supplying cells with the necessary environment and nutrients. Previous research in this has focused on several scaffold materials for various reasons. However, improvements to the material or the specific methods of development are still greatly needed in order to create functional implantable tissue. Because of the difficulty of conducting research in this area, devoted efforts to improving these methods remain critical to successful tissue engineering.

In order for a scaffold to be capable of supporting cells until the formation of a functioning tissue, it is necessary to satisfy several key requirements, principally introduction of helpful biomolecules, vascularization, mechanical function, appropriate chemical and physical environment, and compatibility with surrounding biological tissue.8,9 Great progress has been made towards satisfying many of these conditions, but further research in the field of tissue engineering must address challenges with existing scaffolds and improve their utility for replacing or repairing human tissue.

Key Research Areas of Scaffolding Design

Biomolecules

Throughout most early tissue engineering projects, researchers focused on simple cell culture surrounding specific material scaffolds.10 Promising developments such as the creation of engineered cartilage motivated further funding and interest in research. However, these early efforts missed out on several crucial factors to tissue engineering that allow implantable tissue to take on more complex functional roles. In order to create tissue that is functional and able to direct biological processes alongside nearby natural tissue, it is important to understand the interactions of biomolecules with engineered tissue.

Because the ultimate goal of tissue engineering is to create functional, implantable tissue that mimics biological systems, most important biomolecules have been explored by researchers in the medical field outside of tissue engineering. As a result, a solid body of research exists describing the functions and interactions of various biomolecules. Because of this existing information, understanding their potential uses in tissue engineering relies mainly on studying the interactions of biomolecules with materials which are not native to the body; most commonly, these non-biological materials are used as scaffolding. To complicate the topic further, biomolecules are a considerably large category encompassing everything from DNA to glucose to proteins. As such, it is most necessary to focus on those that interact closely with engineered tissue.

One type of biomolecule that is subject to much research and speculation in current tissue engineering is the growth factor.11 Specific growth factors can have a variety of functions from general cell proliferation to the formation of blood cells and vessels.12-14 They can also be responsible for disease, especially the unchecked cell generation of cancer.15 Many of the positive roles have direct applications to tissue engineering. For example, Transforming Growth Factor-beta (TGF-β) regulates normal growth and development in humans.16 One study found that while addition of ligands to engineered tissue could increase cellular adhesion to nearby cells, the addition also decreased the generation of the extracellular matrix, a key structure in functional tissue.17 To remedy this, the researchers then tested the same method with the addition of TGF-β. They saw a significant increase in the generation of the extracellular matrix, improving their engineered tissue’s ability to become functional faster and more effectively. Clearly, a combination of growth factors and other tissue engineering methods can lead to better outcomes for functional tissue engineering.

With the utility of growth factors established, delivery methods become very important. Several methods have been shown as effective, including delivery in a gelatin carrier.18 However, some of the most promising procedures rely on the scaffolding’s properties. One set of studies mimicked the natural release of growth factors through the extracellular matrix by creating a nanofiber scaffold containing growth factors for delayed release.19 The study saw an positive influence on the behavior of cells as a result of the release of growth factor. Other methods vary physical properties of the scaffold such as pore size to trigger immune pathways that release regenerative growth factors, as will be discussed later. The use of biomolecules and specifically growth factors is heavily linked to the choice of scaffolding material and can be critical to the success of an engineered tissue.

Vascularization

Because almost all tissue cannot survive without proper oxygenation, engineered tissue vascularization has been a focus of many researchers in recent years to optimize chances of engineered tissue success.20 For many of the areas of advancement, this process depends on the scaffold.21 The actual requirements for level and complexity of vasculature vary greatly based on the type of tissue; the requirements for blood flow in the highly vascularized lungs are different than those for cortical bone.22,23 Therefore, it is more appropriate for this topic to address the methods which have been developed for creating vascularized tissue rather than the actual designs of specific tissues.

One method that has shown great promise is the use of modified 3D printers to cast vascularized tissue.24 This method uses the relatively new printing technology to create carbohydrate glass networks in the form of the desired vascular network. The network is then coated with a hydrogel scaffold to allow cells to grow. The carbohydrate glass is then dissolved from inside of the hydrogel, leaving an open vasculature in a specific shape. This method has been successful in achieving cell growth in areas of engineered tissue that would normally undergo necrosis. Even more remarkably, the created vasculature showed the ability to branch into a more complex system when coated with endothelial cells.24

However, this method is not always applicable. Many tissue types require scaffolds that are more rigid or have different properties than hydrogels. In this case, researchers have focused on the effect of a material’s porosity on angiogenesis.7,25 Several key factors have been identified for blood vessel growth, including pore size, surface area, and endothelial cell seeding similar to that which was successful in 3D printed hydrogels. Of course, many other methods are currently being researched based on a variety of scaffolds. Improvements on these methods, combined with better research into the interactions of vascularization with biomaterial attachment, show great promise for engineering complex, differentiated tissue.

Mechanical Strength

Research has consistently demonstrated that large-scale cell culture is not limiting to bioengineering. With the introduction of technology like bioreactors or three-dimensional cell culture plates, growing cells of the desired qualities and in the appropriate form continues to become easier for researchers; this in turn allows for a focus on factors beyond simply gathering the proper types of cells.2 This is important because most applications in tissue engineering require more than just the ability to create groupings of cells—the cells must have a certain degree of mechanical strength in order to functionally replace tissue that experiences physical pressure.

The mechanical strength of a tissue is a result of many developmental factors and can be classified in different ways, often based on the type of force applied to the tissue or the amount of force the tissue is able to withstand. Regardless, mechanical strength of a tissue primarily relies on the physical strength of the tissue and its ability for its cells to function under an applied pressure; these are both products of the material and fabrication methods of the scaffolding used. For example, scaffolds in bone tissue engineering are often measured for compressive strength. Studies have found that certain techniques, such as cooking in a vacuum oven, may increase compressive strength.26 One group found that they were able to match the higher end of the possible strength of cancellous (spongy) bone via 3D printing by using specific molecules within the binding layers.27 This simple change resulted in scaffolding that displayed ten times the mechanical strength of scaffolding with traditional materials, a value within the range for natural bone. Additionally, the use of specific binding agents between layers of scaffold resulted in increased cellular attachment, the implications of which will be discussed later.27 These changes result in tissue that is more able to meet the functional requirements and therefore to be easily used as a replacement for bone. Thus, simple changes in materials and methods used can drastically increase the mechanical usability of scaffolds and often have positive effects on other important qualities for certain types of tissue.

Clearly, not all designed tissues require the mechanical strength of bone; contrastingly for contrast, the brain experiences less than one kPa of pressure compared to the for bone’s 106 kPa pressure bones experience.28 Thus, not all scaffolds must support the same amount of pressure, and scaffolds must be made accordingly to accommodate for these structural differences. Additionally, other tissues might experience forces such as tension or torsion based on their locations within the body. This means that mechanical properties must be looked at on a tissue-by-tissue basis in order to determine their corresponding scaffolding structures. But mechanical limitations are only a primary factor in bone, cartilage, and cardiovascular engineered tissue, the latter of which has significantly more complicated mechanical requirements.29

Research in the past few years has investigated increasingly complex aspects of scaffold design and their effects on macroscopic physical properties. For example, it is generally accepted that pore size and related surface area within engineered bone replacements are key to cellular attachment. However, recent advances in scaffold fabrication techniques have allowed researchers to investigate very specific properties of these pores such as their individual geometry. In one recent study, it was found that using an inverse opal geometry--an architecture known for its high strength in materials engineering--for pores led to a doubling of mineralization within a bone engineering scaffold.30 Mineralization is a crucial quality of bone because of its contribution to compressive strength.31 This result is so important because it demonstrates the recent ability of researchers to alter scaffolds on a microscopic level in order to affect macroscopic changes in tissue properties.

Attachment to Nearby Tissue

Even with an ideal design, a tissue’s success as an implant relies on its ability to integrate with the surrounding tissue. For some types of tissue, this is simply a matter of avoiding rejection by the host through an immune response.32 In these cases, it is important to choose materials with a specific consideration for reducing this immune response. Over the past several decades, it has been shown that the key requirement for biocompatibility is the use of materials that are nearly biologically inert and thus do not trigger a negative response from natural tissue.33 This is based on the strategy which focuses on minimizing the immune response of tissue surrounding the implant in order to avoid issues such as inflammation which might be detrimental to the patient undergoing the procedure. This method has been relatively effective for implants ranging from total joint replacements to heart valves.

Avoiding a negative immune response has proven successful for some medical fields. However, more complex solutions involving a guided immune response might be necessary for engineered tissue implants to survive and take on the intended function. This issue of balancing biochemical inertness and tissue survival has led researchers to investigate the possibility of using the host immune response in an advantageous way for the success of the implant.34 This method of intentionally triggering surrounding natural tissue relies on the understanding that immune response is actually essential to tissue repair. While an inert biomaterial may be able to avoid a negative reaction, it will also discourage a positive reaction. Without provoking some sort of response to the new tissue, an implant will remain foreign to bordering tissue; this means that the cells cannot take on important functions, limiting the success of any biomaterial that has more than a mechanical use.

Current studies have focused primarily on modifying surface topography and chemistry to target a positive immune reaction in the cells surrounding the new tissue. One example is the grafting of oligopeptides onto the surface of an implant to stimulate macrophage response. This method ultimately leads to the release of growth factors and greater levels of cellular attachment because of the chemical signals involved in the natural immune response.35 Another study found that the use of a certain pore size in the scaffold material led to faster and more complete healing in an in vivo study using rabbits. Upon further investigation, it was found that the smaller pore size was interacting with macrophages involved in the triggered immune response; this interaction ultimately led more macrophages to differentiate into a regenerative pathway, leading to better and faster healing of the implant with the surrounding tissue.36 Similar studies have investigated the effect of methods such as attaching surface proteins with similarly enlightening results. These and other promising studies have led to an increased awareness of chemical signaling as a method to enhance biomaterial integration with larger implications including faster healing time and greater functionality.

Conclusion

The use of scaffolds for tissue engineering has been the subject of much research because of its potential for extensive utilization in the medical field. Recent advancements have focused on several areas, particularly the use of biomolecules, improved vascularization, increases in mechanical strength, and attachment to existing tissue. Advancements in each of these fields have been closely related to the use of scaffolding. Several biomolecules, especially growth factors, have led to a greater ability for tissue to adapt as an integrated part of the body after implantation. These growth factors rely on efficient means of delivery, notably through inclusion in the scaffold, in order to have an effect on the tissue. The development of new methods and refinement of existing ones has allowed researchers to successfully vascularize tissue on multiple types of scaffolds. Likewise, better methods of strengthening engineered tissue scaffolds before cell growth and implantation have allowed for improved functionality, especially under mechanical forces. Modifications to scaffolding and the addition of special molecules have allowed for increased cellular attachment, improving the efficacy of engineered tissue for implantation. Further advancement in each of these areas could lead to more effective scaffolds and the ability to successfully use engineered tissue for functional implants in medical treatments.

References

  1. “Tissue Engineering and Regenerative Medicine.” National Institute of Biomedical Imaging and Bioengineering. N.p., 22 July 2013. Web. 29 Oct. 2016.
  2. Haycock, John W. “3D Cell Culture: A Review of Current Approaches and Techniques.” 3D Cell Culture: Methods and Protocols. Ed. John W. Haycock. Totowa, NJ: Humana Press, 2011. 1–15. Web.
  3. Takahashi, Kazutoshi, and Shinya Yamanaka. “Induction of Pluripotent Stem Cells from Mouse Embryonic and Adult Fibroblast Cultures by Defined Factors.” Cell 126.4 (2006): 663–676. ScienceDirect. Web.
  4. Richardson, Thomas P. et al. “Polymeric System for Dual Growth Factor Delivery.” Nat Biotech 19.11 (2001): 1029–1034. Web.
  5. Liao, IC, SY Chew, and KW Leong. “Aligned Core–shell Nanofibers Delivering Bioactive Proteins.” Nanomedicine 1.4 (2006): 465–471. Print.
  6. Elliott Donaghue, Irja et al. “Cell and Biomolecule Delivery for Tissue Repair and Regeneration in the Central Nervous System.” Journal of Controlled Release: Official Journal of the Controlled Release Society 190 (2014): 219–227. PubMed. Web.
  7. Murphy, Ciara M., Matthew G. Haugh, and Fergal J. O’Brien. “The Effect of Mean Pore Size on Cell Attachment, Proliferation and Migration in Collagen–glycosaminoglycan Scaffolds for Bone Tissue Engineering.” Biomaterials 31.3 (2010): 461–466. Web.
  8. Sachlos, E., and J. T. Czernuszka. “Making Tissue Engineering Scaffolds Work. Review: The Application of Solid Freeform Fabrication Technology to the Production of Tissue Engineering Scaffolds.” European Cells & Materials 5 (2003): 29-39-40. Print.
  9. Chen, Guoping, Takashi Ushida, and Tetsuya Tateishi. “Scaffold Design for Tissue Engineering.” Macromolecular Bioscience 2.2 (2002): 67–77. Wiley Online Library. Web.
  10. Vacanti, Charles A. 2006. “The history of tissue engineering.” Journal of Cellular and Molecular Medicine 10 (3): 569-576.
  11. Depprich, Rita A. “Biomolecule Use in Tissue Engineering.” Fundamentals of Tissue Engineering and Regenerative Medicine. Ed. Ulrich Meyer et al. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. 121–135. Web.
  12. Laiho, Marikki, and Jorma Keski-Oja. “Growth Factors in the Regulation of Pericellular Proteolysis: A Review.” Cancer Research 49.10 (1989): 2533. Print.
  13. Morstyn, George, and Antony W. Burgess. “Hemopoietic Growth Factors: A Review.” Cancer Research 48.20 (1988): 5624. Print.
  14. Yancopoulos, George D. et al. “Vascular-Specific Growth Factors and Blood Vessel Formation.” Nature 407.6801 (2000): 242–248. Web.
  15. Aaronson, SA. “Growth Factors and Cancer.” Science 254.5035 (1991): 1146. Web.
  16. Lawrence, DA. “Transforming Growth Factor-Beta: A General Review.” European cytokine network 7.3 (1996): 363–374. Print.
  17. Mann, Brenda K, Rachael H Schmedlen, and Jennifer L West. “Tethered-TGF-β Increases Extracellular Matrix Production of Vascular Smooth Muscle Cells.” Biomaterials 22.5 (2001): 439–444. Web.
  18. Malafaya, Patrícia B., Gabriela A. Silva, and Rui L. Reis. “Natural–origin Polymers as Carriers and Scaffolds for Biomolecules and Cell Delivery in Tissue Engineering Applications.” Matrices and Scaffolds for Drug Delivery in Tissue Engineering 59.4–5 (2007): 207–233. Web.
  19. Sahoo, Sambit et al. “Growth Factor Delivery through Electrospun Nanofibers in Scaffolds for Tissue Engineering Applications.” Journal of Biomedical Materials Research Part A 93A.4 (2010): 1539–1550. Web.
  20. Novosel, Esther C., Claudia Kleinhans, and Petra J. Kluger. “Vascularization Is the Key Challenge in Tissue Engineering.” From Tissue Engineering to Regenerative Medicine- The Potential and the Pitfalls 63.4–5 (2011): 300–311. Web.
  21. Drury, Jeanie L., and David J. Mooney. “Hydrogels for Tissue Engineering: Scaffold Design Variables and Applications.” Synthesis of Biomimetic Polymers 24.24 (2003): 4337–4351. Web.
  22. Lafage-Proust, Marie-Helene et al. “Assessment of Bone Vascularization and Its Role in Bone Remodeling.” BoneKEy Rep 4 (2015): n. pag. Web.
  23. Türkvatan, Aysel et al. “Multidetector CT Angiography of Renal Vasculature: Normal Anatomy and Variants.” European Radiology 19.1 (2009): 236–244. Web.
  24. Miller, Jordan S. et al. “Rapid Casting of Patterned Vascular Networks for Perfusable Engineered Three-Dimensional Tissues.” Nature Materials 11.9 (2012): 768–774. www.nature.com. Web.
  25. Lovett, Michael et al. “Vascularization Strategies for Tissue Engineering.” Tissue Engineering. Part B, Reviews 15.3 (2009): 353–370. Web.
  26. Cox, Sophie C. et al. “3D Printing of Porous Hydroxyapatite Scaffolds Intended for Use in Bone Tissue Engineering Applications.” Materials Science and Engineering: C 47 (2015): 237–247. ScienceDirect. Web.
  27. Fielding, Gary A., Amit Bandyopadhyay, and Susmita Bose. “Effects of Silica and Zinc Oxide Doping on Mechanical and Biological Properties of 3D Printed Tricalcium Phosphate Tissue Engineering Scaffolds.” Dental Materials 28.2 (2012): 113–122. ScienceDirect. Web.
  28. Engler, Adam J. et al. “Matrix Elasticity Directs Stem Cell Lineage Specification.” Cell 126.4 (2006): 677–689. ScienceDirect. Web.
  29. Bilodeau, Katia, and Diego Mantovani. “Bioreactors for Tissue Engineering: Focus on Mechanical Constraints. A Comparative Review.” Tissue Engineering 12.8 (2006): 2367–2383. online.liebertpub.com (Atypon). Web.
  30. Sommer, Marianne R. et al. “Silk Fibroin Scaffolds with Inverse Opal Structure for Bone Tissue Engineering.” Journal of Biomedical Materials Research Part B: Applied Biomaterials (2016): n/a-n/a. Wiley Online Library. Web.
  31. Sapir-Koren, Rony, and Gregory Livshits. “Bone Mineralization and Regulation of Phosphate Homeostasis.” IBMS BoneKEy 8.6 (2011): 286–300. www.nature.com. Web.
  32. Boehler, Ryan M., John G. Graham, and Lonnie D. Shea. “Tissue Engineering Tools for Modulation of the Immune Response.” BioTechniques 51.4 (2011): 239–passim. PubMed Central. Web.
  33. Follet, H. et al. “The Degree of Mineralization Is a Determinant of Bone Strength: A Study on Human Calcanei.” Bone 34.5 (2004): 783–789. PubMed. Web.
  34. Franz, Sandra et al. “Immune Responses to Implants – A Review of the Implications for the Design of Immunomodulatory Biomaterials.” Biomaterials 32.28 (2011): 6692–6709. ScienceDirect. Web.
  35. Kao, Weiyuan John, and Damian Lee. “In Vivo Modulation of Host Response and Macrophage Behavior by Polymer Networks Grafted with Fibronectin-Derived Biomimetic Oligopeptides: The Role of RGD and PHSRN Domains.” Biomaterials 22.21 (2001): 2901–2909. ScienceDirect. Web.
  36. Bryers, James D, Cecilia M Giachelli, and Buddy D Ratner. “Engineering Biomaterials to Integrate and Heal: The Biocompatibility Paradigm Shifts.” Biotechnology and bioengineering 109.8 (2012): 1898–1911. Web.

Comment

Zika and Fetal Viruses: Sharing More Than A Motherly Bond

Comment

Zika and Fetal Viruses: Sharing More Than A Motherly Bond

Zika is a blood-borne pathogen primarily transmitted through mosquito bites and sexual activities. Pregnant women infected by Zika can pass the virus to their fetus, causing microcephaly, a condition in which the baby has an abnormally small head indicative of abnormal brain development. With the outbreak of the Zika virus and its consequences for pregnant women and their babies, much research has focused on how the infection leads to microcephaly in fetuses.

Current Zika research has been focused on uncovering methods for early detection of Zika in pregnant women and educating the public on safe sexual practices to contain the vector of transmission to just mosquitoes.1 However, to truly end the Zika epidemic, there are three critical steps that need to be taken. First, researchers must determine the point at which maternal infections harm the neurological development of fetuses in order to ensure treatment is administered to the mothers before the brain damage becomes irreversible. Subsequently, researchers must determine the mechanism through which Zika spreads from mother to fetus. After this step, researchers can begin developing therapies to protect the fetus from Zika once the mother is already infected and also start creating a preventative vaccine. Although Zika seems like a mysterious new illness, there are several other well-studied viral infections that affect pregnancies, such as cytomegalovirus (CMV). CMV infection during pregnancy also leads to severe fetal brain damage. Previous research techniques could provide clues for researchers trying to understand more about Zika, and learning more about Zika will better equip us for handling prenatal viral outbreaks in the future.

The current detection of microcephaly of infants with Zika-infected mothers involves fetal ultrasound as early as 18 weeks into the gestation period.2 However, this is a late diagnosis of fetal Zika infection and at this point the brain abnormalities caused by the virus are irreversible. Ultrasounds and MRI scans of infants with confirmed CMV infection can detect these neurological abnormalities as well.3 However, these brain lesions are also irreversible, making early detection a necessity for CMV infections as well. Fortunately, the presence of CMV or CMV DNA in amniotic fluid can be used for early diagnosis, and current treatment options include administration of valacyclovir or hyperimmunoglobulin in the window before the fetus develops brain lesions.4 Researchers must try to identify fetal Zika infection as early as possible as opposed to relying on fetal microcephaly as the sole diagnostic tool. Some potential early detection methods include testing for Zika in the urine of pregnant women as soon as Zika symptoms are present, as opposed to screening the fetus for infection.5

Discovering the mechanism through which Zika infects the fetus is necessary to develop therapies to protect the fetus from infection. Many viruses that are transferred to the fetus during pregnancy do so by compromising the immune function of the placental barrier, allowing the virus to cross the placenta and infect the fetus. The syncytiotrophoblast is the epithelial covering of placental embryonic villi, which are highly vascular finger-like projections that increase the surface area available for exchange of nutrients and wastes between the mother and fetus.6 In one study, experiments found that infection of extravillous trophoblast cells decreased the immune function of the placenta, which increased fetal susceptibility to infection.7 Determining which cells in the placenta are infected by Zika could aid research into preventative treatments for fetal infection.

Since viruses that cross the placental barrier are able to infect the fetus, understanding the interaction between immune cells and the placental barrier is important for developing therapies against Zika that increase fetal viral resistance. In one study, researchers found that primary human trophoblast cells use cell-derived vesicles called exosomes to transfer miRNA, conferring placental immune resistance to a multitude of viruses to other pregnancy-related cells.8 miRNAs are responsible for regulating gene expression, and different miRNAs exist in different cells so that those cells will have specific functions and defenses. Isolating these miRNA exosomes, using them to supplement placental cell strains, and subsequently testing whether those cells are more or less susceptible to Zika could support the development of drugs that bolster the placental immune defense mechanism already in place. Since viral diseases that cross the placenta lead to poor fetal outcome, developing protective measures for the placenta is imperative, not only for protection against Zika but also for protection against new viruses without vaccinations.9

Combating new and more elusive viral outbreaks is difficult, but understanding and preventing viral infection in fetuses is like taking a shot in the dark. Although the prospects for infants infected by Zika are currently poor, combining the research done on other congenital infections paints a more complete picture on viral transmission during pregnancy. Instead of starting from scratch, scientists can use this information to determine the tests that can detect Zika, the organs to examine for compromised immune system function, and the treatment types that have a higher probability of effectiveness. Zika will not be the last virus that causes birth defects, but by combining the efforts of many scientists, we can get closer to stopping fetal viral infection once and for all.

References

  1. Wong, K. V. J. Epidemiol. Public Health Rev. 2016, 1.
  2. Mlakar, J., et al. N. Engl. J. Med. 2016, 374, 951-958.
  3. Malinger, G., et al. Am. J. Neuroradiol. 2003, 24, 28-32.
  4. Leruez-Ville, M., et al. Am. J. Obstet. Gynecol. 2016, 215, 462.
  5. Gourinat, A. C., et al. Emerg. Infect. Dis. 2015, 21, 84-86.
  6. Delorme-Axford, E., et al. Proc. Natl. Acad. Sci. 2013, 110, 12048-12053.
  7. Zhang, J.; Parry, S. Ann. N. Y. Acad. Sci. 2001, 943, 148-156.
  8. Mouillet, J. F., et al. Int. J. Dev. Bio. 2014, 58, 281.
  9. Mor, G.; Cardenas I. Am. J. Reprod. Immunol. 2010, 63, 425-433.

Comment

Wearable Tech is the New Black

Comment

Wearable Tech is the New Black

What if our clothes could detect cancer? That may seem like a far fetched, “only applicable in a sci-fi universe” type of concept, but such clothes do exist and similar devices that merge technology and medicine are actually quite prominent today. The wearable technology industry, a field poised to grow to $11.61 billion by 20201, is exploding in the healthcare market as numerous companies produce various devices that help us in our day to day lives such as wearable EKG monitors and epilepsy detecting smart watches. Advancements in sensor miniaturization and integration with medical devices have greatly opened this interdisciplinary trade by lowering costs. Wearable technology ranging from the Apple Watch to consumable body-monitoring pills can be used for everything from health and wellness monitoring to early detection of disorders. But as these technologies become ubiquitous, there are important privacy and interoperability concerns that must be addressed.

Wearable tech like the Garmin Vivosmart HR+ watch uses sensors to obtain insightful data about its wearer’s health. This bracelet-like device tracks steps walked, distance traveled, calories burned, pulse, and overall fitness trends over time.2 It transmits the information to an app on the user’s smartphone which uses various algorithms to create insights about the person’s daily activity. This data about a person’s daily athletic habits is useful to remind them that fitness is not limited to working out at the gym or playing a sport--it’s a way of life. Holding tangible evidence of one’s physical activity for the day or history of vital signs empowers patients to take control of their personal health. The direct feedback of these devices influences patients to make better choices such as taking the stairs instead of the elevator or setting up a doctor appointment early on if they see something abnormal in the data from their EKG sensor. Connecting hard evidence from the body to physical and emotional perceptions refines the reality of those experiences by reducing the subjectivity and oversimplification that feelings about personal well being may bring about.

Not only can wearable technology gather information from the body, but these devices can also detect and monitor diseases. Diabetes, the 7th leading cause of death in the United States,3 can be detected via AccuCheck, a technology that can send an analysis of blood sugar levels directly to your phone.4 Analysis software like BodyTel can also connect patients with doctors and other family members who would be interested in looking at the data gathered from the blood test.5 Ingestible devices such as the Ingestion Event Marker take monitoring a step further. Designed to monitor medication intake, the pills keep track of when and how frequently patients take their medication. The Freescale KL02 chip, another ingestible device, monitors specific organs in the body and relays the organ’s status back to a Wi-Fi enabled device which doctors can use to remotely measure the progression of an illness. They can assess the effectiveness of a treatment with quantitative evidence which makes decision-making about future treatment plans more effective.

Many skeptics hesitate to adopt wearable technology because of valid concerns about accuracy and privacy. To make sure medical devices are kept to the same standards and are safe for patient use, the US Food and Drug Administration (FDA) has begun to implement a device approval process. Approval is only granted to devices that provably improve the functionality of traditional medical devices and do not pose a great risk to patients if they malfunction.6In spite of the FDA approval process, much research is needed to determine whether the information, analysis and insights received from various wearable technologies can be trusted.

Privacy is another big issue especially for devices like fitness trackers that use GPS location to monitor user behavior. Many questions about data ownership (does the company or the patient own the data?) and data security (how safe is my data from hackers and/or the government and insurance companies?) are still in a fuzzy gray area with no clear answers.7 Wearable technology connected to online social media sites, where one’s location may be unknowingly tied to his or her posts, can increase the chance for people to become victims of stalking or theft. Lastly, another key issue that makes medical practitioners hesitant to use wearable technology is the lack of interoperability, or the ability to exchange data, between devices. Data structured one way on a certain wearable device may not be accessible on another machine. Incorrect information might be exchanged, or data could be delayed or unsynchronized, all to the detriment of the patient.

Wearable technology is changing the way we live our lives and understand the world around us. It is modifying the way health care professionals think about patient care by emphasizing quantitative evidence for decision making over the more subjective analysis of symptoms. The ability for numeric evidence about one’s body to be documented holds people accountable for the actions. Patients can check to see if they meet their daily step target or optimal sleep count, and doctors can track the intake of a pill and see its effect on the patient’s body. For better or for worse, we won’t get the false satisfaction of achieving our fitness goal or of believing in the success of a doctor’s recommended course of action without tangible results. While we have many obstacles to overcome, wearable technology has improved the quality of life for many people and will continue to do so in the future.

References

  1. [Hunt, Amber. Experts: Wearable Tech Tests Our Privacy Limits. http://www.usatoday.com/story/tech/2015/02/05/tech-wearables-privacy/22955707/ (accessed Oct. 24, 2016).
  2. Vivosmart HR+. https://buy.garmin.com/en-US/US/into-sports/health-fitness/vivosmart-hr-/prod548743.html (accessed Oct. 31, 2016).
  3. Statistics about Diabetes. http://www.diabetes.org/diabetes-basics/statistics/ (accessed Nov. 1, 2016).
  4. Accu-Chek Mobile. https://www.accu-chek.co.uk/gb/products/metersystems/mobile.html (accessed Oct. 31, 2016).
  5. GlucoTel. http://bodytel.com/portfolios/glucotel/ (accessed Oct. 31, 2016)
  6. Mobile medical applications guidance for industry and Food and Drug Administration staff. U. S. Food and Drug Administration, Feb. 9, 2015. http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM263366.pdf (accessed Oct. 17, 2016).
  7. Meingast, M.; Roosta, T.; Sastry, S. Security and Privacy Issues with Health Care Information Technology. http://www.cs.jhu.edu/~sdoshi/jhuisi650/discussion/secprivhealthit.pdf (accessed Nov. 1, 2016).

Comment

First World Health Problems

Comment

First World Health Problems

I am a first generation American, as both of my parents immigrated here from Myanmar, a third world country. There had been no occurrence of any Inflammatory Bowel Disease (IBD) in my family, yet I was diagnosed with Ulcerative Colitis at the beginning of my sophomore year of high school. Since IBD is known to be caused by a mix of genetic and environmental factors,1,2 what specifically triggered me to develop Ulcerative Colitis? Was it the food in America, the air I was exposed to, a combination of the two, or neither of them at all? Did the “environment” of the first world in the United States cause me to develop Ulcerative Colitis?

IBD is a chronic autoimmune disease, characterized by persistent inflammation of the digestive tract and classified into two separate categories: Ulcerative Colitis and Crohn’s Disease.3 Currently, there is no known cure for IBD, as its pathogenesis (i.e. the manner in which it develops) is not fully understood.1 Interestingly, the incidence of IBD has increased dramatically over the past century.1 A systematic review by Molodecky et al. showed that the incidence rate of IBD was significantly higher in Western nations. This may be due to better diagnostic techniques or the growth of environmental factors that promote its development. This could also suggest that there may be certain stimuli in first world countries that can trigger pathogenesis in individuals with a genetic predisposition to IBD.

Environmental factors that are believed to affect IBD include smoking, diet, geographic location, social status, stress, and microbes.1 Smoking has had varying effects on the development of IBD depending on the form; smoking is a key risk factor for Crohn’s Disease, while non-smokers and ex-smokers are usually diagnosed with Ulcerative Colitis.4 There have not been many studies investigating the causal relationship between diet and IBD due to the diversity in diet composition.1 However, since IBD affects the digestive system, diet has long been thought to have some impact on the pathogenesis of the disease.1 In first world countries, there is access to a larger variety of food, which may impact the prevalence of IBD. People susceptible to the disease in developing countries may have a smaller chance of being exposed to “trigger” foods. In addition, IBD has been found in higher rates in urban areas versus rural areas.1,4,5 This makes sense, as cities have a multitude of potential disease-inducing environmental factors including pollution, poor sanitation, and microbial exposure. Higher socioeconomic status has also been linked to higher rates of IBD.4 This may be partly due to the sedentary nature of white collar work, which has also been linked to increased rates of IBD.1 Stress used to be viewed as a possible factor in the pathogenesis of IBD, but recent evidence has indicated that it only exacerbates the disease.3 Recent research has focused on the microorganisms in the gut, called gut flora, as they seem to have a vital role in the instigation of IBD.1 In animal models, it has even been observed that pathogenesis of IBD is not possible in a germ-free environment.1 The idea of the importance of microorganisms in human health is also linked to the Hygiene Hypothesis.

The Hygiene Hypothesis states that the lack of infections in western countries is the reason for an increasing amount of autoimmune and allergic diseases.6 The idea behind the theory is that some infectious agents guard against a wide variety of immune-related disorders.6 Animal models and clinical trials have provided some evidence backing the Hygiene Hypothesis, but it is hard to causally attribute the pathogenesis of autoimmune and allergic diseases to a decrease in infections, since first world countries have very different environmental factors than third world countries.6

The increasing incidence of IBD in developed countries is not yet fully understood, but recent research points towards a complex combination of environmental and genetic factors. The rise of autoimmune disease diagnoses may also be attributed to better medical equipment and facilities and the tendency of people in more developed countries to regularly get checked by a doctor. There are many difficulties in researching the pathogenesis of IBD including isolating certain environmental factors and obtaining tissue and data from third world countries. However, there is much promising research and it might not be long until we discover a cure for IBD.

References

  1. Danese, S. et al. Autoimm Rev 2004, 3.5, 394-400.
  2. Podolsky, Daniel K. N Engl J Med 2002,  347.6, 417-29.
  3. Mayo Clinic. "Inflammatory Bowel Disease (IBD)." http://www.mayoclinic.org/diseases-conditions/inflammatory-bowel-disease/basics/definition/con-20034908 (accessed Sep. 30, 2016).
  4. CDC. "Epidemiology of the IBD." https://www.cdc.gov/ibd/ibd-epidemiology.htm (accessed Oct.17, 2016).
  5. Molodecky, N. et al. Gastroenterol 2012, 142.1, n. pag.
  6. Okada, H. et. al. Clin Exp Immuno 2010, 160, 1–9.

Comment

East Joins West: The Rise of Integrative Medicine

Comment

East Joins West: The Rise of Integrative Medicine

An ancient practice developed thousands of years ago and still used by millions of people all over the world, Traditional Chinese Medicine (TCM) has undoubtedly played a role in the field of medicine. But just what is TCM? Is it effective? And can it ever be integrated with Western medicine?

The techniques of TCM stem from the beliefs upon which it was founded. The theory of the yin and yang balance holds that all things in the universe are composed of a balance between the forces of yin and yang. While yin is generally associated with objects that are dark, still, and cold, yang is associated with items that are bright, warm, and in motion.1 In TCM, illness is believed to be a result of an imbalance of yin or yang in the body. For instance, when yin does not cool yang, yang rises and headaches, flushing, sore eyes, and sore throats result. When yang does not warm yin, poor circulation of blood, lethargy, pallor, and cold limbs result. TCM aims to determine the nature of the disharmony and correct it through a variety of approaches. As the balance is restored in the body, so is the health.2

Another fundamental concept of TCM is the idea of qi, which is the energy or vital force responsible for controlling the functions of the human mind and body. Qi flows through the body through 12 meridians, or channels, that correspond to the 12 major organ systems, and 8 extra meridians that are all interconnected with the major channels. Just like an imbalance between yin and yang, disruption to the flow causes disease, and correction of the flow restores the body to balance.2 In TCM, disease is not viewed as something that a patient has. Rather, it is something that the patient is. There is no isolated entity called “disease,” but only a whole person whose body functions may be balanced or imbalanced, harmonious or disharmonious.3 Thus, TCM practitioners aim to increase or decrease qi in the body to create a healthy yin-yang balance through various techniques such as acupuncture, herbal medicine, nutrition, and mind/body exercise (tai chi, yoga). Eastern treatments are dismissed by some as superfluous to the recovery process and even harmful if used in place of more conventional treatments. However, evidence exists indicating Eastern treatments can be very effective parts of recovery plans.

The most common TCM treatments are acupuncture, which involves inserting needles at precise meridian points, and herbal medicine, which refers to using plant products (seeds, berries, roots, leaves, bark, or flowers) for medicinal purposes. Acupuncture seeks to improve the body’s functions by stimulating specific anatomic sites—commonly referred to as acupuncture points, or acupoints. It releases the blocked qi in the body, which may be causing pain, lack of function, or illness. Although the effects of acupuncture are still being researched, results from several studies suggest that it can stimulate function in the body and induce its natural healing response through various physiological systems.4 According to the WHO (World Health Organization), acupuncture is effective for treating 28 conditions, while limited but probable evidence suggests it may have an effective value for many more. Acupuncture seems to have gained the most clinical acceptance as a pain reduction therapy. Research from an international team of experts pooled the results of 29 studies on chronic pain involving nearly 18,000 participants—some had acupuncture, some had “sham” acupuncture, and some did not have acupuncture at all. Overall, the study found acupuncture treatments to be superior to both a lack of acupuncture treatment and sham acupuncture treatments for the reduction of chronic pain, suggesting that such treatments are a reasonable option for afflicted patients.5 According to a study carried out at the Technical University of Munich, people with tension headaches and/or migraines may find acupuncture to be very effective in alleviating their symptoms.6 Another study at the University of Texas M.D. Anderson Cancer Center found that twice weekly acupuncture treatments relieved debilitating symptoms of xerostomia--severe dry mouth--among patients undergoing radiation for head and neck cancer.7 Additionally, acupuncture has been demonstrated to both enhance performance in the memory-related brain regions of mild cognitive impairment patients (who have an increased risk of progressing towards Alzheimer’s disease),8 and to provide therapeutic advantages in regulating inflammation in infection and inflammatory disease.9

Many studies have also demonstrated the efficacy of herbal medicine in treating various illnesses. Recently, the WHO estimated that 80% of people worldwide rely on herbal medicines for some part of their primary health care. Researchers from the University of Adelaide have shown that a mixture of extracts from the roots of two medicinal herbs, Kushe and Baituling, works to kill cancer cells.10 Furthermore, scientists concluded that herbal plants have the potential to delay the development of diabetic complications, although more investigations are necessary to characterize this antidiabetic effect.11 Finally, a study found that Chinese herbal formulations appeared to alleviate symptoms for some patients with Irritable Bowel Syndrome, a common functional bowel disorder that is characterized by chronic or recurrent abdominal pain and does not currently have any reliable medical treatment.12

Both TCM and Western medicine seek to ease pain and improve function. Can the two be combined? TCM was largely ignored by Western medical professionals until recent years, but is slowly gaining traction among scientists and clinicians as studies show that an integrative approach has been effective. For instance, for patients dealing with chronic pain, Western medicine can stop the pain quickly with medication or interventional therapy, while TCM can provide a longer-lasting solution to the underlying problem with milder side effects and a greater focus on treating the underlying illness.13 A study by Cardiff University’s School of Medicine and Peking University in China showed that combining TCM and Western medicine could offer hope for developing new treatments for liver, lung, bone, and colorectal cancers.14 Also, studies on the use of traditional Chinese medicines for the treatment of multiple diseases like bronchial asthma, atopic dermatitis, and IBS showed that an interdisciplinary approach to TCM may lead to the discovery of new medicines.15

TCM is still a developing field in the Western world, and more research and clinical trials on the benefits and mechanisms of TCM are being conducted. While TCM methods such as acupuncture and herbal medicine must be further examined to be accepted as credible treatment techniques in modern medicine, they have been demonstrated to treat various illnesses and conditions. Therefore, while it is unlikely for TCM to be a suitable standalone option for disease management, it does have its place in a treatment plan with potential applications alongside Western medicine. Utilizing TCM as a complement to Western medicine presents hope in increasing the effectiveness of healthcare treatment.

References

  1. Yin and Yang Theory. Acupuncture Today. http://www.acupuncturetoday.com/abc/yinyang.php (accessed Dec. 15, 2016).
  2. Lao, L. et al. Integrative pediatric oncology. 2012, 125-135.
  3. The Conceptual Differences between Chinese and Western Medicine. http://www.mosherhealth.com/mosher-health-system/chinese-medicine/chinese-versus-western (accessed Dec. 15, 2016).
  4. How Acupuncture Can Relieve Pain and Improve Sleep, Digestion, and Emotional Well-being. http://cim.ucsd.edu/clinical-care/acupuncture.shtml (accessed Dec. 15, 2016).
  5. Vickers, A J. et al. Arch of Internal Med. 2012, 172, 1444-1453.
  6. Melchart, D. et al. Bmj. 2005, 331, 376-382.
  7. Meng, Z. et al. Cancer. 2012, 118, 3337-3344.
  8. Feng, Y. et al. Magnetic resonance imaging. 2012, 30, 672-682.
  9. Torres-Rosas, R. et al. Nature medicine. 2014, 20, 291-295.
  10. Qu, Z. et al. Oncotarget. 2016, 7, 66003-66019.
  11. Bnouham, M. et al. Int. J. of Diabetes and Metabolism. 2006, 14, 1.
  12. Bensoussan, A. et al. Jama. 1998, 280, 1585-1589.
  13. Jiang, W. Trends in pharmacological sciences. 2005, 26, 558-563.
  14. Combining Chinese, Western medicine could lead to new cancer treatments. https://www.sciencedaily.com/releases/2013/09/130928091021.htm (accessed Dec. 15, 2016).
  15. Yuan, R.; Yuan L. Pharmacology & therapeutics. 2000, 86, 191-198.

Comment

Transplanting Time

Comment

Transplanting Time

Nowadays, it is possible for patients with organ failure to live for decades after receiving an organ transplant. Since the first successful kidney transplant in the 1950s,1,2 advances in the procedure, including the improvement of drugs that facilitate acceptance of the foreign body parts,3 have allowed surgeons to transplant a wider variety of organs, such as the heart, lungs, liver, and pancreas.2,4 Over 750,000 lives have been saved and extended through the use of organ transplants, an unthinkable feat just over 50 years ago.2 Limitations to organ transplantation, such as the lack of available organs, and the development of new advancements that can improve the process promote ongoing discussion regarding the ethics of transplants.

The idea behind an organ transplant is simple. When both the recipient and the new organ are ready, surgeons detach the blood vessels attached to the failing organ before putting the new one in its place by reattaching the patient’s blood vessels to the functioning organ. To prevent rejection of the new organ, the recipient will continue to take immunosuppressant drugs3. In exchange for this lifelong commitment, the patient often receives a longer, more enjoyable life.2

The organs used in transplants usually originate from a cadaver or a living donor.1-3 Some individuals are deterred from becoming an organ donor because they are concerned that doctors will not do their best to save them if their organs are needed. This concern is further complicated by blurred definitions of “dead”; in one ethically ambiguous situation, dying patients who are brain dead may be taken off of life support so that their organs may be donated.1-3 Stories of patients who reawaken from comas after being pronounced “dead” may give some encouragement, but a patient’s family and doctors must decide when to give up that hope. Aside from organs received from the deceased, living donors, who may be family, friends, or strangers to the recipient, may donate organs that they can live without, such as a lung or a kidney.1-3 However, the potential injuring of a healthy person for the sake of another may contradict the oath that doctors take, which instructs physicians to help, not harm their patients.1

One of the most pressing issues today stems from the following question: who receives the organs? The transplant waiting list is constantly growing because the number of organs needed greatly exceeds the number of organs that are available.1-3 Unfortunately, 22 patients die every day while they are waiting for a new organ.4 Because the issue of receiving a transplant is time-sensitive, medical officials must decide who receives a transplant first. Should the person who needs a transplant the most have greater priority over another who has been on the waiting list longer? Should a child be eligible before a senior? Should a lifelong smoker be able to obtain a new lung? Currently, national policy takes different factors into account depending on the organ to be transplanted. For example, other than compatibility requirements, patients on the waiting list for liver transplants are ranked solely on their medical need and distance from the donor hospital.4 On the other hand, people waiting for kidneys are further considered based on whether they have donated a kidney previously, their age, and their time spent on the waiting list.4

Despite various efforts to increase the number of organ donors through education and legislation, the supply of organs does not meet the current and increasing need for them.1-3 As a result, other methods of obtaining these precious resources are currently being developed, one of which is the use of animal organs, a process known as xenotransplantation. Different animal cells, tissues, and organs are being researched for use in humans, giving some hope to those on the waiting list or those who do not quite qualify for a transplant.2,3 In the past, surgeons have attempted to use a primate’s heart and liver for transplantation, but the surgical outcomes were poor.2 Other applications of animal tissue are more promising, such as the use of pigs’ islet cells, which can produce insulin, in humans.2 However, a considerable risk of using these animal parts is that new diseases may be passed from animal to human. Additionally, animal rights groups have protested the use of primates as a source of whole organs.2

Another possible solution to the deficit of organs is the use of stem cells, which have the potential to grow and specialize. Embryonic stem cells can repair and regenerate damaged organs, but harvesting them destroys the source embryo.2,3 Although the embryos are created outside of humans, there are objections to their use. What differentiates a mass of cells from a living person? Fortunately, adult stem cells can be used for treatment as well.2 Researchers have developed a new method that causes adult stem cells to return to a state similar to that of the embryonic stem cells, although the efficacy of the induced adult stem cells compared to the embryonic stem cells is still unclear.7

Regardless of the continuous controversy over the ethics of transplantation, the boundaries for organ transplants are being pushed further and further. Head transplants have been attempted for over a century in other animals, such as dogs,5 but several doctors want to move on to work with humans. To attach a head to a new body, the surgeon would need to connect the old and new nerves in the spinal cord so that the patient’s brain could interact with the host body. Progress is already being made in repairing severe spinal cord injuries. In China, Dr. Ren Xiaoping plans to attempt a complete body transplant, believed by some to be currently impossible.6 There is not much information about the amount of pain that the recipient of a body transplant must endure,5 so it may ultimately decrease, rather than increase, the patient’s quality of life. Overall, most agree that it would be unethical to continue, considering the limited success of such projects and the high chance of failure and death.

Organ transplants and new developments in the field have raised many interesting questions about the ethics of the organ transplantation process. As a society, we should determine how to address these problems and set boundaries to decide what is “right.”

References

  1. Jonsen, A. R. Virtual Mentor. 2012, 14, 264-268.
  2. Abouna, G. M. Med. Princ. Prac. 2002, 12, 54-69.
  3. Paul, B. et al. Ethics of Organ Transplantation. University of Minnesota Center for Bioethics [Online], February 2004 http://www.ahc.umn.edu/img/assets/26104/Organ_Transplantation.pdf (accessed Nov. 4, 2016)
  4. Organ Procurement and Transplantation Network. https://optn.transplant.hrsa.gov/ (accessed Nov. 4 2016)
  5. Lamba, N. et al. Acta Neurochirurgica. 2016.
  6. Tatlow, D. K. Doctor’s Plan for Full-Body Transplants Raises Doubts Even in Daring China. The New York Times. http://www.nytimes.com/2016/06/12/world/asia/china-body-transplant.html?_r=0 (accessed Nov. 4, 2016)
  7. National Institutes of Health. stemcells.nih.gov/info/basics/6.htm (accessed Jan. 23, 2017)

 

Comment

Molecular Mechanisms Behind Alzheimer’s Disease and Epilepsy

1 Comment

Molecular Mechanisms Behind Alzheimer’s Disease and Epilepsy

Abstract

Seizures are characterized by periods of high neuronal activity and are caused by alterations in synaptic function that disrupt the equilibrium between excitation and inhibition in neurons. While often associated with epilepsy, seizures can also occur after brain injuries and interestingly, are common in Alzheimer’s patients. While Alzheimer’s patients rarely show the common physical signs of seizures, recent research has shown that electroencephalogram (EEG) technology can detect nonconvulsive seizures in Alzheimer’s patients. Furthermore, patients with Alzheimer’s have a 6- to 10-fold increase in the probability of developing seizures during the course of their disease compared to healthy controls.2 While previous research has focused on the underlying molecular mechanisms of Aβ tangles in the brain, the research presented here relates seizures to the cognitive decline in Alzheimer’s patients in an attempt to find therapeutic approaches that tackle both epilepsy and Alzheimer’s.

Introduction

The hippocampus is found in the temporal lobe and is involved in the creation and consolidation of new memories. It is the first part of the brain to undergo neurodegeneration in Alzheimer’s disease, and as such, the disease is characterized by memory loss. Alzheimer’s is different than other types of dementia because patients’ episodic memories are affected strongly and quickly. Likewise, patients who suffer from epilepsy also exhibit neurodegeneration in their hippocampi and have impaired episodic memories. Such similarities led researchers to hypothesize that the two diseases have the same pathophysiological mechanisms. In one study, four epileptic patients exhibited progressive memory loss that clinically resembled Alzheimer’s disease.6 In another study, researchers found that seizures precede cognitive symptoms in late-onset Alzheimer’s disease.7 This led researchers to hypothesize that a high incidence of seizures increases the rate of cognitive decline in Alzheimer’s patients. However, much is yet to be discovered about the molecular mechanisms underlying seizure activity and cognitive impairments.

Amyloid precursor protein (APP) is the precursor molecule to Aβ, the polypeptide that makes up the Aβ plaques found in the brains of Alzheimer’s patients. In many Alzheimer’s labs, the J20 APP mouse model of disease is used to simulate human Alzheimer’s. These mice overexpress the human form of APP, develop amyloid plaques, and have severe deficits in learning and memory. The mice also have high levels of epileptiform activity and exhibit spontaneous seizures that are characteristic of epilepsy.11 Understanding the long-lasting effects of these seizures is important in designing therapies for a disease that is affected by recurrent seizures. Thus, comparing the APP mouse model of disease with the temporal lobe epilepsy (TLE) mouse model is essential in unraveling the mysteries of seizures and cognitive decline.

Shared Pathology of the Two Diseases

The molecular mechanisms behind the two diseases are still unknown and under much research. An early observation in both TLE and Alzheimer’s involved a decrease in calbindin-28DK, a calcium buffering protein, in the hippocampus.10 Neuronal calcium buffering and calcium homeostasis are well-known to be involved in learning and memory. Calcium channels are involved in synaptic transmission, and a high calcium ion influx often results in altered neuronal excitability and calcium signaling. Calbindin acts as a buffer for binding free Ca2+ and is thus critical to calcium homeostasis.

Some APP mice have severe seizures and an extremely high loss of calbindin, while other APP mice exhibit no loss in calbindin. The reasons behind this is unclear, but like patients, mice are also very variable.

The loss of calbindin in both Alzheimer’s and TLE is highly correlated with cognitive deficits. However, the molecular mechanism behind the calbindin loss is unclear. Many researchers are now working to uncover this mechanism in the hopes of preventing the calbindin loss, thereby improving therapeutic avenues for Alzheimer’s and epilepsy patients.

Seizures and Neurogenesis

The dentate gyrus is one of the two areas of the adult brain that exhibit neurogenesis.13 Understanding neurogenesis in the hippocampus can lead to promising therapeutic targets in the form of neuronal replacement therapy. Preliminary research in Alzheimer’s and TLE has shown changes in neurogenesis over the course of the disease.14 However, whether neurogenesis is increased or decreased remains a controversial topic, as studies frequently contradict each other.

Many researchers study neurogenesis in the context of different diseases. In memory research, neurogenesis is thought to be involved in both memory formation and memory consolidation.12 Alzheimer’s leads to the gradual decrease in the generation of neural progenitors, the stem cells that can differentiate to create a variety of different neuronal and glial cell types.8 Further studies have shown that the neural stem cell pool undergoes accelerated depletion due to seizure activity.15 Initially, heightened neuronal activity stimulates neural progenitors to divide rapidly at a much faster rate than controls. This rapid division depletes the limited stem cell pool prematurely. Interestingly enough, this enhanced neurogenesis is detected long before other AD-linked pathologies. When the APP mice become older, the stem cell pool is depleted to a point where neurogenesis occurs much slower compared to controls.9 This is thought to represent memory deficits, in that the APP mice can no longer consolidate new memories as effectively. The same phenomenon occurs in mice with TLE.

The discovery of this premature neurogenesis in Alzheimer’s disease has many therapeutic benefits. For one, enhanced neurogenesis can be used as a marker for Alzheimer’s long before any symptoms are present. Furthermore, targeting increased neurogenesis holds potential as a therapeutic avenue, leading to better remedies for preventing the pathological effects of recurrent seizures in Alzheimer’s disease.

Conclusion

Research linking epilepsy with other neurodegenerative disorders is still in its infancy, and leaves many researchers skeptical about the potential to create a single therapy for multiple conditions. Previous EEG studies recorded Alzheimer’s patients for a few hours at a time and found limited epileptiform activity; enhanced overnight technology has shown that about half of Alzheimer’s patients have epileptiform activity in a 24-hour period, with most activity occurring during sleep1. Recording patients for even longer periods of time will likely raise this percentage. Further research is being conducted to show the importance of seizures in enhancing cognitive deficits and understanding Alzheimer’s disease, and could lead to amazing therapeutic advances in the future.

References

  1. Vossel, K. A. et. al. Incidence and Impact of Subclinical Epileptiform Activity. Ann Neurol. 2016.
  2. Pandis, D. Scarmeas, N. Seizures in Alzheimer Disease: Clinical and Epidemiological Data. Epilepsy Curr. 2012. 12(5), 184-187.
  3. Chin, J. Scharfman, H. Shared cognitive and behavioral impairments in epilepsy and Alzheimer’s disease and potential underlying mechanisms. Epilepsy & Behavior. 2013. 26, 343-351.
  4. Carter, D. S. et. al. Long-term decrease in calbindin-D28K expression in the hippocampus of epileptic rats following pilocarpine-induced status epilepticus. Epilepsy Res. 2008. 79(2-3), 213-223.
  5. Jin, K. et. al. Increased hippocampal neurogenesis in Alzheimer’s Disease. Proc Natl Acad Sci. 2004. 101(1), 343-347.
  6. Ito, M., Echizenya, N., Nemoto, D., & Kase, M. (2009). A case series of epilepsy-derived memory impairment resembling Alzheimer disease. Alzheimer Disease and Associated Disorders, 23(4), 406–409.
  7. Picco, A., Archetti, S., Ferrara, M., Arnaldi, D., Piccini, A., Serrati, C., … Nobili, F. (2011). Seizures can precede cognitive symptoms in late-onset Alzheimer’s disease. Journal of Alzheimer’s Disease: JAD, 27(4), 737–742.
  8. Zeng, Q., Zheng, M., Zhang, T., & He, G. (2016). Hippocampal neurogenesis in the APP/PS1/nestin-GFP triple transgenic mouse model of Alzheimer’s disease. Neuroscience, 314, 64–74. https://doi.org/10.1016/j.neuroscience.2015.11.05
  9. Lopez-Toledano, M. A., Ali Faghihi, M., Patel, N. S., & Wahlestedt, C. (2010). Adult neurogenesis: a potential tool for early diagnosis in Alzheimer’s disease? Journal of Alzheimer’s Disease: JAD, 20(2), 395–408. https://doi.org/10.3233/JAD-2010-1388
  10. Palop, J. J., Jones, B., Kekonius, L., Chin, J., Yu, G.-Q., Raber, J., … Mucke, L. (2003). Neuronal depletion of calcium-dependent proteins in the dentate gyrus istightly linked to Alzheimer’s disease-related cognitive deficits. Proceedings of the National Academy of Sciences of the United States of America, 100(16), 9572–9577. https://doi.org/10.1073/pnas.1133381100
  11. Research Models: J20. AlzForum: Networking for a Cure.
  12. Kitamura, T. Inokuchi, K. (2014). Role of adult neurogenesis in hippocampal-cortical memory consolidation. Molecular Brain 7:13. 10.1186/1756-6606-7-13.
  13. Piatti, V. Ewell, L. Leutgeb, J. Neurogenesis in the dentate gyrus: carrying the message or dictating the tone. Frontiers in Neuroscience 7:50. doi: 10.3389/fnins.2013.00050
  14. Noebels, J. (2011). A Perfect Storm: Converging Paths of Epilepsy and Alzheimer’s Dementia Intersect in the Hippocampal Formation. Epilepsia 52, 39-46. doi:  10.1111/j.1528-1167.2010.02909.x
  15. Jasper, H.; et.al. In Jasper’s Basic Mechanisms of the Epilepsies, 4; Rogawski, M., et al., Eds.; Oxford University Press: USA, 2012

1 Comment

Detection of Gut Inflammation and Tumors Using Photoacoustic Imaging

Comment

Detection of Gut Inflammation and Tumors Using Photoacoustic Imaging

Abstract:

Photoacoustic imaging is a technique in which contrast agents absorb photon energy and emit signals that can be analyzed by ultrasound transducers. This method allows for unprecedented depth imaging that can provide a non-invasive alternative to current diagnostic tools used to detect internal tissue inflammation.1 The Rice iGEM team strove to use photoacoustic technology and biomarkers to develop a noninvasive method of locally detecting gut inflammation and colon cancer. As a first step, we genetically engineered Escherichia coli to express near-infrared fluorescent proteins iRFP670 and iRFP713 and conducted tests using biomarkers to determine whether expression was confined to a singular local area.

Introduction:

In photoacoustic imaging, laser pulses of a specific, predetermined wavelength (the excitation wavelength) activate and thermally excite a contrast agent such as a pigment or protein. The heat makes the contrast agent contract and expand producing an ultrasonic emission wavelength longer than the excitation wavelength used. The emission wavelength data are used to produce 2D or 3D images of tissues that have high resolution and contrast.2

The objective of this photoacoustic imaging project is to engineer bacteria to produce contrast agents in the presence of biomarkers specific to gut inflammation and colon cancer and ultimately to deliver the bacteria into the intestines. The bacteria will produce the contrast agents in response to certain biomarkers and lasers will excite the contrast agents, which will emit signals in local, targeted areas, allowing for a non-invasive imaging method. Our goal is to develop a non-invasive photoacoustic imaging delivery method that uses engineered bacteria to report gut inflammation and identify colon cancer. To achieve this, we constructed plasmids that have a nitric-oxide-sensing promoter (soxR/S) or a hypoxia-sensing promoter (narK or fdhf) fused to genes encoding near-infrared fluorescent proteins or violacein with emission wavelengths of 670 nm (iRFP670) and 713 nm (iRFP713). Nitric oxide and hypoxia, biological markers of gut inflammation in both mice and humans, would therefore promote expression of the desired iRFPs or violacein.3,4

Results and Discussion

Arabinose

To test the inducibility and detectability of our iRFPs, we used pBAD, a promoter that is part of the arabinose operon located in E. coli.5 We formed genetic circuits consisting of the pBAD expression system and iRFP670 and iRFP713 (Fig. 1a). AraC, a constitutively produced transcription regulator, changes form in the presence of arabinose sugar, allowing for the activation of the pBAD promoter.

CT Figure 1b.jpg

Fluorescence levels emitted by the iRFPs increased significantly when placed in wells containing increasing concentrations of arabinose (Figure 2). This correlation suggests that our selected iRFPs fluoresce sufficiently when promoters are induced by environmental signals. The results of the arabinose assays showed that we successfully produced iRFPs; the next steps were to engineer bacteria to produce the same iRFPs under nitric oxide and hypoxia.

Nitric Oxide

The next step was to test the nitric oxide induction of iRFP fluorescence. We used a genetic circuit consisting of a constitutive promoter and the soxR gene, which in turn expresses the SoxR protein (Figure 1b). In the presence of nitric oxide, SoxR changes form to activate the promoter soxS, which activates the expression of the desired gene. The source of nitric oxide added to our engineered bacteria samples was diethylenetriamine/nitric oxide adduct (DETA/NO).

Figure 3 shows no significant difference of fluorescence/OD600 between DETA/NO concentrations. This finding implies that our engineered bacteria were unable to detect the nitric oxide biomarker and produce iRFP; future troubleshooting includes verifying promoter strength and correct sample conditions. Furthermore, nitric oxide has an extremely short half-life of a few seconds, which may not be enough time for most of the engineered bacteria to sense the nitric oxide, limiting iRFP production and fluorescence.

CT Figure 1c.jpg

Hypoxia

We also tested the induction of iRFP fluorescence with the hypoxia-inducible promoters narK and fdhf. We expected iRFP production and fluorescence to increase when using the narK and fdhf promoters in anaerobic conditions (Figure 1c and d).

However, we observed the opposite result. A decreased fluorescence for both iRFP constructs in both promoters was measured when exposed to hypoxia (Figure 4). This finding suggests that our engineered bacteria were unable to detect the hypoxia biomarker and produce iRFP; future troubleshooting includes verifying promoter strength and correct sample conditions.

Future Directions

Further studies include testing the engineered bacteria co-cultured with colon cancer cells and developing other constructs that will enable bacteria to sense carcinogenic tumors and make them fluoresce for imaging and treatment purposes.

Violacein has anti-cancer therapy potential

Violacein is a fluorescent pigment for in vivo photoacoustic imaging in the near-infrared range and shows anti-tumoral activity6. It has high potential for future work in bacterial tumor targeting. We have succeeded in constructing violacein using Golden Gate shuffling7 and intend to use it in experiments such as the nitric oxide and hypoxia assays we used for iRFP670 and 713.

Invasin can allow for targeted cell therapy

Using a beta integrin called invasin, certain bacteria are able to invade mammalian cells.8-9 If we engineer E. coli that have the beta integrin invasion as well as the genetic circuits capable of sensing nitric oxide and/or hypoxia, we can potentially allow the E. coli to invade colon cells and release contrast agents for photoacoustic imaging or therapeutic agents such as violacein only in the presence of specific biomarkers.10 Additionally, if we engineer the bacteria that exhibit invasin to invade colon cancer cells only and not normal cells, then this approach would potentially allow for a localized targeting and treatment of cancerous tumors. This design allows us to create scenarios with parameters more similar to the conditions observed in the human gut as we will be unable to test our engineered bacteria in an actual human gut.

Acknowledgements:

The International Genetically Engineered Machine (iGEM) Foundation (igem.org) is an independent, non-profit organization dedicated to education and competition, the advancement of synthetic biology, and the development of an open community and collaboration.

This project would not have been possible without the patient instruction and generous encouragement of our Principal Investigators (Dr. Beth Beason-Abmayr and Dr. Jonathan Silberg, BioSciences at Rice), our graduate student advisors and our undergraduate team. I would also like to thank our iGEM collaborators.

This work was supported by the Wiess School of Natural Sciences and the George R. Brown School of Engineering and the Departments of BioSciences, Bioengineering, and Chemical and Biomolecular Engineering at Rice University; Dr. Rebecca Richards-Kortum, HHMI Pre-College and Undergraduate Science Education Program Grant #52008107; and Dr. George N. Phillips, Jr., Looney Endowment Fund.

If you would like to know more information about our project and our team, please visit our iGEM wiki at 2016.igem.org/Team:Rice.

References

  1. Ntziachristos, V. Nat Methods. 2010, 7, 603-614.
  2. Weber, J. et al. Nat Methods. 2016, 13, 639-650.
  3. Archer, E. J. et al. ACS Synth. Biol. 2012, 1, 451–457.
  4. Hӧckel, M.; Vaupel, P. JNCI J Natl Cancer Inst. 2001, 93, 266−276.
  5. Guzman, L. M. et al. J of Bacteriology. 1995, 177, 4121-4130.
  6. Shcherbakova, D. M.; Verkhusha, V. V. Nat Methods. 2013, 10, 751-754.
  7. Engler, C. et al. PLOS One. 2009, 4, 1-9.
  8. Anderson, J. et al. Sci Direct. 2006, 355, 619–627
  9. Arao, S. et al. Pancreas. 2000, 20, 619-627.
  10. Jiang, Y. et al. Sci Rep. 2015, 19, 1-9.

Comment

Venom, M.D.: How Some of the World’s Deadliest Toxins Fight Cancer

Comment

Venom, M.D.: How Some of the World’s Deadliest Toxins Fight Cancer

Nature, as mesmerizing as it can be, is undeniably hostile. There are endless hazards, both living and nonliving, scattered throughout all parts of the planet. At first glance, the world seems to be quite unwelcoming. Yet through science, humans find ways to survive nature and gain the ability to see its beauty. A fascinating way this is achieved involves taking one deadly element of nature and utilizing it to combat another. In labs and universities across the world today, scientists are fighting one of the world’s most devastating diseases, cancer, with a surprising weapon: animal toxins.

Various scientists around the globe are collecting venomous or poisonous animals and studying the biochemical weapons they synthesize. In their natural forms, these toxins could kill or cause devastating harm to the human body. However, by closely inspecting the chemical properties of these toxins, we have uncovered many potential ways they could help us understand, treat, and cure various diseases. These discoveries have shed a new light on many of the deadly animals we have here on Earth. Mankind may have gained new friends—ones that could be crucial to our survival against cancer and other illnesses.

Take the scorpion, for example. This arachnid exists in hundreds of forms across the globe. Although its stinger is primarily used for killing prey, it is often used for defense against other animals, including humans. Most cases of scorpion stings result in nothing more than pain, swelling, and numbness of the area. However, there are some species of scorpions that are capable of causing more severe symptoms, including death.1 One such species, Leiurus quinquestriatus (more commonly known as the “deathstalker scorpion”), is said to contain some of the most potent venoms on the planet.2 Yet despite its potency, deathstalker venom is a prime target for cancer research. One team of scientists from the University of Washington used the chlorotoxin in the venom to assist in gene therapy (the insertion of genes to fight disease) to combat glioma, a widespread and fatal brain cancer. Chlorotoxin has two important properties that make it effective against fighting glioma. First, it selectively binds to a surface protein found on many tumour cells. Second, chlorotoxin is able to inhibit the spread of tumours by disabling their metastatic ability. The scientists combined the toxin with nanoparticles in order to increase the effectiveness of gene therapy. 3 4

Other scientists found a different way to treat glioma using deathstalker venom. Researchers at the Transmolecular Corporation in Cambridge, Massachusetts produced an artificial version of the venom and attached it to a radioactive form of iodine, I-131. The resultant compound was able to find and kill glioma cells by releasing radiation, most of which was absorbed by the cancerous cells. 5 There are instances of other scorpion species aiding in cancer research as well, such as the Centruroides tecomanus scorpion in Mexico. This species’ toxin contains peptides that have the ability to specifically target lymphoma cells and kill them by damaging their ion channels. The selective nature of the peptides makes them especially useful as a cancer treatment as they leave healthy cells untouched.6

Scorpions have demonstrated tremendous medical potential, but they are far from the only animals that could contribute to the fight against cancer. Another animal that may help us overcome this disease is the wasp. To most people, wasps are nothing more than annoying pests that disturb our outdoor life. Wasps are known for their painful stings, which they use both for defense and for hunting. Yet science has shown that the venom of these insects may have medicinal properties. Researchers from the Institute for Biomedical Research in Barcelona investigated a peptide found in wasp venom for its ability to treat breast cancer. The peptide is able to kill cancer cells by puncturing the cell’s outer wall. In order to make this peptide useful in treatment, it must be able to target cancer cells specifically. Scientists overcame the specificity problem by conjugating the venom peptide with a targeting peptide specific to cancer cells.7 Similar techniques were used in Brazil while scientists of São Paulo State University studied the species Polybia paulista, another organism from the wasp family. This animal’s venom contains MP1, which also serves as a destructive agent of the cell’s plasma membrane. When a cell is healthy, certain components of the membrane should be on the inner side of the membrane, facing the interior of the cell. However, in a cancerous cell, these components, (namely, the phospholipids phosphatidylserine (PS) and phosphatidylethanolamine (PE) ) are on the outer side of the membrane. In a series of simulations, MP1 was observed to selectively and aggressively target membranes that had PS and PE on the outside of the cell. Evidently, using targeted administration of wasp toxins is a viable method to combat cancer.8

Amazingly enough, the list of cancer-fighting animals at our disposal does not end here. One of the most feared creatures on Earth, the snake, is also among the animals under scientific investigation for possible medical breakthroughs. One group of scientists discovered that a compound from the venom of the Southeast Asia pit viper (Calloselasma rhodastoma) binds to a platelet receptor protein called CLEC-2, causing clotting of the blood. A different molecule expressed by cancer cells, podoplanin, binds to CLEC-2 in a manner similar to the snake venom, also causing blood clotting. Why does this matter? In the case of cancer, tumors induce blood clots to protect themselves from the immune system, allowing them to grow freely. They also induce the formation of lymphatic vessels to assist their survival. The interaction between CLEC-2 and podoplanin is vital for for both the formation of these blood clots and lymphatic vessels, and is thus critical to the persistence of tumors. If a drug is developed to inhibit this interaction, it would be very effective in cancer treatment and prevention.9 Research surrounding the snake venom may help us develop such an inhibitor. .

Even though there may be deadly animals roaming the Earth, it is important to remember that they have done more for us than most people realize. So next time you see a scorpion crawling around or a wasp buzzing in the air, react with appreciation, rather than with fear. Looking at our world in this manner will make it seem like a much friendlier place to live.

References

  1. Mayo Clinic. http://www.mayoclinic.org/diseases-conditions/scorpion-stings/home/ovc-20252158 (accessed Oct. 29, 2016).
  2. Lucian K. Ross. Leiurus quinquestriatus (Ehrenberg, 1828). The Scorpion Files, 2008. http://www.ntnu.no/ub/scorpion-files/l_quinquestriatus_info.pdf (accessed Nov. 3, 2016).
  3. Kievit F.M. et al. ACS Nano, 2010, 4, (8), 4587–4594.
  4. University of Washington. "Scorpion Venom With Nanoparticles Slows Spread Of Brain Cancer." ScienceDaily. ScienceDaily, 17 April 2009. <www.sciencedaily.com/releases/2009/04/090416133816.htm>.
  5. Health Physics Society. "Radioactive Scorpion Venom For Fighting Cancer." ScienceDaily. ScienceDaily, 27 June 2006. <www.sciencedaily.com/releases/2006/06/060627174755.htm>.
  6. Investigación y Desarrollo. "Scorpion venom is toxic to cancer cells." ScienceDaily. ScienceDaily, 27 May 2015. <www.sciencedaily.com/releases/2015/05/150527091547.htm>.
  7. Moreno M. et al. J Control Release, 2014, 182, 13-21.
  8. Leite N.B. et al. Biophysical Journal, 2015, 109, (5), 936-947.
  9. Suzuki-Inoue K. et al. Journal of Biological Chemistry, 2010, 285, 24494-24507.

 

Comment

Biological Bloodhounds: Sniffing Out Cancer

Comment

Biological Bloodhounds: Sniffing Out Cancer

50 years ago, doctors needed to see cancer to diagnose it - and by then, it was usually too late to do anything about it. Newer tests have made cancer detection easier and more precise, but preventable cases continue to slip through the cracks, often with fatal consequences. However, a new test has the potential to stop these types of missed diagnoses--it can detect cancer from a single drop of blood, and it may finally allow us to ensure patients receive care when they need it.

Blood platelets are a major component of blood, best known for their ability to stop bleeding by clotting injured blood vessels. However, blood platelets are far more versatile than previously understood. When cancer is formed in the human body, the tumors shed molecules such as proteins and RNA directly into the bloodstream. The blood platelets come in contact with these shed molecules and will absorb them. This results in an alteration of the blood platelets’ own RNA. Persons with cancer will therefore have blood platelets that contain information about the specific cancer present. These “educated” blood platelets are called tumor educated platelets, or TEPs. Recently, TEPs have been used to aid in the detection of specific cancers, and even to identify their locations.1

In a recent study, a group of scientists investigated how TEPs could be used to diagnose cancer. The scientists took blood platelets from healthy individuals and from those with either advanced or early stages of six different types of cancer and compared their blood platelet RNA. While doing so, the researchers found that those with cancer had different amounts of certain platelet RNA molecules. For example, the scientists discovered that the levels of dozens of specific non-protein coding RNAs were altered in patients who had TEPs. The further analysis of hundreds of different RNA levels, from the nearly 300 patients in the study, enabled the scientists to distinguish a cancer-associated RNA profile from a healthy one. Using these results, the team created an algorithm that could classify if someone did or did not have cancer with up to 96% accuracy.1

Not only could the TEPs distinguish between healthy individuals and those with a specific type of cancer, but they could also identify the location of the cancer. The patients in the study had one of six types of cancer: non-small-cell lung cancer, breast cancer, pancreatic cancer, colorectal cancer, glioblastoma, or hepatobiliary cancer. The scientists analyzed the specific TEPs associated with the specific types of cancer and created an algorithm to predict tumor locations. The TEP-trained algorithm correctly identified the location of these six types of cancer 71% of the time.1

The authors of the study noted that this is the first bloodborne factor that can diagnose cancer and pinpoint the location of primary tumors. It is possible that in the near future, TEP-based tests could lead to a new class of extremely accurate liquid biopsies. Nowadays, many cancer tests are costly, invasive, or painful. For example, lung cancer tests require an X-ray, sputum cytology examination, or tissue sample biopsy. X-rays and sputum cytology must be performed after symptoms present, and can often have misleading results. Biopsies are more accurate, but are also highly painful and relatively dangerous. TEP-based blood tests have the potential to both obviate the need for these techniques and provide more granular, clinically useful information. They can be performed before symptoms are shown, at low cost, and with minimal patient discomfort, making them an ideal choice to interdict a growing tumor early.

The information that TEPs have revealed has opened a gate to many potential breakthroughs in the detection of cancer. With high accuracy and an early detection time, cancer blood tests have the potential to save many lives in the future.

References

  1. Best, M. et al. Cancer Cell 2015 28, 676
  2. Marquedant, K. "Tumor RNA within Platelets May Help Diagnose and Classify Cancer, Identify Treatment Strategies." Massachusetts General Hospital. 

Comment

Graphene Nanoribbons and Spinal Cord Repair

Comment

Graphene Nanoribbons and Spinal Cord Repair

The same technology that has been used to strengthen polymers1, de-ice helicopter wings2, and create more efficient batteries3 may one day help those with damaged or even severed spinal cords walk again. The Tour Lab at Rice University, headed by Dr. James Tour, is harnessing the power of graphene nanoribbons to create a special new material called Texas-PEG that may revolutionize the way we treat spinal cord injuries; one day, it may even make whole body transplants a reality.

Dr. Tour, the T.T. and W.F. Chao Professor of Chemistry, Professor of Materials Science and NanoEngineering, and Professor of Computer Science at Rice University, is a synthetic organic chemist who mainly focuses on nanotechnology. He currently holds over 120 patents and has published over 600 papers, and was inducted into the National Academy of Inventors in 2015.4 His lab is currently working on several different projects, such as investigating various applications of graphene, creating and testing nanomachines, and the synthesizing and imaging of nanocars. The Tour Lab first discovered graphene nanoribbons while working with graphene back in 2009.5 Their team found a way to “unzip” graphene nanotubes into smaller strips called graphene nanoribbons by injecting sodium and potassium atoms between nanotube layers in a nanotube stack until the tube split open. “We fell upon the graphene nanoribbons,” says Dr. Tour. “I had seen it a few years ago in my lab but I didn’t believe it could be done because there wasn’t enough evidence. When I realized what we had, I knew it was enormous.”

This discovery was monumental: graphene nanoribbons have been used in a variety of different applications because of their novel characteristics. Less than 50 nm wide ( which is about the width of a virus), graphene nanoribbons are 200 times stronger than steel and are great conductors of heat and electricity. They can be used to make materials significantly stronger or electrically conductive without adding much additional weight. It wasn’t until many years after their initial discovery, however, that the lab discovered that graphene nanoribbons could be used to heal severed spinal cords.

The idea began after one of Dr. Tour’s students read about European research on head and whole body transplants on Reddit. This research was focused on taking a brain dead patient with a healthy body and pairing them with someone who has brain activity but has lost bodily function. The biggest challenge, however, was melding the spine together. The neurons in the two separated parts of the spinal cord could not communicate with one another, and as a result, the animals involved with whole body and head transplant experiments only regained about 10% of their original motor function. The post-graduate student contacted the European researchers, who then proposed using the Tour lab’s graphene nanoribbons in their research, as Dr. Tour’s team had already proven that neurons grew very well along graphene.

“When a spinal cord is severed, the neurons grow from the bottom up and the top down, but they pass like ships in the night; they never connect. But if they connect, they will be fused together and start working again. So the idea was to put very thin nanoribbons in the gap between the two parts of the spinal cord to get them to align,” explains Dr. Tour. Nanoribbons are extremely conductive, so when their edges are activated with polyethylene glycol, or PEG, they form an active network that allows the spinal cord to reconnect. This material is called Texas-PEG, and although it is only about 1% graphene nanoribbons, this is still enough to create an electric network through which the neurons in the spinal cord can connect and communicate with one another.

The Tour lab tested this material on rats by severing their spinal cords and then using Texas-PEG to see how much of their mobility was recovered. The rats scored about 19/21 on a mobility scale after only 3 weeks, a remarkable advancement from the 10% recovery in previous European trials. “It was just phenomenal. There were rats running away after 3 weeks with a totally severed spinal cord! We knew immediately that something was happening because one day they would touch their foot and their brain was detecting it,” says Dr. Tour. The first human trials will begin in 2017 overseas. Due to FDA regulations, it may be awhile before we see trials in the United States, but the FDA will accept data from successful trials in other countries. Graphene nanoribbons may one day become a viable treatment option for spinal injuries.

This isn’t the end of Dr. Tour’s research with graphene nanoribbons. “We’ve combined our research with neurons and graphene nanoribbons with antioxidants: we inject antioxidants into the bloodstream to minimize swelling. All of this is being tested in Korea on animals. We will decide on an optimal formulation this year, and it will be tried on a human this year,” Dr. Tour explained. Most of all, Dr. Tour and his lab would like to see their research with graphene nanoribbons used in the United States to help quadriplegics who suffer from limited mobility due to spinal cord damage. What began as a lucky discovery now has the potential to change the lives of thousands.

References

  1. Wijeratne, Sithara S., et al. Sci. Rep. 2016, 6.
  2. Raji, Abdul-Rahman O., et al. ACS Appl. Mater. Interfaces. 2016, 8 (5), 3551-3556.
  3. Salvatierra, Rodrigo V., et al. Adv. Energy Mater. 2016, 6 (24).
  4. National Academy of Inventors. http://www.academyofinventors.org/ (accessed Feb. 1, 2017).
  5. Zehtab Yazdi, Alireza, et al. ACS Nano. 2015, 9 (6), 5833-5845.

Comment

Microbes: Partners in Cancer Research

Comment

Microbes: Partners in Cancer Research

To millions around the world, the word ‘cancer’ evokes emotions of sorrow and fear. For decades, scientists around the world have been trying to combat this disease, but to no avail. Despite the best efforts of modern medicine, about 46% of patients diagnosed with cancer still pass away as a direct result of the disease.1 However, the research performed by Dr. Michael Gustin at Rice University may change the field of oncology forever.

Cancer is a complex and multifaceted disease that is currently not fully understood by medical doctors and scientists. Tumors vary considerably between different types of cancers and from patient to patient, further complicating the problem. Understanding how cancer develops and responds to stimuli is essential to producing a viable cure, or even an individualized treatment.

Dr. Gustin’s research delves into the heart of this problem. The complexity of the human body and its component cells are currently beyond the scope of any one unifying model. For this reason, starting basic research with human subjects would be detrimental. Researchers turn instead to simpler eukaryotes in order to understand the signal pathways involved in the cell cycle and how they respond to stress.2 Through years of hard work and research, Dr. Gustin’s studies have made huge contributions to the field of oncology.

Dr. Gustin studied a species of yeast, Saccharomyces cerevisiae, and its response to osmolarity. His research uncovered the high osmolarity glycerol (HOG) pathway and mitogen-activated protein kinase (MAPK) cascade, which work together to maintain cellular homeostasis. The HOG pathway is much like a “switchboard [that] control[s] cellular behavior and survival within a cell, which is regulated by the MAPK cascade through the sequential phosphorylation of a series of protein kinases that mediates the stress response.”3 These combined processes allow the cell to respond to extracellular stress by regulating gene expression, cell proliferation, and cell survival and apoptosis. To activate the transduction pathway, the sensor protein Sln1 recognizes a stressor and subsequently phosphorylates, or activates, a receiver protein that mediates the cellular response. This signal transduction pathway leads to the many responses that protect a cell against external stressors. These same protective processes, however, allow cancer cells to shield themselves from the body’s immune system, making them much more difficult to attack.

Dr. Gustin has used this new understanding of the HOG pathway to expand his research into similar pathways in other organisms. Fascinatingly, the expression of human orthologs of HOG1 proteins within yeast cells resulted in the same stimulation of the pathway despite the vast evolutionary differences between yeast and mammals. Beyond the evolutionary implications of this research, this illustrates that the “[HOG] pathway defines a central stress response signaling network for all eukaryotic organisms”.3 So much has already been learned through studies on Saccharomyces cerevisiae and yet researchers have recently discovered an even more representative organism. This fungus, Candida albicans, is the new model under study by Dr. Gustin and serves as the next step towards producing a working model of cancer and its responses to stressors. Its more complex responses to signalling make it a better working model than Saccharomyces cerevisiae.4 The research that has been conducted on Candida albicans has already contributed to the research community’s wealth of information, taking great strides towards eventual human applications in the field of medicine. For example, biological therapeutics designed to combating breast cancer cells have already been tested on both Candida albicans biofilms and breast cancer cells to great success.5

This research could eventually be applied towards improving current chemotherapy techniques for cancer treatment. Eventual applications of this research are heavily oriented towards fighting cancer through the use of chemotherapy techniques. Current chemotherapy techniques utilize cytotoxic chemicals that damage and kill cancerous cells, thereby controlling the size and spread of tumors. Many of these drugs can disrupt the cell cycle, preventing the cancerous cell from proliferating efficiently. Alternatively, a more aggressive treatment can induce apoptosis, programmed cell death, within the cancerous cell.6 For both methods, the chemotherapy targets the signal pathways that control the vital processes of the cancer cell. Dr. Gustin’s research plays a vital role in future chemotherapy technologies and the struggle against mutant cancer cells.

According to Dr. Gustin, current chemotherapy is only effective locally, and often fails to completely incapacitate cancer cells that are farther away from the site of drug administration where drug toxicity is highest. As a result, distant cancer cells are given the opportunity to develop cytoprotective mechanisms that increase their resistance to the drug.7 Currently, a major goal of Dr. Gustin’s research is to discover how and why certain cancer cells are more resistant to chemotherapy. The long-term goal is to understand the major pathways involved with cancer resistance to apoptosis, and to eventually produce a therapeutic product that can target the crucial pathways and inhibitors. With its specificity, this new drug would vastly increase treatment efficacy and provide humanity with a vital tool with which to combat cancer, saving countless lives in the future.

References   

  1. American Cancer Society. https://www.cancer.org/latest-new/cancer-facts-and-figures-death-rate-down-25-since-1991.html (February 3 2017).                  
  2. Radmaneshfar, E.; Kaloriti, D.; Gustin, M.; Gow, N.; Brown, A.; Grebogi, C.; Thiel, M. Plos ONE, 2013, 8, e86067.                
  3. Brewster, J.; Gustin, M. Sci. Signal. 2014, 7, re7.
  4. Rocha, C.R.; Schröppel, K.; Harcus, D.; Marcil, A.; Dignard, D.; Taylor, B.N.; Thomas, D.Y.; Whiteway, M.; Leberer, E. Mol. Biol. Cell. 2001, 12, 3631-3643.
  5. Malaikozhundan, B.; Vaseeharan, B.; Vijayakumar, S.; Pandiselvi K.; Kalanjiam R.; Murugan K.; Benelli G. Microbial Pathogenesis 2017, 102, n.p. Manuscript in progress.
  6. Shapiro, G. and Harper, J.; J Clin Invest. 1999, 104, 1645–1653.
  7. Das, B.; Yeger, H.; Baruchel, H.; Freedman, M.H.; Koren, G.; Baruchel, S. Eur. J. Cancer. 2003, 39, 2556-2565.

Comment

Visualizing the Future of Medicine

Comment

Visualizing the Future of Medicine

What do you do when you get sick? Most likely you schedule a doctor’s appointment, show up, and spend ten to fifteen minutes with the doctor. The physician quickly scans your chart, combines your narrative of your illness with your medical history and his or her observations so that you can leave with diagnosis and prescription in hand. While few give the seemingly routine process a second thought, the very way in which healthcare providers approach the doctor-patient experience is evolving. There is a growing interest in the medical humanities, a more interdisciplinary study of illness. According to Baylor College of Medicine, the aim of the medical humanities is “understanding the profound effects of illness and disease on patients, health professionals, and the social worlds in which they live and work.”1 Yet medical humanities is somewhat of a catch all term. It encompasses disciplines including literature, anthropology, sociology, philosophy, the fine arts and even “science and technology studies.”1 This nuanced approach to medicine is exactly what Dr. Kirsten Ostherr, one of the developers of Rice University’s medical humanities program, promotes.

Dr. Ostherr uses this interdisciplinary approach to study the intersection of technology and medicine. She has conducted research on historical medical visualizations through media such as art and film and its application to medicine today. Originally a PhD recipient of American Studies and Media Studies at Brown University, Dr. Ostherr’s interest in medicine and media was sparked while working at the Department of Public Health at Oregon Health Sciences University, where researchers were using the humanities as a lens through which they could analyze health data. “I noticed that the epidemiologists there used narrative to make sense of data, and that intrigued me,” she said. This inspired Dr. Ostherr to use her background in media and public health to explore how film and media in general have affected medicine and to predict where the future of medical media lies.

While the integration of medicine and media may seem revolutionary, it is not a new concept. In her book, Medical Visions, Dr. Ostherr says that “We know we have become a patient when we are subjected to a doctor’s clinical gaze,” a gaze that is powerfully humanizing and can “transform subjects into patients.”2 With the integration of technology and medicine, this “gaze” has extended to include the visualizations vital to understanding the patient and decoding disease. Visualizations have been a part of the doctor-patient experience for longer than one might think, from X-rays in 1912 to the electronic medical records used by physicians today.3

In her book, Dr. Ostherr traces and analyzes a series of different types of medical visualizations throughout history. Her research begins with the study of scientific films of the early twentieth century, and their attempt to bridge the gap between scientific knowledge and the general public.2 The use of film in medical education was also significant in the 20th century. These technical films helped facilitate the globalization of health and media in the postwar era. Another form of medical visualizations that emerged with the advent of medicine on television. At the intersection of entertainment and education, medical documentary evolved into “health information programming” in the 1980’s which in turn transitioned into the rise of medical reality television.2 The history of this diverse and expanding media, she says, proves that the use of visualizations in healthcare and our daily lives has made medicine “a visual science.”

One of the main takeaways from Dr. Ostherr’s historical analysis of medical visualizations was the deep-rooted relationship between visualizations and their role in spreading medical knowledge to the average person. While skeptics may argue against this characterization, “this is a broad social change that is taking place,” Dr. Ostherr said, citing new scientific research emerging on human centered design and the use of visual arts in medical training. “It’s the future of medicine,” she said. There is already evidence that such a change is taking place: the method of recording patient information using health records has begun to change. In recent years there has been a movement to adopt electronic health records due to their potential to save the healthcare industry millions of dollars and improve efficiency.4 Yet recent studies show that the current systems in place are not as effective as predicted.5 Online patient portals allow patients to keep up with their health information, view test results and even communicate with their health care providers, but while these portals can involve patients as active participants in their care, they can also be quite technical.6 As a result, there is a push to develop electronic health records with more readily understandable language.

In order to conduct further research in the field including projects such as the development of better, easier to understand electronic health records, Dr. Ostherr co-founded and is the director of the Medical Futures Lab. The lab draws resources from Baylor College of Medicine, University of Texas Health Science Center, and Rice University and its diverse team ranges from humanist scholars to doctors to computer scientists.7 The use of technology in medicine has continued to develop rapidly alongside the increasing demand for personalized, humanizing care. While it seems like there is an inherent conflict between the two, Dr. Ostherr believes medicine needs the “right balance of high tech and high touch” which is what her team at the Medical Futures Lab (MFL) works to find. The MFL team works on projects heavily focused on deconstructing and reconstructing the role of the patient in education and diagnosis.7

The increasingly integrated humanistic and scientific approach to medicine is revolutionizing healthcare. As the Medical Futures Lab explores the relationship between personal care and technology, the world of healthcare is undergoing a broad cultural shift. Early on in their medical education, physicians are being taught the value of incorporating the humanities and social sciences into their training, and that science can only teach one so much about the doctor-patient relationship. For Dr. Ostherr, the question moving forward will be “what is it that is uniquely human about healing?” What are the limitations of technology in healing and what about healing process can be done exclusively by the human body? According to Dr. Ostherr, the histories of visualizations in medicine can serve as a roadmap and an inspiration for the evolution and implementation of new media and technology in transforming the medical subject into the patient.

References

  1. Baylor University Medical Humanities. http://www.baylor.edu/medical_humanities/ (accessed Nov. 27, 2017).
  2. Ostherr, K. Medical visions: producing the patient through film, television, and imaging technologies; Oxford University Press: Oxford, 2013.
  3. History of Radiography. https://www.nde-ed.org/EducationResources/CommunityCollege/Radiography/Introduction/history.htm (accessed Jan. 2017).
  4. Abelson, R.; Creswell, J. In Second Look, Few Savings From Digital Health Records. New York Times [Online], January 11, 2013. http://www.nytimes.com/2013/01/11/business/electronic-records-systems-have-not-reduced-health-costs-report-says.html (accessed Jan 2017).
  5. Abrams, L. The Future of Medical Records. The Atlantic [Online], January 17, 2013 http://www.theatlantic.com/health/archive/2013/01/the-future-of-medical-records/267202/ (accessed Jan. 25, 2017).
  6. Rosen, M. D. L. High Tech, High Touch: Why Technology Enhances Patient-Centered Care. Huffington Post [Online], December 13, 2012. http://www.huffingtonpost.com/lawrence-rosen-md/health-care-technology_b_2285712.html (accessed Jan 2017).
  7. Medical Futures Lab. http://www.medicalfutureslab.org/ (accessed Dec 2017).

Comment

The Fight Against Neurodegeneration

Comment

The Fight Against Neurodegeneration

“You know that it will be a big change, but you really don’t have a clue about your future.”A 34-year-old postdoctoral researcher at the Telethon Institute of Genetics and Medicine in Italy at the time, Dr. Sardiello had made a discovery that would change his life forever. Eight years later, Dr. Sardiello is now the principal investigator of a lab in the Jan and Dan Duncan Neurological Research Institute (NRI) where he continues the work that had brought him and his lab to America.

Throughout his undergraduate career, Sardiello knew he wanted to be involved in some manner with biology and genetics research, but his passion was truly revealed in 2000: the year he began his doctoral studies. It was during this year that the full DNA sequence of the common fruit fly was released, which constituted the first ever complete genome of a complex organism. At the time, Sardiello was working in a lab that used fruit flies as a model, and this discovery served to spur his interest in genetics. As the golden age of genetics began, so did Sardiello’s love for the subject, leading to his completion of a PhD in Genetic and Molecular Evolution at the Telethon Institute of Genetics and Medicine. It was at this institute that his team made the discovery that would bring him to America: the function of Transcription Factor EB, colloquially known as TFEB.

Many knew of the existence of TFEB, but no one knew of its function. Dr. Sardiello and his team changed that. In 2009, they discovered that the gene is the master regulator for lysosomal biogenesis and function. In other words, TFEB works as a genetic switch that turns on the production of new lysosomes, an exciting discovery.1 Before the discovery of TFEB’s function, lysosomes were commonly known as the incinerator or the garbage can of the cell, as these organelles were thought to be essentially specialized containers that get rid of cellular waste. However, with the discovery of TFEB’s function, we now know that lysosomes have a much more active role in catabolic pathways and the maintenance of cell homeostasis. Sardiello’s groundbreaking findings were published in Science, one of the most prestigious peer reviewed journals in the scientific world. Speaking about his success, Sardiello said, “The bottom line was that there was some sort of feeling that a big change was about to come, but we didn’t have a clue what. There was just no possible measure at the time.”

Riding the success of his paper, Sardiello moved to the United States and established his own lab with the purpose of defeating the family of diseases known as Neuronal Ceroid Lipofuscinosis (NCLs). NCLs are genetic diseases caused by the malfunction of lysosomes. This malfunction causes waste to accumulate in the cell and eventually block cell function, leading to cell death. While NCLs cause cell death throughout the body, certain specialized cells such as neurons do not regenerate. Therefore, NCLs are generally neurodegenerative diseases. While there are many variants of NCLs, they all result in premature death after loss of neural functions such as sight, motor ability, and memory.

“With current technology,” Sardiello said, “the disease is incurable, since it is genetic. In order to cure a genetic disease, you have to somehow bring the correct gene into every single cell of the body.” With our current understanding of biology, this is impossible. Instead, doctors can work to treat the disease, and halt the progress of the symptoms. Essentially, his lab has found a way using TFEB to enhance the function of the lysosomes in order to fight the progress of the NCL diseases.

In addition to genetic enhancement, Sardiello is also focusing on finding drugs that will activate TFEB and thereby increase lysosomal function. To test these new methods, the Sardiello lab uses mouse models that encapsulate most of the symptoms in NCL patients. “Our current results indicate that drug therapy for NCLs is viable, and we are working to incorporate these strategies into clinical therapy,” Sardiello said. So far the lab has identified three different drugs or drug combinations that may be viable for treatment of this incurable disease.

While it might be easy to talk about NCLs and other diseases in terms of their definitions and effects, it is important to realize that behind every disease are real people and real patients. The goal of the Sardiello Lab is not just to do science and advance humanity, but also to help patients and give them hope. One such patient is a boy named Will Herndon. Will was diagnosed with NCL type 3, and his story is one of resilience, strength, and hope.

When Will was diagnosed with Batten Disease at the age of six, the doctors informed him and his family that there was little they could do. At the time, there was little to no viable research done in the field. However, despite being faced with terminal illness, Will and his parents never lost sight of what was most important: hope. While others might have given up, Missy and Wayne Herndon instead founded The Will Herndon Research Fund - also known as HOPE - in 2009, playing a large role in bringing Dr. Sardiello and his lab to the United States. Yearly, the foundation holds a fundraiser to raise awareness and money that goes towards defeating the NCL diseases. Upon its inception, the fundraiser had only a couple of hundred attendees- now, only half a decade later, thousands of like-minded people arrive each year to support Will and others with the same disease. “Failure is not an option,” Missy Herndon said forcefully during the 2016 banquet. “Not for Will, and not for any other child with Batten disease.” It was clear from the strength of her words that she believed in the science, and that she believed in the research.

“I have a newborn son,” Sardiello said, recalling the speech. “I can’t imagine going through what Missy and Wayne had to. I felt involved and I felt empathy, but most of all, I felt respect for Will’s parents. They are truly exceptional people and go far and beyond what anyone can expect of them. In face of adversity, they are tireless, they won’t stop, and their commitment is amazing.”

When one hears about science and labs, it usually brings to mind arrays of test tubes and flasks or the futuristic possibilities of science. In all of this, one tends to forget about the people behind the test bench: the scientists that conduct the experiments and uncover the next step in the collective knowledge of humanity, people like Dr. Sardiello. However, Sardiello isn’t alone in his endeavors, as he is supported by the members of his lab.

Each and every one of the researchers in Marco’s lab is an international citizen, hailing from at least four different countries in order to work towards a common cause: Parisa Lombardi from Iran, Lakshya Bajaj, Jaiprakash Sharma, and Pal Rituraj from India, Abdallah Amawi, from Jordan, and of course, Marco Sardiello and Alberto di Ronza, from Italy. Despite the vast distances in both geography and culture, the chemistry among the team was palpable, and while how they got to America varied, the conviction that they had a responsibility to help other people and defeat disease was always the same.

Humans have always been predisposed to move forwards. It is because of this propensity that humans have been able to eradicate disease and change the environments that surround us. However, behind all of our achievements lies scientific advancement, and behind it are the people that we so often forget. Science shouldn’t be detached from the humans working to advance it, but rather integrated with the men and women working to make the world a better place. Dr. Sardiello and his lab represent the constant innovation and curiosity of the research community, ideals that are validated in the courage of Will Herndon and his family. In many ways, the Sardiello lab embodies what science truly represents: humans working for something far greater than themselves.

References

  1. Sardiello, M.; Palmieri, M.; di Ronza, A.; Medina, D.L.; Valenza, M.; Alessandro, V. Science. 2009, 325, 473-477.

 

Comment

Cognitive Neuroscience: A Glimpse of the Future

Comment

Cognitive Neuroscience: A Glimpse of the Future

Catalyst Volume 10

Cognitive Neuroscience is a branch of science that addresses the processes in the brain that occur during cognitive activity. The discipline addresses how psychological and cognitive activities are caused by and correlated to the neural connections in our brain. It bridges psychology and neuroscience.

Dr. Simon Fischer-Baum, an assistant professor and researcher at Rice University,  co-directs the neuroplasticity lab at the BioScience Research Collaborative. He received his B.A. in Neuroscience and Behavior from Columbia University in 2003 and received his Ph.D. in Cognitive Sciences from Johns Hopkins University in 2010.

Dr. Fischer-Baum describes his research as the “intersection of psychology and neuroscience and computer science to some extent.” He is interested in instances of how we understand and pronounce a word once we see it. He also studies memory and how information is encoded in the brain. In his opinion, functional magnetic resonance imaging (fMRI) and other tools of cognitive neuroscience are extremely relevant to cognitive psychology despite public perception. For example, he believes that there is a “serious disconnect” as a result of the belief that the methods and findings of cognitive neuroscience do not apply to cognitive psychology. Cognitive psychologists have been attempting to discover the variation between the different levels of processing and how information travels between these levels. Cognitive neuroscience can help achieve these goals through the use of fMRIs.

fMRI shows which parts of the brain are active when the subject is performing a task. During any task, multiple regions of the brain are involved, with each region processing different types of information. For example, reading a word involves processing both visual information and meaning; when you are reading a word, multiple regions of the brain are active. However, one problem with fMRIs is that while they demonstrate what regions of the brain are active, they do not convey what function each region is carrying out.  One of the main objectives of Dr. Fischer-Baum’s work is to pioneer new methods similar to computer algorithms to decode what data from an fMRI tells us about what tasks the brain is performing. “I want to be able to take patterns of activity and decode and relate it back to the levels of representation that cognitive psychologists think are going on in research,” Dr. Fischer-Baum explains.

Recently, Dr. Fischer-Baum published a study of a patient who suffered severe written language impairments after experiencing a hemorrhagic stroke. Although this patient’s reading of familiar words improved throughout the years, he still presented difficulties in processing abstract letter identity information for individual letters. Someone who is able to utilize abstract letter representations can  recognize letters independent of case or font; in other words, this person  is able to identify letters regardless of the whether they are upper case, lower case, or a different font. In the studied patient, Dr. Fischer-Baum’s team observed contralesional reorganization. Compromised regions of the left hemisphere that contained orthography-processing regions (regions that process the set of conventions for writing a language) were organized into homologous regions in the right hemisphere. Through the use of fMRI, the research team determined that the patient’s residual reading ability was supported by functional take-over, which is when injury-damaged functions are taken over by healthy brain regions. These results were found by scanning the brain of the patient as he read and comparing the data with that of a control group of young healthy adults with normal brain functions.

While Dr. Fischer-Baum has made substantial progress in this project, the research has not been without challenges. The project began in 2013 and took three years to complete, which is a long time for Dr. Fischer-Baum’s field of study. Due to this, none of the co-authors from Rice University know each other despite all working on the project at some point in time with another. Because of the amount of time spent on the project, many of the students rotated in and out while working on various parts; the students never worked on the project at the same time as their peers. In addition, the project’s  interdisciplinary approach required the input of  many collaborators with different abilities. All of the Rice undergraduate students that worked on the project were from different majors although most were from the Cognitive Sciences Department and the Statistics Department. At times, this led to miscommunication between the different students and researchers on the project. Since the students came from different backgrounds, they had different approaches to solving problems. This led to the students at times not being harmonious during many aspects of the project.  

Another major setback occurred in bringing ideas to fruition. “You realize quickly when you begin a project that there are a million different ways to solve the problem that you are researching, and trying to decide which is the right or best way can sometimes be difficult,” Dr. Fischer-Baum said. As a result of this, there have been a lot of false starts, and it has taken a long time in order to get work off the ground. How did Dr. Fischer-Baum get past this problem? “Time, thinking, discussion, and brute force,” he chuckled. “You realize relatively quickly that you need to grind it out and put in effort in order to get the job done.”

Despite these obstacles, Dr. Fischer-Baum has also undertaken other projects in order to keep his mind busy. In one, he works with stroke patients with either reading or writing deficits to understand how written language is broken down in the mind. He studies specific patterns in the patients’ brain activity to investigate how reading and writing ability differ from each other. In another of Dr. Fischer-Baum’s projects he works with Dr. Paul Englebretson of the Linguistics Department in order to research the brain activity of blind people as they read Braille. “There is a lot of work on how the reading system works, but a lot of it is based on the perspective of reading by sight,” Dr. Fischer-Baum acknowledged. “I am very interested to see how the way we read is affected by properties of our visual system. Comparing sight and touch can show how much senses are a factor in reading.”

Ultimately, Dr. Fischer-Baum conducts his research with several goals in mind. The first is to build an approach to cognitive neuroscience that is relevant to the kinds of theories that we have in the other cognitive sciences, especially cognitive psychology. “While it feels like studying the mind and studying the brain are two sides of the same coin and that all of this data should be relevant for understanding how the human mind works, there is still a disconnect between the two disciplines,” Dr. Fischer-Baum remarked. He works on building methods in order to bridge this disconnect.

In addition to these goals for advancing the field of cognitive neuroscience, there are clinical implications as well to Dr. Fischer-Baum’s research. Gaining more insight into brain plasticity following strokes can be used to build better treatment and recovery programs. Although the research requires further development, the similarity between different regions and their adaptations following injury can lead to a better understanding of the behavioral and neural differences in patterns of recovery. Additionally, Dr. Fischer-Baum aims to understand the relationship between spontaneous and treatment-induced recovery and how the patterns of recovery of language differ as a result of the initial brain injury type and location. Through the combined use of cognitive psychology and fMRI data, the brains of different stroke patients can be mapped and the data can be used to create more successful treatment-induced methods of language recovery. By virtue of Dr. Fischer-Baum’s research, not only can cognitive neuroscience be applied to many other disciplines, but it can also significantly improve the lives of millions of people around the world.  

 

Comment

Haptics: Touching Lives

Comment

Haptics: Touching Lives

Everyday you use a device that has haptic feedback: your phone. Every little buzz for notifications, key presses, and failed unlocks are all examples of haptic feedback. Haptics is essentially tactile feedback, a form of physical feedback that uses vibrations. It is a field undergoing massive development and applications of haptic technology are expanding rapidly. Some of the up-and-coming uses for haptics include navigational cues while driving, video games, virtual reality, robotics, and, as in Dr. O’Malley’s case, in the medical field with prostheses and medical training tools.

Dr. Marcia O’Malley has been involved in the biomedical field ever since working in an artificial knee implant research lab as an undergraduate at Purdue University. While in graduate school at Vanderbilt University, she worked in a lab focused on human-robot interfaces where she spent her time designing haptic feedback devices. Dr. O’Malley currently runs the Mechatronics and Haptic Interfaces (MAHI) Lab at Rice University, and she was recently awarded a million dollar National Robotics Initiative grant for one of her projects. The MAHI Lab “focuses on the design, manufacture, and evaluation of mechatronic or robotic systems to model, rehabilitate, enhance or augment the human sensorimotor control system.”1 Her current research is focused on prosthetics and rehabilitation with an effort to include haptic feedback. She is currently working on the MAHI EXO- II. “It’s a force feedback exoskeleton, so it can provide forces, it can move your limb, or it can work with you,” she said. The primary project involving this exoskeleton is focused on “using electrical activity from the brain captured with EEG… and looking for certain patterns of activation of different areas of the brain as a trigger to move the robot.” In other words, Dr. O’Malley is attempting to enable exoskeleton users to control the device through brain activity.

Dr. O’Malley is also conducting another project, utilizing the National Robotics Initiative grant, to develop a haptic cueing system to aid medical students training for endovascular surgeries. The idea for this haptic cueing system came from two different sources. The first part was her prior research which consisted of working with joysticks. She worked on a project that involved using a joystick, incorporated with force feedback, to swing a ball to hit targets.2 As a result of this research, Dr. O’Malley found that “we could measure people’s performance, we could measure how they used the joystick, how they manipulated the ball, and just from different measures about the characteristics of the ball movement, we could determine whether you were an expert or a novice at the task… If we use quantitative measures that tell us about the quality of how they’re controlling the tools, those same measures correlate with the experience they have.” After talking to some surgeons, Dr. O’Malley found that these techniques of measuring movement could work well for training surgeons.

The second impetus for this research came from an annual conference about haptics and force feedback. At the conference she noticed that more and more people were moving towards wearable haptics, such as the Fitbit, which vibrates on your wrist. She also saw that everyone was using these vibrational cues to give directional information. However, “nobody was really using it as a feedback channel about performance,” she said. These realizations led to the idea of the vibrotactile feedback system.

Although the project is still in its infancy, the current anticipated product is a virtual reality simulator which will track the movements of the tool. According to Dr. O’Malley, the technology would provide feedback through a single vibrotactile disk worn on the upper limb. The disk would use a voice coil actuator that moves perpendicular to the wearer’s skin. Dr. O’Malley is currently working with Rice psychologist Dr. Michael Byrne to determine which frequency and amplitude to use for the actuator, as well as the timing of the feedback to avoid interrupting or distracting the user.

Ultimately, this project would measure the medical students’ smoothness and precision while using tools, as well as give feedback to the students regarding their performance. In the future, it could also be used in surgeries during which a doctor operates a robot and receives force feedback through similar haptics. During current endovascular surgery, a surgeon uses screens that project a 2D image of the tools in the patient. Incorporating 3D views would need further FDA approval and could distract and confuse surgeons given the number of screens they would have to monitor. This project would offer surgeons a simpler way to operate. From exoskeletons to medical training, there is a huge potential for haptic technologies. Dr. O’Malley is making this potential a reality.

References

  1. Mechatronics and Haptic Interfaces Lab Home Page. http://mahilab.rice.edu (accessed   Nov. 7, 2016).
  2. O’Malley, M. K. et al. J. Dyn. Sys., Meas., Control. 2005, 128 (1), 75-85.

Comment

The Depressive Aftermath of Brain Injury

Comment

The Depressive Aftermath of Brain Injury

One intuitively knows that experiencing a brain injury is often painful and terrifying; the fact that it can lead to the onset of depression, however, is a lesser known but equally serious concern. Dr. Roberta Diddel, a clinical psychologist and member of the adjunct faculty in the Psychology Department at Rice University, focuses on the treatment of individuals with mental health issues and cognitive disorders. In particular, she administers care to patients with cognitive disorders due to traumatic brain injury (TBI). Dr. Diddel acquired a PhD in clinical psychology from Boston University and currently runs a private practice in Houston, Texas. Patients who experience TBI often experience depression; Dr. Diddel uses her understanding of how this disorder comes about to create and administer potential treatments.

Traumatic brain injury (TBI) affects each patient differently based on which region of the brain is damaged. If a patient has a cerebellar stroke, affecting the region of the brain which regulates voluntary motor movements, he or she might experience dizziness and have trouble walking. However, that patient would be able to take a written test because the injury has not affected higher order cognitive functions such as language processing and critical reasoning.

Dr. Diddel said, “Where you see depression the most is when there is a more global injury, meaning it has affected a lot of the brain. For example, if you hit your forehead in a car accident or playing a sport, you’re going to have an injury to the front and back parts of your brain because your brain is sitting in cerebrospinal fluid, causing a whiplash of sorts. In turn, this injury will cause damage to your frontal cortex, responsible for thought processing and problem solving, and your visual cortex, located in the back of your brain. When your brain is bouncing around like that, you often have swelling which creates intracranial pressure. Too much of this pressure prevents the flow of oxygen-rich blood to the brain. That can cause more diffuse brain injury.”

In cases where people experience severe brain injury such as head trauma due to an explosion or a bullet, surgeons may remove blood clots that may have formed in order to relieve intracranial pressure and repair skull fractures.4 They may also remove a section of the skull for weeks or months at a time to let the brain swell, unrestricted to the small, cranial cavity. That procedure alone significantly reduces the damage that occurs from those sorts of injuries and is especially useful in the battlefield where urgent care trauma centers may not be available.

Depression is a common result of TBI. The Diagnostic and Statistical Manual of Mental Disorders (DSM) defines depression as a loss of interest or pleasure in daily activities for more than two weeks, a change in mood, and impaired function in society.1 These symptoms are caused by brain-related biochemical deficiencies that disrupt the nervous system and lead to various symptoms. Usually, depression occurs due to physical changes in the prefrontal cortex, the area of the brain associated with decision-making, social behavior, and personality. People with depression feel overwhelmed, anxious, lose their appetite, and have a lack of energy, often because of depleted serotonin levels. The mental disorder is a mixture of chemical imbalances and mindstate; if the brain is not correctly functioning, then a depressed mindstate will follow.

Dr. Diddel mentioned that in many of her depressed patients, their lack of motivation prevents them from addressing and improving their toxic mindset. “If you’re really feeling bad about your current situation, you have to be able to say ‘I can’t give in to this. I have to get up and better myself and my surroundings.’ People that are depressed are struggling to do that,” she said.

The causes of depression vary from patient to patient and often depends on genetic predisposition to the disease. Depression can arise due to physical changes in the brain such as the alterations in the levels of catecholamines, neurotransmitters that works throughout the sympathetic and central nervous systems. Catecholamines are broken down into other neurotransmitters such as serotonin, epinephrine, and dopamine, which are released during times of positive stimulation and help increase activity in specific parts of the brain. A decrease in these chemicals after an injury can affect emotion and thought process. Emotionally, the patient might have a hard time dealing with a new disability or change in societal role due to the trauma. Additionally, patients who were genetically loaded with genes predisposing them to depression before the injury are more prone to suffering from the mental disorder after the injury.2,3

Depression is usually treated with some form of therapy or antidepressant medication. In cognitive behavior therapy (CBT), the psychologist tries to change the perceptions and behavior that exacerbate a patient’s depression. Generally, the doctor starts by attempting to change the patient’s behavior because it is the only aspect of his or her current situation that can can described. Dr. Diddel suggests such practices to her patients, saying things like “I know you don’t feel like it, but I want you to go out and walk everyday.” Walking or any form of exercise increases catecholamines, which essentially increases the activity of serotonin in the brain and improves the patient’s mood. People who exercise as part of their treatment regimen are also less likely to experience another episode of depression.

The efficacy of antidepressant medication varies from patient to patient depending on the severity of depression a patient faces. People with mild to moderate depression generally respond better to CBT because the treatment aims to change their mindset and how they perceive the world around them. CBT can result in the patient’s depression gradually resolving as he or she perceives the surrounding stimuli differently, gets out and moves more, and pursues healthy endeavors. Psychologists usually begin CBT, and if the patient does not respond to that well, then they are given medication. Some medications increase serotonin levels while others target serotonin, dopamine, and norepinephrine; as a result, they boost the levels of neurotransmitters that increase arousal levels and dampen negative emotions. The population of patients with moderate to severe depressions usually respond better to antidepressant medication. Medication can restore ideal levels of neurotransmitters, which in turn encourages the patient to practice healthier behavior.

According to the Center for Disease Control and Prevention, the US saw about 2.5 million cases of traumatic brain injury in 2010 alone.5 That number rises every year and with it brings a number of patients who suffer from depression in the aftermath.5 Though the mental disorder has been studied for decades and treatment options and medications are available, depression is still an enigma to physicians and researchers alike. No two brains are wired the same, making it very difficult to concoct a treatment plan with a guaranteed success rate. The work of researchers and clinical psychologists like Dr. Diddel, however, aims to improve the currently available treatment. While no two patients are the same, understanding each individual’s depression and tailoring treatment to the specific case can vasty improve the patient’s outcome.

References

  1. American Psychiatric Association. Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC, 2013.
  2. Fann, J. Depression After Traumatic Brain Injury. Model Systems Knowledge Translation Center [Online]. http://www.msktc.org/tbi/factsheets/Depression-After-Traumatic-Brain-Injury (accessed Dec. 28, 2016).
  3. Fann, J.R., Hart, T., Schomer, K.G. J. Neurotrauma. 2009, 26, 2383-2402.
  4. Mayo Clinic Staff. Traumatic Brain Injury. Mayo Clinic, May 15, 2014. http://www.mayoclinic.org/diseases-conditions/traumatic-brain-injury/basics/treatment/con-20029302 (accessed Dec. 29, 2016).
  5. Injury Prevention and Control. Centers for Disease Control and Prevention. https://www.cdc.gov/traumaticbraininjury/get_the_facts.html (accessed Dec. 29, 2016).

Comment