Viewing entries tagged
Attractions

Rewriting the Genetic Code

Comment

Rewriting the Genetic Code

DNA, or deoxyribonucleic acid, is at the root of all life as we know it. Using just four component nucleotide bases, DNA contains all the information needed to build any protein. To make a protein, DNA is transcribed into mRNA, which is in turn “translated” into a protein by molecules called ribosomes. More specifically, the mRNA is “read” by ribosomes in groups of three base pairs called codons, each of which codes for one of 20 possible amino acids. The set of rules that determines which codon codes for which amino acid is called the genetic code, and it is common to almost all known organisms--everything, from bacteria to humankind, shares the same protein expression schema. It is this common code that allows synthetic biologists to insert one organism’s genes into another to create transgenic organisms; generally, a gene will be expressed the same way no matter what organism possesses it. Interestingly, some scientists are attempting to alter this biological norm: rather than modifying genes, they are attempting to modify the genetic code itself in order to create genetically recoded organisms, or GROs.

This modification is possible due to the redundancy of the genetic code. Because there are 64 possible unique codons and only 20 amino acids found in nature, many amino acids are specified by more than one codon. Theoretically, then, researchers should be able to swap every instance of a particular codon with a synonymous one without harming a cell, then repurpose the eliminated codon.1 This was proven possible in a 2013 paper by Lajoie et al. published in Science; in it, a team of scientists working with E. coli cells substituted all instances of the codon UAG, which signals cells to stop translation of a protein, with the functionally equivalent sequence UAA. They then deleted release factor 1 (RF1), the protein that gives UAG its stop function. Finally, they reassigned the function of UAG to code for a non-standard amino acid (NSAA) of their choice.1

In a more recent paper, Ostrov et al. took this recoding even further by excising seven codons from the E. coli genome, reducing it to 57 codons. Because there are 62,214 instances of these codons in the E. coli genome, researchers couldn’t directly excise them from the E. coli DNA with typical gene-editing strategies. Instead, they resorted to synthesizing long stretches of the modified genome, inserting them into the bacteria, and testing to make sure the modifications were not lethal. At time of publishing, they had completed testing of 63% of the recoded genes and found that most of their changes had not significantly impaired the bacteria’s fitness, indicating that such large changes to the genetic code are feasible.2

Should Ostrov’s team succeed in their recoding, there are a number of possible applications for the resulting GRO. One would be the creation of virus-resistant cells.1,2,3 Viral DNA injected into recoded bacteria would be improperly translated if it contained the repurposed codons due to the modified protein expression machinery the GRO possesses. Such resistance was demonstrated by Lajoie in an experiment in which he infected E. coli modified to have no UAG codons with two types of viruses: a T7 virus that contained UAG codons in critical genes, and a T4 virus that did not. As expected, the modified cells showed resistance to T7, but were infected normally by T4. The researchers concluded that more extensive genetic code modifications would probably make the bacteria immune to viral infection entirely.2 Using such organisms in lieu of unmodified ones in bacteria-dependent processes like cheese- and yogurt-making, biogas manufacturing, and hormone production would reduce the cost of those processes by eliminating the hassle and expense associated with viral infection.3,4 It should be noted that while GROs would theoretically be resistant to infection, they would also be unable to “infect” anything themselves. If GRO genes were to be taken up by other organisms, they would also be improperly translated as well, making horizontal gene transfer impossible. This means that GROs are “safe” in the sense that they would not be able to spread their genes to organisms in the wild like other GMOs can.5

GROs could also be used to make novel proteins. The eliminated codons could be repurposed to code for amino acids not found in nature, or non-standard amino acids (NSAAs). It is possible to use GRO’s to produce a range of proteins with expanded chemical properties, free from the limits imposed by strictly using the 20 standard amino acids.2,3 These proteins could then be used in medical or industrial applications. For example, the biopharmaceutical company Ambrx develops proteins with NSAAs for use as medicine to treat cancer and other diseases.6

While GRO’s can do incredible things, they are not without their drawbacks. Proteins produced by the modified cells could turn out to be toxic, and if these GROs manage to escape from the lab into the wild, they could flourish due to their resistance to viral infection.2,3 To prevent this scenario from happening, Ostrov’s team has devised a failsafe. In previous experiments, the researchers modified bacteria so that two essential genes, named adk and tyrS, depended on NSAAs to function. Because NSAAs aren’t found in the wild, this modification effectively confines the bacteria within the lab, and it is difficult for the organisms to thwart this containment strategy spontaneously. Ostrov et al. intend to implement this failsafe in their 57-codon E. coli.2

Genetic recoding is an exciting development in synthetic biology, one that offers a new paradigm for genetic modification. Though the field is still young and the sheer amount of DNA changes needed to recode organisms poses significant challenges to the creation of GROs, genetic recoding has the potential to yield tremendously useful organisms.

References

  1. Lajoie, M. J., et al. Science 2013, 342, 357-360.
  2. Ostrov, N., et al. Science 2016, 353, 819-822.
  3. Bohannon, J. Biologists are close to reinventing the genetic code of life. Science, Aug. 18, 2016. http://www.sciencemag.org/news/2016/08/biologists-are-close-reinventing-genetic-code-life (accessed Sept. 29, 2016).
  4. Diep, F. What are genetically recoded organisms?. Popular Science, Oct. 15, 2013. http://www.popsci.com/article/science/what-are-genetically-recoded-organisms (accessed Sept. 29, 2016)
  5. Commercial and industrial applications of microorganisms. http://www.contentextra.com/lifesciences/files/topicguides/9781447935674_LS_TG2_2.pdf (accessed Nov. 1 2016)
  6. Ambrx. http://ambrx.com/about/ (accessed Oct. 6, 2016)

Comment

CRISPR: The Double-Edged Sword of Genetic Engineering

Comment

CRISPR: The Double-Edged Sword of Genetic Engineering

The phrase “genetic engineering” often brings to mind a mad scientist manipulating mutant genes or a Frankenstein-like creation emerging from a test tube. Brought to the forefront by heated debates over genetically modified crops, genetic engineering has long been viewed as a difficult, risky process fraught with errors.1

The intentional use of CRISPRs, short for clustered regularly interspaced sport palindromic repeats, turned the world of genetic engineering on its head. Pioneered by Jennifer Doudna of Berkeley2 in 2012 and praised as the “Model T” of genetic engineering,3 CRISPR as a tool is both transforming what it means to edit genes and raising difficult ethical and moral questions about genetic engineering as a discipline.

CRISPR itself is no new discovery. The repeats are sequences used by bacteria and microorganisms to protect against viral infections. Upon invasion by a virus, CRISPR identifies the DNA segments from the invading virus, processes them into “spacers,” or short palindromic repeats of DNA, and inserts them back into the bacterial genome.4 When the bacterial DNA undergoes transcription, the resulting RNA is a single-chain molecule that acts as a guide to destroy viral material. In a way, the RNA functions as a blacklist for the bacterial cell: re-invasion attempts by the same virus are quickly identified using the DNA record and subsequently stopped. That same blacklist enables CRISPR to be a powerful engineering tool. The spacers act as easily identifiable flags in the genome, allowing for high precision when manipulating individual nucleotide sequences in genes.5 The old biotechnology system can be perceived as a confused traveler holding an inaccurate map, with a general location and vague person to meet. By the same analogy, CRISPR provides a mugshot of the person to meet and the precise coordinates of where to find them. Scientists have taken advantage of this precision and now use modified proteins, such as Cas-9, to activate gene expression as opposed to cutting the DNA,6 an innovative style of genetic engineering. Traditional genetic engineering can be a shot in the dark, but with the accuracy of CRISPR, mutations are very rare.7 For the first time, scientists are able to pinpoint the exact location of genes, cut the desired sequence, and leave no damage. Another benefit of CRISPR is the reincorporation of genes that have become lost, either by breeding or evolution, bringing back extinct qualities. For example, scientists have succeeded in introducing mammoth genes into living elephant cells.8,9 Even better, CRISPR is very inexpensive, costing around $75 to edit a gene at Baylor College of Medicine,10 and accessible to anyone with biological expertise, starting with graduate students.11 The term “Model T of genetic engineering” could hardly be more appropriate.

CRISPR stretches the boundaries of bioengineering. One enterprising team from China led by oncologist Dr. Lu You has already begun trials on humans. They plan on injecting cells modified using the CRISPR-Cas9 system into patients with metastatic non-small cell lung cancer--patients who otherwise have little hope of survival.12 To prevent attacks on healthy cells, extracted critical immune mediators called T cells will be edited with the CRISPR-Cas9 system. CRISPR will identify and “snip” out a gene that encodes PD-1, a protein that acts as a check on the T-cell’s capacity to launch an immune response. Essentially, Lu’s team is creating super-T-cells, ones that have no mercy for any suspicious activity. This operation is very risky. CRISPR’s mechanisms are not thoroughly understood, and mistakes with gene editing could have drastic consequences.11 In addition, the super T cells could attack in an autoimmune reaction, leading to degradation of critical organs. In an attempt to prevent such a response, Lu’s team will extract T-cells from the tumor itself, as those T-cells would likely have already specialized in attacking cancer cells. To ensure patient safety, the team will examine the effects of three different dosage regimens on ten patients, watching closely for side effects and responsiveness.

Trials involving such cutting-edge technology raise many questions. With ease of use and accessibility, CRISPR has the potential to become a tool worthy of science fiction horror. Several ethics groups have raised concerns over the inappropriate use of CRISPR: they worry that the technology could be used by amateurs and thus yield dangerous results. In spite of these concerns, China greenlit direct editing of human embryos, creating international uproar and a moratorium on further human embryo testing.13 But such editing could lead to new breakthroughs: CRISPR could reveal how genes regulate early embryonic development, leading to a better understanding of infertility and miscarriages.14

This double-edged nature defines CRISPR-Cas9’s increasing relevance at the helm of bioengineering. The momentum behind CRISPR and its seemingly endless applications continue to broach long-unanswered questions in biology, disease treatment, and genetic engineering. Still, with that momentum comes caution: with CRISPR’s discoveries come increasingly blurred ethical distinctions.

References

  1. Union of Concerned Scientists. http://www.ucsusa.org/food_and_agriculture/our-failing-food-system/genetic-engineering/risks-of-genetic-engineering.html#.WBvf3DKZPox (accessed Sept. 30, 2016).
  2. Doudna, J. Nature [Online] 2015, 7583, 469-471. http://www.nature.com/news/genome-editing-revolution-my-whirlwind-year-with-crispr-1.19063 (accessed Oct. 5, 2016).
  3. Mariscal, C.; Petropanagos, A. Monash Bioeth. Rev. [Online] 2016, 102, 1-16. http://link.springer.com/article/10.1007%2Fs40592-016-0062-2 (accessed Oct. 5, 2016).
  4. Broad Institute. https://www.broadinstitute.org/what-broad/areas-focus/project-spotlight/questions-and-answers-about-crispr (accessed Sept. 30, 2016).
  5. Pak, E. CRISPR: A game-changing genetic engineering technique. Harvard Medical School, July 31, 2015. http://sitn.hms.harvard.edu/flash/2014/crispr-a-game-changing-genetic-engineering-technique/ (accessed Sept. 30, 2016)
  6. Hendel, A. et al. Nature Biotech. 2015, 33, 985-989.
  7. Kleinstiver, B. et al. Nature. 2016, 529, 490-495.
  8. Shapiro, B. Genome Biology. [Online] 2015, 228. N.p. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4632474/ (accessed Nov. 1, 2016).
  9. Reardon, S. Nature. [Online] 2016, 7593, 160-163. http://www.nature.com/news/welcome-to-the-crispr-zoo-1.19537 (accessed Nov. 1, 2016).
  10. Baylor College of Medicine. https://www.bcm.edu/research/advanced-technology-core-labs/lab-listing/mouse-embryonic-stem-cell-core/services/crispr-service-schedule (accessed Sept. 30, 2016).
  11. Ledford, H. Nature. [Online] 2015, 7554, 20-24. http://www.nature.com/news/crispr-the-disruptor-1.17673 (accessed Oct. 11, 2016).
  12. Cyranoski, D. Nature. [Online] 2016, 7613, 476-477. http://www.nature.com/news/chinese-scientists-to-pioneer-first-human-crispr-trial-1.20302 (accessed Sept. 30, 2016).
  13. Kolata, G. Chinese Scientists Edit Genes of Human Embryos, Raising Concerns. The New York Times, April 23, 2015. http://www.nytimes.com/2015/04/24/health/chinese-scientists-edit-genes-of-human-embryos-raising-concerns.html?_r=0 (accessed Oct. 2, 2016).
  14. Stein, R. Breaking Taboo, Swedish Scientist Seeks To Edit DNA of Healthy Human Embryos. NPR, Sept. 22, 2016. http://www.npr.org/sections/health-shots/2016/09/22/494591738/breaking-taboo-swedish-scientist-seeks-to-edit-dna-of-healthy-human-embryos (accessed Sept. 22, 2016).

Comment

Healthcare Reforms for the Mentally Ill

Comment

Healthcare Reforms for the Mentally Ill

Neuropsychiatric illnesses are some of the most devastating conditions in the world. Despite being non-communicable, mental and neurological conditions are estimated to contribute to approximately 30.8% of all of the years lived in disability1. Furthermore, in developed nations like the United States, mental disorders have been reported to erode around 2.5% of the yearly gross national product, which fails to account for the opportunity cost of families who have to take care of patients long-term.1 If left untreated, many patients with neuropsychiatric illnesses cannot find gainful employment; their aberrant behavior is stigmatized and prevents forward professional and personal advancement. In fact, about three times as many individuals living with mental illnesses who are in state/local prisons rather than rehabilitative psychiatric institutions.2

Though the Affordable Care Act has substantially decreased the amount of uninsured individuals in the U.S., there are still millions of people who fall into something called the Medicaid gap.3 People in this group make too much money for Medicaid, but too little money to be able to qualify for government tax credits in purchasing an insurance plan. In an attempt to fix this ‘hole,’ the federal government offers aid to states in order to expand their Medicaid programs as needed.4 States that have accepted the Medicaid expansion sponsored by the federal government, have seen sudden reductions in their populations of uninsured people, which has directly improved quality of life for the least fortunate people in society. However, in the many states that continue to reject federal aid, the situation is considerably worse--especially for the mentally ill.

Mental health patients are especially vulnerable to falling into the Medicare gap. Many patients suffering from psychiatric conditions often are unable to find serious employment. According to a report by the Department of Health and Human Services in March 2016, there are 1.9 million low-income, uninsured individuals with mental health disorders who cannot access proper healthcare resources.5 These impoverished psychiatric patients are originally eligible for Medicare. However, once their treatment takes and they become employed, they might pass the Medicare income threshold. If their private health insurance does not cover the cost of their psychiatric treatments, patients will relapse, creating a vicious cycle that is exceptionally difficult to break out of.6

Furthermore, many psychiatric illnesses often initially present during adolescence or early adulthood, which is right around the time students leave home to go to college. So, during initial presentation, many students lack the proper support system necessary to deal with their condition, causing many to drop out of college or receive poor grades. Families often chalk up these conditions to poor adjustments to a brand new college environment at home, preventing psychiatric patients from properly receiving treatment.6 Alone, many students with psychiatric conditions delay seeking treatment, fearing being labeled as “crazy” or “insane” by their peers.

Under the status quo, psychiatric patients face significant barriers to care. As the Medicaid gap is unfortunately subject to political maneuverings, it probably will not be fixed immediately. However, the United States could fund the expansion of Assertive Community Treatment programs, which provide medication, therapy, and social support in an outpatient setting.8 Such programs dramatically reduce hospitalization times for psychiatric patients, alleviating the costs of medical treatment. Funding these programs would help insurance issues from being a deterrent to treatment.

In the current system, psychiatric patients face numerous deterrents to receiving treatment, from lack of family support to significant social stigma. Having access to health insurance be a further barrier to care is a significant oversight of the current system and ought to be corrected.

References

  1. World Health Organization. Chapter 2: Burden of Mental and Behavioural Disorders. 2001. 20 3 2016 <http://www.who.int/whr/2001/chapter2/en/index3.html>.
  2. Torrey, E. F.; Kennard, A. D.; Elsinger, D.; Lamb, R.; Pavle, J. More Mentally Ill Persons Are in Jails and Prisons Than Hospitals: A Survey of the States .
  3. Kaiser Family Foundation. Key Facts about the Uninsured Population. 5 8 2015. 25 3 2016 <http://kff.org/uninsured/fact-sheet/key-facts-about-the-uninsured-population/>.
  4. Ross, Janell. Obamacare mandated better mental health-care coverage. It hasn't happened. 7 8 2015. 24 3 2016 <https://www.washingtonpost.com/news/the-fix/wp/2015/10/07/obamacare-mandated-better-mental-health-care-coverage-it-hasnt-happened/>.
  5. Dey, J.; Rosenoff, E.; West, K. Benefits of Medicaid Expansion for Behavioral Health. 28 3 2016 <https://aspe.hhs.gov/sites/default/files/pdf/190506/BHMedicaidExpansion.pdf>
  6. Taskiran, Sarper. Interview. Rishi Suresh. Istanbul, 3 3 2016.
  7. Gonen, Oner Gurkan. Interview. Rishi Suresh. Houston, 1 4 2016.
  8. Assertive Community Treatment https://www.centerforebp.case.edu/practices/act (accessed Jan 2017).

Comment

Modeling Climate Change: A Gift From the Pliocene

Comment

Modeling Climate Change: A Gift From the Pliocene

Believe it or not, we are still recovering from the most recent ice age that occurred between 21,000 and 11,500 years ago. And yet, in the past 200 years, the Earth's average global temperature has risen by 0.8 ºC at a rate more than ten times faster than the average ice-age recovery rate.1 This increase in global temperature, which shows no signs of slowing down, will have tremendous consequences for our planet’s biodiversity and overall ecology.

Climate change is caused by three main factors: changes in the position of the Earth’s continents, variations in the Earth’s orbital positions, and increases in the atmospheric concentration of “greenhouse gases”, such as carbon dioxide.2 In the past 200 years, the Earth’s continents have barely moved and its orbit around the sun has not changed.2 Therefore, to explain the 0.8 ºC increase in global average temperature that has occurred, the only reasonable conclusion is that there has been a change in the concentration of greenhouse gases.

After decades of research by the Intergovernmental Panel on Climate Change (IPCC), this theory was supported. The IPCC Fourth Assessment Report concluded that the increase in global average temperature is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. Also included in the report is a prediction that global temperatures will increase between 1.1 ºC and 6.4 ºC by the end of the 21st century.2

Though we know what is causing the warming, we are unsure of its effects. The geologists and geophysicists at the US Geological Service (USGS) are attempting to address this uncertainty through the Pliocene Research, Interpretation, and Synoptic Mapping (PRISM) program.3

The middle of the Pliocene Era occurred roughly 3 million years ago-- a relatively short time on the geological time scale. Between the Pliocene era and our current Holocene era, the continents have barely drifted, the planet has maintained a near identical orbit around the sun, and the type of organisms living on earth has remained relatively constant.2 Because of these three commonalities , we can draw three conclusions. Because the continents have barely drifted, global heat distribution through oceanic circulation is the same. Additionally, because the planet’s orbit is essentially the same, glacial-interglacial cycles have not been altered. Finally, because the type of organisms has remained relatively constant, the biodiversity of the Pliocene is comparable to our own.

While the eras share many similarities, the main difference between them is that the Pliocene was about 4 ºC warmer at the equator and 10 ºC warmer at the poles.4 Because the Pliocene had similar conditions to today, but was warmer, it is likely that at the end of the century, our planet’s ecology may begin to look like the Pliocene. This idea has been supported by the research done by the USGS’s PRISM.3

It is a unique and exciting opportunity to be able to study a geological era so similar to our own and apply discoveries we make from that era to our current environment. PRISM is using multiple techniques to extract as much data about the Pliocene as possible. The concentration of magnesium ions, the number of carbon double bonds in organic structures called alkenones, and the concentration and distribution of fossilized pollen all provide a wealth of information that can be used to inform us about climate change. However, the single most useful source of such information comes from planktic foraminifera, or foram.5

Foram, abundant during the Pliocene era, are unicellular, ocean-dwelling organisms adorned with calcium shells. Fossilized foram are extracted from deep-sea core drilling. The type and concentration of the extracted foram reveal vital information about the temperature, salinity, and productivity of the oceans during the foram’s lifetime.5 By performing factor analysis and other statistical analyses on this information, PRISM has created a model of the Pliocene that covers both oceanic and terrestrial areas, providing a broad view of our planet as it existed 3 million years ago. Using the information provided by this model, scientists can determine where temperatures will increase the most and what impact such a temperature increase will have on life that can exist in those areas.

Since its inception in 1989, PRISM has predicted, with proven accuracy, two main trends.The first is that average temperatures will increase the most at the poles, with areas nearest to the equator experiencing the least amount of temperature increase.5 The second is that tropical plants will expand outward from the equator, taking root in the middle and higher latitudes.5

There are some uncertainties associated with the research behind PRISM. Several assumptions were made, such as the idea of uniformitarianism, which states that the same natural laws and physical processes that occur now were true in the past. The researchers also assumed that the ecological tolerances of certain key species, such as foram, have not significantly changed in the last 3 million years. Even with these normalizing assumptions, an important discrepancy exists between the Pliocene and our Holocene: the Pliocene achieved its temperature at a normal rate and remained relatively stable throughout its era, while our temperatures are increasing at a much more rapid rate.

The film industry has fetishized climate change, predicting giant hurricanes and an instant ice age, as seen in the films 2012 and The Day After Tomorrow. Fortunately, nothing as cataclysmic will occur. However, a rise in global average temperature and a change in our ecosystems is nothing to be ignored or dismissed as normal. It is only through the research done by the USGS via PRISM and similar systems that our species can be prepared for the coming decades of change.

References

  1. Earth Observatory. http://earthobservatory.nasa.gov/Features/GlobalWarming/page3.php (accessed Oct. 1, 2016).
  2. Pachauri, R.K., et. al. IPCC 4th Assessment 2007, 104.
  3. PRISM4D Collaborating Institutions. Pliocene Research Interpretation and Synoptic Mapping. http://geology.er.usgs.gov/egpsc/prism/ (Oct. 3, 2016).
  4. Monroe, R. What Does 400PPM Look Like?. https://scripps.ucsd.edu/programs/keelingcurve/2013/12/03/what-does-400-ppm-look-like/ (accessed Oct. 19, 2016).
  5. Robinson, M. M., J. Am. Sci. 2011, 99, 228

Comment

Telomeres: Ways to Prolong Life

Comment

Telomeres: Ways to Prolong Life

Two hundred years ago, the average life expectancy oscillated between 30 and 40 years, as it had for centuries before. Medical knowledge was fairly limited to superstition and folk cures, and the science behind what actually caused disease and death was lacking. Since then, the average lifespan of human beings has skyrocketed due to scientific advancements in health care, such as an understanding of bacteria and infections. Today, new discoveries are being made in cellular biology which, in theory, could lead us to the next revolutionary leap in life span. Most promising among these recent discoveries is the manipulation of telomeres in order to slow the aging process, and the use of telomerase to identify cancerous cells.

Before understanding how telomeres can be utilized to increase the average lifespan of humans, it is essential to understand what a telomere is. When cells divide, their DNA must be copied so that all of the cells share an identical DNA sequence. However, the DNA cannot be copied all the way to the end of the strand, resulting in the loss of some DNA at the end of the sequence with every single replication.1 To prevent valuable genetic code from being cut off during cell division, our DNA contains telomeres, a meaningless combination of nucleotides at the end of our chromosomal sequences that can be cut off without consequences to the meaningful part of the DNA. Repeated cell replication causes these protective telomeres to become shorter and shorter, until valuable genetic code is eventually cut off, causing the cell to malfunction and ultimately die.1 The enzyme telomerase functions in cells to rebuild these constantly degrading telomeres, but its activity is relatively low in normal cells as compared to cancer cells.2

The applications of telomerase manipulation have only come up fairly recently, with the discovery of the functionality of both telomeres and telomerase in the mid 80’s by Nobel Prize winners Elizabeth Blackburn, Carol Grieder, and Jack Sjozak.3 Blackburn discovered a sequence at the end of chromosomes that was repeated several times, but could not determine what the purpose of this sequence was. At the same time, Sjozak was observing the degradation of minichromosomes, chromatin-like structures which replicated during cell division when introduced to a yeast cell. Together, they combined their work by isolating Blackburn’s repeating DNA sequences, attaching them to Sjozak’s minichromosomes, and then placing the minichromosomes back inside yeast cells. With the new addition to their DNA sequence, the minichromosomes did not degrade as they had before, thus proving that the purpose of the repeating DNA sequence, dubbed the telomere, was to protect the chromosome and delay cellular aging.

Because of the relationship between telomeres and cellular aging, many scientists theorize that cell longevity could be enhanced by finding a way to control telomere degradation and keep protective caps on the end of cell DNA indefinitely.1 Were this to be accomplished, the cells would be able to divide an infinite number of times before they started to lose valuable genetic code, which would theoretically extend the life of the organism as a whole.

In addition, studies into telomeres have revealed new ways of combatting cancer. Although there are many subtypes of cancer, all variations of cancer involve the uncontrollable, rapid division of cells. Despite this rapid division, the telomeres of cancer cells do not shorten like those of a normal cell upon division, otherwise this rapid division would be impossible. Cancer cells are likely able to maintain their telomeres due to their higher levels of telomerase.3 This knowledge allows scientists to use telomerase levels as an indicator of cancerous cells, and then proceed to target these cells. Vaccines that target telomerase production have the potential to be the newest weapon in combatting cancer.2 Cancerous cells continue to proliferate at an uncontrollable rate even when telomerase production is interrupted. However, without the telomerase to protect their telomeres from degradation, these cells eventually die.

As the scientific community advances its ability to control telomeres, it comes closer to controlling the process of cellular reproduction, one of the many factors associated with human aging and cancerous cells. With knowledge in these areas continuing to develop, the possibility of completely eradicating cancer and slowing the aging process is becoming more and more realistic.

References

  1. Genetic Science Learning Center. Learn.Genetics. http://learn.genetics.utah.edu (accessed Oct. 5, 2016).
  2. Shay, J. W.; Woodring W. E.  NRD. [Online] 2016, 5. http://www.nature.com/nrd/journal/v5/n7/full/nrd2081.html (accessed Oct. 16, 2016).
  3. The 2009 Nobel Prize in Physiology or Medicine - Press Release. The Nobel Prize. https://www.nobelprize.org/nobel_prizes/medicine/laureates/2009/press.html (accessed Oct. 4, 2016).

Comment

Astrocytes: Shining the Spotlight on the Brain’s Rising Star

Comment

Astrocytes: Shining the Spotlight on the Brain’s Rising Star

We have within us the most complex and inspiring stage to ever be set: the human brain. The cellular components of the brain act as players, interacting through chemical and electrical signaling to elicit emotions and convey information. Although most of our attention has in the past been focused on neurons, which were erroneously presumed to act alone in their leading role, scientists are slowly realizing that astrocytes—glial cells in the brain that were previously assumed to only have a supportive role in association with neurons—are so much more than merely supporting characters.

Though neurons are the stars, most of the brain is actually composed of supportive cells like microglia, oligodendrocytes, and, most notably, astrocytes. Astrocytes, whose formal name is a misnomer given that modern imaging technology reveals they actually maintain a branch-like shape rather than a star-like one, exist as one of three mature types in the grey matter, white matter, or retina. Structurally, the grey matter astrocyte variant exhibits bushy, root-like tendrils and a spherical shape. The white matter variant, commonly found in the hippocampus, favors finer extensions called processes. The retinal variant features an elongated structure.¹

Functionally, astrocytes were previously believed to play a solely supportive role, as they constitute a large percentage of the glial cells present in the brain. Glial cells are essentially all of the non-neural cells in the brain that assist in basic functioning; they themselves are not electrically excitable. However, current research suggests that astrocytes play far more than merely a supporting role in the brain. Astrocytes and neurons directly interact to interpret stimuli and store memories⁴, among many other yet undiscovered tasks.

Although astrocytes are not electrically excitable, astrocytes communicate with neurons via calcium signaling and the neurotransmitter glutamate.² Calcium signaling works whereby intracellular calcium in astrocytes is released upon excitation and is propagated in waves that move through neighboring astrocytes and neurons. Neurons experience a responsive increase in intracellular calcium if they are directly touching affected astrocytes, as the signal is communicated via gap junctions rather than synaptically. Such signalling is unidirectional; calcium excitation can move from astrocyte to neuron, but not from neuron to astrocyte.³ The orientation of astrocytes in different regions of the brain and their proximity to neurons allows them to form close communication networks that help information travel throughout the central nervous system.

Astrocytes in the hippocampus play a role in memory development. They act as an intermediary cell in a neural inhibitory circuit that utilizes acetylcholine, glutamate, and Gamma-Aminobutyric Acid (GABA) to solidify experiential learning and memory formation. Disruption of cholinergic signaling, signaling relating to acetylcholine, prohibits the formation of memories in the dentate gyrus of the hippocampal formation. Astrocytes act as mediators that convert cholinergic inputs into glutamatergic activation of neurons.⁴ Without the assistance of astrocytic networks in close association with neurons, memory formation and long-term potentiation would be far less efficient if even still possible.

Astrocytes’ ability to interpret and release chemical neurotransmitters, especially glutamate, allows them to regulate the intensity of synaptic firing in neurons.⁵ Increased glutamate uptake by astrocytes reduces synaptic strength in associated neurons by decreasing neuronal concentration of glutamate.⁶ Regulation of synaptic strength in firing is crucial for healthy brain function. If synapses fire too much or too powerfully, they may overwhelm the brain. Conversely, if synapses fire too infrequently or not strongly enough, messages might not make their way throughout the central nervous system. The ability of astrocytes to modulate synaptic activity through selective glutamate interactions puts them in an integral position to assist in consistent and efficient transmission of information throughout the human body.

Through regulation of neurotransmitters and psychoactive chemicals in the brain, astrocytes are able to maintain homeostasis in the central nervous system. Potassium buffering and balancing of pH are the major ways that astrocytes assist in maintaining optimal conditions for brain function.⁷ Astrocytes are able to compensate for the slow re-uptake of potassium by neurons, thus decluttering the extracellular space of free potassium in response to neuronal activity. Re-uptake of these ions is extremely important to brain function as synaptic transmission by neurons relies on electrically switching membrane potentials along neuronal axons.

Due to their role in synaptic regulation and their critical position in the brain network, astrocytes also have the potential to aid in therapies for dealing with neurological disorders. For example, epileptic seizures have been found to be related to an excitatory loop between neurons and astrocytes. Focal ictal discharges, the brain activity responsible for epileptic seizures, are correlated to hyperactivity in neurons as well as an increase in intracellular calcium in nearby astrocytes; the calcium oscillations then spread to neighboring astrocyte networks to perpetuate the ictal discharge and continue the seizure. Astrocytes in epileptic brain tissues exhibit structural changes that may favor such a positive feedback loop. Inhibition of calcium uptake in astrocytes, and consequent decrease in release of glutamate and ATP, is linked to suppression of ictal discharges, and therefore linked to a decrease in the severity and occurrence of epileptic seizures.⁸ Furthermore, it is evident that astrocyte activity also plays a role in memory loss associated with Alzheimer’s Disease. Although astrocytes in the hippocampus contain low levels of the neurotransmitter GABA under normal conditions, hyperactive astrocytes near amyloid plaques in affected individuals exhibit increased levels of GABA that are not evident in other types of glial cells. GABA is the main inhibitory neurotransmitter in the brain, and abnormal increases in GABA are associated with Alzheimer’s Disease; introducing antagonist molecules has been shown to reduce memory impairment, but at the cost of inducing seizures.⁹ Since there is a shift in GABA release by astrocytes between normal and diseased individuals, astrocytes could be as the key to remedying neurodegenerative conditions like Alzheimer’s.

In addition to aiding in treatment of neurological disorders, astrocytes may also help stroke victims. Astrocytes ultimately support damaged neurons by donating their mitochondria to the neurons.¹⁰ Mitochondria produce adenosine triphosphate (ATP) and act as the energy powerhouse in eukaryotic cells; active cells like neurons cannot survive without them. Usually neurons accommodate their exceptionally large energy needs by multiplying their intracellular mitochondria via fission. However, when neurons undergo stress or damage, as in the case of stroke, the neuron is left without its source of energy. New research suggests that astrocytes come to the rescue by releasing their own mitochondria into the extracellular environment in response to high levels of the enzyme CD38, so that damaged neurons can absorb the free mitochondria and survive the damage.¹¹ Astrocytes also help restore neuronal mitochondria and ATP production post-insult by utilizing lactate shuttles, in which astrocytes generate lactate through anaerobic respiration and then pass the lactate to neurons where it can be used as a substrate for oxidative metabolism¹². Such a partnership between astrocytes and neurons presents researchers with the option of using astrocyte-targeted therapies to salvage neuronal systems in stroke victims and others afflicted by ailments associated with mitochondrial deficiencies in the brain.

Essentially, astrocytes are far more than the background supporters they were once thought to be. Before modern technological developments, the capabilities and potential of astrocytes were left woefully unnoticed. Astrocytes interact both directly and indirectly with neurons through chemical signaling to create memories, interpret stimuli, regulate signaling, and, maintain a healthy central nervous system. A greater understanding of the critical role astrocytes play in the human brain could allow scientists to develop astrocyte-targeted therapeutic practices. As astrocytes slowly inch their way into the spotlight of neuroscientific research, there is so much yet to be discovered.

References

  1. Kimelberg, H.K.; Nedergaard, M. Neurotherapeutics 2010, 7, 338-353
  2. Schummers, J. et al. Science 2008, 320, 1638-1643
  3. Nedergaard, M. Science 1994, 263, 1768+
  4. Ferrarelli, L. K. Sci. Signal 2016, 9, ec126
  5. Gittis, A. H.; Brasier, D. J. Science 2015, 349, 690-691
  6. Pannasch, U. et al. Nature Neuroscience 2014, 17, 549+
  7. Kimelberg, H.K.; Nedergaard, M. Neurotherapeutics 2010, 7(4), 338-353
  8. Gomez-Gonzalo, M. et al. PLoS Biology 2010, 8,
  9. Jo, S. et al. Nature Medicine 2014, 20, 886+
  10. VanHook, A. M. Sci. Signal 2016, 9, ec174
  11. Hayakawa, K. et al. Nature 2016, 535, 551-555
  12. Genc, S. et al. BMC Systems Biology 2011, 5, 162

Comment

Surviving Without the Sixth Sense

Comment

Surviving Without the Sixth Sense

Though references to the “sixth sense” often bring images of paranormal phenomena to mind, the scientific world has bestowed this title to our innate awareness of our own bodies in space. Proprioception, the official name of this sense, is what allows us to play sports and navigate in the dark. Like our other five senses, our capability for spatial awareness has become so automatic that we hardly ever think about it. But scientists at the National Institute of Health (NIH) have made some breakthroughs about a genetic disorder that causes people to lack this sense, leading to skeletal abnormalities, balance difficulties, and even the inability to discern some forms of touch.1

The gene PIEZO2 has been associated with the body’s ability to sense touch and coordinate physical actions and movement. While there is not a substantial amount of research about this gene, previous studies on mice show that it is instrumental in proprioception.2 Furthermore, NIH researchers have recently attributed a specific phenotype to a mutation in PIEZO2, opening a potential avenue to unlock its secrets.

Pediatric neurologist Carsten G. Bönnermann, the senior investigator at the NIH National Institute of Neurological Disorders and Stroke, had been studying two patients with remarkably similar cases when he met Alexander Chesler at a lecture. Chesler, an investigator at the NIH National Center for Complementary and Integrative Health, joined Bönnermann in performing a series of genetic and practical tests to investigate the disorder.1

The subjects examined were an 8-year-old girl and an 18-year-old woman from different backgrounds and geographical areas. Even though these patients were not related, they both exhibited a set of similar and highly uncommon phenotypes. For example, each presented with scoliosis - unusual sideways spinal curvature - accompanied by fingers, feet, and hips that could bend at atypical angles. In addition to these physical symptoms, the patients experienced difficulty walking, substantial lack of coordination, and unusual responses to physical touch.1These symptoms are the result of PIEZO2 mutations that block the gene’s normal activity or production. Using full genome sequencing, researchers found that both patients have at least one recessively-inherited nonsense variant in the coding region of PIEZO2.1 But because these patients represent the first well-documented cases of specific proprioceptive disorders, there is not an abundance of research about the gene itself. Available previous studies convey that PIEZO2 encodes a mechanosensitive protein - that is, it generates electrical nerve signals in response to detected changes in factors such as cell shape.2 This function is responsible for many of our physical capabilities, including spatial awareness, balance, hearing, and touch. In fact, PIEZO2 has even been found to be expressed in neurons that control mechanosensory responses, such as perception of light touch, in mice. Past studies found that removing the gene in mouse models caused intolerable limb defects.2 Since this gene is highly homogenous in humans and in mice (the two versions are 95% similar), many researchers assumed that humans could not live without the gene either. According to Bönnermann and Chesler, however, it is clear that this PIEZO2 mutation does not cause a similar fate in humans.

Along with laboratory work, Bönnermann and Chesler employed techniques to further investigate the tangible effects of the mutations. Utilizing a control group for comparison, the researchers presented patients with a set of tests that examined their movement and sensory abilities. The results were startling, to say the least. The patients revealed almost a total lack of proprioception when blindfolded. They stumbled and fell while walking and could not determine which way their joints were moving without looking. In addition, both failed to successfully move a finger from their noses to a target. The absence of certain sensory abilities is also astonishing - both patients could not feel the vibrations of a tuning fork pressed against their skin, could not differentiate between ends of a caliper pressed against their palms, and could not sense a soft brush across their palms and bottom of their feet. Furthermore, when this same soft brush was swept across hairy skin, both of the patients claimed that the sensation was prickly. This particular result revealed that the subjects were generally missing brain activation in the region linked to physical sensation, yet they appeared to have an emotional response to the brushing across hairy skin; these specific brain patterns directly contrasted with those of the control group participants. Additional tests performed on the two women revealed that the patients’ detection of pain, itching, and temperature was normal when compared to the control group findings, and that they possessed nervous system capabilities and cognitive functions appropriate for their ages.1

Because the patients are still able to function in daily life, it is apparent that the nervous system has alternate pathways that allow them to use sight to largely compensate for their lack of proprioception.3,4 Through further research, scientists can tap into these alternate pathways when designing therapies for similar patients. Additionally, the common physical features of both patients shed light on the fact that PIEZO2 gene mutations could contribute to the observed genetic musculoskeletal disorders.3 This suggests that proprioception itself is necessary for normal musculoskeletal development; it is possible that abnormalities developed over time as a result of patients’ postural responses and compensations to their deficiencies.4

In an era when our lives depend so heavily on our abilities to maneuver our bodies and coordinate movements, the idea of lacking proprioception is especially concerning. Bönnermann and Chesler’s discoveries open new doors for further investigation of PIEZO2’s role in the nervous system and musculoskeletal development. These discoveries can also aid in better understanding a variety of other neurological disorders. But, there is still much unknown about the full effects of the PIEZO2 mutation. For example, we do not know if musculoskeletal abnormalities injure the spinal cord, if the gene mutation poses additional consequences for the elderly, or if women are more susceptible to the disorder than are men. Furthermore, it is very likely that there are numerous other patients around the world who present similar symptoms to the 8-year-old girl and 18-year-old woman observed by Bönnermann and Chesler. While researchers work towards gaining a better understanding of the disease and developing specific therapies, these patients must focus on other coping mechanisms, such as reliance on vision, to accomplish even the most basic daily activities. Because contrary to popular perception, the sixth sense is not an ability to see ghosts; it is so much more.

References

  1. Chesler, A. T. et al. N Engl J Med. 2016, 375, 1355-1364
  2. Woo, S. et al. Nat Neurosci. 2015, 18, 1756-1762
  3. “‘Sixth sense’ may be more than just a feeling”. National Institutes of Health, https://www.nih.gov/news-events/news-releases/Sixth-sense-may-be-more-just-feeling (accessed Sept. 22, 2016).
  4. Price, Michael. “Researchers discover gene behind ‘sixth sense’ in humans”. Science Magazine, http://www.sciencemag.org/news/2016/09/researchers-discover-gene-behind-sixth-sense-humans (accessed Sept. 22, 2016)

Comment

The Health of Healthcare Providers

Comment

The Health of Healthcare Providers

A car crash. A heart attack. A drug overdose. No matter what time of day, where you are, or what your problem is, emergency medical technicians (EMTs) will be on call and ready to come to your aid. These health care providers are charged with providing quality care to maintain or improve patient health in the field, and their efforts have saved the lives of many who could not otherwise find care on their own. While these EMTs deserve praise and respect for their line of work, what they deserve even more is consideration for the health issues that they themselves face. Emergency medical technicians suffer from a host of long-term health issues, including weight gain, burnout, and psychological changes.

The daily "schedule" of an EMT is probably most characterized by its variability and unpredictability. The entirety of their day is a summation of what everyone in their area is doing, those people's health issues, and the uncertainty of life itself. While there are start and end times to their shifts, even these are not hard and fast--shifts have the potential to start early or end late based on when people call 911. An EMT can spend their entire shift on the ambulance, without time to eat a proper meal or to get any sleep. These healthcare providers learn to catch a few minutes of sleep here and there when possible. Their yearly schedules are also unpredictable, with lottery systems in place to ensure that someone is working every day, at all hours of the day, while maintaining some fairness. Most services will have either 12 or 24 hour shifts, and this lottery system can result in EMTs having stacked shifts that are either back to back or at least within close proximity to one another. This only enhances the possibility of sleep disorders, with 70 percent of EMTs reporting having at least one sleep problem.1 While many people have experienced the effects of exhaustion and burnout due to a lack of sleep, few can say that their entire professional career has been characterized by these feelings. EMTs have been shown to be more than twice as likely than control groups to have moderate to high scores on the Epworth Sleepiness Scale (ESS), which is correlated with a greater likelihood of falling asleep during daily activities such as conversing, sitting in public places, and driving.1 The restriction and outright deprivation of sleep in EMTs has been shown to cause a large variety of health problems, and seems to be the main factor in the decline of both physical and mental health for EMTs.

A regular amount of sleep is essential in maintaining a healthy body. Reduced sleep has been associated with an increase in weight gain, cardiovascular disease, and weakened immune system functions. Studies have shown that, at least in men, short sleep durations are linked to weight gain and obesity, which is potentially due to alterations in hormones that regulate appetite.2,3 Due to this trend, it is no surprise that a 2009 study found that sleep durations that deviated from an ideal 7-8 hours, as well as frequent insomnia, increased the risk of cardiovascular disease. The fact that EMTs often have poor diets compounds that risk. An EMT needs to be ready around the clock to respond, which means there really isn’t any time to sit down and have a proper meal. Fast food becomes the meal of choice due to its convenience, both in availability and speed. Some hospitals have attempted to improve upon this shortcoming in the emergency medical service (EMS) world by providing some snacks and drinks at the hospital. This, however, creates a different issue due to the high calorie nature of these snacks. The body generally knows when it is full by detecting stretch in the stomach, and signaling the brain that enough food has been consumed. In a balanced diet, a lot of this space should be filled with fruits, vegetables, and overall low calorie items unless you are an athlete who uses a lot more energy. By eating smaller, high calorie items, an EMT will need to eat more in order to feel full, but this will result in the person exceeding their recommended daily calories. The extra energy will often get stored as fat, compounding the weight gain due to sleep deprivation. Studies involving the effects of restricted sleep on the immune system are less common, but one experiment demonstrated markers of systemic inflammation which could, again, lead to cardiovascular disease and obesity.2

Mental health is not spared from complications due to long waking periods with minimal sleep. A study was conducted to test the cognitive abilities of subjects experiencing varying amounts of sleep restriction;the results showed that less sleep led to cognitive deficits, and being awake for more than 16 hours led to deficits regardless of how much sleep the subject had gotten.4 This finding affects both the EMTs, who can injure themselves, and the patients, who may suffer due to more errors being made in the field. First year physicians, who similarly can work over 24 hour shifts, are subject to an increased risk of automobile crashes and percutaneous (skin) injuries when sleep deprived.5 These injuries often happen when leaving a shift. A typical EMT shift lasts from one morning to the next, and the EMT will leave his or her shift during rush hour on little to no sleep, increasing the dangerous possibility of falling asleep or dozing at the wheel. A similar study to the one on first year physicians mentioned prior studied extended duration work at critical-care units, and found that long shifts increased the risk of medical errors and lapses in attention.6 In addition to the more direct mental health problems posed by the continuous strain, EMTs and others in the healthcare field also face more personal issues, including burnout and changes in behavior. A study on pediatric residents, who face similar amounts of stress and workloads, established that 20% of participants were suffering from depression, and 75% met the criteria for burnout, both of which led to medical errors made during work.7 A separate study found that emergency physicians suffering from burnout also faced high emotional exhaustion, depersonalization, and a low sense of accomplishment.8 While many go into the healthcare field to help others, exhaustion and desensitization create a sort of cynicism in order to defend against the enormous emotional burden that comes with treating patients day in and day out.

Sleep deprivation, long work duration, and the stress that comes with the job contribute to a poor environment for the physical and mental health of emergency medical technicians and other healthcare providers. However, a recent study has shown that downtime, especially after dealing with critical patients, led to lower rates of depression and acute stress in EMTs.9 While this does not necessarily ameliorate post-traumatic stress or burnout, it is a start to addressing the situation. Other possible interventions would include providing more balanced meals at hospitals that are readily available to EMTs, as well as an improved scheduling system that prevents or limits back to back shifts. These concepts can apply to others facing high workloads with abnormal sleeping schedules as well, including college students, who are also at risk for mood disorders and a poorer quality of life due to the rigors of college life.10

References

  1. Pirrallo, R. G. et al. International Journal of the Science and Practice of Sleep Medicine. 2012, 16, 149-162.
  2. Banks, S. et al. J. Clin. Sleep Med. 2007, 3(5), 519-528.
  3. Watanabe, M. et al. Sleep  2010, 33(2), 161-167.
  4. Van Dongen, H. P. et al. Sleep 2004, 27(4), 117-126.
  5. Najib, T. A. et al. JAMA 2006, 296(9), 1055-1062.
  6. Barger, L. K. et al. PLoS Med. [Online] 2006, 3(12), e487. https://dx.doi.org/10.1371%2Fjournal.pmed.0030487 (accessed Oct. 3, 2016)
  7. Fahrenkopf, A. M. et al. BMJ [Online] 2008, 336, 488. http://dx.doi.org/10.1136/bmj.39469.763218.BE (accessed Oct. 3, 2016)
  8. Ben-Itzhak, S. et al. Clin. Exp. Emerg. Med. 2015, 2(4), 217-225.
  9. Halpern, J. et al. Biomed. Res. Int. [Online] 2014, 2014. http://dx.doi.org/10.1155/2014/483140 (accessed Oct. 3, 2016)
  10. Singh, R. et al. J. Clin. Diagn. Res. [Online] 2016, 10(5), JC01-JC05. https://dx.doi.org/10.7860%2FJCDR%2F2016%2F19140.7878 (accessed Oct 3, 2016)

Comment

Zika and Fetal Viruses: Sharing More Than A Motherly Bond

Comment

Zika and Fetal Viruses: Sharing More Than A Motherly Bond

Zika is a blood-borne pathogen primarily transmitted through mosquito bites and sexual activities. Pregnant women infected by Zika can pass the virus to their fetus, causing microcephaly, a condition in which the baby has an abnormally small head indicative of abnormal brain development. With the outbreak of the Zika virus and its consequences for pregnant women and their babies, much research has focused on how the infection leads to microcephaly in fetuses.

Current Zika research has been focused on uncovering methods for early detection of Zika in pregnant women and educating the public on safe sexual practices to contain the vector of transmission to just mosquitoes.1 However, to truly end the Zika epidemic, there are three critical steps that need to be taken. First, researchers must determine the point at which maternal infections harm the neurological development of fetuses in order to ensure treatment is administered to the mothers before the brain damage becomes irreversible. Subsequently, researchers must determine the mechanism through which Zika spreads from mother to fetus. After this step, researchers can begin developing therapies to protect the fetus from Zika once the mother is already infected and also start creating a preventative vaccine. Although Zika seems like a mysterious new illness, there are several other well-studied viral infections that affect pregnancies, such as cytomegalovirus (CMV). CMV infection during pregnancy also leads to severe fetal brain damage. Previous research techniques could provide clues for researchers trying to understand more about Zika, and learning more about Zika will better equip us for handling prenatal viral outbreaks in the future.

The current detection of microcephaly of infants with Zika-infected mothers involves fetal ultrasound as early as 18 weeks into the gestation period.2 However, this is a late diagnosis of fetal Zika infection and at this point the brain abnormalities caused by the virus are irreversible. Ultrasounds and MRI scans of infants with confirmed CMV infection can detect these neurological abnormalities as well.3 However, these brain lesions are also irreversible, making early detection a necessity for CMV infections as well. Fortunately, the presence of CMV or CMV DNA in amniotic fluid can be used for early diagnosis, and current treatment options include administration of valacyclovir or hyperimmunoglobulin in the window before the fetus develops brain lesions.4 Researchers must try to identify fetal Zika infection as early as possible as opposed to relying on fetal microcephaly as the sole diagnostic tool. Some potential early detection methods include testing for Zika in the urine of pregnant women as soon as Zika symptoms are present, as opposed to screening the fetus for infection.5

Discovering the mechanism through which Zika infects the fetus is necessary to develop therapies to protect the fetus from infection. Many viruses that are transferred to the fetus during pregnancy do so by compromising the immune function of the placental barrier, allowing the virus to cross the placenta and infect the fetus. The syncytiotrophoblast is the epithelial covering of placental embryonic villi, which are highly vascular finger-like projections that increase the surface area available for exchange of nutrients and wastes between the mother and fetus.6 In one study, experiments found that infection of extravillous trophoblast cells decreased the immune function of the placenta, which increased fetal susceptibility to infection.7 Determining which cells in the placenta are infected by Zika could aid research into preventative treatments for fetal infection.

Since viruses that cross the placental barrier are able to infect the fetus, understanding the interaction between immune cells and the placental barrier is important for developing therapies against Zika that increase fetal viral resistance. In one study, researchers found that primary human trophoblast cells use cell-derived vesicles called exosomes to transfer miRNA, conferring placental immune resistance to a multitude of viruses to other pregnancy-related cells.8 miRNAs are responsible for regulating gene expression, and different miRNAs exist in different cells so that those cells will have specific functions and defenses. Isolating these miRNA exosomes, using them to supplement placental cell strains, and subsequently testing whether those cells are more or less susceptible to Zika could support the development of drugs that bolster the placental immune defense mechanism already in place. Since viral diseases that cross the placenta lead to poor fetal outcome, developing protective measures for the placenta is imperative, not only for protection against Zika but also for protection against new viruses without vaccinations.9

Combating new and more elusive viral outbreaks is difficult, but understanding and preventing viral infection in fetuses is like taking a shot in the dark. Although the prospects for infants infected by Zika are currently poor, combining the research done on other congenital infections paints a more complete picture on viral transmission during pregnancy. Instead of starting from scratch, scientists can use this information to determine the tests that can detect Zika, the organs to examine for compromised immune system function, and the treatment types that have a higher probability of effectiveness. Zika will not be the last virus that causes birth defects, but by combining the efforts of many scientists, we can get closer to stopping fetal viral infection once and for all.

References

  1. Wong, K. V. J. Epidemiol. Public Health Rev. 2016, 1.
  2. Mlakar, J., et al. N. Engl. J. Med. 2016, 374, 951-958.
  3. Malinger, G., et al. Am. J. Neuroradiol. 2003, 24, 28-32.
  4. Leruez-Ville, M., et al. Am. J. Obstet. Gynecol. 2016, 215, 462.
  5. Gourinat, A. C., et al. Emerg. Infect. Dis. 2015, 21, 84-86.
  6. Delorme-Axford, E., et al. Proc. Natl. Acad. Sci. 2013, 110, 12048-12053.
  7. Zhang, J.; Parry, S. Ann. N. Y. Acad. Sci. 2001, 943, 148-156.
  8. Mouillet, J. F., et al. Int. J. Dev. Bio. 2014, 58, 281.
  9. Mor, G.; Cardenas I. Am. J. Reprod. Immunol. 2010, 63, 425-433.

Comment

Wearable Tech is the New Black

Comment

Wearable Tech is the New Black

What if our clothes could detect cancer? That may seem like a far fetched, “only applicable in a sci-fi universe” type of concept, but such clothes do exist and similar devices that merge technology and medicine are actually quite prominent today. The wearable technology industry, a field poised to grow to $11.61 billion by 20201, is exploding in the healthcare market as numerous companies produce various devices that help us in our day to day lives such as wearable EKG monitors and epilepsy detecting smart watches. Advancements in sensor miniaturization and integration with medical devices have greatly opened this interdisciplinary trade by lowering costs. Wearable technology ranging from the Apple Watch to consumable body-monitoring pills can be used for everything from health and wellness monitoring to early detection of disorders. But as these technologies become ubiquitous, there are important privacy and interoperability concerns that must be addressed.

Wearable tech like the Garmin Vivosmart HR+ watch uses sensors to obtain insightful data about its wearer’s health. This bracelet-like device tracks steps walked, distance traveled, calories burned, pulse, and overall fitness trends over time.2 It transmits the information to an app on the user’s smartphone which uses various algorithms to create insights about the person’s daily activity. This data about a person’s daily athletic habits is useful to remind them that fitness is not limited to working out at the gym or playing a sport--it’s a way of life. Holding tangible evidence of one’s physical activity for the day or history of vital signs empowers patients to take control of their personal health. The direct feedback of these devices influences patients to make better choices such as taking the stairs instead of the elevator or setting up a doctor appointment early on if they see something abnormal in the data from their EKG sensor. Connecting hard evidence from the body to physical and emotional perceptions refines the reality of those experiences by reducing the subjectivity and oversimplification that feelings about personal well being may bring about.

Not only can wearable technology gather information from the body, but these devices can also detect and monitor diseases. Diabetes, the 7th leading cause of death in the United States,3 can be detected via AccuCheck, a technology that can send an analysis of blood sugar levels directly to your phone.4 Analysis software like BodyTel can also connect patients with doctors and other family members who would be interested in looking at the data gathered from the blood test.5 Ingestible devices such as the Ingestion Event Marker take monitoring a step further. Designed to monitor medication intake, the pills keep track of when and how frequently patients take their medication. The Freescale KL02 chip, another ingestible device, monitors specific organs in the body and relays the organ’s status back to a Wi-Fi enabled device which doctors can use to remotely measure the progression of an illness. They can assess the effectiveness of a treatment with quantitative evidence which makes decision-making about future treatment plans more effective.

Many skeptics hesitate to adopt wearable technology because of valid concerns about accuracy and privacy. To make sure medical devices are kept to the same standards and are safe for patient use, the US Food and Drug Administration (FDA) has begun to implement a device approval process. Approval is only granted to devices that provably improve the functionality of traditional medical devices and do not pose a great risk to patients if they malfunction.6In spite of the FDA approval process, much research is needed to determine whether the information, analysis and insights received from various wearable technologies can be trusted.

Privacy is another big issue especially for devices like fitness trackers that use GPS location to monitor user behavior. Many questions about data ownership (does the company or the patient own the data?) and data security (how safe is my data from hackers and/or the government and insurance companies?) are still in a fuzzy gray area with no clear answers.7 Wearable technology connected to online social media sites, where one’s location may be unknowingly tied to his or her posts, can increase the chance for people to become victims of stalking or theft. Lastly, another key issue that makes medical practitioners hesitant to use wearable technology is the lack of interoperability, or the ability to exchange data, between devices. Data structured one way on a certain wearable device may not be accessible on another machine. Incorrect information might be exchanged, or data could be delayed or unsynchronized, all to the detriment of the patient.

Wearable technology is changing the way we live our lives and understand the world around us. It is modifying the way health care professionals think about patient care by emphasizing quantitative evidence for decision making over the more subjective analysis of symptoms. The ability for numeric evidence about one’s body to be documented holds people accountable for the actions. Patients can check to see if they meet their daily step target or optimal sleep count, and doctors can track the intake of a pill and see its effect on the patient’s body. For better or for worse, we won’t get the false satisfaction of achieving our fitness goal or of believing in the success of a doctor’s recommended course of action without tangible results. While we have many obstacles to overcome, wearable technology has improved the quality of life for many people and will continue to do so in the future.

References

  1. [Hunt, Amber. Experts: Wearable Tech Tests Our Privacy Limits. http://www.usatoday.com/story/tech/2015/02/05/tech-wearables-privacy/22955707/ (accessed Oct. 24, 2016).
  2. Vivosmart HR+. https://buy.garmin.com/en-US/US/into-sports/health-fitness/vivosmart-hr-/prod548743.html (accessed Oct. 31, 2016).
  3. Statistics about Diabetes. http://www.diabetes.org/diabetes-basics/statistics/ (accessed Nov. 1, 2016).
  4. Accu-Chek Mobile. https://www.accu-chek.co.uk/gb/products/metersystems/mobile.html (accessed Oct. 31, 2016).
  5. GlucoTel. http://bodytel.com/portfolios/glucotel/ (accessed Oct. 31, 2016)
  6. Mobile medical applications guidance for industry and Food and Drug Administration staff. U. S. Food and Drug Administration, Feb. 9, 2015. http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM263366.pdf (accessed Oct. 17, 2016).
  7. Meingast, M.; Roosta, T.; Sastry, S. Security and Privacy Issues with Health Care Information Technology. http://www.cs.jhu.edu/~sdoshi/jhuisi650/discussion/secprivhealthit.pdf (accessed Nov. 1, 2016).

Comment

Algae: Pond Scum or Energy of the Future?

Comment

Algae: Pond Scum or Energy of the Future?

In many ways, rising fuel demands indicate positive development--a global increase in energy accessibility. But as the threat of climate change from burning fuel begins to manifest, it spurs the question: How can the planet meet global energy needs while sustaining our environment for years to come? While every person deserves access to energy and the comfort it brings, the population cannot afford to stand by as climate change brings about ecosystem loss, natural disaster, and the submersion of coastal communities. Instead, we need a technological solution which will meet global energy needs while promoting ecological sustainability. When people think of renewable energy, they tend to picture solar panels, wind turbines, and corn-based ethanol. But what our society may need to start picturing is that nondescript, green-brown muck that crowds the surface of ponds: algae.

Conventional fuel sources, such as oil and coal, produce energy when the carbon they contain combusts upon burning. Problematically, these sources have sequestered carbon for millions of years, hence the term fossil fuels. Releasing this carbon now increases atmospheric CO2 to levels that our planet cannot tolerate without a significant change in climate. Because fossils fuels form directly from the decomposition of plants, live plants also produce the compounds we normally burn to release energy. But, unlike fossil fuels, living biomass photosynthesizes up to the point of harvest, taking CO2 out of the atmosphere. This coupling between the uptake of CO2 by photosynthesis and the release of CO2 by combustion means using biomass for fuel should not add net carbon to the atmosphere.1 Because biofuel provides the same form of energy through the same processes as fossil fuel, but uses renewable resources and does not increase atmospheric carbon, it can viably support both societal and ecological sustainability.

If biofuel can come from a variety of sources such as corn, soy, and other crops, then why should we consider algae in particular? Algae double every few hours, a high growth rate which will be crucial for meeting current energy demands.2 And beyond just their power in numbers, algae provide energy more efficiently than other biomass sources, such as corn.1 Fat composes up to 50 percent of their body weight, making them the most productive provider of plant oil.3,2 Compared to traditional vegetable biofuel sources, algae can provide up to 50 times more oil per acre.4 Also, unlike other sources of biomass, using algae for fuel will not detract from food production. One of the primary drawbacks of growing biomass for fuel is that it competes with agricultural land and draws from resources that would otherwise be used to feed people.3 Not only does algae avoid this dilemma by either growing on arid, otherwise unusable land or on water, but also it need not compete with overtaxed freshwater resources. Algae proliferates easily on saltwater and even wastewater.4 Furthermore, introducing algae biofuel into the energy economy would not require a systemic change in infrastructure because it can be processed in existing oil refineries and sold in existing gas stations.2

However, algae biofuel has yet to make its grand entrance into the energy industry. When oil prices rose in 2007, interest shifted towards alternative energy sources. U.S. energy autonomy and the environmental consequences of carbon emission became key points of discussion. Scientists and policymakers alike were excited by the prospect of algae biofuel, and research on algae drew governmental and industrial support. But as U.S. fossil fuel production increased and oil prices dropped, enthusiasm waned.2

Many technical barriers must be overcome to achieve widespread use of algae, and progress has been slow. For example, algae’s rapid growth rate is both its asset and its Achilles’ heel. Areas colonized by algae can easily become overcrowded, which blocks access to sunlight and causes large amounts of algae to die off. Therefore, in order to farm algae as a fuel source, technology must be developed to regulate its growth.3 Unfortunately, the question of how to sustainably grow algae has proved troublesome to solve. Typically, algae for biofuel use is grown in reactors in order to control growth rate. But the ideal reactor design has yet to be developed, and in fact, some current designs use more energy than the algae yield produces.5

Although algae biofuel faces technological obstacles and dwindling government interest, many scientists today still see algae as a viable and crucial solution for future energy sustainability. UC San Diego houses the California Center for Algal Biotechnology, and Dr. Stephen Mayfield, a molecular biologist at the center, has worked with algae for over 30 years. In this time he has helped start four companies, including Sapphire Energy, founded in 2007, which focuses on developing algae biofuels. After receiving $100 million from venture capitalists in 2009, Sapphire Energy built a 70,000-square-foot lab in San Diego and a 220-acre farm in New Mexico. They successfully powered cars and jets with algae biofuel, drawing attention and $600 million in further funding from ExxonMobil. Although diminished interest then stalled production, algal researchers today believe people will come to understand the potential of using algae.2 The Mayfield Lab currently works on developing genetic and molecular tools to make algae fuel a viable means of energy production.4 They grow algae, extract its lipids, and convert them to gasoline, jet, and diesel fuel. Mayfield believes his lab will reach a low price of 80 or 85 dollars per barrel as they continue researching with large-scale biofuel production.1

The advantage of growing algae for energy production lies not only in its renewability and carbon neutrality, but also its potential for other uses. In addition to just growing on wastewater, algae can treat the water by removing nitrates.5 Algae farms could also provide a means of carbon sequestration. If placed near sources of industrial pollution, they could remove harmful CO2 emissions from the atmosphere through photosynthesis.4 Additionally, algae by-products are high in protein and could serve as fish and animal feed.5

At this time of increased energy demand and dwindling fossil fuel reserves, climate change concerns caused by increased atmospheric carbon, and an interest in U.S. energy independence, we need economically viable but also renewable, carbon neutral energy sources.4 Algae holds the potential to address these needs. Its rapid growth and photosynthetic ability mean its use as biofuel will be a sustainable process that does not increase net atmospheric carbon. The auxiliary benefits of using algae, such as wastewater treatment and carbon sequestration, increase the economic feasibility of adapting algae biofuel. While technological barriers must be overcome before algae biofuel can be implemented on a large scale, demographic and environmental conditions today indicate that continued research will be a smart investment for future sustainability.

References

  1. Deaver, Benjamin. Is Algae Our Last Chance to Fuel the World? Inside Science, Sep. 8, 2016.
  2. Dineen, Jessica. How Scientists Are Engineering Algae To Fuel Your Car and Cure Cancer. Forbes UCVoice, Mar. 30, 2015.
  3. Top 10 Sources for Biofuel. Seeker, Jan. 19, 2015.
  4. California Center for Algae Biotechnology. http://algae.ucsd.edu/. (accessed Oct. 16, 2016).
  5. Is Algae the Next Sustainable Biofuel? Forbes StatoilVoice, Feb. 27, 2015. (republished from Dec. 2013)

Comment

First World Health Problems

Comment

First World Health Problems

I am a first generation American, as both of my parents immigrated here from Myanmar, a third world country. There had been no occurrence of any Inflammatory Bowel Disease (IBD) in my family, yet I was diagnosed with Ulcerative Colitis at the beginning of my sophomore year of high school. Since IBD is known to be caused by a mix of genetic and environmental factors,1,2 what specifically triggered me to develop Ulcerative Colitis? Was it the food in America, the air I was exposed to, a combination of the two, or neither of them at all? Did the “environment” of the first world in the United States cause me to develop Ulcerative Colitis?

IBD is a chronic autoimmune disease, characterized by persistent inflammation of the digestive tract and classified into two separate categories: Ulcerative Colitis and Crohn’s Disease.3 Currently, there is no known cure for IBD, as its pathogenesis (i.e. the manner in which it develops) is not fully understood.1 Interestingly, the incidence of IBD has increased dramatically over the past century.1 A systematic review by Molodecky et al. showed that the incidence rate of IBD was significantly higher in Western nations. This may be due to better diagnostic techniques or the growth of environmental factors that promote its development. This could also suggest that there may be certain stimuli in first world countries that can trigger pathogenesis in individuals with a genetic predisposition to IBD.

Environmental factors that are believed to affect IBD include smoking, diet, geographic location, social status, stress, and microbes.1 Smoking has had varying effects on the development of IBD depending on the form; smoking is a key risk factor for Crohn’s Disease, while non-smokers and ex-smokers are usually diagnosed with Ulcerative Colitis.4 There have not been many studies investigating the causal relationship between diet and IBD due to the diversity in diet composition.1 However, since IBD affects the digestive system, diet has long been thought to have some impact on the pathogenesis of the disease.1 In first world countries, there is access to a larger variety of food, which may impact the prevalence of IBD. People susceptible to the disease in developing countries may have a smaller chance of being exposed to “trigger” foods. In addition, IBD has been found in higher rates in urban areas versus rural areas.1,4,5 This makes sense, as cities have a multitude of potential disease-inducing environmental factors including pollution, poor sanitation, and microbial exposure. Higher socioeconomic status has also been linked to higher rates of IBD.4 This may be partly due to the sedentary nature of white collar work, which has also been linked to increased rates of IBD.1 Stress used to be viewed as a possible factor in the pathogenesis of IBD, but recent evidence has indicated that it only exacerbates the disease.3 Recent research has focused on the microorganisms in the gut, called gut flora, as they seem to have a vital role in the instigation of IBD.1 In animal models, it has even been observed that pathogenesis of IBD is not possible in a germ-free environment.1 The idea of the importance of microorganisms in human health is also linked to the Hygiene Hypothesis.

The Hygiene Hypothesis states that the lack of infections in western countries is the reason for an increasing amount of autoimmune and allergic diseases.6 The idea behind the theory is that some infectious agents guard against a wide variety of immune-related disorders.6 Animal models and clinical trials have provided some evidence backing the Hygiene Hypothesis, but it is hard to causally attribute the pathogenesis of autoimmune and allergic diseases to a decrease in infections, since first world countries have very different environmental factors than third world countries.6

The increasing incidence of IBD in developed countries is not yet fully understood, but recent research points towards a complex combination of environmental and genetic factors. The rise of autoimmune disease diagnoses may also be attributed to better medical equipment and facilities and the tendency of people in more developed countries to regularly get checked by a doctor. There are many difficulties in researching the pathogenesis of IBD including isolating certain environmental factors and obtaining tissue and data from third world countries. However, there is much promising research and it might not be long until we discover a cure for IBD.

References

  1. Danese, S. et al. Autoimm Rev 2004, 3.5, 394-400.
  2. Podolsky, Daniel K. N Engl J Med 2002,  347.6, 417-29.
  3. Mayo Clinic. "Inflammatory Bowel Disease (IBD)." http://www.mayoclinic.org/diseases-conditions/inflammatory-bowel-disease/basics/definition/con-20034908 (accessed Sep. 30, 2016).
  4. CDC. "Epidemiology of the IBD." https://www.cdc.gov/ibd/ibd-epidemiology.htm (accessed Oct.17, 2016).
  5. Molodecky, N. et al. Gastroenterol 2012, 142.1, n. pag.
  6. Okada, H. et. al. Clin Exp Immuno 2010, 160, 1–9.

Comment

Corals in Hot Water, Literally

Comment

Corals in Hot Water, Literally

Coral reefs support more species per unit area than any other marine environment, provide over half a billion people worldwide with socio-economic benefits, and produce an estimated USD $30 billion annually.1 Many people do not realize that these diverse ecosystems are at risk of extinction as a result of human activity--the Caribbean has already lost 80% of its coral cover in the past few decades2 and some estimates report that at least 60% of all coral will be lost by 2030.1 One of the most predominant and direct threats to the health of these fragile ecosystems is the enormous amount of carbon dioxide and methane that have spilled into the atmosphere, warming the planet and its oceans on unprecedented levels.

Corals are Cnidarians, the phylum characterized by simple symmetrical structural anatomy. Corals reproduce either asexually or sexually and create stationary colonies made up of hundreds of genetically identical polyps.3 The major reef-building corals belong to a sub-order of corals, called Scleractinia. These corals contribute substantially to the reef. framework and are key species in building and maintaining the structural complexity of the reef.3 The survival of this group is of particular concern, since mass die- offs of these corals affect the integrity of the reef. Corals form a symbiosis with tiny single-celled algae of the genus Symbiodinium. This symbiotic relationship supports incredible levels of biodiversity and is a beautifully intricate relationship that is quite fragile to sudden environmental change.3

The oceans absorb nearly half of the carbon dioxide in the atmosphere through chemical processes that occur at its surface.4 Carbon dioxide combines with water molecules to create a mixture of bicarbonate, calcium carbonate, and carbonic acid. Calcium carbonate is an important molecule used by many marine organisms to secrete their calcareous shells or skeletons. The increase of carbon dioxide in the atmosphere shifts this chemical equilibrium, creating higher levels of carbonic acid and less calcium carbonate.4 Carbonic acid increases the acidity of the ocean and this phenomenon has been shown to affect the skeletal formation of juvenile corals.5 Acidification weakens the structural integrity of coral skeletons and contributes to heightened dissolution of carbonate reef structure.3

The massive influx of greenhouse gases into our atmosphere has also caused the planet to warm very quickly. Corals are in hot water, literally. Warmer ocean temperatures have deadly effects on corals and stress the symbiosis that corals have with the algae that live in their tissues. Though coral can procure food by snatching plankton and other organisms with protruding tentacles, they rely heavily on the photosynthesizing organism Symbiodinium for most of their energy supply.3 Symbiodinium provides fixed carbon compounds and sugars necessary for coral skeletal growth. The coral provides the algae with a fixed position in the water column, protection from predators, and supplementary carbon dioxide.3 Symbiodinium live under conditions that are 1 to 2° C below their maximum upper thermal limit. Under warmer conditions due to climate change, sea surface temperatures can rise a few degrees above their maximum thermal limit. This means that a sudden rise in sea temperatures can stress Symbiodinium by causing photosynthetic breakdown and the formation of reactive oxygen species that are toxic to corals.3 The algae leave or are expelled from the coral tissues as a mechanism for short-term survival in what is known as bleaching. Coral will die from starvation unless the stressor dissipates and the algae return to the coral’s tissues.3

Undoubtedly, the warming of the seas is one of the most widespread threats to coral reef ecosystems. However, other threats combined with global warming may have synergistic effects that heighten the vulnerability of coral to higher temperatures. These threats include coastal development that either destroys local reefs or displaces sediment to nearby reefs, smothering them. Large human populations near coasts expel high amounts of nitrogen and phosphorous into the ecosystem, which can increase the abundance of macroalgae and reduce hard coral cover. Increased nutrient loading has been shown to be a factor contributing to a higher prevalence of coral disease and coral bleaching.6 Recreational fishing and other activities can cause physical injury to coral making them more susceptible to disease. Additionally, fishing heavily reduces population numbers of many species of fish that keep the ecosystem in balance.

The first documented global bleaching event in 1998 killed off an estimated 16% of the world’s reefs; the world experienced the destruction of the third global bleaching event occurred only last year.1 Starting in mid-2015, an El Niño Southern Oscillation (ENSO) weather event spurred hot sea surface temperatures that decimated coral reefs across the Pacific, starting with Hawaii, then hitting places like American Samoa, Australia, and reefs in the Indian Ocean.7 The aftermath in the Great Barrier Reef is stunning; the north portion of the reef experienced an average of 67% mortality.8 Some of these reefs, such as the ones surrounding Lizard Island, have been reduced to coral skeletons draped in macroalgae. With climate change, it is expected that the occurrence of ENSO events will become more frequent, and reefs around the world will be exposed to greater thermal stress.1

Some scientists are hopeful that corals may be able to acclimatize in the short term and adapt in the long term to warming ocean temperatures. The key to this process lies in the genetic type of Symbiodinium that reside in the coral tissues. There are over 250 identified types of Symbiodinium, and genetically similar types are grouped into clades A-I. The different clades of these algae have the potential to affect the physiological performance of their coral host, including responses to thermotolerance, growth, and survival under more extreme light conditions.3 Clade D symbiont types are generally more thermotolerant than those in other clades. Studies have shown a low abundance of Clade D organisms living in healthy corals before a bleaching event, but after bleaching and subsequently recovering, the coral has a greater abundance of Clade D within its tissues.9,10 Many corals are generalists and have the ability to shuffle their symbiont type in response to stress.11

However, there is a catch. Though some algal members of Clade D are highly thermotolerant, they are also known as selfish opportunists. The reason healthy, stress-free corals generally do not have a symbiosis with this clade is that it tends to hoard the energy and organic compounds it creates from photosynthesis and shares fewer products with its coral host.3

Approaches that seemed too radical a decade ago are now widely considered as the only means to save coral reefs from the looming threat of extinction. Ruth Gates, a researcher at the Hawaii Institute of Marine Biology is exploring the idea of assisted evolution in corals. Her experiments include breeding individual corals in the lab, exposing them to an array of stressors, such as higher temperatures and lower pH, and picking the hardiest survivors to transplant to reefs.12 In other areas of the globe, scientists are breeding coral larvae in labs and then releasing them onto degraded reefs where they will hopefully settle and form colonies.

Governments and policy makers can create policies that have significant impact on the health of reefs. The creation of marine protected areas that heavily regulates or outlaws harvesting of marine species offers sanctuary to a stressed and threatened ecosystem.3 There is still a long way to go, and the discoveries being made so far about coral physiology and resilience are proving that the coral organism is incredibly complex.

The outlook on the future of healthy reefs is bleak; rising fossil fuel consumption rates mock the global goal of keeping rising temperatures below two degrees Celsius. Local stressors such as overfishing, pollution, and coastal development cause degradation of reefs worldwide. Direct human interference in the acclimatization and adaptation of corals may be instrumental to their survival. Rapid transitions to cleaner sources of energy, the creation of more marine protection areas, and rigid management of reef fish stocks may ensure coral reef survival. If humans fail in this endeavor, one of the most biodiverse and productive ecosystems on earth that has persisted for millions of years may come crashing to an end within our lifetime.

References

  1. Cesar, H., L. Burke, and L. Pet-Soede. 2003. "The Economics of Worldwide Coral Reef Degradation." Arnhem, The Netherlands: Cesar Environmental Economics Consulting. http://pdf.wri.org/cesardegradationreport100203.pdf (accessed Dec 14, 2016)
  2. Gardner, T.A. et al. Science 2003, 301:958–960.
  3. Sheppard C., Davy S., Piling G., The Biology of Coral Reefs; Biology of Habitats Series; Oxford University Press; 1st Edition, 2009
  4. Branch, T.A.et al. Trends in Ecology and Evolution 2013, 28:178-185
  5. Foster, T. et al. Science Advances 2016, 2(2) e1501130
  6. Vega Thurber, R.L. et al. Glob Change Biol 2013, 20:544-554
  7. NOAA Coral Watch, NOAA declares third ever global coral bleaching event. Oct 8, 2015. http://www.noaanews.noaa.gov/stories2015/100815-noaa-declares-third-ever-global-coral-bleaching-event.html (accessed Dec 15, 2016)
  8. ARC Centre of Excellence for Coral Reef Studies, Life and Death after the Great Barrier Reef Bleaching. Nov 29, 2016 https://www.coralcoe.org.au/media-releases/life-and-death-after-great-barrier-reef-bleaching (accessed Dec 13, 2016)
  9. Jones A.M. et al. Proc. R. Soc. B 2008, 275:1359-1365
  10. Silverstein, R. et al. Glob Change Biology 2014, 1:236-249
  11. Correa, A.S.; Baker, A.C. Glob Change Biology 2010, 17:68-75
  12. Mascarelli, M. Nature 2014, 508:444-446

Comment

East Joins West: The Rise of Integrative Medicine

Comment

East Joins West: The Rise of Integrative Medicine

An ancient practice developed thousands of years ago and still used by millions of people all over the world, Traditional Chinese Medicine (TCM) has undoubtedly played a role in the field of medicine. But just what is TCM? Is it effective? And can it ever be integrated with Western medicine?

The techniques of TCM stem from the beliefs upon which it was founded. The theory of the yin and yang balance holds that all things in the universe are composed of a balance between the forces of yin and yang. While yin is generally associated with objects that are dark, still, and cold, yang is associated with items that are bright, warm, and in motion.1 In TCM, illness is believed to be a result of an imbalance of yin or yang in the body. For instance, when yin does not cool yang, yang rises and headaches, flushing, sore eyes, and sore throats result. When yang does not warm yin, poor circulation of blood, lethargy, pallor, and cold limbs result. TCM aims to determine the nature of the disharmony and correct it through a variety of approaches. As the balance is restored in the body, so is the health.2

Another fundamental concept of TCM is the idea of qi, which is the energy or vital force responsible for controlling the functions of the human mind and body. Qi flows through the body through 12 meridians, or channels, that correspond to the 12 major organ systems, and 8 extra meridians that are all interconnected with the major channels. Just like an imbalance between yin and yang, disruption to the flow causes disease, and correction of the flow restores the body to balance.2 In TCM, disease is not viewed as something that a patient has. Rather, it is something that the patient is. There is no isolated entity called “disease,” but only a whole person whose body functions may be balanced or imbalanced, harmonious or disharmonious.3 Thus, TCM practitioners aim to increase or decrease qi in the body to create a healthy yin-yang balance through various techniques such as acupuncture, herbal medicine, nutrition, and mind/body exercise (tai chi, yoga). Eastern treatments are dismissed by some as superfluous to the recovery process and even harmful if used in place of more conventional treatments. However, evidence exists indicating Eastern treatments can be very effective parts of recovery plans.

The most common TCM treatments are acupuncture, which involves inserting needles at precise meridian points, and herbal medicine, which refers to using plant products (seeds, berries, roots, leaves, bark, or flowers) for medicinal purposes. Acupuncture seeks to improve the body’s functions by stimulating specific anatomic sites—commonly referred to as acupuncture points, or acupoints. It releases the blocked qi in the body, which may be causing pain, lack of function, or illness. Although the effects of acupuncture are still being researched, results from several studies suggest that it can stimulate function in the body and induce its natural healing response through various physiological systems.4 According to the WHO (World Health Organization), acupuncture is effective for treating 28 conditions, while limited but probable evidence suggests it may have an effective value for many more. Acupuncture seems to have gained the most clinical acceptance as a pain reduction therapy. Research from an international team of experts pooled the results of 29 studies on chronic pain involving nearly 18,000 participants—some had acupuncture, some had “sham” acupuncture, and some did not have acupuncture at all. Overall, the study found acupuncture treatments to be superior to both a lack of acupuncture treatment and sham acupuncture treatments for the reduction of chronic pain, suggesting that such treatments are a reasonable option for afflicted patients.5 According to a study carried out at the Technical University of Munich, people with tension headaches and/or migraines may find acupuncture to be very effective in alleviating their symptoms.6 Another study at the University of Texas M.D. Anderson Cancer Center found that twice weekly acupuncture treatments relieved debilitating symptoms of xerostomia--severe dry mouth--among patients undergoing radiation for head and neck cancer.7 Additionally, acupuncture has been demonstrated to both enhance performance in the memory-related brain regions of mild cognitive impairment patients (who have an increased risk of progressing towards Alzheimer’s disease),8 and to provide therapeutic advantages in regulating inflammation in infection and inflammatory disease.9

Many studies have also demonstrated the efficacy of herbal medicine in treating various illnesses. Recently, the WHO estimated that 80% of people worldwide rely on herbal medicines for some part of their primary health care. Researchers from the University of Adelaide have shown that a mixture of extracts from the roots of two medicinal herbs, Kushe and Baituling, works to kill cancer cells.10 Furthermore, scientists concluded that herbal plants have the potential to delay the development of diabetic complications, although more investigations are necessary to characterize this antidiabetic effect.11 Finally, a study found that Chinese herbal formulations appeared to alleviate symptoms for some patients with Irritable Bowel Syndrome, a common functional bowel disorder that is characterized by chronic or recurrent abdominal pain and does not currently have any reliable medical treatment.12

Both TCM and Western medicine seek to ease pain and improve function. Can the two be combined? TCM was largely ignored by Western medical professionals until recent years, but is slowly gaining traction among scientists and clinicians as studies show that an integrative approach has been effective. For instance, for patients dealing with chronic pain, Western medicine can stop the pain quickly with medication or interventional therapy, while TCM can provide a longer-lasting solution to the underlying problem with milder side effects and a greater focus on treating the underlying illness.13 A study by Cardiff University’s School of Medicine and Peking University in China showed that combining TCM and Western medicine could offer hope for developing new treatments for liver, lung, bone, and colorectal cancers.14 Also, studies on the use of traditional Chinese medicines for the treatment of multiple diseases like bronchial asthma, atopic dermatitis, and IBS showed that an interdisciplinary approach to TCM may lead to the discovery of new medicines.15

TCM is still a developing field in the Western world, and more research and clinical trials on the benefits and mechanisms of TCM are being conducted. While TCM methods such as acupuncture and herbal medicine must be further examined to be accepted as credible treatment techniques in modern medicine, they have been demonstrated to treat various illnesses and conditions. Therefore, while it is unlikely for TCM to be a suitable standalone option for disease management, it does have its place in a treatment plan with potential applications alongside Western medicine. Utilizing TCM as a complement to Western medicine presents hope in increasing the effectiveness of healthcare treatment.

References

  1. Yin and Yang Theory. Acupuncture Today. http://www.acupuncturetoday.com/abc/yinyang.php (accessed Dec. 15, 2016).
  2. Lao, L. et al. Integrative pediatric oncology. 2012, 125-135.
  3. The Conceptual Differences between Chinese and Western Medicine. http://www.mosherhealth.com/mosher-health-system/chinese-medicine/chinese-versus-western (accessed Dec. 15, 2016).
  4. How Acupuncture Can Relieve Pain and Improve Sleep, Digestion, and Emotional Well-being. http://cim.ucsd.edu/clinical-care/acupuncture.shtml (accessed Dec. 15, 2016).
  5. Vickers, A J. et al. Arch of Internal Med. 2012, 172, 1444-1453.
  6. Melchart, D. et al. Bmj. 2005, 331, 376-382.
  7. Meng, Z. et al. Cancer. 2012, 118, 3337-3344.
  8. Feng, Y. et al. Magnetic resonance imaging. 2012, 30, 672-682.
  9. Torres-Rosas, R. et al. Nature medicine. 2014, 20, 291-295.
  10. Qu, Z. et al. Oncotarget. 2016, 7, 66003-66019.
  11. Bnouham, M. et al. Int. J. of Diabetes and Metabolism. 2006, 14, 1.
  12. Bensoussan, A. et al. Jama. 1998, 280, 1585-1589.
  13. Jiang, W. Trends in pharmacological sciences. 2005, 26, 558-563.
  14. Combining Chinese, Western medicine could lead to new cancer treatments. https://www.sciencedaily.com/releases/2013/09/130928091021.htm (accessed Dec. 15, 2016).
  15. Yuan, R.; Yuan L. Pharmacology & therapeutics. 2000, 86, 191-198.

Comment

Transplanting Time

Comment

Transplanting Time

Nowadays, it is possible for patients with organ failure to live for decades after receiving an organ transplant. Since the first successful kidney transplant in the 1950s,1,2 advances in the procedure, including the improvement of drugs that facilitate acceptance of the foreign body parts,3 have allowed surgeons to transplant a wider variety of organs, such as the heart, lungs, liver, and pancreas.2,4 Over 750,000 lives have been saved and extended through the use of organ transplants, an unthinkable feat just over 50 years ago.2 Limitations to organ transplantation, such as the lack of available organs, and the development of new advancements that can improve the process promote ongoing discussion regarding the ethics of transplants.

The idea behind an organ transplant is simple. When both the recipient and the new organ are ready, surgeons detach the blood vessels attached to the failing organ before putting the new one in its place by reattaching the patient’s blood vessels to the functioning organ. To prevent rejection of the new organ, the recipient will continue to take immunosuppressant drugs3. In exchange for this lifelong commitment, the patient often receives a longer, more enjoyable life.2

The organs used in transplants usually originate from a cadaver or a living donor.1-3 Some individuals are deterred from becoming an organ donor because they are concerned that doctors will not do their best to save them if their organs are needed. This concern is further complicated by blurred definitions of “dead”; in one ethically ambiguous situation, dying patients who are brain dead may be taken off of life support so that their organs may be donated.1-3 Stories of patients who reawaken from comas after being pronounced “dead” may give some encouragement, but a patient’s family and doctors must decide when to give up that hope. Aside from organs received from the deceased, living donors, who may be family, friends, or strangers to the recipient, may donate organs that they can live without, such as a lung or a kidney.1-3 However, the potential injuring of a healthy person for the sake of another may contradict the oath that doctors take, which instructs physicians to help, not harm their patients.1

One of the most pressing issues today stems from the following question: who receives the organs? The transplant waiting list is constantly growing because the number of organs needed greatly exceeds the number of organs that are available.1-3 Unfortunately, 22 patients die every day while they are waiting for a new organ.4 Because the issue of receiving a transplant is time-sensitive, medical officials must decide who receives a transplant first. Should the person who needs a transplant the most have greater priority over another who has been on the waiting list longer? Should a child be eligible before a senior? Should a lifelong smoker be able to obtain a new lung? Currently, national policy takes different factors into account depending on the organ to be transplanted. For example, other than compatibility requirements, patients on the waiting list for liver transplants are ranked solely on their medical need and distance from the donor hospital.4 On the other hand, people waiting for kidneys are further considered based on whether they have donated a kidney previously, their age, and their time spent on the waiting list.4

Despite various efforts to increase the number of organ donors through education and legislation, the supply of organs does not meet the current and increasing need for them.1-3 As a result, other methods of obtaining these precious resources are currently being developed, one of which is the use of animal organs, a process known as xenotransplantation. Different animal cells, tissues, and organs are being researched for use in humans, giving some hope to those on the waiting list or those who do not quite qualify for a transplant.2,3 In the past, surgeons have attempted to use a primate’s heart and liver for transplantation, but the surgical outcomes were poor.2 Other applications of animal tissue are more promising, such as the use of pigs’ islet cells, which can produce insulin, in humans.2 However, a considerable risk of using these animal parts is that new diseases may be passed from animal to human. Additionally, animal rights groups have protested the use of primates as a source of whole organs.2

Another possible solution to the deficit of organs is the use of stem cells, which have the potential to grow and specialize. Embryonic stem cells can repair and regenerate damaged organs, but harvesting them destroys the source embryo.2,3 Although the embryos are created outside of humans, there are objections to their use. What differentiates a mass of cells from a living person? Fortunately, adult stem cells can be used for treatment as well.2 Researchers have developed a new method that causes adult stem cells to return to a state similar to that of the embryonic stem cells, although the efficacy of the induced adult stem cells compared to the embryonic stem cells is still unclear.7

Regardless of the continuous controversy over the ethics of transplantation, the boundaries for organ transplants are being pushed further and further. Head transplants have been attempted for over a century in other animals, such as dogs,5 but several doctors want to move on to work with humans. To attach a head to a new body, the surgeon would need to connect the old and new nerves in the spinal cord so that the patient’s brain could interact with the host body. Progress is already being made in repairing severe spinal cord injuries. In China, Dr. Ren Xiaoping plans to attempt a complete body transplant, believed by some to be currently impossible.6 There is not much information about the amount of pain that the recipient of a body transplant must endure,5 so it may ultimately decrease, rather than increase, the patient’s quality of life. Overall, most agree that it would be unethical to continue, considering the limited success of such projects and the high chance of failure and death.

Organ transplants and new developments in the field have raised many interesting questions about the ethics of the organ transplantation process. As a society, we should determine how to address these problems and set boundaries to decide what is “right.”

References

  1. Jonsen, A. R. Virtual Mentor. 2012, 14, 264-268.
  2. Abouna, G. M. Med. Princ. Prac. 2002, 12, 54-69.
  3. Paul, B. et al. Ethics of Organ Transplantation. University of Minnesota Center for Bioethics [Online], February 2004 http://www.ahc.umn.edu/img/assets/26104/Organ_Transplantation.pdf (accessed Nov. 4, 2016)
  4. Organ Procurement and Transplantation Network. https://optn.transplant.hrsa.gov/ (accessed Nov. 4 2016)
  5. Lamba, N. et al. Acta Neurochirurgica. 2016.
  6. Tatlow, D. K. Doctor’s Plan for Full-Body Transplants Raises Doubts Even in Daring China. The New York Times. http://www.nytimes.com/2016/06/12/world/asia/china-body-transplant.html?_r=0 (accessed Nov. 4, 2016)
  7. National Institutes of Health. stemcells.nih.gov/info/basics/6.htm (accessed Jan. 23, 2017)

 

Comment

Venom, M.D.: How Some of the World’s Deadliest Toxins Fight Cancer

Comment

Venom, M.D.: How Some of the World’s Deadliest Toxins Fight Cancer

Nature, as mesmerizing as it can be, is undeniably hostile. There are endless hazards, both living and nonliving, scattered throughout all parts of the planet. At first glance, the world seems to be quite unwelcoming. Yet through science, humans find ways to survive nature and gain the ability to see its beauty. A fascinating way this is achieved involves taking one deadly element of nature and utilizing it to combat another. In labs and universities across the world today, scientists are fighting one of the world’s most devastating diseases, cancer, with a surprising weapon: animal toxins.

Various scientists around the globe are collecting venomous or poisonous animals and studying the biochemical weapons they synthesize. In their natural forms, these toxins could kill or cause devastating harm to the human body. However, by closely inspecting the chemical properties of these toxins, we have uncovered many potential ways they could help us understand, treat, and cure various diseases. These discoveries have shed a new light on many of the deadly animals we have here on Earth. Mankind may have gained new friends—ones that could be crucial to our survival against cancer and other illnesses.

Take the scorpion, for example. This arachnid exists in hundreds of forms across the globe. Although its stinger is primarily used for killing prey, it is often used for defense against other animals, including humans. Most cases of scorpion stings result in nothing more than pain, swelling, and numbness of the area. However, there are some species of scorpions that are capable of causing more severe symptoms, including death.1 One such species, Leiurus quinquestriatus (more commonly known as the “deathstalker scorpion”), is said to contain some of the most potent venoms on the planet.2 Yet despite its potency, deathstalker venom is a prime target for cancer research. One team of scientists from the University of Washington used the chlorotoxin in the venom to assist in gene therapy (the insertion of genes to fight disease) to combat glioma, a widespread and fatal brain cancer. Chlorotoxin has two important properties that make it effective against fighting glioma. First, it selectively binds to a surface protein found on many tumour cells. Second, chlorotoxin is able to inhibit the spread of tumours by disabling their metastatic ability. The scientists combined the toxin with nanoparticles in order to increase the effectiveness of gene therapy. 3 4

Other scientists found a different way to treat glioma using deathstalker venom. Researchers at the Transmolecular Corporation in Cambridge, Massachusetts produced an artificial version of the venom and attached it to a radioactive form of iodine, I-131. The resultant compound was able to find and kill glioma cells by releasing radiation, most of which was absorbed by the cancerous cells. 5 There are instances of other scorpion species aiding in cancer research as well, such as the Centruroides tecomanus scorpion in Mexico. This species’ toxin contains peptides that have the ability to specifically target lymphoma cells and kill them by damaging their ion channels. The selective nature of the peptides makes them especially useful as a cancer treatment as they leave healthy cells untouched.6

Scorpions have demonstrated tremendous medical potential, but they are far from the only animals that could contribute to the fight against cancer. Another animal that may help us overcome this disease is the wasp. To most people, wasps are nothing more than annoying pests that disturb our outdoor life. Wasps are known for their painful stings, which they use both for defense and for hunting. Yet science has shown that the venom of these insects may have medicinal properties. Researchers from the Institute for Biomedical Research in Barcelona investigated a peptide found in wasp venom for its ability to treat breast cancer. The peptide is able to kill cancer cells by puncturing the cell’s outer wall. In order to make this peptide useful in treatment, it must be able to target cancer cells specifically. Scientists overcame the specificity problem by conjugating the venom peptide with a targeting peptide specific to cancer cells.7 Similar techniques were used in Brazil while scientists of São Paulo State University studied the species Polybia paulista, another organism from the wasp family. This animal’s venom contains MP1, which also serves as a destructive agent of the cell’s plasma membrane. When a cell is healthy, certain components of the membrane should be on the inner side of the membrane, facing the interior of the cell. However, in a cancerous cell, these components, (namely, the phospholipids phosphatidylserine (PS) and phosphatidylethanolamine (PE) ) are on the outer side of the membrane. In a series of simulations, MP1 was observed to selectively and aggressively target membranes that had PS and PE on the outside of the cell. Evidently, using targeted administration of wasp toxins is a viable method to combat cancer.8

Amazingly enough, the list of cancer-fighting animals at our disposal does not end here. One of the most feared creatures on Earth, the snake, is also among the animals under scientific investigation for possible medical breakthroughs. One group of scientists discovered that a compound from the venom of the Southeast Asia pit viper (Calloselasma rhodastoma) binds to a platelet receptor protein called CLEC-2, causing clotting of the blood. A different molecule expressed by cancer cells, podoplanin, binds to CLEC-2 in a manner similar to the snake venom, also causing blood clotting. Why does this matter? In the case of cancer, tumors induce blood clots to protect themselves from the immune system, allowing them to grow freely. They also induce the formation of lymphatic vessels to assist their survival. The interaction between CLEC-2 and podoplanin is vital for for both the formation of these blood clots and lymphatic vessels, and is thus critical to the persistence of tumors. If a drug is developed to inhibit this interaction, it would be very effective in cancer treatment and prevention.9 Research surrounding the snake venom may help us develop such an inhibitor. .

Even though there may be deadly animals roaming the Earth, it is important to remember that they have done more for us than most people realize. So next time you see a scorpion crawling around or a wasp buzzing in the air, react with appreciation, rather than with fear. Looking at our world in this manner will make it seem like a much friendlier place to live.

References

  1. Mayo Clinic. http://www.mayoclinic.org/diseases-conditions/scorpion-stings/home/ovc-20252158 (accessed Oct. 29, 2016).
  2. Lucian K. Ross. Leiurus quinquestriatus (Ehrenberg, 1828). The Scorpion Files, 2008. http://www.ntnu.no/ub/scorpion-files/l_quinquestriatus_info.pdf (accessed Nov. 3, 2016).
  3. Kievit F.M. et al. ACS Nano, 2010, 4, (8), 4587–4594.
  4. University of Washington. "Scorpion Venom With Nanoparticles Slows Spread Of Brain Cancer." ScienceDaily. ScienceDaily, 17 April 2009. <www.sciencedaily.com/releases/2009/04/090416133816.htm>.
  5. Health Physics Society. "Radioactive Scorpion Venom For Fighting Cancer." ScienceDaily. ScienceDaily, 27 June 2006. <www.sciencedaily.com/releases/2006/06/060627174755.htm>.
  6. Investigación y Desarrollo. "Scorpion venom is toxic to cancer cells." ScienceDaily. ScienceDaily, 27 May 2015. <www.sciencedaily.com/releases/2015/05/150527091547.htm>.
  7. Moreno M. et al. J Control Release, 2014, 182, 13-21.
  8. Leite N.B. et al. Biophysical Journal, 2015, 109, (5), 936-947.
  9. Suzuki-Inoue K. et al. Journal of Biological Chemistry, 2010, 285, 24494-24507.

 

Comment

Fire the Lasers

Comment

Fire the Lasers

Imagine a giant solar harvester flying in geosynchronous orbit, which using solar energy, beams radiation to a single point 36, 000 km away. It would look like a space weapon straight out of Star Wars. Surprisingly, this concept might be the next so-called “moonshot” project that humanity needs to move forward. In space-based solar power generation, a solar harvester in space like the one discussed above would generate DC current from solar radiation using photovoltaic cells, and then convert it into microwaves. These microwaves would then be beamed to a rectifying antenna (or a rectenna) on the ground, which would convert them back into direct current (DC). Finally, a converter would change the DC energy to AC to be supplied into the grid.1

With ever-increasing global energy consumption and rising concerns of climate change due to the burning of fossil fuels, there has been increasing interest in alternative energy sources. Although renewable energy technology is improving every year, its current energy capacity is not enough to obviate the need for fossil fuels. Currently, wind and solar sources have capacity factors (a ratio of an energy source’s actual output over a period of time to its potential output) of around 34 and 26 percent, respectively. In comparison, nuclear and coal sources have capacity factors of 90 and 70 percent, respectively.2 Generation of energy using space solar power satellites (SSPSs) could pave the path humanity needs to move towards a cleaner future. Unlike traditional solar power, which relies on favorable weather conditions, SSPSs would allow continuous, green energy generation.

Although space-based solar power (SBSP) might sound pioneering, scientists have been flirting with the idea since Dr. Peter Glaser introduced the concept in 1968. Essentially, SBSP systems can be characterized by three elements: a large solar collector in geostationary orbit fitted with reflective mirrors, wireless transmission via microwave or laser, and a receiving station on Earth armed with rectennas.3 Such an implementation would require complete proficiency in reliable space transportation, efficient power generation and capture, practical wireless transmission of power, economical satellite design, and precise satellite-antenna calibration systems. Collectively, these goals might seem insurmountable, but taken separately, they are actually feasible. Using the principles of optics, scientists are optimizing space station design to maximize energy collection.4 There have been advancements in rectennas that allow the capture of even weak, ambient microwaves.5 With the pace of advancement speeding up every year, it’s easy to feel like the future of renewable energy is rapidly approaching. However, these advancements will be limited to literature if there are no global movements to utilize SBSP.

Japan Aerospace Exploration Agency (JAXA) has taken the lead in translating SBSP from the page to the launch pad. Due to its lack of fossil fuel resources and the 2011 incident at the Fukushima Daiichi nuclear plant, Japan, in desperate need of alternative energy sources, has proposed a 25-year technological roadmap to the development of a one-gigawatt SSPS station. To accomplish this incredible feat, Japan plans on deploying a 10,000 metric ton solar collector that would reside in geostationary orbit around Earth.6 Surprisingly, the difficult aspect is not building and launching the giant solar collector; it’s the technical challenge of transmitting the energy back to earth both accurately and efficiently. This is where JAXA has focused its research.

Historically, wireless power transmission has been accomplished via laser or microwave transmissions. Laser and microwave radiation are similar in many ways, but when it comes down to which one to use for SBSP, microwaves are a clear winner. Microwaves have longer wavelengths (usually lying between five and ten centimeters) than those of lasers (which often are around one micrometer), and are thus better able to penetrate Earth’s atmosphere.7 Accordingly, JAXA has focused on optimizing powerful and accurate microwave generation. JAXA has developed kW-class high-power microwave power transmission using phased, synchronized, power-transmitting antenna panels. Due to current limitations on communication technologies, JAXA has also developed advanced retrodirective systems, which allow high-accuracy beam pointing.8 In 2015, JAXA was able to deliver 1.8 kilowatts accurately to a rectenna 55 meters away which, according to JAXA, is the first time that so much power has been transmitted with any appreciable precision . Although this may seem insignificant compared to the 36,000 km transmissions required for a satellite in geosynchronous orbit, this is huge achievement for mankind. It demonstrates that large scale wireless transmission is a realistic option to power electric cars, transmission towers, and even satellites. JAXA,continuing on its roadmap, plans to conduct the first microwave power transmission in space by 2018.

Although the challenges ahead for space based solar power generation are enormous in both economic and technical terms, the results could be revolutionary. In a manner similar to the introduction of coal and oil, practical SBSP systems would completely alter human civilization. With continuous green energy generation, SBSP systems could solve our energy conflicts and allow progression to next phase of civilization. If everything goes well, air pollution and oil spills may merely be bygones.

References

  1. Sasaki, S. IEEE Spec. 2014, 51, 46-51.
  2. EIA (U.S. Energy Information Administration). www.eia.gov/electricity/monthly (accessed     Oct. 29, 2016).
  3. Wolfgang, S. Acta Astro. 2004, 55, 389-399.
  4. Yang, Y. et al. Acta Astro. 2016, 121, 51-58.
  5. Wang, R. et al. IEEE Trans. Micro. Theo. Tech. 2014, 62, 1080-1089.
  6. Sasaki, S. Japan Demoes Wireless Power Transmission for Space-Based Solar Farms. IEEE Spectrum [Online], March 16, 2015. http://spectrum.ieee.org/ (accessed Oct. 29, 2016).
  7. Summerer, L. et al. Concepts for wireless energy transmission via laser. Europeans Space Agency (ESA)-Advanced Concepts Team [Online], 2009. https://www.researchgate.net/profile/Leopold_Summerer/ (accessed Oct. 29, 2016).
  8. Japan Space Exploration Agency. Research on Microwave Wireless Power Transmission Technology. http://www.ard.jaxa.jp/eng/research/ssps/hmi-mssps.html (accessed Oct. 29, 2016).

 

Comment

GMO: How Safe is Our Food?

Comment

GMO: How Safe is Our Food?

For thousands of years, humans have genetically enhanced other living beings through the practice of selective breeding. Sweet corn and seedless watermelons at local grocery stores as well as purebred dogs at the park are all examples of how humans have selectively enhanced desirable traits in other living creatures. In his 1859 book On the Origin of Species, Charles Darwin discussed how selective breeding by humans had been successful in producing change over time. As technology improves, our ability to manipulate plants and other organisms by introducing new genes promises both new innovations and potential risks.

Genetically modified organisms (GMOs) are plants, animals, or microorganisms in which genetic material, such as DNA, has been artificially manipulated to produce a certain advantageous product. This recombinant genetic engineering allows certain chosen genes, even those from unassociated species, to be transplanted from one organism into another.1 Genetically modified crops are usually utilized to yield an increased level of crop production and to introduce resistance against diseases. Virus resistance makes plants less susceptible to diseases caused by insects and viruses, resulting in higher crop yields.

Genetic enhancement has improved beyond selective breeding as gene transfer technology has become capable of directly altering genomic sequences . Using a “cut and paste” mechanism, a desired gene can be isolated from a target organism via restriction enzymes and then inserted into a bacterial host using DNA ligase. Once the new gene is introduced, the cells with the inserted DNA (known as “recombinant” DNA) can be bred to generate an advanced strain that can be further replicated to produce the desired gene product.1 Due to this genetic engineering process, researchers have been able to produce synthetic insect-resistant tomatoes, corn, and potatoes. Humans’ ability to modify crops has improved yields and nutrients in a given environment, becoming the keystone of modern agriculture.2 Despite these positive developments, skepticism still exists regarding the safety and societal impact of GMOs.

The technological advancement from selective breeding to genetic engineering has opened up a plethora of possibilities for the future of food. As scientific capabilities expand, ethics and ideals surrounding the invasive nature of the production of GMOs have given rise to concerns about safety and long-term impacts. According to the Center for Food Safety, GMO seeds are used in 90 percent of corn, soybeans, and cotton grown in the United States.2 Because GMO crops are so prevalent, any negative ecological interactions involving a GMO product could prove devastating for the environment.

While the dangers of genetic modification are being considered, genetic engineering has proven to have benefits to human health and the farming industry. Genetically modified foods maintain a longer shelf life, which allows for the safe transport of surplus foodstuffs to people in countries without access to nutrition-rich foods. Genetic engineering has supplemented staple crops with vital minerals and nutrients, , helping fight worldwide malnutrition. For example, Golden rice is a genetically-modified variant of rice that biosynthesizes beta-carotene, a precursor of vitamin A.3 This type of rice is intended to be produced and consumed in areas with a shortage of dietary vitamin A, which is a deficiency that kills 670,000 children each year. Despite the controversial risks, genetic engineering of crops promises to continually increase the availability and durability of food.

References

  1. Learn.Genetics. http://learn.genetics.utah.edu/content/science/gmfoods/ (accessed Sep 20, 2016)
  2. Fernandez-Cornejo, Jorge, and Seth James Wechsler. USDA ERS – Adoption of Genetically Engineered Crops in the U.S.: Recent Trends in GE Adoption. USDA ERS – Adoption of Genetically Engineered Crops in the U.S.: Recent Trends in GE Adoption. https://www.ers.usda.gov/data-products/adoption-of-genetically-engineered-crops-in-the-us/recent-trends-in-ge-adoption.aspx (accessed Sep 30,2016)
  3. Dan Charles. In A Grain Of Golden Rice, A World Of Controversy Over GMO Foods. http://www.npr.org/sections/thesalt/2013/03/07/173611461/in-a-grain-of-golden-rice-a-world-of-controversy-over-gmo-foods (accessed Sep 24, 2016)

Comment

Green Sea Turtles: A Shell of What They Once Were

Comment

Green Sea Turtles: A Shell of What They Once Were

Sea turtles appear in many cultures and myths, and are often beloved symbols of longevity and wisdom. However, in spite of the cultural respect shown towards them, green sea turtles have gradually become endangered due to factors such as nesting habitat loss, pollution, egg harvesting, climate change, and boat strikes. Now, there’s a new, even more dangerous threat on the block: herpes. And no, it’s not the herpes you’re thinking of - this kind, known as fibropapillomatosis (FP), is much, much worse.

FP has been observed across all species of sea turtles for years, but it has recently become especially widespread among green sea turtles (Chelonia mydas). The alarming incidence of FP is exacerbating the decline of this already vulnerable population. Among green sea turtles, the number of cases of FP increased 6000% from the 1980s to the mid-1990s, with FP becoming so globally pervasive that the outbreak has been classified as “panzootic,” the animal equivalent of “pandemic.” Now, you might think, “That sounds bad, but why are these turtles dying?” In humans, herpes is unpleasant, but it is seldom life-threatening. Unfortunately, in green sea turtles, the outlook isn’t nearly as optimistic. FP causes the development of tumors on the soft tissues, the shells, and even the eyes of infected turtles. When these growths are left untreated, they can grow to immense sizes, impairing the animal's vital activities, such as breathing and swallowing. So, while the tumors aren’t directly lethal, they invite hordes of secondary infections and pathogens that ultimately result in death.

To make matters worse, treatment for FP is still in development. A landmark study identified the specific pathogen responsible for FP as Chelonid herpesvirus 5 (ChHV5), a close relative of human genital herpes.1 This discovery was the first step to a cure, but it raised an important question - how had this variant of herpesvirus become so prevalent? Until recently, the answer to that question was elusive.

Fortunately, several recent discoveries offered new explanations for FP’s rise. One study reported a significant positive correlation between serum concentrations of heavy metals and the severity of FP, as well as a significant negative correlation between serum cholesterol concentrations and FP.2 In a related find, a team at the University of São Paulo discovered that many green sea turtles have been exposed to organochlorine compounds, which are known to have carcinogenic effects.3 Further research could potentially determine a direct causal relationship between the development of FP and exposure to heavy metals or organochlorine compounds. If such a relationship were found, projects that strive to decrease the prevalence of said compounds in the turtles’ habitats could prove effective in mitigating the spread of FP.

So what’s the prognosis for the green sea turtle? Unfortunately, even knowing what we now know, it may not be good. A study by Jones et. al. found almost all of the infected turtles are juveniles, potentially creating a big problem for the population.4 Jones believes the most optimistic explanation for this trend is that current adults and hatchlings have never been exposed to the disease, so only one generation (the juveniles) has been infected. Another optimistic possibility is that once infected turtles recover from the disease, they will simply acquire immunity as adults. However, there is another, devastating possibility: all of the affected juveniles will perish before they reach adulthood, leaving only the unaffected alive and dooming the species. In a heartbreaking aside, Jones reported that FP “grows on their [the turtles’] eyes, they can't see predators, they can't catch food, so sometimes they slowly starve to death — it's not a nice thing for the turtles to experience. Severely affected turtles are quite skinny and have other pathogens affecting them – that’s why they die.”

Eradicating such a devastating disease will no doubt take many more years of specialized research, and significant efforts are needed immediately to rehabilitate the green sea turtle population. Luckily, conservation groups such as The Turtle Hospital, located in the Florida Keys, are making an active effort to save infected sea turtles. They perform surgeries that remove FP tumors, rehabilitate the turtles, and then release them back into the wild. In addition, they collaborate with universities to study the virus and educate the public on sea turtle conservation. To date, the Turtle Hospital has successfully treated and released over 1,500 sea turtles. Through the hard work of conservation organizations and researchers across the globe, we may still be able to save the green sea turtle.

References

  1. Jacobson, E. R. et al. Dis. Aquat. Organ. 1991, 12.
  2. Carneiro da Silva, C., et al. Aquat. Toxicol. 2016, 170, 42-51.
  3. Sánchez-Sarmiento, A. M. et al. J. Mar. Biol. Assoc. U. K. 2016, 1-9.
  4. Jones, K., et al. Vet. J. 2016, 212, 48-57.
  5. Borrowman, K. Electronic Theses and Dissertations. 2008
  6. Monezi, T. A. et al. Vet. Microbiol. 2016, 186, 150-156.
  7. Herbst, L. H. et al. Dis. Aquat. Organ. 1995, 22.
  8. The Turtle Hospital. Rescue, Rehab, Release. http://www.turtlehospital.org/about-us/ (accessed Oct. 4, 2016).

 

Comment

Engineering Eden: Terraforming a Second Earth

Comment

Engineering Eden: Terraforming a Second Earth

Today’s world is faced with thousands of complex problems that seem to be insurmountable. One of the most pressing is the issue of the environment and how our over-worked planet can sustain such an ever-growing society. Our major source of energy is finite and rapidly depleting. Carbon dioxide emissions have passed the “irreversibility” threshold. Our oceans and atmosphere are polluted, and scientists predict a grim future for Mother Earth if humans do not change our wasteful ways. A future similar to the scenes of “Interstellar” or “Wall-E” is becoming increasing less fictitious. While most of the science world is turning to alternative fuels and public activism as vehicles for change, some radical experts in climate change and astronomy suggest relocation to a different planet: Mars. The Mars rover, Curiosity, presents evidence that Mars has the building blocks of a potential human colony, such as the presence of heavy metals and nutrients nestled in its iconic red surface. This planet, similar in location, temperature, and size to Earth, seems to have the groundwork to be our next home. Now is when we ponder: perhaps our Earth was not meant to sustain human life for eternity. Perhaps we are living at the tail end of our time on Earth.

Colonizing Mars would be a project beyond any in human history, and the rate-limiting step of this process would be developing an atmosphere that could sustain human, animal, and plant life. The future of mankind on Mars is contingent on developing a breathable atmosphere, so humans and animals could thrive without the assistance of oxygen tanks, and vegetation could grow without the assistance of a greenhouse. The Martian atmosphere has little oxygen, being almost 95.7 percent carbon dioxide. It is also one percent of the density of Earth’s atmosphere, so it provides no protection from the Sun’s radiation. Our atmosphere, armed with a thick layer of ozone, absorbs or deflects the majority of radiation before it hits our surface. Even if a human could breathe on the surface of Mars, he or she would die from radiation poisoning or cancer. Fascinating ways to address this have been discussed, one being mass hydrogen bombing across the entire surface of the planet, creating an atmosphere of dust and debris thick enough to block ultraviolet radiation. This feat can also be accomplished by physically harnessing nearby asteroids and catapulting them into the surface. The final popular idea is the use of mega-mirrors to capture the energy of the sun to warm up the surface to release greenhouse gases from deep within the soil.1

However, bioengineers have suggested another way of colonizing Mars--a way that does not require factories or asteroids or even human action for that matter. Instead, we would use genetically modified plants and algae to build the Martian atmosphere. The Defense Advanced Research Projects Agency (DARPA) is pursuing research in developing these completely new life forms.2 These life forms would not need oxygen or water to survive, but instead would synthesize a new atmosphere given the materials already on Mars. The bioengineering lab at DARPA has developed a software called DTA GView which has been called a “Google Maps of Genomes.” It acts as a library of genes, and DARPA has identified genes that could be inserted into extremophile organisms. A bacteria called Chroococcidiopsis is resistant to wide temperature changes and hypersalinity, two conditions found on Mars.3 Carnobacterium spp has proven to thrive under low pressure and in the absence of oxygen. These two organisms could potentially be genetically engineered to live on Mars and add vital life-sustaining molecules to the atmosphere.

Other scientific developments must occur before these organisms are ready to pioneer the human future on Mars. Curiosity must send Earth more data regarding what materials are present in Mars’ soil, and we must study how to choose, build, and transport the ideal candidate to Mars. Plus, many argue that our scientific research should be focused on healing our current home instead of building a new one. If we are willing to invest the immense scientific capital required to terraform another planet, we would likely also be able to mediate the problem of Earthly pollution. However, in such a challenging time, we must venture to new frontiers, and the bioengineers at DARPA have given us an alternative method to go where no man or woman has ever gone before.

References

  1. “The ethics of terraforming Mars: a review” iGem Valencia Team, 2010, 1-12 (Accessed November 2, 2016)
  2. Terraforming Mars With Microbeshttp://schaechter.asmblog.org/schaechter/2013/06/terraforming-mars-with-microbes.html (Accessed November 4, 2016)
  3. We are Engineering the Organisms that will terraform Mars.http://motherboard.vice.com/read/darpa-we-are-engineering-the-organisms-that-will-terraform-mars (Accessed November 4, 2016)

Comment