Saturday, 30 August 2014

Treating Brain Tumours With Proton Beam Therapy

What has been highlighting the news recently is the Ashya King case and the implications of certain therapies to treat brain tumours. Ashya King is the five year old son of parents Brett and Naghemeh King. Whilst Ashya was being treated in a UK hospital, his parents, without consent from doctors, took him from hospital and journeyed overseas to seek the treatment they wanted for their son. Mr and Mrs King were in preference for proton beam therapy for their son - this directly targets cancer cells and is considered to have less side effects than the conventional radiation/X-ray therapy. However currently in the UK, proton beam therapy isn't available for the treatment of tumours like Ashya has -  instead it has only been used to treat certain eye conditions. Government legislation says that proton beam therapy will be introduced in combating cancers like Ashya has by 2018. Understandably, Ashya's parents simply cannot wait that long for the treatment; it is important to intervene as early as possible when tackling a tumour as severe's as Ashya's.

What is proton beam therapy?


X-rays have long been used to treat cancerous tumours. Given in sufficiently high doses, X-rays will be able to kill tumour cells, however what many people view as a fundamental flaw is that healthy tissues can be exposed to similar intensity radiation. This leaves risk of healthily tissue being destroyed which can in turn be potentially detrimental to health. Therefore many people believe proton beam therapy to be the best option, as even in high concentrations there is greatly reduced damage to healthy tissues (and more importantly, the vital organs). The problem that arises is that those receiving X-ray therapy will receive a lesser dose than desired by doctors as the risk to healthy tissue comprises the treatment. Many patients desire proton beam therapy partly because of this.


Recalling fundamental chemistry, protons are subatomic particles of the atom which are positively charged. They have a relative charge of +1. Opposing this, 'orbiting' electrons have a relative charge of -1, and thus the two subatomic particles have an attraction for each other. Having this rather basic intuition allows us to make sense of how proton beam therapy works. As charged particles, such as protons are fired near other molecules or atoms, they become attracted to the elections with orbit the atoms. This causes the atoms to ultimately lose electrons and as a result become positively charged. These are now ions. Obviously, this process is therefore called ionisation which causes the chemical properties of that atom to change. This is vitally important when targeting cancer cells. As the proton beam targets the cells, the molecules within those cells become damaged, or ionised. However it matters what molecules become affected, for example to damage the DNA of a cancer cell means destroying it's functionality. As a result, key processes such as mitosis (cellular division) and DNA replication become disrupted. With a high enough dosage, no matter how hard enzymes work to repair the damage, their efforts become futile. Cancer cells have less capability of repairing damage to organelles and molecules than healthy cells, which means minimal concern for the normal cells that are bombarded with the proton beam. Fundamentally, the proton beam creates a selection pressure on cancer cells, they are more likely to be destroyed and subsequently their numbers decline. 







Although both X-rays and proton beams aim to target cancer cells, it is considered that the precision and accuracy of the proton beam is greater. The 'distribution of protons can be directed and deposited in tissue volumes'. Another downfall with X-rays is that as they 'lack charge and mass', their use results in radiation being deposited 'in normal tissues near the body's surface'. This ultimately poses a risk of genetic mutation to healthy cells.

Interestingly, protons can be accelerated or decelerated as and when required - X-rays cannot be 'energised to specific velocities' unlike protons. This property of protons allows physicians to have some element of control as to how penetrative the proton beam is. As a very physical entity, protons slow down as they penetrate further into tissue, coming into contact with more electrons as a result. It follows that as the protons decelerate, they eventually stop at their designated site - the cancer cells. This is not the case with X-rays as some radiation is left to pass through the tumour tissue and through to other healthy tissue via an "exit dose" before exiting the patient completely.


I have empathy as to why the parents of Ashya want to pursue this proton beam therapy. With fewer side effects and less damage to normal tissue, it is highly likely that Ashya would enjoy 'a better quality of life during and after proton treatment'.



Credit to the BBC on their updates on the Ashya King case and to The National Association for Proton Therapy for their article 'How Proton Treatment Works'. Read more on the subject here.


Image: Patient receiving radiation therapy - BBC

Thursday, 28 August 2014

Understanding Congenital Heart Disease

The heart is undoubtedly to many one of the most essential organs of the body, supplying all major tissues with oxygen and glucose which allows body cells to respire - to generate ATP. However what is also very important is how the heart develops during pregnancy, and how effective it is as an organ. Unfortunately children are sometimes born with congenital heart defects which means the heart's duty to pump blood around the body sufficiently can become compromised. These conditions are considered relatively rare which means for scientists, students, doctors and specialists, heart development has become a very interesting topic of study. Since the heart is considered a relatively complex organ, giving rise to our double circulatory system, rare conditions help scientists to appreciate it's development. More crucially, it may trigger the development of new treatments for these defects. An article in the Biological Sciences Review helped me understand what exactly we mean by congenital heart disease.

To start with, an interesting statistic you may want to bear in mind is that 'between 1% and 5% of the human population are born with structural or functional problems with their hearts'. You may consider this a small percentage, but this equates to a very large quantity. But a statistic isn't always representative - many congenital heart defects in infants aren't detected which means the actual percentage could be higher. What is even more interesting, is that congenital heart disease is the 'most common non-infectious cause of child death'.

The tree main types of congenital heart disease which you would want to know about are:
  • Septation defects: This is where there is an error is the separation of different parts (chambers) of the heart
  • Unilateral blocks: Defects referring to the heart valves
  • Routing abnormalities: Erros in connecting chambers of the heart with the correct major blood vessel or even failure in connection at all. 
How does the mammalian heart form? First of all, it's important to appreciate how blood is pumped around the body in a fully developed heart. Here I've included an animation which shows how this is achieved. Notice the valves are incredibly important in monitoring how blood flows and it's volumes whilst in the four chambers. Credit is given to The Children's Hospital of Philadelphia for this animation.



As the heart is the first organ required of an embryo, it develops quite early, starting as a 'crescent-shaped structure at about 2 weeks of gestation'. From this, a straight tube structure takes form which leads onto a Y-shape tube due to a join which forms at it's centre. What is amazing is that even at this stage of development, the heart continues to beat 'as early as day 22 of gestation'.

The next phase of development includes determining the positions of different parts of the heart. Chambers need to be positioned in the correct places relative to each other. This is what is known as "cardiac looping", and involves the heart muscle to bend in a very particular manner which is controlled by genes. This stage of heart formation is crucial as malformations can lead to routing problems when blood vessels may fail to attach to the various chambers of the heart.

After this stage, chambers need to be formed. At around 6-8 weeks into pregnancy, atria and ventricles become separated from each other as different areas of heart tissue become distinguished. At this point, valves are also formed in conjunction with the chambers. You may have heard of a 'hole in the heart', when someone has a hole through their atrial septum. This condition can arise at this stage of development. What is intriguing is that despite this seemingly worrying condition, problems don't seem to arise until adulthood (it is asymptomatic). It is the most common congenital heart defect diagnosed in adulthood. Of course it would be better that this was diagnosed sooner, however as mentioned many heart defects go unnoticed during early childhood.

However this isn't always a bad thing, not in foetuses that is. Foetuses contain specialised structures in their heart which aid development which adults do not possess. The foramen oval is in fact 'a small gap through which blood passes from the right atrium into the left atrium'. Why is this? A foetus is supplied blood which is oxygenated from the placenta, from the mother. Therefore if this blood were to pass through it's lungs it would render them useless as the blood is already oxygenated. This means in a foetus, the right atrium receives oxygenated blood which can then be passed into the left atrium and then the left ventricle to be pumped to the rest of the body. This gap normally closes shortly after birth, however if this fails this is an alternative way a 'hole in the heart' can remain. Surgery is usually used to treat the condition.

How can we treat heart defects? In order to pursue a treatment, we must start with extensive and long-lasting research into the genes and signalling pathways used in heart development. Scientists have started to examine heart defects that have been induced by specific mutations in genetic code, in DNA. As a single gene codes for the production of a single polypeptide, a change in phenotype can be deduced from a change in the genetic sequence, or genotype.

Scientists have introduced a method of identifying what processes regulate heart formation - this is called mutant screening. There are two possible ways of carrying out this method:


  1. 'Forward genetics' - Individuals are studied for a particular characteristic, or phenotype. Then their genetic sequences are analysed in the hope of identifying the gene responsible for a particular abnormality. 
  2. 'Reverse genetics' - This is considered the converse approach, where the effect of a known gene on a characteristic is investigated. In mice, this could involve deliberately inducing a mutation (using a mutagen such as ENU*) or removing a gene to see whether this causes a change in development. Removing a gene means no longer synthesising a particular protein, so this is useful when looking at how the heart develops without that protein present. 
*ENU = N-ethyl-N-nitrosourea (causes a point mutation)


An example of this which is stated in the article is that of the gene Nkx2.5 in mice. The removal of this in mice caused faulty development. However when looking at mutations of this gene in humans, it correlated with atrial septal (hole in heart) defects in families. Another example is that the loss of protein TBX1 encoded by a certain gene leads to certain congenital heart defects in humans. 

The potential of genetic screening for heart defects is fast advancing. It has allowed us to screen embryos for diseases and in the future could mean correcting abnormalities during pregnancy.

Max Brödel inspired anatomical drawing of the heart, showing ventricular and atrial chambers.



Credit is given to Katherine Powell who writes for the Biological Sciences Review (Volume 26, Number 1) for her published article . 

Wednesday, 20 August 2014

Whole Functional Organ Grown In Animal

It's always exciting to hear new 'world firsts' in the medical world, what caught my eye recently was that new strides were being made in the area of organ synthesis. An article from the BBC caught my attention. For the first time, a functional organ has been successfully grown inside an animal. Now you may have heard before of organs being synthesised in the laboratory environment, outside a living body. However that has changed. Scientists at the University of Edinburgh encouraged a group of cells ti develop into a thymus gland. The thymus gland is a notable part of the immune system where T-cells mature and grow. T-cells are part of a much larger operation when fighting disease and foreign microbes. The results for the study were published in the scientific journal Nature Cell Biology. The cells were implanted into mice, which then proliferated in number and took shape to a thymus gland.

But where did the cells come from? In fact, scientists were able to use mouse embryonic stem cells which were multi-potent. This useful property allows scientists to re-programme cells to develop into almost any type. The specificity of a cell which needs to be cultured is very important when developing a particular organ.

It is important to realise that this study is still in its early stages, and only with vigorous testing and further research will scientists be able to trial his technique on humans (or human tissue). Problems are likely to arise with regenerating organs, such as the fear of rejection from the patient. After all, embryonic stem cells were used - in patients adult stem cells would be more desirable to avoid rejection. Additionally scientists will need to be wary of the fact that the cells could divide uncontrollably to form a cancer.


Some have called this study to be analogous with one breakthrough last year when a brain the size of a human foetus's was synthesised. However to implement the brain into a living body proves very difficult, however the thymus can be seen as a simpler organ to replicate which is why it was used in this study. For example, the thymus is essentially a mass of tissue, it isn't divided into separate chambers like the brain. The only two main regions are the cortex and the medulla. (Wikipedia article - Thymus)


In my view, this step proves to be the start of a new age of regenerative medicine. Replicating one organ raises the obvious question: Can we make any organ? Theoretically, yes. However I imagine every organ has it's own problems when attempting to sculpt it's shape. Organs such as the heart and the stomach have intricate contours and a specific shape to their function. The rugae on the stomach lining contribute to a larger surface area for digestion for example.

An interesting point in the article was made about the potential of the findings. Where can a newly grown thymus be most useful in our society. In Britain, and perhaps in other parts of the world, there is an increasingly ageing population. Growing new thymus glands or simply thymus tissue, could be used to replace the ones of the elderly. It is known that with old age, the immune system tends to weaken partly due to the shrinking of the thymus.

As I have mentioned, the field of regenerative medicine has advanced at an alarming rate. Already, patients have been the recipients of newly grown tracheae, and blood vessels. This has been achieved so far by 'seeding' patient cells into a scaffolding which then slowly disintegrates over time to leave developed tissue.

But is this better than organ transplantation? Dr Paolo de Coppi of Ormond Street Hospital suggests "Research such as this demonstrates that organ engineering could, in the future, be a substitute for transplantation."


Credit to James Gallager, Health Editor for the BBC for his article which can be read in further detail here

Sunday, 17 August 2014

The History of Anaesthesia!

Recently I undertook some work experience at Leicester Royal Infirmary, where I learned about the roles and duties of not just doctors, but also of the many people that make up a surgical team, and those who accompany consultants in their work. On the first day, I remember the consultant anaesthetist I was shadowing in the MRI scanning unit was very enthusiastic about the field he works in. In fact he was so committed to informing me what exactly anaesthesia is, he gave me a two hour lecture on the history of anaesthesia! After all, MRI scans can take unto 40 minutes which left us plenty of time to discuss aspects of medicine and indeed how anaesthetics came about in today's medical world. I must admit, it was very interesting and I thought it would be fitting to talk a bit more about it here.

Well, fundamentally we need to understand what exactly is an anaesthetic. General anaesthetics are 'medications used to cause a loss of consciousness' according to NHS Choices. It is widely accepted that these drugs interrupt the transmission of signals along the nerves of the body. This explains why during surgery, the presence of an endotracheal tube in the throat fails to initiate a 'chocking' reflex by the patient. Anaesthetic drugs can by administered in one of two main ways: intravenously in a liquid (through a cannula), or as a vapour through a breathing mask. It is generally known that using a needle and giving the anaesthetic by injection is a lot quicker and 'smoother'. However using the vapour is a very suitable option if one is not comfortable around needles or the idea of injection.


In today's medical world, anaesthetists have access to quite a diverse array of anaesthetic drugs such as desflurane and isoflurane (in combination with nitrous oxide). Conversely in the 17th century, between 1835 and 1845, there was evidence that people attempted to use anaesthetic agents. Despite these formulations, these drugs didn't have a significant effect on the medical field at that time. Back then, it was William Morton who administered the first anaesthetic made of ether - this was a medical breakthrough. The patient, Gilbert Abbott received the anaesthetic on the 16th October 1846, for treatment of his jaw where a lump needed to be surgically removed. This took place in the Massachusetts General Hospital, in Boston. After this pivotal event, ether anaesthetics were increasingly becoming used in hospitals for surgeries including amputation and tooth extraction. For surgeons at least, this new revolution was considered a step forward. In those times, the diversity of surgery was quite limited compared to what it is today. Previously, operating on the head for example was especially avoided. Patients were conscious after all, awake if you like. Strapped down to the table, it was known many would faint at knowing their leg would simply be cut off, or at the sight of a newly sharpened knife. I'd like to share one of the terrible stories of early surgery that was originally published in the New York Herald, 21st July 1841. I managed to find this excerpt from The Royal College of Anaesthetists webiste.


"The case was an interesting one of a white swelling, for which the thigh was to be amputated. The patient was a youth of about fifteen, pale, thin but calm and firm. One Professor felt for the femoral artery, had the leg held up for a few moment to ensure the saving of blood, the compress part of the tourniquet was placed upon the artery and the leg held up by an assistant. The white swelling was fearful, frightful. A little wine was given to the lad; he was pale but resolute; his Father supported his head and left hand. A second Professor took the long, glittering knife, felt for the bone, thrust in the knife carefully but rapidly. The boy screamed terribly; the tears went down the Father’s cheeks. The first cut from the inside was completed, and the bloody blade of the knife issued from the quivering wound, the blood flowed by the pint, the sight was sickening; the screams terrific; the operator calm."



Reading this re-emphasises for me how fortunate we are to have access to anaesthetic drugs for surgery - especially very invasive ones. To great appreciation of the surgeon, they could afford to be more accurate, precise and careful when performing operations to get the optimum result. Now the 'doors were opened' to many other parts of the body such as the brain that could now be safely operated on. But surgery improved not only with anaesthetics, but with the introduction of sterile surgery catalysed by Joseph Lister in the 1860's. Lister used methods of cleaning and dressing wounds with a solution of carbolic acid which saw a decrease in patients suffering gangrene post-surgery. Even the idea of surgeons wearing clean gloves was down to Lister as well as the sterilisation of surgical instruments, much like today. 
A year later after the introduction of ether anaesthetics, different agents that could be inhaled were formulated - one example being Chloroform. This was first used by James Simpson, Professor of Obstetrics in Edinburgh. However there were a few known flaws of this particular agents. One surprising effect was that usually in very nervous patients, sudden death could occur (1848 saw the first events). Another was that it could induce late-onset liver damage. Despite this, it proved quite popular due to it's ease of use and effectiveness. 
Soon enough, many different anaesthetic agents were being used, even cocaine became useful as a local anaesthetic from 1877! By the early 1900's, minimally toxic anaesthetic drugs were being used.
It is important to realise however that anaesthesia doesn't just pay attention to the drug that is given to a patient. Whilst on my work experience, I learned that the anaesthetist in the theatre has a very important duty to monitor the patient's holistic condition: their breathing and if they are experiencing any pain are two main examples. Endotracheal tubes that could be placed into the mouth and descend into the windpipe was the next invention that would become increasingly used in the 1920's and 1930's when the techniques became perfected. Furthermore, you may be surprised to hear that intravenous methods of delivering anaesthetic didn't come about until the 1930's - this helped deliver the drug smoother and quicker, as well as being more pleasant for those who detested the more traditional inhalation agents. Even more progression came about when muscle relaxants such as curare (actually a poison!) became more and more useful over the course of the 1940's and 1950's. Despite all these advancements, what anaesthetic that is considered in today's world a revolution is halothane. Apparently much easier to use and therefore more practical, it is probably the most widely used category of anaesthetic. Since the mid-1950's, this group of agents have improved in potency and become safer after years of refinement. 
With the probability of mortality of less than 1 in 250,000 from taking an anaesthetic, it is comforting to patients that anaesthesia in the modern age is considered very safe. This is indeed one area of medicine that I have become so interested in recently - reading about it's history is especially useful as well as knowing how it has evolved as a field over the last 150 years. 


More information on what anaesthesia is can be found on the NHS Choices website here.



Full credit is given to Dr D J Wilkinson, past honorary treasurer of The Royal College of Anaesthetists who gave permission for an article to be published on The Royal College of Anaesthetists website titled "The History of Anaesthesia". To read in more detail about what I have talked about, you can see the original text here.