Trephination Timeline

  • 6500 BCE

    First evidence of the surgical procedure of trephination found in France.

  • 5000 BCE

    Evidence of the surgical procedure of trephination found in China.

  • 950 BCE - 1400 BCE

    Evidence of the surgical procedure of trephination found in Mesoamerica.

Subungual hematoma

A subungual hematoma is a collection of blood (hematoma) underneath a toenail or fingernail (black toenail). It can be extremely painful for an injury of its size, although otherwise it is not a serious medical condition.

Subungual hematoma
Other namesRunner's toe, tennis toe, skier's toe
Subungual hematoma of a toe
SpecialtyInternal medicine, Podiatry
SymptomsDiscoloration of nail, pain
Risk factorsPoorly fitting foot wear, overtraining particularly hiking and running
TreatmentUsually unmerited, blood drainage or nail removal in serious cases
PrognosisUsually self-resolving as nail grows out

Mental Disorders

I am currently a history major and a psychology minor at the University of New Mexico (UNM). That being said, I plan to combine the two disciplines in the charting of mental disorders as they have been medically perceived, and historically recorded throughout time. The first historical record of mental disorders dates back 10,000 years, however, I will primarily focus from 400 B.C., to the present. Being that I cannot possibly include every recorded episode, I will provide the most significant examples that were, and are, widely accepted. The academic values within these works are intended to summarize and educate the readers on mental disorders throughout the majority of documented history, as well as why it is important that these areas continue to be researched and developed.This essay will provide a concise historiography of mental disorders that incorporates perceived causalities, as well as the treatments, as both progress through documented history.

Seems a bit Screwy, this act of ‘TREPHINATION’ During this period, if something was wrong with a person’s head, they just drilled a hole and let out the ‘BAD’. [source]

The documented history of mental disorders dates back over the last “10,000 years, with the timeline holding evident through evidence of Trephination throughout Europe” (Pitsios,239). With more than “200 skulls being found from Scandinavia to the Balkans” (Pitsios,239). This ancient technique was the first surgical operation and served numerous purposes. It’s evolution through the periods advanced the technical means, knowledge, and therapeutic needs as time elapsed. “Hippocrates was the one to classify for the first time the kinds of cranial fractures and define the conditions and circumstances for carrying out trepanning” (Pitsios,239). The historiographical challenge in charting how historians wrote about mental disorders is that they did not until the late 19th, and early 20th centuries. This being said, I derived my historiographical information from theories and models of behavior that were practiced at the time.

“In 400 B.C. Hippocrates suggested that mental disorders are caused by both biological and psychological attributes” (Barlow,30). Hippocrates, also known as Hippocrates II, was a Greek physician, who is considered one of the most outstanding figures in the history of medicine, as well as “the father of western medicine” (Barlow,14). It is evident that during this place in time mental disorders were thought to be cured by the drilling of holes, through specific parts of the skull, allowing the pressure to be released along with any bad spirits. However, during the “18th and 19th century, trephination operations had been rejected as a therapeutic surgical method, due to the high mortality that reached 100% at the time” (Pitsios,240). The way in which mental disorders were perceived during this period may seem barbaric and brutal to us now, but at the time, this method was the most advanced surgical treatment and served as the platform from which psychological research would springboard.

In 200 A.D. Galen, a Roman physician adopted and incorporating Hippocrates ideas which created a long-lasting school of thought throughout the field of psychopathology. He suggested that normal and abnormal behaviors are related to four bodily fluids, or humors. These fluids/humors needed to be maintained at specific levels in order for people to function ‘normally’. The four Humors include choleric (yellow bile, fire), melancholic (black bile, earth), sanguine (blood, air) and phlegmatic (phlegm, water). This biological tradition would continue into the 19th century. The avenues by which those of this time perceived mental disorders were primitive, but progressing. The advancement in thought, practice, and knowledge lead those of this time period to build on certain aspects, as well as rid others. With the belief that mental disorders were purely caused by physiological aspects, thinkers of this period would rest upon the biological tradition practiced over the past few hundred years.

Evil or Misunderstood? The individuals in this period were thought to be possessed by the supernatural rather than being Mentally Ill [source]

The documented history of mental disorders seems to have taken about a nine-hundred-year hiatus, as the timeline did not pick up again until the 1300s. Due to the lack of focus from historians, the majority of the information within these works is derived from those within the medical and psychologic fields, as they were the only people documenting on this topic. Mental disorders in this period are blamed on demons and witches. Superstition was widespread, with exorcisms being performed to rid victims of ‘evil spirits’. In the “late Middle Ages, mental illness was not recognized as such. Instead, the mentally disturbed were accused of witchcraft” (Spanos,417). In the late 14th century, religious powers in charge began supporting these superstitions due to their rising popularity amongst European society.

At this place in time mental disorders were not directly documented as being such, however, indirectly related, books on witches were published. ‘The Malleus Maleficarum, or in Latin, The Hammer of Witches’, was written by Heinrich Kramerin 1486. The book’s main purpose was to challenge all arguments against the existence of witchcraft and to instruct magistrates on how to identify, interrogate and convict witches. A century later in 1580, Jean Bodin would publish ‘On the Demon-Mania of Witches, or in Frecnh, De la démonomanie des sorciers’. So at this point in the history of thought concerning mental disorders is that they were not to blame for abnormal/deviant behavior, instead, witchcraft would be the culprit.

Witches and demons were thought to have possessed those who’s bizarre behaviors exhibited by those who were mentally ill, were actually viewed as the work of the devil. The treatments for these mental disorders leaned on religion alone to cure the individuals. With “exorcism, in various religious rituals were performed in an effort to rid the victim of evil spirits. Shaving the pattern of the cross in the hair on the victims head, and securing sufferers to a wall near the front of the church so that they might benefit from hearing Mass” (Barlow,10). These superstitions carried their way through the next few centuries. As the causality of evil and madness were blamed on witchcraft and sorcery in the 15th century. This ‘causality’ even spilled its way across the Atlantic, as “evident by the Salem, Massachusetts witch trials in the late 17th century which resulted in the hanging deaths of numerous women” (Barlow,10). Due to the fact that most accused of having these powers and or possessions would openly admit to “having carried out impossible acts such as flying through the air, they were deluded, and probably many were schizophrenic” (Spanos,417).

The treatment for possession that was unrelated to religion was even stranger, especially at first glance. One treatment was to “suspend the possessed person over a pit full of poisonous snakes so that it might scare the evil spirits right out of their bodies” (Barlow,11). Oddly enough, sometimes this tactic/treatment actually took effect. “Strange behaving individuals would suddenly come to their senses and experience relief from their symptoms, even if only temporarily” (Barlow,11). Other treatment methods consisted of “dunking the person(s) in ice-cold water” as an element of shock (Barlow,11). In the 1500’s Paracelsus, a Swiss physician and astrologer was a pioneer in several aspects of the medical revolution of the Renaissance. He proposes that it is not the devil or evil spirits that affect people’s psychological functioning, but instead, the moon and the stars. From 1400-1800 those thought to have mental disorders were also treated by bloodletting and leeches to rid the body of unhealthy fluids and restore the chemical balance.

The biological school of thought continued to “wax and wane during the times of Hippocrates and Galen but was reinvigorated in the 19th century because of two factors” (Barlow,14). One, the discovery and causation of syphilis, and two, strong support from the most influential American psychiatrist at the time, John P. Grey. He is considered “the champion of the biological tradition in the United States” (Barlow,14). John P. Grey was the head of New York’s Utica Hospital. He believed that insanity was caused by physical attributes, which de-emphasized the psychological treatments. The fields of psychopathology, psychology, and psychiatry were now heading in a scientific direction, whereas before, they had been seen as spiritual in both causalities, and treatment. This giving way to moral therapy.

In 1793 Philippe Pinel, a French physician who was highly influential in the development of a more humane psychological approach to the care and safekeeping of psychiatric patients, which is more commonly known in the present as moral therapy. He introduces moral therapy and implicates more humane practices in French mental institutions. “In the 19th-century mental disorders were attributed to mental or emotional stress, so patients were often treated sympathetically in a restful and hygienic environment” (Barlow,17). This new approach was quite the dichotomy from the previous practices.

In 1848 Dorothea Dix, as an American advocate for the mentally ill who fought through lobbying state legislatures and the United States Congress, helped to create the first generation of American mental asylums which successfully campaigns for humane treatment in U.S. mental institutions. Thus far in the history of mental disorders, documentation has conveyed the ways in which these issues are perceived and dealt with. For the first time in the field of mental disorders, the belief that patients needed not only treatment, but they also needed compassion to be incorporated. This shift in thought is drastic from the topics predecessors but would prove to be a vital component through the present.

Another aspect of this time period that is important to the applicable fields is Emil Kraepelin, one of the founding fathers of modern psychiatry publishes work on diagnosis, classifying numerous psychological disorders from a biological perspective. Prior to this, diagnosis, classification, and stratification had not yet been incorporated. Actual ‘words’ could be assigned to behaviors, which would then convey the combined symptoms into separate and specific areas. Another huge step in the ‘progressive’ direction for psychologists that would be a lasting, and impactful.

Across the globe, numerous individuals began to develop different hypothesis, theories, and new branches of the field of mental disorders. In 1900 Sigmund Freud, an Austrian neurologist and the founder of psychoanalysis, which is a clinical method for treating psychopathology through dialogue between a patient and a psychoanalyst published ‘The Interpretation of Dreams’. While in 1904 Ivan Pavlov, a Russian physiologist known primarily for his work in classical conditioning received the Nobel Prize due to his studies on the physiology of digestion. In the viewpoint of behavioral psychologists, those who suffered from certain mental disorders could be treated with Pavlov’s classical conditioning. Those of this time began practices of altering psychosis of numerous avenues through acts of conditioning and extinction.

Behaviorism would be coined by John B. Watson, an American psychologist who established the psychological school of behaviorism. He indorsed a change in psychology through his speech about Psychology as the Behaviorist Views it, which he gave at Columbia University in 1913. In 1920 John B. Watson’s ‘Little Albert’ experiments dealing with conditioned fear, through the use of operant conditioning. This was the term coined later by B.F. Skinner in 1938. He elaborates on the principals of operant conditioning through his publishing of The Behavior of Organisms-which reinforcement and shaping were used in order to ‘correct’ the behavior of an individual.

A seemingly ‘Shocking’ treatmnet.Can electric shock therapy be benificial to treating Mental Disorders? [source]

Going slightly back in time to 1930, electric shock treatments as well as brain surgery, began to be used to treat psychopathology. Other than psychosurgery such as trephination, ECT is the most controversial treatment for mental disorders. The ideas for this go back even further, but so as to not have disrupted the continuity of Behaviorism I jumped around in my chronology. Benjamin Franklin “accidentally discovered, and then confirmed experimentally in the 1750s, that a mild electric shock produced a brief convulsion and memory loss, but otherwise did little harm. Soon then after, a friend of Franklin, a Dutch physician proposed that this shock may serve as a treatment for depression. Going from drilling holes in people’s heads through the act of Trephination, to shocking people in order to ‘cure’ them from their mental disorder only took two millennia. Although, the later would eventually gain traction.

In the 1950s mental hospitals began to incorporate the practice of electric shock therapy, however, they used it more as a tool for obedience and abuse than a cure for mental disorders. In present-day practices of ECT electroconvulsive therapy patients are anesthetized to reduce discomfort and given muscle-relaxing drugs to prevent bone-breaking from the convulsions during the seizures. The way in which these seizures are said to help those who suffer from depression is as follows. ECT increases levels of serotonin blocks stress hormones and promotes neurogenesis in the hippocampus. Basically, it gives the individual more happy/feel-good chemicals/neurotransmitters, blocks the negative chemicals, and regrows/revitalizes parts of our brain that deal with emotion, it just took two and a half centuries to get it right (Barlow).

Finally, published historiographic information on mental disorders. The International Statistical Institute adopted the first International Classification of Diseases (ICD) in 1893. In 1952 the first edition of the Diagnostic and Statistical Manual (DSM-1) is published in the United States. As the years progressed, both have revised, and continue to expand their classifications through multiple editions. The ICD-10 is the most current globally, while the DSM-5 is used primarily in the United States. These manuals classify how mental disorders were thought about at the time of their publishing. As the manuals progress, so do the ways in which people perceive mental illness.

The studies in mental disorders today, discuss whether these disorders are innate, or learned, essentially using the primitive nature of this topics founding beliefs, ‘biological or physiological’ causation. One of the most studied areas in the field of mental disorders currently is psychopathy. Those most concentrated on are criminal offenders. With them in mind, treatment programs have been developed, as well as professionally accepted ways f diagnosis. This only adds to the fire of confusion when it comes to mental disorders, being that those who have received ‘treatment’, for psychopathy prove more likely to re-offend. Current psychologists predict that microchips will be developed and installed in specific locations of the brain in those diagnosed with psychopathy. This would allow them to function ‘normally’, but changing the neural-wiring of these individuals. This then brings into question, the moralistic, or ethical speculation of this hypothetical practice. Everything stated in this paragraph can be cited to (Hare).

Research has shown through the 22 point scale of Robert Hare, that significant leaders in global society portray traits of psychopathy. The highest, and worst score is equated 40, whilst the minimum requirement for an individual is a score of 26 to be diagnosed as a psychopath. According to a study done at Oxford University by Dr. Kevin Dutton, “Donald Trump scored slightly higher than Adolf Hitler on this scale, and, Hillary Clinton scored between with Napoleon Bonaparte and Nero”. These mental disorders are apparently, not always equivalent to a negative prerogative depending on the political and societal structures they exist in.

Mental disorders throughout history, and into the present, continue to be researched and developed in vastly different ways, however, “the history of psychiatry teaches us to doubt it, by emphasizing the infinitely variable and fluctuating character of psychiatric entities” (Borch-Jacobsen,19). It is much like the rest of the categories studied by historians, paradoxical and dizzyingly contradicting. We as future historians must be aware of these aspects in order for progress to ensue. “The idea that emotional reactions occur reflexively and involuntarily in response to internal and external stimuli persists in the present and continues to make possible and plausible the concept of mood disorders” (Jansson,399).

The Historiography of thought in Psychology, as time continues to pass, Mental Disorders are thought of in numerous different ways. [source]

(1)Barlow, Durand, Hofmann. Abnormal Psychology An Integrative Approach. 8th ed., Cengage Learning, 2018.

(2)Borch-Jacobsen, Mikkel History of the Human Sciences, Vol 14(2), May, 2001 pp. 19-38. Publisher: Sage Publications [Journal Article], Database: PsycINFO

(3)Jansson, Asa. “Mood Disorders and the Brain: Depression, Melancholia, and the Historiography of Psychiatry.” Medical History 55, no. 3 (2011): 393–99. doi:10.1017/S0025727300005469.

(4)Pitsios, Theodoros, and Vasiliki Zafiri. 2012. “Cases of Trephination in Ancient Greek Skulls.” International Journal of Caring Sciences 5 (3): 239–45.

(5)Spanos, Nicholas P. 1978. “Witchcraft in Histories of Psychiatry: A Critical Analysis and an Alternative Conceptualization.” Psychological Bulletin 85 (2): 417–39. doi:10.1037/0033-2909.85.2.417.

Trepanation, the process of making a burr hole in the skull to access the brain, is an ancient form of a primitive craniotomy. There is widespread evidence of contributions made to this practice by ancient civilizations in Europe, Africa, and South America, where archaeologists have unearthed thousands of trepanned skulls dating back to the Neolithic period. Little is known about trepanation in China, and it is commonly believed that the Chinese used only traditional Chinese medicine and nonsurgical methods for treating brain injuries. However, a thorough analysis of the available archeological and literary evidence reveals that trepanation was widely practiced throughout China thousands of years ago. A significant number of trepanned Chinese skulls have been unearthed showing signs of healing and suggesting that patients survived after surgery. Trepanation was likely performed for therapeutic and spiritual reasons. Medical and historical works from Chinese literature contain descriptions of primitive neurosurgical procedures, including stories of surgeons, such as the legendary Hua Tuo, and surgical techniques used for the treatment of brain pathologies. The lack of translation of Chinese reports into the English language and the lack of publications on this topic in the English language may have contributed to the misconception that ancient China was devoid of trepanation. This article summarizes the available evidence attesting to the performance of successful primitive cranial surgery in ancient China.

Conflict of interest statement: The authors declare that the article content was composed in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

A brief reflection on the not-so-brief history of the lobotomy

The slow rise and rapid fall of the lobotomy can give us pause to wonder: In the future, what practice of today will be looked on with horror?

“Each physician has a different nature. One believes in the principle: premum non nocere (do no harm). The other says: melius anceps remedium quam nullum (better a dangerous remedy than nothing). I lean toward the second category.” —Gottlieb Burckhardt, the father of psychosurgery (1891)[1]

Psychosurgery—an ill-defined combination of neurosurgery and psychiatry—has long been one of the most controversial fields in medicine. It has captivated the minds of both the physician and the philosopher alike, having a complicated history of medical uncertainty and ethical divide. Perhaps one of the most familiar terms within the field of psychosurgery is lobotomy—a word that has been broadly used to describe various procedures such as the leucotomy, the topectomy, and the neuroinjection of different sclerosing agents.[2]

Origins of psychosurgery
The origins of psychosurgery can be traced back to antiquity, with evidence of Stone Age craniotomies dating as far back as 5100 BCE.[3] Archaeological findings suggest that prehistoric shamans could access the brain through trephination, a process that involves the drilling or incision of a hole in the skull using a bladed surgical tool.[4] Trephination has been well documented throughout early history leading into premodern times—not only in medical literature but also in certain works of visual art.[5] For example, Renaissance painter Hieronymus Bosch depicts psychosurgical trephination in one of his most popular works, The Extraction of the Stone of Madness (circa 1494). Clearly, there has been a longstanding interest in the brain–behavior relationship and the potential role for psychosurgery in manipulation of this complex connection.

The Extraction of the Stone of Maddness by Hieronymus Bosch.
Reproduced with permission of the Museo Nacional del Prado, Mardid.

It wasn’t until the mid-19th century that psychosurgery took on a more familiar form, when the scientific community became interested in the hallmark neuropsychiatric case of Phineas Gage, a 25-year-old railroad worker who was speared by a rod measuring 109 cm long and 3 cm thick through his prefrontal cortex during an unfortunate workplace explosion.[6,7] To the surprise of the masses, Gage walked away from the incident without any remarkable somatic complaints, but to those who knew him well, the Gage who survived the explosion was not the Gage they had known before. Once an upstanding model citizen, he had become easily irritable, disinhibited, and extremely labile.[8] Gage’s physician followed his case closely and published the following description:

Previous to his injury, although untrained in the schools, he possessed a well-balanced mind, and was looked upon by those who knew him as a shrewd, smart businessman, very energetic and persistent in executing all his plans of operation. In this regard his mind was radically changed, so decidedly that his friends and acquaintances said he was “no longer Gage.”[9]

The case of Phineas Gage spurred an entire field of research into the specific functioning of different parts of the brain, and how this might be related to the clinicopathology of various psychiatric diseases with similarly disinhibited presentations.

Inception of lobotomy
Inspired by an emerging understanding of the frontal lobe and its undeniable force in shaping human behavior, Swiss psychiatrist Gottlieb Burckhardt was the first-known physician to translate theories about the brain–behavior connection into a targeted surgical practice. Working with a small cohort of severely schizophrenic patients who were refractory to other treatment measures, Burckhardt removed segments of a patient’s brain to treat the psychiatric disease and to change the patient, in his words, from “an excited to a quieter demented [schizophrenic] person.”[1] In his landmark research, which he reported in 1891, Burckhardt performed and documented multiple open-brain surgical procedures on six schizophrenic patients over the span of 10 years—with varying degrees of success. His results ranged from patients being successfully “quieted” by the procedure (which was the case for three of the six patients) to one patient passing away from postoperative complications.6 While Burckhardt intended for the utility of his surgery to be “at most palliative,” his research was harshly rejected by the medical community for being highly disturbing and grossly ineffective. So, Burckhardt abandoned his research after the publication of his results, and psychosurgical exploration faded into the background for some decades.[10]

In the early 1930s, psychosurgery experienced a sudden and surprisingly swift revival. In Europe, Portuguese neurologist António Egas Moniz and his neurosurgical colleague Almeida Lima were experimenting with connections between the frontal cortices and the thalamus, and began to slowly reintroduce some of the principles of Burckhardt’s research.[11] To further refine Burckhardt’s surgical technique, the duo developed a more targeted, specific process called the leucotomy, which involved inserting a small surgical rod with a retractable wire loop (called the leucotome) into the brain. The instrument could then be used to cavitate areas of white matter, with the express intent of altering a patient’s disposition.[6,11] With a body of research that was very much in its infancy and without having produced convincing results to support their new technique, Moniz and Lima began promoting the controversial procedure across Europe with charisma and political savvy. Indeed, it was then that the lobotomy began to gain acceptance as a primary treatment for psychiatric disease—even though Moniz and Lima kept poor records of patient follow-up and even had returned some patients to asylums postoperatively, never to be seen again.[11]

As the lobotomy was popularized across Europe, the procedure was also being introduced to an eager North American medical audience. Neurologist Walter Freeman and neurosurgeon James Watts championed this migration, aiming to improve on the results of their international colleagues.[6] The duo modified the procedure so that it required nothing more than a small 1 cm burr hole that could be drilled superior to the zygomatic arch for the insertion of the leucotome. This undoubtedly made the procedure much simpler and slightly less invasive, but it still came with the inherent postoperative risks of seizure disorders, infections, and even death.[6,12] Furthermore, Freeman eventually found himself mesmerized by the work of an Italian colleague who had developed a transorbital approach to the procedure that required nothing more than a simple, ice pick–like instrument that could be tapped through the orbital bone and swept across the prefrontal cortex. He quickly and eagerly adopted this method in the late 1930s.[13]

The work of Freeman and Watts had simplified the lobotomy so much that Freeman began performing the procedure without the help of his neurosurgical colleague and without the sterile field that was often required in the operating room.[5] This served to distance Watts from the pair’s research, as he was disturbed by the crude nature of the transorbital approach and unimpressed with the substandard, nonsterile perioperative care that Freeman was providing. In time, the duo severed their ties, but Freeman continued with his passionate crusade to popularize the transorbital lobotomy throughout North America.[2] Tens of thousands of psychiatric patients underwent the procedure—with varying degrees of success—until the lack of evidence supporting the lobotomy finally caught up with Freeman and his psychosurgical colleagues.

Downfall of the lobotomy
While the rise of the lobotomy was slow and sequential, its demise seems to have happened all at once. Amid growing doubt about the procedure, Moniz was awarded the 1949 Nobel Prize for Physiology or Medicine for his earlier work with the contentious surgery. In an instant, the global medical community cast its critical eye on the research of Burckhardt, Moniz and Lima, and Freeman and Watt, and so began the downfall.[6] Critics challenged that the lobotomy did not “confer the greatest benefit to mankind”—a stated criterion for the Nobel Prize—but rather argued that it caused a more grievous harm.[14] An impressive library of antilobotomy literature was quickly formed.

It wasn’t until chlorpromazine was introduced into the psychopharmaceutical market that the lobotomy was truly depopularized. Chlorpromazine was the first psychotherapeutic drug that had been approved to treat schizophrenia with positive effect, and during its first year on the market, it was administered to an estimated 2 million patients.[15] With a safer, more reliable option now readily available for the entire medical community, the lobotomy officially fell out of favor.

This article has been peer reviewed.


1. Burckhardt G. 1891. Ueber Rindenexcisionen, als Beitrag zur operativen Therapie der Psychosen [About cortical excision, as a contribution to surgical treatment of psychosis]. Allgemeine Zeitschrift fur Psychiatrie und psychisch-gerichtliche Medicin [General journal for psychiatry and mental forensic medicine]. 189147:463-548. German.

2. Kucharski A. History of frontal lobotomy in the United States, 1935-1955. Neurosurgery. 198414:765-772.

3. Alt KW, Jeunesse C, Buitrago-Téllez CH, et al. Evidence for stone age cranial surgery. Nature. 1997387:360.

4. Rifkinson-Mann S. Cranial surgery in ancient Peru. Neurosurgery. 198823:411-416.

5. Faria MA. Violence, mental illness, and the brain—a brief history of psychosurgery: part 1—from trephination to lobotomy. Surg Neurol Int. 20134:49.

6. Mashour G, Walker E, Martuza R. Psychosurgery: Past, present, and future. Brain Res Brain Res Rev. 200548:409-419.

7. Ordia JI. Neurologic function seven years after crowbar impalement of the brain. Surg Neurol. 198932:152-155.

8. Damasio H, Grabowski T, Frank R, et al. The return of Phineas Gage: Clues about the brain from the skull of a famous patient. Science. 1994264:1102-1105.

9. Harlow JM. Recovery from the passage of iron bar through the head. Publ Mass Med Soc. 18682:327-347.

10. Joanette Y, Stemmer B, Assal G, et al. From theory to practice: The unconventional contribution of Gottlieb Burckhardt to psychosurgery. Brain Lang. 199345:572-587.

11. Valenstein ES. Great and desperate cures: The rise and decline of psychosurgery and other radical treatments for mental illness. New York: Basic Books 1986.

12. Freeman W, Watts JW. Prefrontal lobotomy in the treatment of mental disorders. South Med J. 193730:23-31.

13. Pressman JD. Sufficient promise: John F. Fulton and the origins of psychosurgery. Bull Hist Med. 198862:1-22.

14. Lindsten J, Ringertz N. The Nobel Prize in Physiology or Medicine, 1901-2000. 26 June 2001.

15. Feldman RP, Goodrich JT. Psychosurgery: A historical review. Neurosurgery. 200148:647-659.

Mr Gallea is a third-year medical student at the University of British Columbia.

Primary Care Procedures: Trephination of Subungual Hematoma

Subungual hematoma is a fairly common condition. The severe pain that results, caused by the buildup of pressure in a closed space, persists for days if the condition is not treated. However, the blood under the nail can be easily removed-and the pain almost completely relieved-by timely nail trephination. Here I describe techniques that have worked well in my practice.

Figure 1 – The subungual hematoma on this patient’s left thumb would be classed as complex, on account of the damage to the cuticle. (Courtesy of Alexander K. C. Leung, MD)

Subungual hematomas may be simple or complex. Complex hematomas are accompanied by a fracture, nail base dislocation, tissue loss, or skin laceration (Figure 1). Simple hematomas are characterized by an intact nail and nail margins with no other associated injury. 1

Although most subungual hematomas that appear simple are not accompanied by fracture, it is usually wise to obtain radiographs to be sure. However, some authorities suggest that radiographs are unnecessary in patients who exhibit no worrisome findings after the hematoma is drained. 2

When a sudden darkening appears beneath a nail following an injury, the diagnosis of subungual hematoma is fairly straightforward. If the patient has no history of significant trauma, consider other conditions that may have a similar appearance, such as subungual melanoma, subungual nevus, and Kaposi sarcoma. 1 PREPARATION FOR DRAINAGE
Nail trephination can be successfully performed up to 36 hours after injury-and possibly even later-because the blood under the nail will not coagulate during this period. 3 An underlying fracture is not considered a contraindication to nail trephination. 3

Before drainage, prepare the nail with povidone-iodine solution or alcohol. If the only procedure to be performed is trephination, local anesthesia is generally not necessary.

Some authorities have recommended removing the nail plate and repairing the nail bed for subungual hematomas that involve more than 50% of the nail. Because nail bed repair is difficult at best, and because the nail itself acts as an anatomical splint, this recommendation seems to add risk and pain with little benefit. Better data support the less invasive approach.1 If the nail base is dislocated, however, as is often the case when a crush injury involves a tuft fracture, I do remove the nail and repair the bed.


Figure 2 – An electrocautery unit such as this may be used to drain a subungual hematoma by melting a hole in the nail.

There are a variety of drainage methods. One of the techniques most commonly taught to new practitioners is to employ heat to melt a hole in the nail. A heated paper-clip tip or a portable medical electrocautery unit may be used (Figure 2). 3 Some clinicians feel that trephination accomplished through the use of heat is more painful than other methods. There is also a possibility that the heat will cause the blood to coagulate and thus limit drainage. However, I have not found this to be a problem.

At least 2 medical devices for draining subungual hematomas quickly and painlessly-and without heat-have been described. The first is a medical drill (PathFormer). 4 Although I have no experience with this device, it is reported to be quite effective and painless. The second device, a carbon-dioxide laser, has also been used to drain subungual hematomas without pain. 2 This might be a good choice for a dermatologist or primary care provider who already has one in the office. Despite their advantages, the cost of both these devices would likely be an obstacle.

My preferred method for draining a subungual hematoma is to use an 18-gauge needle as a twist drill this method employs easily accessed equipment and is practically painless. After applying a topical antiseptic, such as povidone-iodine solution, position a hypodermic needle with the tip in the center of the hematoma and hold the hub between the index finger and thumb (Figure 3). Then roll the needle back and forth so that it slowly bores into the nail plate. Within less than a minute, blood should start to emerge from the hole. At this point the tip of the needle is within the hematoma and has not touched the sensitive nail bed (Figure 4). Continue drilling until the hole has widened sufficiently or until the first sign of discomfort from the patient (which will be a signal that the needle has touched the nail bed). Up to this point, the procedure is typically painless. The experience of draining 3 or 4 hematomas in this manner provides a good feel for that point just before the nail bed is reached. Stopping there makes for a completely painless procedure.

Figure 3 – To drain a subungual hematoma with an 18-gauge needle, hold the hub between your thumb and index fingers and position the tip in the center of the hematoma. Then roll the needle back and forth so that it slowly bores into the nail plate.

Figure 4 – When draining a subungual hematoma with a needle, try to stop just before the needle reaches the sensitive nail bed.


Figure 5 –The longitudinal ridging evident on this patient's fingernail is the result of a prior subungual hematoma with fracture.

After the hematoma has been drained, use a 4 3 4-in gauze pad to wick up as much blood as possible. One source has suggested using a capillary tube for this purpose. 2

Finally, apply a sterile dressing. Consider sending the patient home with a sterile needle to use should dried blood block the hole. Antibiotics may be prescribed but are generally unnecessary even if there is an accompanying fracture.

Be sure to warn the patient that the nail may be lost, although eventually a new one will grow to replace it. Even more important is to warn the patient that there is a 2% to 15% risk of permanent nail deformity as a result of the initial injury to the nail bed (Figure 5). 1

Simple subungual hematomas rarely require further care. Complex hematomas that are sutured or involve fractures of the distal tuft will require monitoring of wound healing, suture removal, and/or referral to an orthopedist.

A brief history of epilepsy and its therapy in the Western Hemisphere

The history of epilepsy and its treatment in the western world dates back at least 4 millennia to the ancient civilization of the middle east. Past and present treatments have been empirical, usually reflecting the prevailing views of epilepsy, be they medical, theological or superstitious. Ancient physicians relied on clinical observation to distinguish between epileptic syndromes and infer their cause. Early pathophysiological theories of epilepsy correctly identified the brain as the site of the problem, but emphasized incorrect causes such as an excess of phlegm in the brain. Treatments consisted of prescribed diets or living conditions, occasional surgery such as bloodletting or skull trephination and medicinal herbs. These treatments, often ineffective, had the intellectual advantage of being based on pathophysiological principles, unlike current, more empirical, therapies. The unfortunate but widely held view of epilepsy as being due to occult or evil influences gained adherents even in the medical world during ancient times, and the later acceptance of Christianity allowed theological interpretations of seizures as well. Magical or religious treatments were more frequently prescribed as a result, practices which persist to this day. In the Renaissance an attempt was made to view epilepsy as a manifestation of physical illness rather than a moral or occult affliction, but it was during the Enlightenment that epilepsy was viewed along more modern lines, helped by advances in anatomy and pathology and the development of chemistry, pharmacy and physiology. The idea that focal irritation may cause seizures came about from clinical and experimental work, and was supported by the successful control of seizures by the (sedative) bromides and barbiturates in the late 19th century. The introduction of phenytoin showed that non-sedative drugs could be effective in controlling seizures as well, and the development of in vivo seizure models widened the scope of pharmaceutical agents tested for their efficacy against epilepsy. Increasing knowledge of the cellular mechanisms of epilepsy will, hopefully, allow the development and introduction of drugs with increasing specificity against seizure activity and the development of epilepsy.

The 19th and 20th Century Treatments

During the late 19th and early 20th centuries, treatments for severe depression generally weren't enough to help patients.

Desperate for relief, many people turned to lobotomies, which are surgeries to destroy the brain's prefrontal lobe. Though reputed to have a "calming" effect, lobotomies often caused personality changes, a loss of decision-making ability, poor judgment, and sometimes even death.  

Electroconvulsive therapy (ECT), which is an electrical shock applied to the scalp in order to induce a seizure, was also sometimes used for patients with depression.

In the 1950s and 60s, doctors divided depression into subtypes of "endogenous" and "neurotic" or "reactive." Endogenous depression was thought to result from genetics or some other physical defect, while the neurotic or reactive type of depression was believed to be the result of some outside problems such as a death or loss of a job.

The 1950s were an important decade in the treatment of depression thanks to the fact that doctors noticed that a tuberculosis medication called isoniazid seemed to be helpful in treating depression in some people.   Where depression treatment had previously been focused only on psychotherapy, drug therapies now started to be developed and added to the mix.

In addition, new schools of thought, such as cognitive behavioral and family systems theory emerged as alternatives to psychodynamic theory in depression treatment.

One of the first drugs to emerge for the treatment of depression was known as Tofranil (imipramine), which was then followed by a number of other medications categorized as tricyclic antidepressants (TCAs). Such drugs provided relief for many people with depression but were often accompanied by serious side effects that included weight gain, tiredness, and the potential for overdose.  

Other antidepressants later emerged, including Prozac (fluoxetine) in 1987, Zoloft (sertraline) in 1991, and Paxil (paroxetine) in 1992. These medications, known as selective serotonin reuptake inhibitors (SSRIs), target serotonin levels in the brain and usually have fewer side effects than their predecessors.

Newer antidepressant drugs that have emerged in the past couple of decades include atypical antidepressants such as Wellbutrin (bupropion), Trintellix (vortioxetine), and serotonin-norepinephrine reuptake inhibitors (SNRIs).

Trephination Timeline - History

If you read the medical news lately you may have seen a headline title Skeleton May Show Ancient Brain Surgery. This article was about an 1800 year old skeleton found in Veria, Greece. The skeleton was of a woman of about 25 years of age that suffered severe head trauma and underwent cranial surgery, unfortunately evidence shows that she did not survive.

There is an interesting history of skull surgery, known as trepanation, which comes from the Greek word trypanon, meaning auger or borer. Cranial trepanation has caught the interest of surgeons and archeologist since the 1860's, when it was first realized that ancient humans had scraped or cut holes in the skulls of living persons in France and Peru.

Trepanation is serious enough surgical procedure in this day and age, could this procedure have taken place as a routine operation as long ago as 2000 BC? We do have a historical record of thousands of skulls with evidence of this surgery. Sometimes historical records suggest a reality that we find hard to accept.

Maybe the romantic in us wants to believe that our ancestors could accomplish this but logic tells us that they didn’t have the technology or medical understanding to perform this surgery. They must have done it on dying or dead patients, that would be the logical answer. Unfortunately historical evidence exists that proves beyond any doubt that patients not only were alive when they had cranial surgery but survived in most cases, and many endured several of these operations over a lifetime.

In studies of healing patterns after primitive trepanations some assumptions can be made:

If there is no sign of biological activity around the surgical site, then death was almost immediate.

If there is a discrete ring of superficial osteoporosis around the wound then it is likely that the patient has lived 1 to 4 weeks postoperatively.

When the edge of the wound reaches an equilibrium and calcium is deposited where new bone forms radial striations, and eventually the edge consolidates the patient has survived several months postoperatively. (Marino p946) Credit:

Why would primitive cultures of France, nearly 4000 years ago, practice trepanation? The suggested reasons for this surgery are numerous but not substantiated. Researchers over the last century and a half have speculated that cranial surgery was done in cases of trauma from battle or accident, cranial infections, headaches, mental disease, and religious rituals. (Marino p944) Rituals involving the opening of the skull were believed to facilitate the exit of evil spirits that caused epilepsy. This seems plausible because in almost every age and culture epileptic seizures were believed to be the work of evil spirits. (Finger p915)

Some of these reasons for trepanation, though logical do not hold up under scrutiny. There is no gender difference in the distribution of the older French skulls, if combat had caused injury we would expect more males to be candidates for this procedure. Also if war were a major cause of head injury there would be more surgeries to the left side of the skull, if they were struck by a right-handed adversary.(Clower p1421)

In the study of trepanation over the last one hundred and fifty years two men stand out Dr. Paul Broca (1824-1880) and Dr.Victor Horsley (1857-1916). Dr. Broca was not the first person to find, examine or collect trepanned skull but he was the first person to understand and explain what he saw. Horsley's interest amounted to little more than a passing fancy, but his theories regarding the origins of the practice of trepanation contributed significantly to our understanding. Unfortunately neither Broca nor Horsley’s theories have withstood the test of time.

The theories of Broca and Horsley remain widely cited in the anthropological and archeological literature. (Finger p911) Scientists still compare and contrast Horsley's empirical-surgical theory of trepanation with the more anthropological-medical approach chosen by Broca, who attempted to connect seizure disorders in children to supernatural events. (Finger p916) "For Broca, the major stumbling block proved to be the lack of solid evidence to prove that young people were routinely chosen for the operation." Without the age factor, his theory is more plausible.

For Horsley, the idea that the openings were above the motor cortex proved problematic. Without this feature, his notion of traumatic injury also seems more reasonable. (Finger p916) It is interesting that Horsley was one of the first researchers to conclude that the "motor cortex" is smaller than he originally thought and probably did not extend back to include the parietal lobe. Horsley's later motor cortex mapping research helped to undermine the very trepanation theory he had proposed.(Finger p915)

Horsley's general thesis, that blows to the skull with or without epilepsy might have been the initial reason trepanation was performed, is more likely. The best empirical support for the skull fracture theory comes not from French anthropological sites, but from skulls found in Peru that he did not examine. (Finger p915) Peruvian skulls have a male-to-female ratio that is approximately 4:1, about half of the skulls have facial area damage, and they have significantly more trepanations on left side. This suggest that Peruvian physicians saw many more head injuries caused by combat among right handed warriors.(Finger p916) Notably missing from the 20th-century scientific literature is evidence that trepanation was performed for religious, magical, or cultural reasons.

Why did these patients survive cranial surgery? In the documented cases of cranial surgery recorded by French anthropologist, that took place over 4,000 years, I have not read of a solid defendable hypothesis. Of the cases documented from Peru until 500 years ago I have some ideas. Survival of surgery is a quality-of-life issue. The citizens of pre-Columbian Peru had a substantially higher quality-of-life than their counterparts in Medieval and Renaissance Europe.

Examination of Peruvian skulls, by today’s physicians, reveals that these cranial surgeries rarely became infected, and most survived. Even more impressive are the skulls exhibiting successful cranio-plasties (plates inserted into the trephination holes) made of silver and gold, which were placed with such skill that the bone healed around them. (Marino p942 this reference has pictures of sculls with gold cranio-plasties that is well worth the trip to a medical library to see) In contrast, during the 18th century, trephination of the cranium in Europe reached a nearly 100% fatality rate.(Marino p945) Comparing the two cultures may give a clue to why the Peruvian patient’s quality-of-life was better and therefor he/she was more likely to survive.

If you are reading this from a North American point-of-view you probably don’t have a preconceived view of life in South America one thousand years ago, this is a good situation. To better understand the relative timelines and pre-Columbian empires a short review is appropriate so as not to confuse the different cultures. Reviewing the map from north to south the Aztecs settled in what is now central Mexico on small islands in Lake Texcoco where they founded the city of Tenochtitlan (circa 1300 ad) that is now Mexico City. They created a cultural and political empire during the 15th century. Looking farther south the Maya controlled southern Mexico from about 50 BC until the Spanish conquest in the 16th century. The Maya empire reached its cultural and political zenith about 550-900 AD. They controlled the area of southern Mexico and Honduras

The Inca empire, which we are interest in, was by far the largest pre-Columbian state, extending from Peru to Chile including western and central South America. This area was developed by the Chavin-Sechin (900 to 200 BC), the Huari-Tiahuanaco (750 BC to AD 1000), and the Moche-Chimfi cultures (200 BC to AD 1400).(Marino p941) During each of these periods the population reached higher levels of culture under paternal monarchs and each of these cultures were based on agricultural socialism. (Marino p942) Historically the Incas came late on the scene. The expansion of the Inca empire was achieved in some part by military conquests. Not all groups were brought into the realm by direct military action, many joined in alliances with the Incas as the result of peaceful overtures from the expanding state. Others joined out of fear that military intervention would result if an invitation to peaceful alliance were rejected. During this time the population detribalized and culture soared. (Marino p942) Quality of life was improving because of "wise and benevolent rulers."

Before Francisco Pizarro’s conquest of the Inca’s, their empire was equivalent in area to France, Belgium, Holland, Italy, and Switzerland combined, measuring approximately 980,000 km2. (Marino p941) At its height the Inca empire had an estimated 12 million people in much of what is now Peru and Ecuador and large parts of Chile, Bolivia, and Argentina. At the beginning of the Renaissance (circa 1500 AD) there were about 73 million people living in Europe. (Manchester p47)

It may be harder for you to understand Europe of 1000 to 1500 AD, you have to abandon your High School and Hollywood version of Medieval Europe and dig deep to develop a realistic world view. With the fall of the Roman Empire social structure and public works infrastructure collapsed as barbarian hordes overran Europe. As Europe emerged from the Dark Ages, life was not good even in the best of times for the average person.

European political institutions evolved over the centuries. Medievalism was born in the decaying ruins after the barbarian tribes had overwhelmed the Roman Empire. A new aristocracy of nomadic tribal leaders eventually became the ruling nobles of Europe. These militant lords, enriched by plunder and conquest were not "paternal" leaders.

Cities in Europe and Peru are not related in structure or function. In Europe people lived in walled towns for protection. In Peru the detribalized population was united, cities were cultural and religious centers, people lived in surrounding countryside. The wall around a town in Europe was its first line of defense. Therefore the land within was very valuable, and not an inch of could be wasted. The twisting streets were extremely narrow and were not paved Doors opened directly onto streets which were filthy, urine and solid waste were simply dumped out windows. Sunlight rarely reached the ground level, because the second story of each building always extended out over the first story, and the third story extended over the second, nearly meeting the building on the other side of the street. (Manchester p48)

The walled town was not typical of Europe though. Between 80 and 90 percent of the population lived in villages of fewer than a hundred people. These villages were fifteen or twenty miles apart surrounded by endless forest. (Manchester p53) Unless a person was a noble or priest his/her mental geography limited their world to what they knew. If war took a man even a short distance form his nameless village, the chances of his returning were slight, and finding his way back alone was virtually impossible. "Each hamlet was inbred, isolated, unaware of the world beyond the most familiar local landmark."(Manchester p21)

Cities in Peru did not have the cramped population and unsanitary conditions of Europe. Nor did they have the pollution-producing industries emerging in Europe. These people were engaged the cooperative efforts of agriculture, mining, herding, and fishing. They had a rural lifestyle in small villages over the high plateaus and coastal lowlands. Their cities appeared to be cultural centers where people would travel to, they lived in the outlying country side. Because even the remote mountain villages were tied to the rest of the empire with an intricate road system of approximately 20,000 km for rapid messenger service to communicate across the empire the pre-Columbian people had a much broader mental geography.

In Europe at the end of the Dark Ages agriculture and transportation of foodstuffs were inefficient, the population was never fed adequately from year to year. Famines, Black Death and recurring pandemics repeatedly thinned the population of Europe at least once a generation after 1347(Manchester p5). The Peruvians demonstrated knowledge of the contagion mechanisms of typhus (which would be understood in Europe only at the beginning of the twentieth century). They fought it with isolation measures and recognized the role of body lice in its spread. It is also evident that they understood the means by which malaria, endemic on the Peruvian coast, was spread. Houses were routinely built in the high and sandy part of the valleys, outside of the access radius of the mosquito vectors. (Marino p942 ) Tuberculosis, whose cause and spread depends essentially on poor social conditions was not endemic in their culture, Europe was not so lucky.

There are numerous reports in historical chronicles that refer to the pharmacological wealth of South America that was used by the pre-Columbian cultures. Many of these drugs could help the patient survive trepadation. The most obvious would be drugs that could be used for anesthesia. This could have been accomplished with drugs known to be used by the Incas such as, coca, datura, or yuca. It is know that alcoholic beverages such as chicha, made of fermented corn, was given to patients, causing a relaxed or sedated state. The next most obvious drug choice would seem to be an antiseptics to prevent infection, such as, Peru balsam, tannin, saponins, and cinnamic acid. These were available and used for embalming the dead, they may have been used in surgery. It would be prudent to have a good drug to control bleeding, this could have been done with herbal extracts of Indean ratania root, pumachuca shrub, and preparations high in tannic acid. (Marino p947) Beyond surgery a drug used then as well as today to control Malaria is quinine. It is well known that they used the bark of the cincona tree as a source of quinine to treat malaria. (Marino p943) The tragedy of the pre-Columbian historical period is the lack of written records (Marino p942) this would have provided remarkable insights into early surgeons and their medical practices.

Clower, William & Finger, Stanley, Discovering Trepanation: the contribution of Paul Broca, Neurosurgery Vol. 49 No. 6, p.1417-1425, December 2001

Finger, stanley & Clower, William T., Victor Horsley on "Trephining in Pre-historic Times," Neurosurgery, vol. 48, Number 4, p. 911- 918, April 2001

Manchester, William, A World Lit Only By Fire, Litle, Brown and Company, Boston, 1993.

Marino, Raul & Gonzales-Portillo, Marco, Preconquest Peruvian Neurosurgeons: A study of Inca and Pre-Columbian Trephination and the Art of Medicine in Ancient Peru, Neurology, vol. 47, No 4, p. 940 – 955, October 2000,

Trephination Timeline - History

A short history of brain research

Despite the fact that the understanding of the human brain is still in its infancy, it appears that brain surgery is one of the oldest of the practiced medical arts. Evidence of “trepanation” can be found in archaeological remains dating back to the Neolithic period – around 10,000 BC. Trepanation (also known as trepanning, trephination, trephining or burr hole) is surgery in which a hole is drilled into the skull to expose the brain.

Cave paintings from the late Stone Age suggest that people believed the practice would cure epileptic seizures, migraines and mental disorders, perhaps in the belief that the operation would allow evil spirits to escape. There is also some evidence that such surgery was undertaken to prevent blood clots forming and to remove bone fragments following a head injury.

The following list details some of the key events and discoveries that have helped shaped our understanding of the brain today:

Hippocrates, the father of modern medical ethics, wrote many texts on brain surgery. Born on the Aegean Island of Cos in 470 BC, Hippocrates was quite familiar with the clinical signs of head injuries and he was the first known person to speculate that the two halves of the brain were capable of independent processing, which he termed "mental duality".

The study of the brain suffered a setback in the seventeenth century when René Descartes, the French philosopher and founding father of modern medicine, was forced to do a deal with the Pope in order to get the bodies he needed for dissection. The Pope agreed on the understanding that Descartes would not have anything to do with the soul, mind or emotions, as those were seen as the realm of the church. Unfortunately, this agreement set the tone for Western science for the next two centuries, dividing the human experience into two distinct and separate spheres that could never overlap. Even today many people are sceptical of illnesses that are defined as being psychosomatic (illnesses where the symptoms are caused by mental processes of the sufferer).

Franz Joseph Gall, a German anatomist, founded the science of phrenology, which holds that a person’s character can be determined by reading the configuration of bumps on the skull.

As peculiar as this theory may seem, it was widely accepted at the time. At the height of the phrenology craze, some people suggested that politicians should be chosen based on the shape of their skulls while others claimed to be able to detect signs of latent delinquency in children based on the bumps on their heads.

A North American railway worker by the name of Phineas Gage suffered damage to the frontal lobe of his brain when it was pierced by a metal rod that shot through his skull during an explosion.

Although Gage survived the accident, he experienced profound mood and behaviour changes. A quiet, industrious worker before the accident, Gage became a surly, aggressive man who could not hold down a job.

This famous case, now found in countless neuroscience textbooks, was an important milestone in the study of the brain’s anatomy because it suggested that important parts of the personality reside in the frontal lobe. These findings indirectly lead to the development of the procedure called lobotomy, which was based on the theory that the removal of portions of the frontal lobe could cure mental derangement and depression.

Charles Darwin published his book “The Expression of the Emotions in Man and Animals” in which he traces the origins of emotional responses and facial expressions in humans and animals, making note of the striking similarities between species. Later, in an unpublished notebook, Darwin proposes the theory that blushing is a clear indication of consciousness. He notes that of all the animals, only humans blush and claims that this is because they are the only ones capable of self-consciously imagining what others are thinking of them.

© MyBrain International Limited. All rights reserved. No part of this site may be reproduced without prior written permission.

Watch the video: Nail Trephination (November 2021).