Since the dawn of time, people of all cultures have sought to immortalize themselves and cheat death.
From mystical elixirs and occult rituals to state-of-the-art medical advancements, the science of longevity has been studied for centuries — and now it may finally be yielding results. Humans can now manage chronic diseases and hack their physical bodies to defy the effects of aging.
Understanding how this became possible requires looking back. Learn which ideas have endured, which have faded, and which continue to shape the way people approach health and aging today by tracing the history of longevity.
Read on to learn about the perennial quest to outlast mortality, how virtually every culture across the globe has approached it, and what might actually work.
Ancient roots of the quest for longevity
Ancient civilizations have long sought immortality, and revisiting these early roots offers insight into today’s trends in longevity.
Since most people in ancient times had a lower life expectancy than people do today, they often believed that conquering death would make them immortal and, by extension, divine, and therefore, they spared no expense in trying to do so.
The following section examines some of these earlier efforts, as reflected in myths, legends, and cultural beliefs. They reveal how the roots of the modern longevity journey were planted, providing lessons that people have adapted and reshaped over time.
Longevity myths and legends in ancient cultures
One of the most striking and prevalent myths is the concept of the “fountain of youth.” Popularized in the 1400s and 1500s by Spanish explorers, it was believed that finding this magical substance could grant the consumer the ability to live forever or cure them of mortal illness.
In the Hindu religion, there are multiple references to a plant known as “Soma,” which was believed to grant immortality to whoever ingested it. Expanding on this, an entire branch of Ayurvedic medicine — known as “Rasayana“, or rejuvenation — aims to tackle the mysteries of aging, and is meticulously detailed in countless Ayurvedic texts.
The Greeks were also renowned for their reverence for immortality. Ancient Greek myths frequently reference the “Ambrosia” and “Nectar” of the gods, which were considered food and drink that the gods consumed to grant them immortality.
The catch, as with all Greek legends, was that if one were to eat these foods, then they would be condemned to continue to do so for eternity, lest their life force fade away.
Early medical practices aimed at prolonging life
As the mystical pursuit of immortality gave way to more practical approaches, ancient healers began to look inward — into the body, mind, and natural world — for answers to achieving long life. Across cultures, early medical traditions emerged, each offering its own philosophy of balance and preservation.
Ayurvedic medicine
Ayurvedic beliefs take a holistic approach towards dhatusamya, or homeostasis. It was believed that adhering to core tenets could help maintain balance among the body, mind, and spirit. This was necessary for maintaining homeostasis and preventing illness.
These comprehensive tenets were:
- Diet. Taking care of one’s diet and eating mindfully.
- Sleep. Sleeping well
- Abhyanga. Regular oil massages.
- Panchakarma. Deep tissue massages.
- Mindfulness. Practices such as yoga and controlled breathing.
- Self-care. Keeping regular self-care routines for morning, midday, and evening.
It is worth noting that most, if not all, of these practices are still recommended by modern Western physicians today, albeit with a more modernized perspective.
Traditional Chinese medicine
Traditional Chinese medicine also offers many perspectives aimed at promoting longevity and minimizing age-related decline. Traditional herbal remedies have long been used as they were believed to be potent anti-inflammatory agents, and therefore help people to live longer.
Some of these herbal remedies are:
- Ginseng
- Turmeric
- Goji
- Astragalus
- Green tea
Many studies in recent years have examined the use of extracts from these herbs and have actually found some evidence potentially linking their consumption to decreased mortality.
Similarly, many scholars during these times relied heavily on their limited knowledge of human anatomy and physiology to develop remedies for their patients. Many such remedies came from known medicinal plants.
Greek/Western medicine
Hippocrates, who is regarded as the father of Western medicine, was one of the first to turn away from looking at the divine for notions of healing and immortality and instead looked towards the human body.
His philosophy centered around the 4 body “humors” — blood, phlegm, yellow bile, and black bile — whose balance was key to maintaining good health, and imbalance would lead to disease. While archaic, his observations paved the way for scholars succeeding him, such as Aristotle and Galen, to expand on their explorations of human anatomy and medicine.
While these observations may have been primitive, they still provided early scholars with considerable insight into the inner workings of the human body and how its external environment can positively or negatively impact it.
Cultural views on aging and the afterlife
The Ayurvedic practice as a whole does not view aging as a disease or a problem needing to be “fixed”, but rather as one of the natural processes of living, and eventually dying.
Jara, or aging, is comparable to hunger, thirst, or sleep and is composed of 4 different entities that are affected by physical, emotional, metabolic, immune, and even psychic factors.
Similarly, the ancient Egyptians did not believe in physical reincarnation, but rather spiritual reincarnation. They thought that the person would either ascend into the afterlife if they had lived a good and moral life or completely cease to exist if they had been wicked.
This philosophy was rather convenient, as it also aligned with the fact that ancient Egyptians did not live as long as most populations did at the time; this incentivized them to do good in the years they did have.
![]()
Life expectancy through history
While many schools of thought on aging existed, often intertwined with spiritual and religious beliefs, most lacked any medical or scientific knowledge.
For much of history, people have faced numerous daily challenges. From being chased by wildlife to dying by the thousands from famine, the human race has overcome multiple obstacles to reach its current state.
These challenges shaped human life expectancy and influenced the ways people adapted in their pursuit of longer lives. This pattern is evident in the next section, spanning prehistoric, medieval, and even modern times.
Prehistoric and ancient life expectancy
Hunter-gatherer societies faced poor hygienic conditions, high burdens of infection, malnutrition, and ineffective medical practices; these challenges often meant high mortality rates for infants, specifically.
Life expectancy for those who made it past infancy sat around the ages of 10-30 years, capping life expectancy to around 40 years for most. Interestingly, studies have found that around half of those who reached 20 were able to reach the age of 60.
This eventually enabled stable multigenerational support for younger people and thus the transfer of skills and experience.
Additionally, tribes and villages coalesced into small towns and communities, and people were able to slowly navigate these challenges, to some extent. This newfound stability enabled communities to connect, trade, and thus thrive alongside others, leading to the rise of civilization as it is known today.
Medieval era: Plagues and plateaus
People in the 14th century lived under conditions not dissimilar to those of their ancient predecessors, and life was far from easy. While written records are sparse, archeological data and historical accounts provide a grim image, with climate conditions that often led to crop failures and various famines.
Most communities subsisted on herding livestock and farming, and relied on unclean water sources for their everyday needs. Much of the population lived in poverty and had extremely limited access to formal medical care, which was also not the best.
This setting provided an ideal petri dish for all kinds of infectious diseases to wreak havoc, which they did. There were:
- Tuberculosis
- Pneumonia
- Smallpox
- Viral illnesses
- Dysentery
- Various gastrointestinal diseases
All of which were omnipresent, and usually fatal for the common man who often had a compromised immune system, plagued by poverty, malnutrition, and unsanitary conditions.
It is thus no surprise that when the Bubonic Plague, also known as the Black Death, broke out, it had devastating effects on the populations of Europe and the Middle East.
While the numbers are not consistent, it is estimated that the Bubonic Plague killed between 25 and 50 million people in Europe between 1346 and 1353, wiping out almost half the population of the continent and nearly a third of the population in the Middle East.
Enlightenment and early scientific breakthroughs
After the Black Death, or Bubonic Plague, many parts of Europe struggled heavily to recover. However, throughout the hardships of that recovery process, some discoveries were made, though they would not bear fruit for decades or centuries.
One example of this is the earliest microscope, which was created around 1590 by Hans and Zacharias Janssen, only a few years after the plague.
This led to the discovery of the “cell“, which is attributed to Robert Hooke in 1665, who used the term to describe the microstructure of cork. His contemporary, Anton Van Leeuwenhoek, had developed the strongest microscope of his time. This led him to explain, for the first time, microscopic structures such as bacteria, protists, and sperm cells.
The primitive nature of medicine at the time did not favor people, as most therapies relied mainly on herbal remedies and practices such as bloodletting to “treat” illness. It is also no surprise that most doctors, or rather apothecaries, were not accessible to most, as they were often too expensive for the layman, and so old wives’ tales developed.
Despite these antiquated practices, this era also gave rise to various medical discoveries and numerous theories that are still in use today.
The turning point: Modern medicine and the Industrial Revolution
During the Industrial Revolution, life expectancy began to rise for the first time in human history, thanks to advances in sanitation, medicine, and living standards. At that time, the average person could expect to live only about 32 years, whereas now someone born in 2021 might reach 70 or more!
This remarkable leap reflects the transformation of people’s lives through technological and social progress, extending their lifespan. Largely due to the invention of technologies such as steam engines, telegraphs, and the “spinning jenny,” which was used for spinning cotton and wool, manufacturing became much more efficient.
This massive shift led to mass migration events from rural towns to industrial urban centers, bringing about significant socioeconomic, technological, and cultural changes.
Medicine also underwent significant changes during this period, keeping pace with the broader transformations of the Industrial Revolution. Below are some key examples of how the revolution reshaped the medical world:
Vaccination and infectious disease control
As societies advanced, one of the greatest threats to human longevity remained invisible: the spread of infectious diseases.
For centuries, outbreaks of smallpox, tuberculosis, and other deadly illnesses devastated communities despite rising living standards. The realization that prevention could be more powerful than cure marked a new chapter in humanity’s pursuit of long life, as medical innovators began to turn the tide against disease and lay the foundation for modern immunology.
![]()
Variolation
Multiple texts describe a practice known as variolation. In what is arguably the earliest form of vaccination, variolation was widely practiced in Southeast Asia, specifically in Bengal, to prevent and immunize against cowpox and smallpox.
It involved taking scraps from the pustules of sick people and making powders from them to be inhaled, or even beaded amulets or bracelets, to inoculate healthy people and protect them from disease.
The practice eventually spread to Europe, where smallpox had also been ravaging the population. Although it was not a foolproof method of achieving immunity, variolation had a case-fatality rate 10 times lower than smallpox itself, and even many of the aristocratic class relied on it.
Proper vaccination
In 1796, Edward Jenner performed the first successful vaccination against smallpox.
His success was a direct result of experiments conducted over the course of several years, following reports of milkmaids who had been immune to smallpox after contracting cowpox beforehand.
Regardless of the criticism and scrutiny he received from the public and even his peers, Jenner’s discovery was a key step that kick-started the vaccine boom of the following two centuries and led to the eventual eradication of smallpox in 1980.
Vaccine boom
Building on Jenner’s success, the success of vaccination sparked a global wave of scientific exploration aimed at preventing other deadly diseases.
Tuberculosis (TB)
Among them was Tuberculosis (TB), one of the most widespread and devastating airborne infectious diseases, with some of the earliest records of its existence dating back to the ancient Egyptians 4000 years ago, similar to polio. Often referred to as “consumption,” and “wasting disease” due to its catastrophic effects on people, tuberculosis was considered a sure death sentence for most of history.
While research for a TB vaccine had been underway since the early 1900s, it was not until many decades later that its efficacy was proven in large clinical trials.
This effort culminated in the development of the Bacille Calmette-Guérin (BCG) vaccine by French scientists Albert Calmette and Camille Guérin. It is the first vaccine to protect against tuberculosis, which remains in use today and has dramatically reduced global TB rates, especially in developed countries.
Among the most remarkable discoveries of this vaccine boom is the development of the polio vaccine.
Polio
Poliomyelitis is a highly infectious illness that disproportionately affects children. It can present with symptoms as mild as the stomach flu, all the way to severe paralysis and even death. It is an ancient illness, with records dating back to ancient Egypt.
In the 1900s, Polio was a ubiquitous condition in the United States, with outbreaks occurring each summer.
It was not until the efforts of Jonas Salk in the 1950s that a vaccine was developed with astounding results; the rate of paralytic polio in the United States dropped from 13.9 cases per 100,000 people in 1954 to 0.8 cases per 100,000 by 1961.
The development of the oral polio vaccine further suppressed this illness, and by 2002, the Americas, Europe, and the Western Pacific regions were declared polio-free.
Today, polio has largely been eradicated globally, with only a handful of endemic areas remaining, an estimated 20 million people who would have otherwise been paralyzed, and around 1.5 million lives saved.
Discovery of antibiotics
In 1928, the transformative effects of Penicillin, one of the very first antibiotics, were discovered in Alexander Fleming’s lab by pure chance.
The discovery of this drug at such a critical time transformed the course of medical science. It saved countless lives afterward by uncovering more organic-based medications and eventually synthetic ones as well.
The first medicine discovered to have antituberculous effects was streptomycin in 1945. The success of this drug was short-lived, however, as resistance to its effects quickly developed. This was curtailed by the development of a drug known as para-aminosalicylic acid, which, when paired with streptomycin, was able to provide patients with a lasting cure.
Within a few years, “modern” TB drugs — rifampicin, isoniazid, pyrazinamide, and ethambutol — had all been developed. These drugs are the same ones being used for TB treatment today, known as the RIPE regimen, usually undertaken for courses between 6 and 9 months.
While tuberculosis is still a highly prevalent illness in today’s world and is considered the top infectious killer in the world, the WHO estimates that between 2000 and 2016, around 53 million TB-related deaths have been prevented due to advances in TB vaccination, treatment, and detection.
Early roots of modern medicine
During the 18th century and until the Industrial Revolution, medical practice was relatively primitive in its accuracy and quality.
While numerous medical texts were developed in the preceding decades, many of these were still of some use, but they were also outdated. Additionally, the technology available at the time further limited the potential for discovery, and despite this, much of today’s medical practice had its foundations set during this century.
In the late 1800s, Louis Pasteur developed his groundbreaking “germ theory,” which for the first time examined the link between microbes, infections, and diseases.
This subsequently led to the invention of “pasteurization,” the process by which food, mainly milk and dairy products, is made safe for human consumption by being treated to remove disease-causing organisms. Pasteur is also credited with developing early versions of vaccines for diseases such as rabies and anthrax, and as a result, his contributions have revolutionized clinical medicine.
Some years later, Joseph Lister built upon these discoveries along with his own observations and applied them to surgical settings. As a result of his experiments, he developed the first antiseptic protocols for performing surgery on soldiers returning from battle.
These procedures represent the cornerstones of modern surgical antiseptic care.

The age of diagnostics and disease management
The Industrial Revolution was the progenitor of the discovery and invention of various medical advancements; however, some of the most notable discoveries came around the turn of the century. These were central to extending human life expectancy because they transformed how illnesses are detected, treated, and prevented.
Below are some of the key medical inventions that helped people better recognize and manage a wide range of diseases.
Innovations in medical imaging and early detection
One striking example is the advent of radiology, developed in 1895 by Professor Conrad Wilhelm Roentgen. Within a year of its discovery, more than a thousand papers were published on its use in clinical imaging.
For the first time, this breakthrough granted doctors the ability to examine disease processes inside the body, without resorting to dissection and autopsy. This laid the foundation for later developments, such as breast mammography and, eventually, CT scanners and MRIs in the 1970s, which are now indispensable for modern screening, diagnosis, and prevention.
In addition to medical imaging, discoveries were also being made in the fields of diagnostics and laboratory testing.
The centrifuge, originally used in dairy processes to separate milk from cream, was first developed in a lab setting in 1869 to separate nucleic acids from white blood cells and remains instrumental in current-day diagnostics.
Such advances in lab medicine have been pivotal in the detection of and understanding of both infectious and non-communicable diseases.
Chronic disease management and treatments
Non-communicable diseases such as heart disease, strokes, high blood pressure, cancer, and diabetes were, and still are, major sources of illness and disability.
You often hear stories about people who seemed healthy suddenly dropping dead due to a heart attack or stroke, even professional athletes. However, the management and treatments for these diseases have advanced further, particularly for diabetes and high blood pressure.
Diabetes
One striking example of the progress made in this field is in diabetes treatment. Diabetes has long been elucidated as a disease of “sugar loss” from the body, and was often treated ineffectively with extreme carbohydrate-restricted diets, with very grim outcomes.
The discovery of insulin in 1921 marked a breakthrough in this field and enabled the effective reduction of blood glucose levels for the first time.
In the decades that followed, newer formulations of insulin were developed, as well as oral antidiabetic agents, which provided patients with even better, more personalized control of their condition. These developments have led to significant progress in reducing diabetes-related complications and mortality.
Hypertension
Similar strides have also been made in treating hypertension, AKA high blood pressure, which was considered an untreatable illness up until the 1950s.
During this time, the first antihypertensive drug trials were being conducted and validated to lower blood pressure, as well as to reduce the incidence of stroke and heart failure. Multitudes of studies were conducted in the years after, seeking to establish guidelines for target blood pressures and decrease mortality rates due to complications, such as stroke, with great success.
Nowadays, physicians can perform risk assessments for patients by calculating scores based on laboratory investigations and history, with astounding accuracy, even just on their phones.
For conditions that require consistent monitoring, such as diabetes, patients no longer have to rely on frequent skin pricks or blood tests. Now, there are continuous glucose monitors and even insulin pumps that are inserted under the skin and deliver insulin when needed.
As these two conditions are closely interrelated and often comorbid, the strides made in their management over the past hundred years have been nothing short of remarkable and essential to increasing people’s longevity.
Breakthroughs in organ replacement and transplant
Around the same time, researchers were experimenting with possible therapies for patients with advanced kidney disease. Between the 1910s and the early 1950s, the concept of hemodialysis evolved from a niche idea to a widely accepted phenomenon.
What used to be a morbid condition that meant almost certain death only a few decades prior had become a manageable ailment, adding significant years of life to those who might have had only months or even weeks to live without it.
Adjacently, leaps and bounds were being made in the fields of organ donation and transplant medicine. While stories of skin grafting date back to the Ebers Papyrus (1550 B.C.), the first verifiable skin transplant occurred in 1869. It was primarily used as a treatment for severe burns during World War I and World War II.
It wasn’t until 1959, when the first successful human kidney transplant was done between two fraternal brothers, that things truly kicked off for this fledgling field. Within a decade, Dr. Christiaan Barnard had successfully performed the first human heart transplant in Cape Town, South Africa.
The development of such inventions revolutionized modern diagnostics and enabled the creation of more comprehensive guidelines for screening and early detection techniques.
Within a century, medicine had evolved to perform what had seemed, only a short while earlier, pure science fiction.
Read more: A Science-Based Guide on How to Prevent Cervical Cancer
Longevity in the 21st century: Tech, data, and biohacking
With the turn of the millennium and the advent of the digital age, technological advancements have paved the way for numerous developments in the fields of personalized healthcare, laboratory testing, real-time health monitoring, and telemedicine. This has made healthcare more precise and accessible, enabling people to lead longer, healthier lives.
Take a look at some of these breakthroughs that mark remarkable progress in the human longevity journey.
Advances in lab medicine
Over the past few decades, the quest for longevity has reached new heights.
Advancements in tissue engineering have enabled researchers to develop “lab-grown” organoids and tissue cultures that mimic real organs and facilitate the safe testing of new pharmaceuticals and drug-delivery systems, with uncanny accuracy.
This has also opened the door to entirely lab-grown organs for transplant and regenerative purposes, replacing diseased ones. These approaches could eventually eliminate the need for live subjects, such as lab mice and rats — whose use has long raised ethical concerns — within a generation.
Personalized healthcare
Another point of distinction is the level of personalization and customization available.
Nowadays, people have numerous ways to track their health and wellness markers. This has been facilitated by products such as wearable devices that track steps, heart rate, and sleep time, which are imperative for healthy aging.
Read more: The Impact of Stress and Lack of Sleep on Skin Aging
Beyond tracking, these technologies are paving the way for truly personalized healthcare. The data they collect allows doctors to tailor treatments and lifestyle plans to each individual’s needs. For instance, diabetes care is no longer one-size-fits-all; practitioners can now leverage continuous data from glucose monitors, diet logs, and genetic testing to recommend adjustments tailored to each patient.
Some apps also connect people with healthcare professionals, such as telemedicine and online therapy applications, further streamlining the consultation and follow-up process for patients.
The medical tourism boom was also remarkable. This is when people travel all over the globe for various medical treatments, which, in recent years, has been advantageous in helping individuals find the specific care they desire.
This phenomenon has led to the development of health spas and wellness clinics that offer personalized care tailored to people’s needs — or even whims.
Unlike hospitals, these resorts aim to provide luxurious services, such as IV rejuvenation drips, full-body scans, plasma transfusions, cold plunge baths, and more, for their guests. These guests often invest large amounts of money in these services for their personal longevity, at premium prices — and with unclear results.
Buyer beware
In a similar vein, many have attempted to capitalize on such advancements in an effort to achieve immortality themselves. One striking example is Bryan Johnson, a 47-year-old entrepreneur who has dedicated his life and wealth to maximizing his lifespan.
While Johnson claims to be “the healthiest person alive,” many of the therapies he tests have weak evidence and are not readily accessible to the general public. While he has received plenty of criticism for his experiments on himself and his son, Johnson seems to be steadfast in his belief that humans can “conquer death.”
This reflects a broader problem in the longevity market. Some emerging technologies and biomedical treatments are promoted as revolutionary despite limited or inconclusive proof of their effectiveness. From high-end biohacking regimens to so-called “anti-aging” products, the promise of extended youth often outpaces the science.
To this day, many fraudsters peddle products they claim will “enhance vigor” or “reverse aging.” These products are marketed in a way that capitalizes on people’s fear of aging and death, and often contain harmful substances and chemicals for the sake of making a quick buck.
These examples underscore the need for caution and critical thinking when approaching medical technologies or longevity products that lack solid scientific backing.
Societal implications of longevity
As people live longer, the knock-on effects of these aging populations will become a problem.
Longer lifespans will necessitate the adaptation of entire societies to an increasingly aging population, with significant implications for healthcare systems, economic structures, government policies, and personal relationships.
Discover how this effect is already evident in countries with the most pronounced longevity.
Aging populations and global health
While it has long been sought after, longevity once achieved is not all it’s cut out to be, especially when large portions of a population have achieved it.
Two clear examples of which are Japan and South Korea. Japan and South Korea consistently rank among the world’s highest life expectancies, with Japan home to one of the legendary “Blue Zones” in Okinawa. These regions specifically offer a unique perspective into what it means to have “achieved” longevity in a large slice of the population.
Despite the perceived individual benefits of living longer, this longevity has reverberating effects through society, especially as more and more people enter their retirement years.

The burden of aging
Almost a third of Japan’s population is elderly (65 and older), whereas Korea has around 15% in that age group, with more than a twofold increase projected over the next three decades.
These demographics place a significant burden on the remainder of the population. As more people enter their retirement years, the more strain is shouldered on the working class to make up that difference.
These shifts have detrimental effects on the economy, including higher labor costs, decreased productivity, and a blunted business expansion, which can prove disastrous in the long run.
This is further compounded by the gap between the working class and the pensioners, wherein the working class’s efforts are shifted from generating business profits to bolstering the pensions of the elderly, who cannot work anymore.
An increase in aging populations also has further ripple effects on communities and, in particular, healthcare systems.
Elderly people often suffer from multiple comorbid chronic health issues, such as diabetes, heart disease, and neurological decline. This places significant pressure on the healthcare systems and the caregivers to continue taking care of the elderly while also managing their own jobs and lives.
In South Korea, the situation is even more dire. The steady rate of decreasing fertility rates since the 1980s has been attributed to high living costs, insufficient childcare support for parental leave, and unfair hiring and workplace practices towards women. These factors have led to a general shift in perspective among the younger population away from having children and even getting married.
Quality vs quantity
The examples set by these countries can serve as cautionary tales about what happens when too many people live as long as they do, without the proper societal guardrails for it.
As more young people decide to have fewer children, it is only a matter of time before the parents of these kids are left to fend for themselves, with no one to take care of them, in both direct and indirect ways.
In fact, studies have shown that these two entities — quality and quantity of life — are not separate, but rather complementary. Nurturing strong interpersonal relationships and prioritizing one’s physical and mental health are associated with lower mortality rates and higher quality of life indices.
It is therefore imperative that people shift their focus from just living longer to living better, fuller lives.
In conclusion
The human pursuit of longevity is by no means a new phenomenon. Instead, it is as old as history itself. The tools and the terminologies have been updated, but the spirit is still the same.
In many ways, humans as a species have already achieved a certain level of longevity. Humans now live for longer than they ever have, and while life expectancies have plateaued recently, the fact of the matter is that most people will live much longer than most of their ancestors did.
History shows you which paths to longevity have endured and are supported by the most substantial evidence. With these technological advancements and medical miracles being performed today, the sky truly is the limit for what humanity can achieve in the field of longevity.
Once again, this article reminds you that longevity isn’t just about adding years to life — it’s about adding life to those years. Beyond simply living longer, it’s equally important to live healthier and more fulfilling lives, nurturing not only physical well-being but also mental, spiritual, and social wellness.
If you want to see more resources on the history of longevity, check out the Longevity Science Labs. The lab uses the research of the Institute for Life Management Science to produce courses, certifications, podcasts, videos, and other tools. Visit the Longevity Science Labs today.
Photos by:
- This file comes from the Wellcome Collection, a website operated by Wellcome Trust, a global charitable foundation based in the United Kingdom.
- Rawpixel
- Wellcome Collection via Rawpixel
- jcomp on Freepik


