Skip to main content

History of Medicine


Introduction
The practice of medicine—the science and art of preventing, alleviating, and curing disease—is one of the oldest professional callings. Since ancient times, healers with varying degrees of knowledge and skills have sought to restore the health or relieve the distress of the sick and injured.
Often, that meant doing little more than offering sympathy to the patient while nature took its course. Today, however, practitioners of medicine have several millennia of medical advances on which to base their care of patients.

Physicians and Their Training
A doctor begins a routine checkup by listening to a patient's heartbeat with a stethoscope.
In most developed countries, physicians are the key health care providers. Through intensive schooling and training, they become highly skilled. Those who want to practice medicine must meet educational, professional, and ethical standards before their state, province, or country awards them a license to treat patients. Approaches to medical education, licensing, and practice vary throughout the world. This article focuses on physicians' training and practice within the United States.

Medical School
A physical therapist works with two formerly conjoined twins during a therapy session. Physical …
In 2008 there were 129 accredited medical schools granting doctor of medicine (M.D.) degrees and 25 colleges granting doctor of osteopathic medicine (D.O.) degrees in the United States. M.D.s, also known as allopathic physicians, treat disease mainly with drugs, surgery, and other well-established remedies. D.O.s, like their M.D. colleagues, are fully licensed medical physicians and surgeons who practice in most clinical specialties. Osteopathic medicine emphasizes the interrelationships of bodily structures and functions, and its practitioners receive special training in manipulative treatments of the musculoskeletal system.
All medical schools require students to take the Medical College Admissions Test (MCAT), a four-and-one-half-hour, computer-based exam that assesses problem-solving ability, critical thinking, writing skills, and knowledge of scientific concepts. In addition to scoring well on the MCAT, medical school applicants generally must have completed undergraduate courses in biology, physics, chemistry (including organic chemistry), and English and earned a bachelor's degree.
In 2007 about 42,000 students applied to allopathic medical schools; just under 18,000 were accepted. Approximately 12,000 students applied for some 4,400 spots in osteopathic schools. Medical-school opportunities for women improved significantly in the late 20th century. In 2006, 48.8 percent of allopathic medical school graduates were women (up from 5.5 percent in 1962), and women constituted 47.5 percent of about 2,800 osteopathic graduates in 2006 (up from less than 2 percent of 427 total graduates in 1968).
Although programs aimed at increasing the diversity of the medical school student body have made some inroads, enrollment of minorities other than Asians remains far from optimal. In 2006, approximately 7.2 percent of students entering allopathic schools were black, 7.5 percent Hispanic, and 20 percent Asian; the percentages of osteopathic students were 4 percent, 4 percent, and 18 percent, respectively. Because racial and ethnic minorities would constitute half of the U.S. population by 2050, and research had shown that greater diversity within the physician workforce contributes to greater access to health care for the underserved, the Association of American Medical Colleges (AAMC) and the American Medical Association (AMA) were committed to increasing the number of minority physicians and raising the general cultural competence of all physicians.
During the first two years of a typical four-year curriculum, medical students concentrate on the sciences basic to medicine—anatomy, biochemistry, physiology, microbiology, pathology, pharmacology, and the behavioral sciences. They learn about normal structure and function of the human body and how those are altered by disease. They gain firsthand knowledge of the intricate human anatomy by dissecting dead bodies (cadavers). First- and second-year students also spend some of their time in a hospital observing experienced doctors and learning the fundamentals of taking a medical history and examining patients. In the third year, they gain extensive experience with patients in hospital, clinic, and office settings in internal medicine, family medicine, pediatrics, obstetrics and gynecology, surgery, and psychiatry. Fourth-year students spend most of their time in a clinical setting; they also take elective courses. The osteopathic curriculum is virtually the same as the allopathic, except that osteopathic students also learn muscle and joint manipulation.
Medical education is expensive. In 2007 the average cost of tuition, fees, and health insurance for a first-year student at a private allopathic medical school was about $40,000 and for a state resident attending a tax-supported public medical school about $22,000. Annual tuition and fees for D.O. programs ranged from $20,000 to $35,000. According to AAMC statistics, approximately 85 percent of M.D.s were graduating with significant debt (between $120,000 and $160,000). Even though physicians' eventual earnings are among the highest of any occupation, their debt was increasing at a much higher annual rate than their incomes. A 2007 AAMC report concluded that without drastic changes, significant numbers of young physicians could be faced with unmanageable debt.

Graduate Medical Education
Newly graduated doctors must complete a residency in the specialty they have chosen before they can be licensed to practice. Most find a residency through the computer-coordinated National Resident Matching Program (NRMP). In 2008 about 50,000 applicants sought about 25,000 residency positions through the match.
The residency served by a new doctor is a fast-paced medical apprenticeship. For M.D.s, this paid, on-the-job training lasts from three to seven years, depending on the specialty. Residencies in the surgical specialties are the longest—five or more years. Those in internal medicine, family medicine, pediatrics, and psychiatry are generally three years. Doctors who choose to specialize further enter post-residency fellowships (the NRMP conducts matches for fellowship positions in 34 subspecialties).
Most graduates of osteopathic colleges serve a 12-month internship followed by a two- to six-year residency in one of nearly 700 American Osteopathic Association (AOA)-approved programs. Graduates of medical schools outside the United States and Canada (both U.S. citizens and non-U.S. citizens) may apply for residencies after they have met certain qualifications and been certified by the Educational Commission for Foreign Medical Graduates. Between 2004 and 2008 about 4,300 international medical graduates per year found residences through the NRMP.

Licensing of Physicians
Each of the 50 states, the District of Columbia, and the U.S. territories has its own requirements for obtaining a medical license, its own board that grants the license, and its own laws governing the practice of medicine. Because of the wide variations in the licensing process among states, few generalizations can be made about it. Since 1994, however, all states have required M.D.s to pass the United States Medical Licensing Examination (USMLE). The first two steps of the three-step test are typically taken during the second and fourth years of medical school and the third step during residency. Most state licensing boards for osteopaths require applicants to pass the Comprehensive Osteopathic Medical Licensing Examination, a three-part test comparable to the USMLE.

Certification of Specialists
Medical specialists are not certified by their state but by the profession itself. Each major allopathic specialty has its own certification board. The American Board of Medical Specialties (ABMS) oversees 24 of those boards and assists them in developing standards to certify doctors in 146 specialties and subspecialties. Certification by an ABMS member board involves rigorous testing and peer evaluation; about 85 percent of licensed M.D.s are certified by at least one ABMS board. D.O.s. gain certification from one of 18 specialty boards overseen by the AOA.

What Doctors Do
Diagnosis
Physicians do several things in the course of serving a patient. The first is to make a diagnosis, i.e., identify the exact nature of the problem. They obtain a medical history by asking pertinent questions about a patient's current and past health and illnesses, personal habits, lifestyle, family, and job or school. The history will indicate how and when the patient became sick and whether or not he or she ever previously had the same or a similar condition.
The history will reveal symptoms (abnormal body changes that patients detect) such as pain, fatigue, loss of appetite, and nausea. These are clues that help doctors decide what signs (abnormal body changes that physicians detect) they should look for. By simple inspection doctors may observe a rash or swelling of the ankles. They may discover a cyst or tumor by palpation (feeling the body tissue); detect a heart murmur by listening to the chest with a stethoscope (auscultation); or determine the presence of infection in the body through blood chemistry studies. A case of appendicitis would be diagnosed on the basis of a patient's complaints of abdominal pain, appetite loss, nausea, and vomiting (symptoms), which would prompt the physician to check for fever, tenderness in the lower right portion of the abdomen, and a high number of white cells in the blood (signs), which together would lead to a diagnosis of probable appendicitis. (See also diagnosis.)

Treatment
A well-trained physician will select the best forms of therapy that will correct a particular illness in a given patient. The clear choice for the patient with appendicitis would be surgery to remove the appendix. A delay in surgery could result in rupture of the appendix and serious infection. Because of the risk of infection during and after the operation, the physician is also likely to prescribe antibiotic therapy.Increasingly doctors are practicing “evidence-based medicine”—an approach that involves conscientious use of results from well-controlled clinical trials in making treatment decisions. Proponents of evidence-based medicine consider it the most objective way to ensure that patients receive optimal care. In recent years, many specialties have issued detailed “clinical practice guidelines” based on the best evidence available. These compilations enable clinicians to readily choose treatments backed by solid science.
Before a physician proceeds with a treatment, he or she must (by law) convey the details of the planned procedure (including all the risks) to the patient in language the patient comprehends. The patient then must sign an “informed consent” document acknowledging that he or she is going ahead with the treatment with full awareness of the potential outcomes.

Prognosis
The prognosis is the doctor's prediction of the probable outcome of a case based on his or her knowledge of the disease, previous experiences treating it, and the health status of the patient. When appendicitis strikes an otherwise healthy child, the doctor can usually make an optimistic prognosis. If, however, an obese adult with high blood pressure develops appendicitis that is not diagnosed promptly, the potential for rupture of the appendix and infection is high. Moreover, the patient's excess weight and elevated blood pressure increase the risk of surgical complications. In this case the prognosis would be uncertain, as the doctor would not be able to predict how the patient would endure surgery, how extensive infection might be, or how long the healing process would take.

Types of Practice
Some doctors choose to go into private practice in which they set up individual offices and cater to the medical needs of as many people as they can. Private practitioners are paid on a fee-for-service basis (patients' health insurance reimburses them for each service provided). Such solo practices give physicians a great deal of independence and allow close rapport with patients. At the same time, they tend to work extremely long hours and are virtually always “on call.”
More commonly, recently licensed doctors enter a group practice in which they share support staff, patients, and certain medical responsibilities with other doctors. Group practices tend to be able to afford expensive equipment that private practitioners generally would not have. Doctors in a group may be paid on a traditional fee-for-service basis, or they may be salaried.
New doctors may take a salaried position within a health maintenance organization (HMO). HMOs provide medical treatment on a prepaid basis; enrolled patients pay a fixed monthly fee regardless of whether they need medical care that month. HMO members must choose a primary care physician, who acts as the “gate keeper” for the organization by determining what tests and treatments patients do and do not need and when they should see a specialist. For doctors, the advantages of such a practice include regular hours of work, steady income, availability of sophisticated diagnostic equipment, and easy access to group consultation. (See also health maintenance organization.)

Levels of Care and Coverage
In the United States and many other countries physicians provide three levels of care. The primary level is the broadest and meets the complete medical needs of the vast majority of people. Primary care is most often provided by general practitioners such as family practice doctors, internists, or pediatricians, who are patients' first contact with the medical system. Secondary care is provided by a physician who acts as a consultant at the request of the primary physician; the provider of this level of care might be a radiologist, a cardiologist, urologist, or dermatologist. The third level, tertiary care, is usually based at a teaching hospital or center that has personnel, facilities, and technology for special investigations and treatments; comprehensive cancer care is an example.
The levels of care patients receive depend on their insurance coverage. The United States is the only country in the world without some form of universal health care (coverage that extends to all citizens). In 2007, 84.7 percent of the population was covered by private or public health insurance; the rest either had no health insurance or were underinsured. The latter group included middle- and higher-income people, half of whom went without needed care due to the cost. Massachusetts and Vermont both approved strategies for achieving universal coverage, but neither plan had been fully implemented in 2008.

Too Few Physicians?
From the late 1970s to the mid-1990s, several expert panels forecast a glut of physicians in the United States in the new millennium. Consequently, medical schools capped enrollment, and Congress limited its funding of medical residencies. The experts, however, miscalculated. Between 2000 and 2007, 15 states and 16 medical specialty boards issued reports detailing how looming or already-evident physician shortfalls would affect them. Three national reports projected that by 2020 there would be as many as 200,000 too few physicians to meet the needs of the aging population. To remedy the shortfall, medical schools began increasing enrollment. In 2007 allopathic medical schools admitted 17,800 students, the largest class ever, while first-year enrollment in osteopathic medical schools reached nearly 4,500 students (up from just under 3,000 students in 2000).
In addition to the existing and anticipated physician shortages, there was also a serious misdistribution of them. The ratios of physicians to population were highest in the New England and the Middle Atlantic states (Massachusetts had 443 doctors per 100,000 population) and lowest in the South Central and Mountain states (Mississippi had 182 doctors per 100,000). In 2005 nearly 60 million Americans lived in government-designated health professional shortage areas (HPSAs). The National Health Service Corps (NHSC), now a part of the Health Services and Resources Administration, was created in 1970 to address the physician distribution problem. NHSC offers primary care physicians incentives such as help with repayment of loans to serve in HPSAs. From its inception through 2007, 27,000 NHSC-recruited doctors had served up to five million people who were uninsured or underinsured or faced language, geographic, or cultural barriers to care.

Changing Face of Medicine
Americans were not receiving their medical care from allopathic and osteopathic physicians only; since the early 1990s, surveys had shown that 30 percent to 60 percent of adult Americans were using complementary and alternative medicine (CAM). Complementary treatments are used in conjunction with conventional medicine; alternative treatments are used in place of it. Many of the unconventional therapies had gained such wide acceptance that, according to an editorial in the Journal of the American Medical Association in 2008, they had become “part of the conventional medicine armamentarium.”
The number of practitioners of CAM is unknown, and the quality and quantity of their training varies widely. They include chiropractors, naprapaths, massage therapists, Chinese herbalists, energy healers, homeopaths, acupuncturists, yoga therapists, and others; they treat back problems, allergies, arthritis, insomnia, headaches, high blood pressure, depression, and even cancer. In 1992 the U.S. National Institutes of Health, the country's premier medical research establishment, created the Office of Alternative Medicine to study therapies outside the realm of conventional medicine. The office was expanded to the National Center for Complementary and Alternative Medicine (NCCAM) in 1998. In its first decade NCCAM sponsored more than 1,200 clinical trails assessing the validity of widely used but unproven therapies.
The Internet and other communications technologies have profoundly altered the practice of medicine. They allow rapid dissemination of information, which has the potential to increase physicians' efficiency in caring for patients as well as enhance the quality of care provided. In the burgeoning field of telemedicine, physicians provide consultative services and exchange medical information via electronic communications. Doctors can send live video and high-resolution images to distant locations and examine patients who are hundreds or thousands of miles away. Telemedicine has the potential to alleviate regional inequalities in medicine and bring the advantages of urban medical centers to those with little or no access to care.
The Internet has also empowered patients, who have virtually unlimited access to medical information. While many sites provide valuable information, others may disseminate unreliable or misleading information. The National Institutes of Health recommends that consumers ask questions about the sites they visit: Who runs the site? Who pays for it? What is the basis of the information? How current is it? The National Library of Medicine offers MEDLINEplus, a resource that links the general public to sites that have been carefully reviewed.

History Of Medicine In Brief
The Japanese physician Seishu Hanaoka and his mother stand watch over the doctor's unconscious …
Evidence of attempts to care for the sick and injured predates written records. Skulls found in Europe and South America dating as far back as 10,000 BC have shown that the practice of trepanning, or trephining (removal of a portion of the skull bone), was not uncommon. This operation, performed by many primitive peoples, including American Indians, was probably done to release evil spirits that were thought to be the source of illness; yet, in many cases, it proved to be the medically correct thing to do. Opening the skull can relieve pressure and pain caused by brain tumors and head injuries.
Indeed, much of early medicine was closely identified with pagan religions and superstitions. Illness was attributed to angry gods or evil spirits; prayers, incantations, and other rituals were used to appease the gods or ward off demons—and thereby drive off disease. Nonetheless, the ancients did not entirely lack valid medical knowledge. In fact, through observation and experience, they acquired considerable wisdom about sickness and its prevention and relief. (See also health.)
The book of Leviticus in the Old Testament described quarantine regulations and sanitary practices that were used to prevent the spread of leprosy and bubonic plague. The ancient Romans realized the importance of sanitation to health and built sewers, systems that drained waste water from public baths, and aqueducts that provided clean water.

Egyptian Medicine
The ancient Egyptians were among the first to use certain herbs and drugs, including castor oil, senna, and opium. They also set and splinted fractured bones using techniques remarkably similar to those of modern medicine. Egyptians were reputed to be skilled diagnosticians; a medical papyrus from about 1600 BC, which is believed to be a copy of a text from about 3000 BC, described 58 sick patients, of whom 42 were given specific diagnoses. Although the Egyptians practiced mummification, which involved removing and dehydrating most of the internal organs of the dead, they apparently did not study those organs, as their anatomical knowledge was quite limited.

Greek and Roman Medicine
Hippocrates (460–375 BC), known as the “father of Western medicine,” was an admired physician and teacher who rejected the notion that disease was punishment sent by the gods; rather, he believed it had natural causes. Hippocrates put forth a doctrine that attributed health and disease to four bodily humors, or fluids-blood, black bile, yellow bile, and phlegm. He believed that the humors were well balanced in a healthy person, but various disturbances or imbalances in them caused disease. At that time, his humoral theory seemed highly scientific. In fact, doctors diagnosed and treated illnesses based on the four humors well into the 19th century.
Knowing that he could not cure most diseases, Hippocrates tended to recommend conservative measures such as exercise, rest, and cleanliness. By contrast, for fever, which he thought was caused by an excess of blood in the body, he recommended the drastic measure of bloodletting. The practice of blood-letting (or bleeding), which was thought to have many therapeutic effects, was used for more than two thousand years and undoubtedly hastened the deaths of countless patients who might otherwise have recovered.
Hippocrates is best known today for his ethical code (Hippocratic Oath), which continues to be used by the medical profession as a guide to appropriate conduct. The oath is a pledge doctors make to always use their knowledge and best judgment for the benefit of their patients and to never harm or injure those in their care.
For a brief period after Hippocrates'death, two Greek physician-scholars living in Alexandria, Herophilus and Erasistratus, performed the first known systematic dissections of human bodies. They dissected virtually every organ, including the brain, and recorded what they learned. Despite their dedication to the science of anatomy, these pioneers had little influence on the subsequent practice of medicine. By 150 BC, dissection of human cadavers was banned throughout the Hellenistic world, and any writings they left behind were lost when Alexandria's library was destroyed in the 3rd century AD.
One of those trained in Hippocratic medicine was Claudius Galen (129–circa 216 AD), a Greek who traveled widely and became the most renowned physician in Rome. Although Galen accepted and embellished the four-humors doctrine, he also made important discoveries. He performed systematic experiments on animals (including apes, monkeys, dogs, pigs, snakes, and lions), which involved both dissection and vivisection (live dissection). He treated gladiators and took advantage of the opportunity to study the internal organs and muscles of the wounded. Galen recognized connections between bodily structures and functions; for example, he demonstrated that a severed spinal cord led to paralysis. He recognized that the heart circulated blood through the arteries but did not understand that it circulated in only one direction (see blood). Galen produced a prodigious body of medical scholarship that was adhered to by medical practitioners for 1,600 years. Unfortunately, his erroneous beliefs as well as his accurate insights were perpetuated.

Arabian Medicine
After the breakup of the Roman Empire, the tradition of Greek medicine continued in the universities of the Arab world. The Persian physician Rhazes, or Ar-Razi (circa 865–923), is credited with being the first to distinguish between the highly contagious viral diseases smallpox and measles. He also recognized the need for sanitation in hospitals. Probably the most important physician at the beginning of the second century was Avicenna. His monumental Canon of Medicine, a five-volume encyclopedia of case histories and therapeutic instructions, was long considered an absolute medical authority in both Eastern and Western traditions.

Renaissance Medicine
It was not until the Renaissance that Europeans began to seek a truly scientific basis for medical knowledge instead of relying on Galen's teachings. The Flemish physician Andreas Vesalius discovered many new principles of anatomy through dissections, which he compiled in his profusely illustrated Seven Books on the Structure of the Human Body, published in 1543.
Ambroise Paré (1510–1590), a Frenchman, practiced as an army surgeon and became an expert at treating battlefield wounds. He proved that tying blood vessels was a better method of stopping profuse bleeding than cauterizing them with hot oil or a hot iron—a discovery that spared countless soldiers terrible pain and suffering.

17th- and 18th-Century Medicine
Based on painstaking observations of his own veins and study of the blood vessels of sheep, English physician William Harvey determined that blood was pumped away from the heart via the arteries and was returned to it by way of the veins. Groundbreaking as this discovery was, Harvey could not explain how blood passed from the arteries to the veins. Four years after Harvey's death in 1657, the Italian researcher Marcello Malpighi, with the aid of a microscope, identified and described the pulmonary and capillary network that connected small arteries with small veins. (See also circulatory system; Harvey, William; heart.)
The art of surgery developed in 17th-century England at a time when elsewhere in Europe operations were being performed mainly by barbers. William Cheselden, a surgeon and anatomist, was known for his swift and skillful operations; it was reported that he could perform a lithotomy (removal of a stone from the urinary bladder) in 54 seconds.
Edward Jenner, an English country physician, noticed that women who milked cows often caught cowpox (a relatively mild illness) but never got the much more virulent human disease smallpox. Based on that observation, he began developing the world's first vaccine against a virulent infectious disease. In 1796 Jenner inoculated an eight-year-old boy who had never had smallpox with material taken from cowpox lesions on the hands of a dairymaid. Several weeks later, he exposed the boy to smallpox, and he remained healthy.

19th-Century Medicine
Before the mid-1800s surgery had to be performed without anesthesia. Patients may have been given a blow on the head, a dose of opium, or a swig of whiskey or rum—at best, minimally effective means of reducing pain. The best surgeons, therefore, were those who completed their work in the least amount of time. Early in the century British and American scientists began experimenting with two pain-numbing substances, the gas nitrous oxide and the liquid solvent ether. In 1846, before a large group of doctors at Massachusetts General Hospital in Boston, William Morton (who did not have a medical degree but had apprenticed with a dentist) demonstrated ether anesthesia in a patient undergoing surgery to remove a tumor from his neck. The resoundingly successful operation was painless for the patient. Word of this achievement spread quickly, and soon dentists and surgeons on both sides of the Atlantic were using anesthesia. In 1847 chloroform was introduced and became the anesthetic of choice.
Certainly one of the most important advances of the 19th century was the development and acceptance of the “germ theory of disease.” In the 1840s Ignaz Semmelweis, a young physician working in a hospital in Vienna, recognized that doctors who performed autopsies and then delivered babies were responsible for spreading puerperal (childbed) fever, an often deadly infection of the reproductive organs. After Semmelweis ordered doctors to wash their hands with a chlorinated lime solution before entering the maternity ward, deaths from puerperal fever plummeted (see Semmelweis, Ignaz).
French chemist and microbiologist Louis Pasteur first learned about germs by studying the fermentation of beer, wine, and milk. He went on to explore infectious diseases in farm animals and develop vaccines against anthrax in sheep, erysipelas in swine, and chicken cholera in poultry. Finally Pasteur turned his attention to rabies in humans. His crowning achievement was the development of a vaccine against the always-fatal viral infection caused by bites of rabid animals. In 1885 Pasteur was urged by doctors to give his experimental vaccine, which had only been tested in dogs, to a young boy who had been bitten more than a dozen times by a rabid dog. He administered a series of 13 daily injections of increasingly virulent material obtained from the spinal cords of rabid rabbits. The child endured the prolonged and painful treatment and made a full recovery. (See also bacteria; disease, human; Pasteur, Louis.)
German physician Robert Koch discovered that dormant anthrax spores could remain in the blood of sheep for years and, under the right conditions, develop into the infectious organisms that caused deadly anthrax outbreaks (see Koch, Robert). In 1876, when he presented his findings on the anthrax disease cycle to doctors in Breslau, Germany, an eminent pathologist commented: “I regard it as the greatest discovery ever made with bacteria and I believe that this is not the last time that this young Robert Koch will surprise and shame us by the brilliance of his investigations.” He was right. Koch went on to discover the bacteria responsible for tuberculosis (TB; 1882) and human cholera (1883) and to do groundbreaking research on leprosy, bubonic plague, and malaria.
The French microbiologist Charles Laveran discovered the disease-causing protozoan Plasmodium, the parasite carried by mosquitoes responsible for malaria. At the turn of the century, the American army doctor Walter Reed headed a team of physicians who proved that yellow fever was also transmitted by mosquitoes.
The first serious studies of mental disease were conducted during the 19th century. Jean Charcot used hypnosis as a tool to search the troubled minds of mental patients. His student Sigmund Freud developed the psychoanalytic technique for treating mental illness (see Freud, Sigmund; psychoanalysis).


20th- and 21st-Century Medicine
A Red Cross volunteer in Togo demonstrates the installation and purpose of an insecticide-treated …
In 1900 the average life expectancy of persons born in the United States was 47 years; by the end of the century it was 77 years. The U.S. Centers for Disease Control and Prevention (CDC) attributed 25 of those 30 additional years of life that Americans had gained to 10 momentous 20th-century public health achievements:
control of infectious diseases

immunizations

the decline in deaths from heart disease and stroke

safer and healthier foods

healthier mothers and babies

increased safety of motor vehicles

safer workplaces

family planning

fluoridation of drinking water

the recognition of tobacco use as a health hazard
Paul Ehrlich's discovery in 1910 of Salvarsan, the first drug effective against syphilis, inaugurated the era of modern antimicrobial drug therapy. The sulfa drugs, which provided strong protection against streptococci and other bacteria, were introduced in the 1930s. In 1938 Ernst Chain and Howard Florey succeeded in synthesizing and purifying the Penicillium mold that Alexander Fleming had discovered 10 years earlier; their product, the broad-spectrum antibiotic penicillin, is still widely used today. In 1948 Selman Waksman discovered streptomycin, a powerful antibiotic that led to the control of TB. (See also antibiotic; Ehrlich, Paul; Fleming, Alexander.)
In the early 1920s researchers Frederick Banting and Charles Best isolated the hormone insulin, which they used to save the lives of young people with diabetes mellitus. At the time, diabetes mainly affected children and adolescents; because their bodies did not produce insulin, which the body needs to convert food into energy, they died. Shortly after this triumph, the pharmaceutical manufacturer Eli Lilly and Company began large-scale production of cow and pig insulin, which helped turn diabetes (the type now known as type 1) from a fatal into a manageable disease that allowed young people to live into adulthood. By the end of the century, type 2 diabetes (in which the body is unable to properly utilize the insulin it produces) had become a public health threat of epidemic proportions; 3.8 million people worldwide died from its complications each year. (See also Banting, Frederick; diabetes; hormones.)
The eradication of smallpox, one of the most deadly and debilitating scourges the world had ever known, represents one of the greatest accomplishments in modern medicine, science, and public health. Thanks to widespread vaccination, smallpox was eliminated from Europe, North America, Australia, and New Zealand by 1950 and from most of South and Central America by 1959. In 1967 the World Health Organization (WHO) launched a campaign to eradicate the disease that still infected up to 15 million people annually in 31 countries. Ten years later, the last case of smallpox was diagnosed in a young man in Somalia, and in 1980 WHO declared smallpox eradicated from the planet. Because humans were the only natural reservoir of the smallpox virus, once it was gone, people no longer needed to be vaccinated against it. The only remaining specimens of the virus were retained in high-security laboratories in the United States and Russia. (See also vaccine.)
A global polio eradication initiative was begun in 1988, at which time about 350,000 children in 125 countries on five continents were crippled each year by the highly contagious viral disease that attacks the nervous system. By 1999 the number of cases had been reduced by 99 percent, and by the end of 2006, only four countries—India, Nigeria, Pakistan, and Afghanistan—still had endemic polio (uninterrupted transmission of the wild polio virus). Although the campaign faced major setbacks and did not achieve the original goal of eradication by 2005, its sponsors (WHO, the United Nations Children's Fund, CDC, and Rotary International) predicted that with newly acquired tools and tactics, commitment from governments, and adequate funding, a polio-free world could be achieved by 2013.
The early 1980s saw the emergence of the deadly new disease acquired immunodeficiency syndrome (AIDS), caused by the human immunodeficiency virus (HIV), which rapidly grew into a global pandemic. Thanks to the development of life-prolonging drugs, by the mid-1990s, HIV/AIDS was no longer a death sentence in wealthy countries. In poor countries, however, the pandemic continued to wreak havoc. In 2001 more than 28 million people in sub-Saharan Africa were living with HIV/AIDS, but fewer than 40,000 had access to drug treatment. At the same time, much of Africa and many developing countries were profoundly affected by malaria and TB. The three pandemic diseases killed more than six million people every year.
In 2002 the Global Fund to Fight AIDS, Tuberculosis and Malaria was created to dramatically increase resources to combat the trio of devastating diseases. By mid-2008 the fund had distributed $6.2 billion to 136 countries. At least 1.75 million people were receiving drug treatment for HIV/AIDS, 3.9 million were receiving TB treatment, and 59 million insecticide-treated mosquito nets had been distributed to families in malaria-ridden countries. The program estimated it had saved more than 1.5 million lives.
Late in the 20th century, antimicrobial drugs employed to treat common infections were becoming increasingly ineffective, allowing the return of previously conquered diseases and the emergence of virulent new infections. By 2007 about five percent of the nine million new cases of TB in the world each year were resistant to at least two of the four standard drugs; treatment with other drugs was 200 times more expensive and had lower cure rates. In 2007 the CDC reported that the bacterium methicillin-resistant Staphylococcus aureus (MRSA) was responsible for more than 90,000 serious infections and 19,000 deaths in the United States annually. For years MRSA had been a problem in health care institutions such as hospitals, nursing homes, and dialysis centers, where it had grown increasingly resistant to commonly used antibiotics. Formerly it primarily infected people with weakened immune systems; by 2007, however, 13 percent of cases were occurring in healthy people living in the community.

As part of the Human Genome Project, a technician at the Sanger Institute in Cambridge, Eng., …
In 1953 British graduate student Francis Crick and American research fellow James Watson identified the double-helix structure of DNA, a discovery that helped explain how genetic information is passed along. Exactly 50 years later, the Human Genome Project was completed. The 13-year international collaboration of more than 2,800 researchers, one of the boldest scientific undertakings in history, identified all human genes (about 22,000) and determined the sequences of the 3 billion chemical base pairs that make up human DNA. The genetic information provided by the project has enabled researchers to pinpoint errors in genes that cause or contribute to disease. In the future, having the tools to know the precise genetic make-up of individuals will enable clinicians to deliver truly personalized medicine.
As an increasing number of genetic tests have become commercially available—some of which can be lifesaving—new ethical questions have been raised about the best ways to deliver them and how the genetic information they provide should be used by insurers, employers, courts, schools, adoption agencies, and the military. In response to those concerns, in 2008 the U.S. Congress passed and President George W. Bush signed into law the Genetic Information Nondiscrimination Act, which prohibits insurance companies and employers from discriminating on the basis of information derived from genetic tests.

Additional references about medicine
Adler, Robert. Medical Firsts (John Wiley & Sons, 2004).
Groopman, Jerome. How Doctors Think (Houghton Mifflin, 2007).
Harding, Anne S. Milestones in Health and Medicine (Oryx Press, 2000).
Jauhar, Sandeep. Intern: A Doctor's Initiation (Farrar, Straus and Giroux, 2008).
Nuland, Sherwin B. The Uncertain Art: Thoughts on a Life in Medicine (Random House, 2008).
Porter, Roy, ed. Cambridge Illustrated History of Medicine (Cambridge University Press, 1996).
Straus, Eugene W. and Straus, Alex. Medical Marvels: The 100 Greatest Advances in Medicine (Prometheus Books, 2006).
See also bibliographies for Biochemistry; Bioengineering; Bioethics; Brain and Spinal Cord; Disease, Human; Health; Psychology.)
Reference:
MLA Style:   "medicine." Britannica Student Library. Encyclopædia Britannica Student and Home Edition.  Chicago: Encyclopædia Britannica, 2011.
APA Style:   medicine. (2011). Britannica Student Library. Encyclopædia Britannica Student and Home Edition.  Chicago: Encyclopædia Britannica.
 Delete Selected Notes

Comments

Popular posts from this blog

Roberto Esquivel Cabrera the largest manhood cries out

However, Roberto is not ready for reduction as he is obviously interested in the fame that his size brings. Before now, American actor Jonah Falcon had the known largest penis in the world, measuring 9.5 inches flaccid, and 13.5 inches when erect. But Roberto hijacked the title from and became known with his larger manhood size in 2015 after he was filmed weighing his penis to prove its authenticity. Speaking to Barcroft TV, Roberto said: “I am famous because I have the biggest p*nis in the world. I am happy with it, I know nobody has the size I have. I would like to be in the Guinness Book of Records but they don’t recognise this record.” But despite his pride, members of the medical community have urged him to at least consider a penile reduction. Doctor Jesus David Salazar Gonzalez said: “We have advised him ‘Mr Roberto, the best thing for you is that the doctors give a normal shape to your p*nis so that it doesn’t hurt you, in order to have s*xual relationships, in or

Nigeria, SARS and God. A tale of a needle and a thread. By Deejay.

My take on the ongoing #EndSars movement across major cities in Nigeria.  Which way?¿ This anomaly of a thing has been on for a while and the more the crave for a cessation to be injected into the operation of this vicious squad, the more the voices of our people are being maimed into the pit of no reprieve. It's sickening to admit that Nigeria as a large entity and a well placed nation among national bodies in Africa is undergoing this unsolicited menace that has claimed the lives of many innocent Nigerians in recent years with no probable remorse shown from the operating squad for killing indiscriminately and the lawmakers whom we invested our supports upon to lead us to the right path. To say that Nigeria is now a country where multifaceted bloody acts can be practised and go unscathed without any backlash from whatsoever is a big cause for concern. It's very simple to say that our leaders have been misguided and the ability to see through the heart of the masses is long blu