World and Ukrainian Medicine in the ХІХ–XX-th century
Lecture plan
I. MEDICINE IN THE 19-th CENTURY
1. Development of Physiology.
2. Verification of the germ theory.
3. Discoveries in clinical medicine and anaesthesia.
4. Advances at the end of the century.
II. MEDICINE IN THE 20-th CENTURY
1. Infectious diseases and chemotherapy.
2. Sulfonamide drugs, antibiotics.
3. Development of Immunology. Immunization against viral diseases.
4. Development of Endocrinology.
5. Development of Surgery in the 20-th century.
Development of Physiology
By the beginning of the 19th century, the structure of the human body was almost fully known, due to new methods of microscopy and of injections. Even the body’s microscopic structure was understood. But as important as anatomical knowledge was an understanding of physiological processes, which were rapidly being elucidated, especially in Germany. There, physiology became established as a distinct science under the guidance of Johannes Müller, who was a professor at Bonn and then at the University of Berlin. An energetic worker and an inspiring teacher, he described his discoveries in a famous textbook, Handbuch der Physiologie des Menschen (“Manual of Human Physiology”), published in the 1830s.
Among Müller’s illustrious pupils were Hermann von Helmholtz, who made significant discoveries relating to sight and hearing and who invented the ophthalmoscope; and Rudolf Virchow, one of the century’s great medical scientists, whose outstanding achievement was his conception of the cell as the centre of all pathological changes. Virchow, German pathologist and statesman, one of the most prominent physicians of the 19th century, pioneered the modern concept of pathological processes by his application of the cell theory to explain the effects of disease in the organs and tissues of the body. He emphasized that diseases arose, not in organs or tissues in general, but primarily in their individual cells. Virchow’s work Die Cellularpathologie, published in 1858, gave the deathblow to the outmoded view that disease is due to an imbalance of the four humours.
In France the most brilliant physiologist of the time was Claude Bernard, whose many important discoveries were the outcome of carefully planned experiments. His researches clarified the role of the pancreas in digestion, revealed the presence of glycogen in the liver, and explained how the contraction and expansion of the blood vessels are controlled by vasomotor nerves. He proposed the concept of the internal environment—the chemical balance in and around the cells—and the importance of its stability. His Introduction à l’étude de la médecine expérimentale (1865; An Introduction to the Study of Experimental Medicine) is still worthy of study by all who undertake research.
Verification of the germ theory
Perhaps the overarching medical advance of the 19th century, certainly the most spectacular, was the conclusive demonstration that certain diseases, as well as the infection of surgical wounds, were directly caused by minute living organisms. This discovery changed the whole face of pathology and effected a complete revolution in the practice of surgery.
A 19th-century pioneer in this field, regarded by some as founder of the parasitic theory of infection, was Agostino Bassi of Italy, who showed that a disease of silkworms was caused by a fungus that could be destroyed by chemical agents.
Agostino Bassi, pioneer Italian bacteriologist, anticipated the work of Louis Pasteur by 10 years in discovering that numerous diseases are caused by microorganisms.
The main credit for establishing the science of bacteriology must be accorded to the French chemist Louis Pasteur. It was Pasteur who, by a brilliant series of experiments, proved that the fermentation of wine and the souring of milk are caused by living microorganisms. His work led to the pasteurization of milk and solved problems of agriculture and industry as well as those of animal and human diseases. He successfully employed inoculations to prevent anthrax in sheep and cattle, chicken cholera in fowl, and finally rabies in humans and dogs. The latter resulted in the widespread establishment of Pasteur institutes.
Pasteur studied alcoholic fermentation and lactic fermentation in sour milk; he found that both fermentations were caused by minute organisms, and were hastened by exposure to the air. He proved that the microscopic organisms were not spontaneously generated but were introduced by air.
After a study upon the diseases of wines, which has had most important practical bearings, an opportunity arose which changed the whole course of his career, and profoundly influenced the development of medical science.
A disease of the silkworm had, for some years, ruined one of the most important industries in France, and in 1865 the Government asked Pasteur to give up his laboratory work and teaching, and to devote his whole energies to the task of investigating this disease and its causes. Notwithstanding all the difficulties and obstacles encountered in the problem, Pasteur carried his silkworm studies to a successful conclusion.
From Pasteur, Joseph Lister derived the concepts that enabled him to introduce the antiseptic principle into surgery. In 1865 Lister, a professor of surgery at Glasgow University, began placing an antiseptic barrier of carbolic acid between the wound and the germ-containing atmosphere. Infections and deaths fell dramatically, and his pioneering work led to more refined techniques of sterilizing the surgical environment.
Although pain had been banished from the operating-room after different methods of anaesthesia had been introduced, the spectre of infection still remained in pre-Listcrian days. Erysipelas, pyemia, septicemia and hospital gangrene were endemic in most surgical wards. The man who changed all this was Joseph Lister.
Lister saw that sepsis was the principal obstacle to any great advantage in surgery. Finally, noting that closed wounds did not suppurate while open ones exposed to the air did, he concluded that suppuration was in some manner due to contact with the air but that the air alone did not cause suppuration.
He found the solution of his problem in the work of Louis Pasteur on fermentation and putrefaction; it was not the air but the germs in the air that produced suppuration. He saw at once that putrefaction could only be avoided by preventing germs from gaining access to wounds. He looked around for a suitable antiseptic, and chose carbolic acid. With it Lister made his first antiseptic dressing in March, 1865.
The case was a compound fracture of the leg, the sort of wound which previously had almost invariably become infected, often with fatal results. He washed the wound out with the carbolic solution, and applied a piece of lint soaked with the solution over it. Healing was astonishingly good and Lister was encouraged to try this method in other cases. By March 1867 he was able to report a total of eleven cases of compound fracture treated by the antiseptic method, with nine recoveries, one amputation and one death. This was an unprecedented result.
He became so obsessed with the fear that microbes might fall upon the wound during an operation that he introduced in 1870 the carbolic spray to purify the atmosphere. He clung obstinately to this practice for 17 years but finally admitted that it was superfluous.
In 1877, after an absence of 25 years. Lister retumed as professor of clinical surgery at King’s College Hospital, London. He occupied the chair of surgery for 15 years.
Obstetrics had already been robbed of some of its terrors by Alexander Gordon at Aberdeen, Scotland, Oliver Wendell Holmes at Boston, and Ignaz Semmelweis at Vienna and Pest (Budapest), who advocated disinfection of the hands and clothing of midwives and medical students who attended confinements. These measures produced a marked reduction in cases of puerperal fever, the bacterial scourge of women following childbirth.
Another pioneer in bacteriology was the German physician Robert Koch, who showed how bacteria could be cultivated, isolated, and examined in the laboratory. A meticulous investigator, Koch discovered the organisms of tuberculosis, in 1882, and cholera, in 1883. By the end of the century many other disease-producing microorganisms had been identified.
Robert Koch is a prominent German bacteriologist, the founder of modern microbiology. He was born in 1843, died in 1910. When Koch became a doctor he carried on many experiments on mice in a small laboratory. In 1882 Koch discovered tuberculosis bacilli. In his report made in the Berlin Physiological Society Koch described in detail the morphology of tuberculosis bacilli and the ways to reveal then. Due to his discovery Koch became known all over the world. In 1884 Koch published his book on cholera. This book included the investigations of his research work carried out during the cholera epidemic in Egypt and India. From the intestines of the men with cholera Koch isolated a small comma-shaped bacterium. He determined that these bacteria spread through drinking water. In 1905 Koch got the Nobel prize for his important scientific discoveries.
As soon as Koch came to Alexandria he and his two assistants Gaffcky and Fisher began their investigations. In the blood, kidneys, spleen, liver and lungs of the people who died of cholera Koch found many microorganisms but all of them were not the agents of cholera. However in the walls of the intestines and in stools Koch always found a microorganism which looked like a comma. Many times Koch tried to grow this bacterium on gelatin but he failed to do it. Many times Koch inoculated this bacterium to the experimental animals, but none became ill with cholera. As the epidemic of cholera became less in Egypt, Koch went to India to continue his investigations there. In Kalcutta Koch often walked along its muddy streets, where the poor lived. Once Koch saw some muddy water on the ground near a small house.
Koch looked into that water and he thought he saw there those “commas”. He took some of this water, analysed it under the microscope many times and found there the same bacteria which he had so many times revealed in the people with cholera. Koch also established that animals could not catch this disease. The source of the disease was the water which people drank.
DISCOVERIES IN CLINICAL MEDICINE AND ANAESTHESIA
n There was perhaps some danger that in the search for bacteria other causes of disease would escape detection. Many physicians, however, were working along different lines in the 19th century. Among them were a group attached to Guy’s Hospital, in London: Richard Bright, Thomas Addison, and Sir William Gull. Bright contributed significantly to the knowledge of kidney diseases, including Bright’s disease, and Addison gave his name to disorders of the adrenal glands and the blood. Gull, a famous clinical teacher, left a legacy of pithy aphorisms that might well rank with those of Hippocrates.
n In Dublin Robert Graves and William Stokes introduced new methods in clinical diagnosis and medical training; while in Paris a leading clinician, Pierre-Charles-Alexandre Louis, was attracting many students from America by the excellence of his teaching.
n The most famous contribution by the United States to medical progress at this period was undoubtedly the introduction of general anaesthesia, a procedure that not only liberated the patient from the fearful pain of surgery but also enabled the surgeon to perform more extensive operations. The discovery was marred by controversy. Crawford Long, Gardner Colton, and Horace Wells are all claimants for priority.
n Crawford Long, American physician, is traditionally considered the first to have used ether as an anesthetic in surgery. He observed that persons injured in “ether frolics” (social gatherings of people who were in a playful state of ether-induced intoxication) seemed to suffer no pain, and in 1842 he painlessly removed a tumour from the neck of a patient to whom he had administered ether.
n Gardner Colton, American anesthetist and inventor, was among the first to utilize the anesthetic properties of nitrous oxide in medical practice. After a dentist suggested the use of the gas as an anesthetic, Colton safely used it in extracting thousands of teeth. As he was studying medicine in New York (without taking a degree), Colton learned that the inhalation of nitrous oxide, or laughing gas, produced exhilaration. After a public demonstration of its effects in New York City proved to be a financial success, he began a lecture tour of other cities.
n Horace Wells, American dentist, was a pioneer in the use of surgical anesthesia. While practicing in Hartford, Connecticut, in 1844, Wells noted the pain-killing properties of nitrous oxide (“laughing gas”) during a laughing-gas road show and thereafter used it in performing painless dental operations. He was allowed to demonstrate the method at the Massachusetts General Hospital in January 1845, but when the patient proved unresponsive to the gas, Wells was exposed to ridicule.
n It was William Thomas Morton who, on Oct. 16, 1846, at Massachusetts General Hospital, in Boston, first demonstrated before a gathering of physicians the use of ether as a general anaesthetic. He is credited with gaining the medical world’s acceptance of surgical anesthesia. The news quickly reached Europe, and general anaesthesia soon became prevalent in surgery.
n At Edinburgh, the professor of midwifery, James Young Simpson, had been experimenting upon himself and his assistants, inhaling various vapours with the object of discovering an effective anaesthetic. He was the first to use chloroform in obstetrics and the first in Britain to use ether. In November 1847 chloroform was tried with complete success, and soon it was preferred to ether and became the anaesthetic of choice.
ADVANCES AT THE END OF THE CENTURY
n Patrick Manson, a British pioneer in tropical medicine, showed in China, in 1877, how insects can carry disease and how the embryos of the Filaria worm, which can cause elephantiasis, are transmitted by the mosquito. Manson explained his views to a British army surgeon, Ronald Ross, then working on the problem of malaria, and Ross discovered the malarial parasite in the stomach of the Anopheles mosquito in 1897.
n In Cuba, Carlos Finlay expressed the view, in 1881, that yellow fever is carried by the Stegomyia mosquito. Following his lead, the Americans Walter Reed, William Gorgas, and others were able to conquer the scourge of yellow fever in Panama and made possible the completion of the Panama Canal by reducing the death rate there from 176 per 1,000 to 6 per 1,000.
n Other victories in preventive medicine ensued, because the maintenance of health was now becoming as important a concern as the cure of disease; and the 20th century was to witness the evolution and progress of national health services in a number of countries.
n In addition, spectacular advances in diagnosis and treatment followed the discovery of X rays by Wilhelm Conrad Röntgen, in 1895, and of radium by Pierre and Marie Curie in 1898. Before the turn of the century, too, the vast new field of psychiatry had been opened up by Sigmund Freud.
n The tremendous increase in scientific knowledge during the 19th century radically altered and expanded the practice of medicine. Concern for upholding the quality of services led to the establishment of public and professional bodies to govern the standards for medical training and practice.
Infectious diseases and chemotherapy
In the years following the turn of the century, ongoing research concentrated on the nature of infectious diseases and their means of transmission. Increasing numbers of pathogenic organisms were discovered and classified. Some, such as the rickettsias, which cause diseases like typhus, were smaller than bacteria; some were larger, such as the protozoans that engender malaria and other tropical diseases. The smallest to be identified were the viruses, producers of many diseases, among them mumps, measles, German measles, and poliomyelitis; and in 1910 Peyton Rous showed that a virus could also cause a malignant tumour, a sarcoma in chickens.
There was still little to be done for the victims of most infectious organisms beyond drainage, poultices, and ointments, in the case of local infections, and rest and nourishment for severe diseases. The search for treatments aimed at both vaccines and chemical remedies.
Germany was well to the forefront in medical progress. The scientific approach to medicine had been developed there long before it spread to other countries, and postgraduates flocked to German medical schools from all over the world. The opening decade of the 20th century has been well described as the golden age of German medicine. Outstanding among its leaders was Paul Ehrlich.
Sulfonamide drugs
In 1932 the German bacteriologist Gerhard Domagk announced that the red dye Prontosil is active against streptococcal infections in mice and humans. Soon afterward French workers showed that its active antibacterial agent is sulphanilamide. In 1936 the English physician Leonard Colebrook and his colleagues provided overwhelming evidence of the efficacy of both Prontosil and sulphanilamide in streptococcal septicemia (bloodstream infection), thereby ushering in the sulphonamide era. New sulphonamides, which appeared with astonishing rapidity, had greater potency, wider antibacterial range, or lower toxicity. Some stood the test of time; others, like the original sulphanilamide and its immediate successor, sulfapyridine, were replaced by safer and more powerful successors.
Antibiotics
A dramatic episode in medical history occurred in 1928, when Alexander Fleming noticed the inhibitory action of a stray mold on a plate culture of staphylococcus bacteria in his laboratory at St. Mary’s Hospital, London. Many other bacteriologists must have made the observation, but none had realized the possible implications. The mold was a strain of Penicillium—P. notatum—which gave its name to the now-famous drug penicillin. In spite of his conviction that penicillin was a potent antibacterial agent, Fleming was unable to carry his work to fruition, mainly because biochemists at the time were unable to isolate it in sufficient quantities or in a sufficiently pure form to allow its use on patients.
Ten years later Howard Florey, Ernst Chain, and their colleagues at Oxford University took up the problem again they isolated penicillin in a form that was fairly pure (by standards then current) and demonstrated its potency and relative lack of toxicity. By then World War II had begun, and techniques to facilitate commercial production were developed in the United States. By 1944 adequate amounts were available to meet the extraordinary needs of wartime.
Antituberculous drugs
While penicillin is the most useful and the safest antibiotic, it suffers from certain disadvantages. The most important of these is that it is not active against Mycobacterium tuberculosis, the bacillus of tuberculosis. In view of the importance of tuberculosis as a public health hazard, this is a serious defect. The position was rapidly rectified when, in 1944, Selman Waksman, Albert Schatz, and Elizabeth Bugie announced the discovery of streptomycin from cultures of a soil organism, Streptomyces griseus, and stated that it was active against M. tuberculosis. Subsequent clinical trials amply confirmed this claim. Streptomycin suffers, however, from the great disadvantage that the tubercle bacillus tends to become resistant to it. Fortunately, other drugs became available to supplement it, the two most important being para-aminosalicylic acid (PAS) and isoniazid. With a combination of two or more of these preparations, the outlook in tuberculosis improved immeasurably. The disease was not conquered, but it was brought well under control.
Other antibiotics
Penicillin is not effective over the entire field of microorganisms pathogenic to humans. During the 1950s the search for antibiotics to fill this gap resulted in a steady stream of them, some with a much wider antibacterial range than penicillin (the so-called broad-spectrum antibiotics) and some capable of coping with those microorganisms that are inherently resistant to penicillin or that have developed resistance through exposure to penicillin.
This tendency of microorganisms to develop resistance to penicillin at one time threatened to become almost as serious a problem as the development of resistance to streptomycin by the bacillus of tuberculosis. Fortunately, early appreciation of the problem by clinicians resulted in more discriminate use of penicillin. Scientists continued to look for means of obtaining new varieties of penicillin, and their researches produced the so-called semisynthetic antibiotics, some of which are active when taken by mouth, while others are effective against microorganisms that have developed resistance to the earlier form of penicillin.
Immunology
Dramatic though they undoubtedly were, the advances in chemotherapy still left one important area vulnerable, that of the viruses. It was in bringing viruses under control that advances in immunology – the study of immunity – played such a striking part. One of the paradoxes of medicine is that the first large-scale immunization against a viral disease was instituted and established long before viruses were discovered. When Edward Jenner introduced vaccination against the virus that causes smallpox, the identification of viruses was still 100 years in the future. It took almost another half century to discover an effective method of producing antiviral vaccines that were both safe and effective.
In the meantime, however, the process by which the body reacts against infectious organisms to generate immunity became better understood. In Paris, Élie Metchnikoff had already detected the role of white blood cells in the immune reaction, and Jules Bordet had identified antibodies in the blood serum. The mechanisms of antibody activity were used to devise diagnostic tests for a number of diseases. In 1906 August von Wassermann gave his name to the blood test for syphilis, and in 1908 the tuberculin test – the skin test for tuberculosis – came into use. At the same time, methods of producing effective substances for inoculation were improved, and immunization against bacterial diseases made rapid progress.
Antibacterial vaccination
Typhoid
In 1897 the English bacteriologist Almroth Wright introduced a vaccine prepared from killed typhoid bacilli as a preventive of typhoid. Preliminary trials in the Indian army produced excellent results, and typhoid vaccination was adopted for the use of British troops serving in the South African War. Unfortunately, the method of administration was inadequately controlled, and the government sanctioned inoculations only for soldiers that “voluntarily presented themselves for this purpose prior to their embarkation for the seat of war.” The result was that, according to the official records, only 14,626 men volunteered out of a total strength of 328,244 who served during the three years of the war. Although later analysis showed that inoculation had had a beneficial effect, there were 57,684 cases of typhoid – approximately one in six of the British troops engaged – with 9,022 deaths.
It is perhaps a sign of the increasingly critical outlook that developed in medicine in the post-1945 era that experts continued to differ on some aspects of typhoid immunization. There was no question as to its fundamental efficacy, but there was considerable variation of opinion as to the best vaccine to use and the most effective way of administering it. Moreover, it was often difficult to decide to what extent the decline in typhoid was attributable to improved sanitary conditions and what to the greater use of the vaccine.
Tetanus
The other great hazard of war that was brought under control in World War I was tetanus. This was achieved by the prophylactic injection of tetanus antitoxin into all wounded men. The serum was originally prepared by the bacteriologists Emil von Behring and Shibasaburo Kitasato in 1890 – 92, and the results of this first large-scale trial amply confirmed its efficacy. (Tetanus antitoxin is a sterile solution of antibody globulins – a type of blood protein – from immunized horses or cattle.)
It was not until the 1930s, however, that an efficient vaccine, or toxoid, as it is known in the cases of tetanus and diphtheria,was produced against tetanus. (Tetanus toxoid is a preparation of the toxin – or poison – produced by the microorganism; injected into humans, it stimulates the body’s own defences against the disease, thus bringing about immunity.) Again, a war was to provide the opportunity for testing on a large scale, and experience with tetanus toxoid in World War II indicated that it gave a high degree of protection.
Diphtheria
The story of diphtheria is comparable to that of tetanus, though even more dramatic. First, as with tetanus antitoxin, came the preparation of diphtheria antitoxin by Behring and Kitasato in 1890. As the antitoxin came into general use for the treatment of cases, the death rate began to decline. There was no significant fall in the number of cases, however, until a toxin–antitoxin mixture, introduced by Behring in 1913, was used to immunize children. A more effective toxoid was introduced by the French bacteriologist Gaston Ramon in 1923, and with subsequent improvements this became one of the most effective vaccines available in medicine. Where mass immunization of children with the toxoid was practiced, as in the United States and Canada beginning in the late 1930s and in England and Wales in the early 1940s, cases of diphtheria and deaths from the disease became almost nonexistent. In England and Wales, for instance, the number of deaths fell from an annual average of 1,830 in 1940 – 44 to zero in 1969. Administration of a combined vaccine against diphtheria, pertussis (whooping cough), and tetanus (DPT) is recommended for young children. Although an increasing number of dangerous side effects from the DPT vaccine have been reported, it continues to be used in most countries because of the protection it affords.
BCG vaccine for tuberculosis
In 1908 Albert Calmette, a pupil of Pasteur, and Camille Guérin produced an avirulent (weakened) strain of the tubercle bacillus. About 13 years later, vaccination of children against tuberculosis was introduced, with a vaccine made from this avirulent strain and known as BCG (bacillus Calmette-Guérin) vaccine. Although it was adopted in France, Scandinavia, and elsewhere, British and U.S. authorities frowned upon its use on the grounds that it was not safe and that the statistical evidence in its favour was not convincing.
Immunization against viral diseases
The first of the viral vaccines to result from these advances was for yellow fever, developed by the microbiologist Max Theiler in the late 1930s. About 1945 the first relatively effective vaccine was produced for influenza; in 1954 the American physician Jonas E. Salk introduced a vaccine for poliomyelitis; and in 1960 an oral poliomyelitis vaccine, developed by the virologist Albert B. Sabin, came into wide use.
These vaccines went far toward bringing under control three of the major diseases of the time although, in the case of influenza, a major complication is the disturbing proclivity of the virus to change its character from one epidemic to another. Even so, sufficient progress has been made to ensure that a pandemic like the one that swept the world in 1918 – 19, killing more than 15,000,000 people, is unlikely to occur again. Centres are now equipped to monitor outbreaks of influenza throughout the world in order to establish the identity of the responsible viruses and, if necessary, take steps to produce appropriate vaccines.
During the 1960s effective vaccines came into use for measles and rubella (German measles). Both evoked a certain amount of controversy. In the case of measles in the Western world it was contended that, if acquired in childhood, it is not a particularly hazardous malady, and the naturally acquired disease evokes permanent immunity in the vast majority of cases. Conversely, the vaccine induces a certaiumber of adverse reactions, and the duration of the immunity it produces is problematical. In the end the official view was that universal measles vaccination is to be commended.
The immune response
With advances in cell biology in the second half of the 20th century came a more profound understanding of both normal and abnormal conditions in the body. Electron microscopy enabled observers to peer more deeply into the structures of the cell, and chemical investigations revealed clues to their functions in the cell’s intricate metabolism. The overriding importance of the nuclear genetic material DNA (deoxyribonucleic acid) in regulating the cell’s protein and enzyme production lines became evident. A clearer comprehension also emerged of the ways in which the cells of the body defend themselves by modifying their chemical activities to produce antibodies against injurious agents.
In some conditions viruses invade the genetic material of cells and distort their metabolic processes. Such viruses may lie dormant for many years before becoming active. This may be the underlying cause of many cancers, in which cells escape from the usual constraints imposed upon them by the normal body. The dreaded affliction of acquired immune deficiency syndrome (AIDS) is caused by a virus that has a long dormant period and then attacks the cells that produce antibodies. The result is that the affected person is not able to generate an immune response to infections or malignancies.
Endocrinology
At the beginning of the 20th century, endocrinology was in its infancy. Indeed, it was not until 1905 that Ernest H. Starling, one of the many brilliant pupils of Edward Sharpey-Schafer, the dean of British physiology during the early decades of the century, introduced the term hormone for the internal secretions of the endocrine glands. In 1891 the English physician George Redmayne Murray achieved the first success in treating myxedema (the common form of hypothyroidism) with an extract of the thyroid gland. Three years later, Sharpey-Schafer and George Oliver demonstrated in extracts of the adrenal glands a substance that raised the blood pressure; and in 1901 Jokichi Takamine, a Japanese chemist working in the United States, isolated this active principle, known as epinephrine or adrenaline.
Insulin
During the first two decades of the century, steady progress was made in the isolation, identification, and study of the active principles of the various endocrine glands, but the outstanding event of the early years was the discovery of insulin by Frederick Banting, Charles H. Best, and J.J.R. Macleodin 1921. Almost overnight the lot of the diabetic patient changed from a sentence of almost certain death to a prospect not only of survival but of a long and healthy life.
For more than 30 years, some of the greatest minds in physiology had been seeking the cause of diabetes mellitus. In 1889 the German physicians Joseph von Mering and Oskar Minkowski had shown that removal of the pancreas in dogs produced the disease. In 1901 the American pathologist Eugene L. Opie described degenerative changes in the clumps of cells in the pancreas known as the islets of Langerhans, thus confirming the association between failure in the function of these cells and diabetes. Sharpey-Schafer concluded that the islets of Langerhans secrete a substance that controls the metabolism of carbohydrate. Then Banting, Best, and Macleod, working at the University of Toronto, succeeded in isolating the elusive hormone and gave it the name insulin.
Insulin was available in a variety of forms, but synthesis on a commercial scale was not achieved, and the only source of the hormone was the pancreas of animals. One of its practical disadvantages is that it has to be given by injection; consequently an intense search was conducted for some alternative substance that would be active when taken by mouth. Various preparations – oral hypoglycemic agents, as they are known – appeared that were effective to a certain extent in controlling diabetes, but evidence indicated that these were only of value in relatively mild cases of the disease. For the person with advanced diabetes, a normal, healthy life remained dependent upon the continuing use of insulin injections.
Cortisone
Another major advance in endocrinology came from the Mayo Clinic, in Rochester, Minn. In 1949 Philip S. Hench and his colleagues announced that a substance isolated from the cortex of the adrenal gland had a dramatic effect upon rheumatoid arthritis. This was compound E, or cortisone, as it came to be known, which had been isolated by Edward C. Kendall in 1935. Cortisone and its many derivatives proved to be potent as anti-inflammatory agents. Although it is not a cure for rheumatoid arthritis, as a temporary measure cortisone can often control the acute exacerbation caused by the disease and can provide relief in other conditions, such as acute rheumatic fever, certain kidney diseases, certain serious diseases of the skin, and some allergic conditions, including acute exacerbations of asthma. Of even more long-term importance is the valuable role it has as a research tool.
Vitamins
In the field of nutrition, the outstanding advance of the 20th century was the discovery and the appreciation of the importance to health of the “accessory food factors,” or vitamins. Various workers had shown that animals did not thrive on a synthetic diet containing all the correct amounts of protein, fat, and carbohydrate; they even suggested that there must be some unknown ingredients iatural food that were essential for growth and the maintenance of health. But little progress was made in this field until the classical experiments of the English biologist F. Gowland Hopkins were published in 1912. These were so conclusive that there could be no doubt that what he termed “accessory substances” were essential for health and growth.
The name vitamine was suggested for these substances by the biochemist Casimir Funk in the belief that they were amines, certain compounds derived from ammonia. In due course, when it was realized that they were not amines, the term was altered to vitamin.
Tropical medicine
The first half of the 20th century witnessed the virtual conquest of three of the major diseases of the tropics: malaria, yellow fever, and leprosy. At the turn of the century, as for the preceding two centuries, quinine was the only known drug to have any appreciable effect on malaria. With the increasing development of tropical countries and rising standards of public health, it became obvious that quinine was not completely satisfactory. Intensive research between World Wars I and II indicated that several synthetic compounds were more effective. The first of these to become available, in 1934, was quinacrine (known as mepacrine, Atabrine, or Atebrin). In World War II it amply fulfilled the highest expectations and helped to reduce disease among Allied troops in Africa, Southeast Asia, and the Far East. A number of other effective antimalarial drugs subsequently became available.
Surgery in the 20th century
Changes before World War I
The opening decade of the 20th century was a period of transition. Flamboyant exhibitionism was falling from favour as surgeons, through experience, learned the merits of painstaking, conscientious operation – treating the tissues gently and carefully controlling every bleeding point. The individualist was not submerged, however, and for many years the development of the various branches of surgery rested on the shoulders of a few clearly identifiable men. Teamwork on a large scale arrived only after World War II. The surgeon, at first, was undisputed master in his own wards and theatre. But as time went on and he found he could not solve his problems alone, he called for help from specialists in other fields of medicine and, even more significantly, from his colleagues in other scientific disciplines.
The increasing scope of surgery led to specialization. Admittedly, most general surgeons had a special interest, and for a long time there had been an element of specialization in such fields as ophthalmology, orthopedics, obstetrics, and gynecology; but before long it became apparent that, to achieve progress in certain areas, surgeons had to concentrate their attention on that particular subject.
Abdominal surgery
By the start of the 20th century, abdominal surgery, which provided the general surgeon with the bulk of his work, had grown beyond infancy, thanks largely to Billroth. In 1881 he had performed the first successful removal of part of the stomach for cancer. His next two cases were failures, and he was stoned in the streets of Vienna. Yet, he persisted and by 1891 had carried out 41 more of these operations with 16 deaths – a remarkable achievement for that era.
Peptic ulcers (gastric and duodenal) appeared on the surgical scene (perhaps as a new disease, but more probably because they had not been diagnosed previously), and in 1881 Ludwig Rydygier cured a young woman of her gastric ulcer by removing it. Bypass operations – gastroenterostomies – soon became more popular, however, and enjoyed a vogue that lasted into the 1930s, even though fresh ulcers at the site of the juncture were not uncommon.
Neurosurgery
Though probably the most demanding of all the surgical specialties, neurosurgery was nevertheless one of the first to emerge. The techniques and principles of general surgery were inadequate for work in such a delicate field. William Macewen, a Scottish general surgeon of outstanding versatility, and Victor Alexander Haden Horsley, the first British neurosurgeon, showed that the surgeon had much to offer in the treatment of disease of the brain and spinal cord. Macewen, in 1893, recorded 19 patients operated on for brain abscess, 18 of whom were cured; at that time most other surgeons had 100 percent mortality rates for the condition. His achievement remained unequaled until the discovery of penicillin.
An American, Harvey Williams Cushing, almost by himself consolidated neurosurgery as a specialty. From 1905 on, he advanced neurosurgery through a series of operations and through his writings. Tumours, epilepsy, trigeminal neuralgia, and pituitary disorders were among the conditions he treated successfully.
Radiology
In 1895 a development at the University of Würzburg had far-reaching effects on medicine and surgery, opening up an entirely fresh field of the diagnosis and study of disease and leading to a new form of treatment, radiation therapy. This was the discovery of X rays by Wilhelm Conrad Röntgen, a professor of physics. Within months of the discovery there was an extensive literature on the subject: Robert Jones, a British surgeon, had localized a bullet in a boy’s wrist before operating; stones in the urinary bladder and gallbladder had been demonstrated; and fractures had been displayed.
Experiments began on introducing substances that are opaque to X rays into the body to reveal organs and formations, both normal and abnormal. Walter Cannon, a Boston physiologist, used X rays in 1898 in his studies of the alimentary tract. Friedrich Voelcker, of Heidelberg, devised retrograde pyelography (introduction of the radiopaque medium into the kidney pelvis by way of the ureter) for the study of the urinary tract in 1905; in Paris in 1921, Jean Sicard X-rayed the spinal canal with the help of an oily iodine substance, and the next year he did the same for the bronchial tree; and in 1924 Evarts Graham, of St. Louis, used a radiopaque contrast medium to view the gallbladder. Air was also used to provide contrast; in 1918, at Johns Hopkins, Walter Dandy injected air into the ventricles (liquid-filled cavities) of the brain.
World War I
The battlefields of the 20th century stimulated the progress of surgery and taught the surgeon innumerable lessons, which were subsequently applied in civilian practice. Regrettably, though, the principles of military surgery and casualty evacuation, which can be traced back to the Napoleonic wars, had to be learned over again.
World War I broke, quite dramatically, the existing surgical hierarchy and rule of tradition. No longer did the European surgeon have to waste his best years in apprenticeship before seating himself in his master’s chair. Suddenly, young surgeons in the armed forces began confronting problems that would have daunted their elders. Furthermore, their training had been in “clean” surgery performed under aseptic conditions. Now they found themselves faced with the need to treat large numbers of grossly contaminated wounds in improvised theatres. They rediscovered debridement (the surgical excision of dead and dying tissue and the removal of foreign matter).
Between the world wars
The years between the two world wars may conveniently be regarded as the time when surgery consolidated its position. A surprising number of surgical firsts and an amazing amount of fundamental research had been achieved even in the late 19thcentury, but the knowledge and experience could not be converted to practical use because the human body could not survive the onslaught. In the years between World Wars I and II, it was realized that physiology – in its widest sense, including biochemistry and fluid and electrolyte balance – was of major importance along with anatomy, pathology, and surgical technique.
The problem of shock
The first problem to be tackled was shock, which was, in brief, found to be due to a decrease in the effective volume of the circulation. To combat shock, the volume had to be restored, and the obvious substance was blood itself. In 1901 Karl Landsteiner, then in Austria, discovered the ABO blood groups, and in 1914 sodium citrate was added to freshly drawn blood to prevent clotting. Blood was occasionally transfused during World War I, but three-quarters of a pint was considered a large amount. These transfusions were given by directly linking the vein of a donor with that of the recipient. The continuous drip method, in which blood flows from a flask, was introduced by Hugh Marriott and Alan Kekwick at the Middlesex Hospital, London, in 1935.
As blood transfusions increased in frequency and volume, blood banks were required. Although it took another world war before these were organized on a large scale, the first tentative steps were taken by Sergey Sergeyevich Yudin, of Moscow, who, in 1933, used cadaver blood, and by Bernard Fantus, of Chicago, who, four years later, used living donors as his source of supply. Saline solution, plasma, artificial plasma expanders, and other solutions are now also used in the appropriate circumstances.
Sometimes after operations (especially abdominal operations),the gut becomes paralyzed. It is distended, and quantities of fluid pour into it, dehydrating the body. In 1932 Owen Wangensteen, at the University of Minnesota, advised decompressing the bowel, and in 1934 two other Americans, Thomas Miller and William Abbott, of Philadelphia, invented an apparatus for this purpose, a tube with an inflatable balloon on the end that could be passed into the small intestine. The fluid lost from the tissues was replaced by a continuous intravenous drip of saline solution on the principle described by Rudolph Matas, of New Orleans, in 1924. These techniques dramatically improved abdominal surgery, especially in cases of obstruction, peritonitis (inflammation of the abdominal membranes), and acute emergencies generally, since they made it possible to keep the bowel empty and at rest.
Anesthesia and thoracic surgery
The strides taken in anesthesia from the 1920s onward allowed surgeons much more freedom. Rectal anesthesia had never proved satisfactory, and the first improvement on the combination of nitrous oxide, oxygen, and ether was the introduction of the general anesthetic cyclopropane by Ralph Waters of Madison, Wis., in 1933. Soon afterward, intravenous anesthesia was introduced; John Lundy of the Mayo Clinic brought to a climax a long series of trials by many workers when he used Pentothal (thiopental sodium, a barbiturate) to put a patient peacefully to sleep. Then, in 1942, Harold Griffith and G. Enid Johnson, of Montreal, produced muscular paralysis by the injection of a purified preparation of curare. This was harmless since, by then, the anesthetist was able to control the patient’s respiration.
These advances allowed thoracic surgery to move into modern times. In the 1920s, operations had been performed mostly for infective conditions and as a last resort. The operations necessarily were unambitious and confined to collapse therapy, including thoracoplasty (removal of ribs), apicolysis (collapse of a lung apex and artificially filling the space), and phrenic crush (which paralyzed the diaphragm on the chosen side); to isolation of the area of lung to be removed by first creating pleural adhesions; and to drainage.
The technical problems of surgery within the chest were daunting until Harold Brunn of San Francisco reported six lobectomies (removals of lung lobes) for bronchiectasis with only one death. (In bronchiectasis one or more bronchi or bronchioles are chronically dilated and inflamed, with copious discharge of mucus mixed with pus.) The secret of Brunn’s success was the use of intermittent suction after surgery to keep the cavity free of secretions until the remaining lobes of the lung could expand to fill the space. In 1931 Rudolf Nissen, in Berlin, removed an entire lung from a girl with bronchiectasis. She recovered to prove that the risks were not as bad as had been feared.
Cancer of the lung has become a major disease of the 20th century; perhaps it has genuinely increased, or perhaps modern techniques of diagnosis reveal it more often. As far back as 1913 a Welshman, Hugh Davies, removed a lower lobe for cancer, but a new era began when Evarts Graham removed a whole lung for cancer in 1933. The patient, a doctor, was still alive at the time of Graham’s death in 1957.
The thoracic part of the esophagus is particularly difficult to reach, but in 1909 the British surgeon Arthur Evans successfully operated on it for cancer. But results were generally poor until, in 1944, John Garlock, of New York, showed that it is possible to excise the esophagus and to bring the stomach up through the chest and join it to the pharynx. Lengths of colon are also used as grafts to bridge the gap.
World War II and after
Once the principles of military surgery were relearned and applied to modern warfare, instances of death, deformity, and loss of limb were reduced to levels previously unattainable. This was due largely to a thorough reorganization of the surgical services, adapting them to prevailing conditions, so that casualties received the appropriate treatment at the earliest possible moment. Evacuation by air (first used in World War I) helped greatly in this respect. Diagnostic facilities were improved, and progress in anesthesia kept pace with the surgeon’s demands. Blood was transfused in adequate – and hitherto unthinkable – quantities, and the blood transfusion service as it is known today came into being.
Surgical specialization and teamwork reached new heights with the creation of units to deal with the special problems of injuries to different parts of the body. But the most revolutionary change was in the approach to wound infections brought about by the use of sulphonamides and (after 1941) of penicillin. The fact that these drugs could never replace meticulous wound surgery was, however, another lesson learned only in the bitter school of experience.
Heart surgery
The attitude of the medical profession toward heart surgery was for long overshadowed by doubt and disbelief. Wounds of the heart could be sutured (first done successfully by Ludwig Rehn, of Frankfurt am Main, in 1896); the pericardial cavity – the cavity formed by the sac enclosing the heart – could be drained in purulent infections (as had been done by Larrey in 1824); and the pericardium could be partially excised for constrictive pericarditis when it was inflamed and constricted the movement of the heart (this operation was performed by Rehn and Sauerbruch in 1913). But little beyond these procedures found acceptance.
Yet, in the first two decades of the 20th century, much experimental work had been carried out, notably by the French surgeons Théodore Tuffier and Alexis Carrel. Tuffier, in 1912, operated successfully on the aortic valve. In 1923 Elliott Cutler of Boston used a tenotome, a tendon-cutting instrument, to relieve a girl’s mitral stenosis (a narrowing of the mitral valve between the upper and lower chambers of the left side of the heart) and in 1925, in London, Henry Souttar used a finger to dilate a mitral valve in a manner that was 25 years ahead of its time. Despite these achievements, there was too much experimental failure, and heart disease remained a medical, rather than surgical, matter.
Resistance began to crumble in 1938, when Robert Gross successfully tied off a persistent ductus arteriosus (a fetal blood vessel between the pulmonary artery and the aorta). It was finally swept aside in World War II by the remarkable record of Dwight Harken, who removed 134 missiles from the chest – 13 in the heart chambers – without the loss of one patient.
Until 1953, however, the techniques all had one great disadvantage: they were done “blind.” The surgeon’s dream was to stop the heart so that he could see what he was doing and be allowed more time in which to do it. In 1952 this dream began to come true when Floyd Lewis, of Minnesota, reduced the temperature of the body so as to lessen its need for oxygen while he closed a hole between the two upper heart chambers, the atria. The next year John Gibbon, Jr., of Philadelphia brought to fulfilment the research he had begun in 1937; he used his heart–lung machine to supply oxygen while he closed a hole in the septum between the atria.
Unfortunately, neither method alone was ideal, but intensive research and development led, in the early 1960s, to their being combined as extracorporeal cooling. That is, the blood circulated through a machine outside the body, which cooled it(and, after the operation, warmed it); the cooled blood lowered the temperature of the whole body. With the heart dry and motionless, the surgeon operated on the coronary arteries; he inserted plastic patches over holes; he sometimes almost remodelled the inside of the heart. But when it came to replacing valves destroyed by disease, he was faced with a difficult choice between human tissue and man-made valves, or even valves from animal sources.
Organ transplantation
In 1967 surgery arrived at a climax that made the whole world aware of its medicosurgical responsibilities when the South African surgeon Christian Barnard transplanted the first human heart. Reaction, both medical and lay, contained more than an element of hysteria. Yet, in 1964, James Hardy, of the University of Mississippi, had transplanted a chimpanzee’s heart into a man; and in that year two prominent research workers, Richard Lower and Norman E. Shumway, had written: “Perhaps the cardiac surgeon should pause while society becomes accustomed to resurrection of the mythological chimera.” Research had been remorselessly leading up to just such an operation ever since Charles Guthrie and Alexis Carrel, at the University of Chicago, perfected the suturing of blood vessels in 1905 and then carried out experiments in the transplantation of many organs, including the heart.
New developments in immunosuppression (the use of drugs to prevent organ rejection) have advanced the field of transplantation enormously. Kidney transplantation is now a routine procedure that is supplemented by dialysis with an artificial kidney (invented by Willem Kolff in wartime Holland) before and after the operation; mortality has been reduced to about 10 percent per year. Rejection of the transplanted heart by the patient’s immune system was overcome to some degree in the 1980s with the introduction of the immunosuppressant cyclosporine; records show that many patients have lived for five or more years after the transplant operation.
The complexity of the liver and the unavailability of supplemental therapies such as the artificial kidney have contributed to the slow progress in liver transplantation (first performed in 1963 by Thomas Starzl). An increasing number of patients, especially children, have undergone successful transplantation; however, a substantial number may require retransplantation due to the failure of the first graft.
Lung transplants (first performed by Hardy in 1963) are difficult procedures, and much progress is yet to be made in preventing rejection. A combined heart-lung transplant is still in the experimental stage, but it is being met with increasing success; two-thirds of those receiving transplants are surviving, although complications such as infection are still common. Transplantation of all or part of the pancreas is not completely successful, and further refinements of the procedures (first performed in 1966 by Richard Lillehei) are needed.
The Kharkiv State Medical University is one of the oldest higher educational establishments in Ukraine. It was founded in 1805 as the Medical Faculty of the Kharkov University. At present, over 600 teachers work at the departments of this Medical University.
In year 2000 Odessa State Medical University has completed its 100 years of existence. Odessa State Medical University is the first university of Ukraine which started medical education in English medium in 1996.
Modern Ukrainian medicine
The first steps towards modern Ukrainian medicine were made in 1898-1910, when the first scientific associations of Ukrainian doctors were established: the Ukrainian scientific Society in Kyiv and the Shevchenko Scientific Society in Lviv. The first works on medicine in Ukrainian were published and the first disease prevention and treatment institutions of clearly Ukrainian orientation were established. At the same time, Ukrainian doctors made themselves heard at European medical forums in Paris, Madrid, Prague and Belgrade and the Ukrainian Halitska army health service established the new Ukrainian military medicine.
In January 1918, he first medical journal in Eastern Ukraine “Ukrainski Medychni Visti” was published. In its editorial “Our tasks today” Ovksentiy Korchak-Chepurkovskyi, the oldest Ukrainian professor-hygienist, the founder of social hygiene, wrote the following: “Our main task is to develop Ukrainian national medicine as a science and practical field of knowledge”. To achieve this goal, it was necessary to “open our scientific and educational medical establishments, draw upon the experience of medicine…”.
Korchak-Chepurkovskyi organized and headed the first Ukrainian medical university department.
He was one of the founders of the Ukrainian Academy of Science, where he established a medical section which made the functions of a center of Ukrainian medical science development, and organized a health research department, a prototype of later academic institute. He also researched Ukrainian medical terminology as well as the health and demography of the Ukrainian population.
Among the first national scientific schools were those of surgeons (by Yevhen Cherniakhivskyi), obstetrician–gynecologist (by Oleksaner Krupskyi), physician–gerontologist (by Ivan Bzylevych), otolaryngologists (by Oleksander Puchkivsky), microbiologists (by Marko Neshadymenko) and others.
The National Museum of Medicine of Ukraine is an exposition modernly equipped and showing the path of medicine and public health development in Ukraine from ancient times to the present day. The need for such museum is obvious: it is the base for teaching the history of medicine and other disciplines in medical schools.
The National Museum of Medicine of Ukraine was established in 1973. It is located in the building of the former anatomical theatre of Kyiv University. The first chief of the museum was its founder, Honoured Science Worker of Ukraine, Doctor of Medical Sciences, Professor O.A. Grando.
The creators of the museum took a fresh approach to its organization. Poster exhibitions, original interiors, portraits of famous scientists and physicians, dioramas of the most significant events in Ukrainian medicine – all this make the exhibition bright and exciting. The National Museum of Medicine of Ukraine is the largest of all European medical museums.
The special painful pages of our history are the Holodomor of 1932-1933 and Chernobyl which are reflected in the halls of the museum.
Mykola Amosov was a Ukrainian doctor, heart surgeon, inventor, enthusiast, known for his inventions of several surgical procedures for treating heart defects.
In 1955 he was the first in Ukraine who began treatment for heart diseases surgically. In 1958, he was one of the first in the Soviet Union to introduce into the practice the method of artificial blood circulation (in 1963). Amosov was first in the Soviet Union to perform the mitral valve replacement, and in 1965 for the first time in the world he created and introduced into practice the antithrombotic heart valves prosthethesis. Amosov elaborated a number of new methods of surgical treatment of heart lesions, the original model of heart-lung machine.
His work on the surgical treatment of heart diseases won a State Prize of Ukraine (1988) gold medals and Silver Medal (1978) of the Exhibition of Economic Achievements of the USSR.
The clinic established by Amosov, produced about 7000 lung resections, more than 95000 operations for heart diseases, including about 36000 operations with extracorporeal blood circulation.
In 1983 Amosov’s cardiac surgery clinic was reorganized in Kiev Research Institute of Cardiovascular Surgery and in the Ukrainian Republican cardiovascular surgical center. Each year, the institute fulfilled about 3000 heart operations, including over 1500 – with extracorporeal blood circulation. Amosov was the first director of the Institute, and since 1988 – Honorary Director of the Institute.
In 1955, Amosov created and headed the first in the USSR Chair of Thoracic Surgery for the postgraduate studies and later the Chair of Anesthesiology. These Chairs have prepared more than 700 specialists for Ukraine and other republics.
Along with surgery Amosov paid much attention to contemporary problems of biological, medical and psychological cybernetics. From 1959 to 1990 he headed the Department of Biological Cybernetics in the Institute of Cybernetics. Under the leadership of Amosov fundamental studies of the self-regulation of the heart systems were conducted and the issues of machine diagnosis of heart disease were studied, elaboration and creation of physiological models of “internal environment”, computer modeling of basic mental functions, and some socio-psychological mechanisms of human behavior were done. Innovative approach, the original views of Amosov were widely recognized in our country and abroad. For his research in the field of Biocybernetics in 1978 and 1997 he was awarded the State Prize of Ukraine.
By the order of Cabinet of Ministers of Ukraine № 128-p of 12 March 2003 the Institute of Cardiovascular Surgery of the Academy of Medical Sciences of Ukraine was named after Academician Nikolai Mikhailovich Amosov.
Volodymyr Filatov was a Ukrainian ophthalmologist and surgeon best known for his development of tissue therapy. He introduced the tube flap grafting method, corneal transplantation and preservation of grafts from cadaver eyes. He founded The Filatov Institute of Eye Diseases & Tissue Therapy in Odessa. Filatov is also credited for restoring Vasily Zaytsev’s sight when he suffered an injury to his eyes from a mortar attack during Battle of Stalingrad.
First corneal transplantation was attempted by Filatov on 28 of February 1912, but the graft grew opaque. After numerous attempts over the course of many years, Filatov achieved a successful transplantation of cornea from a diseased person on 6 of May 1931.
The Filatov Institute of eye diseases and tissue therapy of AMS of Ukraine is known in the world as one of the leading ophthalmologic centers of Europe and is the main one in Ukraine on the problem of ophthalmology and tissue therapy.