History of Ukrainian medicine. Medicine of the New Time. XX-th century medicine. Contemporary medicine and health protection in Ukraine.
HISTORY OF UKRAINIAN MEDICINE
1. Folk medicine
The history of medicine in Ukraine begins with the history of folk medicine.
The origin of Ukrainian medicine may be traced back to the folk medicine of the Kyiv Ukraine-Rus epoch. It developed as a monastery medicine and medicine of the Cossacks state.

Medical schools
The first medical hospitals in Kyiv Rus were founded in the 11-th century and were mostly in the form of alms houses attached to churches.
In 1653 in the city of Zamostya (near Lviv) Zamostya’s academy was organized under the initiative of graph Yan Zamoyskyi.
Yan Zamoyskyi graduated from the University of Padua. He desided to open a similar school in his motherland. The Pope of Rome Clement VIII confirmed the Academy status and gave it the right to adjudge the degree of the doctors of philosophy, low and medicine. Medical Faculty of the Academy was weaker than that of Cracow. Only 1-2 professors taught medicine there.
The relation between Zamostya’s academy and Padua University was very strong during many years. For example, rector of the Academy asked Padua’s medical faculty an advice about the causes and treatment of Kovtun (Plica Polonica). At that time it was spread on Halychyna territory, especially among Hutsuls, who lived in mountain regions of Carpathians.
The academy was existing only for 190 years. But it played a positive role in the dissemination of scientific medical knowledge among the population.
Some graduates of the academy, especially Ukrainians and Byelorussians went on a service to the feudal lords. Some graduates continued their education at the universities of Italy where they received a degree of the doctor of medicine.
One of such doctors of medicine was Yuriy Drohobych-Kotermak (1450-1494).
Yuriy Drohobych-Kotermak was a philosopher, astrologist, writer, medical doctor, rector of the University of Bologna, professor of Krakow Academy, first publisher of a Ukrainian printed text. He is the author of “Iudicium Pronosticon Anni Currentis”, 1483.
In 1478 Drohobych received his doctorate in philosophy, but he continued his studies. This time he took up medicine.
At that time natural philosophy disciplines were closely connected. Almost all contemporary philosophers demonstrated equally strong knowledge in astronomy and medicine, which allowed university professors to transfer from one department to another. Similar methods were used in teaching both disciplines. It was done through reading and interpretation of Latin translations of Greek and Arab classical authors.
Medicine was considered the key to understanding nature.
Shortly after Drohobych completed his medical studies, he was offered a position to teach astronomy at Bologna University. At the beginning of 1481, the student body of the University elected Drohobych to become the rector of the school of Medicine and Free Arts.
In 1486 Drohobych returned to Krakow. He started his medical practice and also taught medicine at Krakow University. Similar to his peers from Bologna, he based his lectures on the works of Hippocrates, Galen, and Avicenna.
A few years later, he received his professorship in medicine and became the doctor of the Polish king Casimir IV Jagiellon. In 1492 he became the Dean of the Department of Medicine. It was customary at that time for professors to have off-site meetings to discuss with students issues that did not fit the official scientific doctrine.
Copernicus attended Drohobych’s meetings, however it is not certain whether the former had an influence on the latter.

The history f Ukraine in XVI-XVII centuries is characterized by struggle of Ukrainians for their independence.
In the 14th and 15th centuries new hospitals were built and many physicians gave the first aid to the inhabitants of Ukraine and the soldiers of Bogdan Khmelnitsky’s troops.

Medical assistance in the army
of Bogdan Khmelnitsky
At the end of the XVI century the main Kossacks hospital was the hospital in Trahtemyrivskyi monastery below the Dnieper.
Military hospitals were in monasteries: Lebedinsky near Chyhyryn and Levkovsky near Ovruch. Monasteries willingly took care over the Cossacks.
In Cossack hospitals, opposed to civilians in towns and villages, the disabled found refuge as well as treatment the wounded was practiced. Those were the first military hospitals in Ukraine.


Cossacks medicine is very interesting too. Their practice off medicine is full of mysteries and legends.
Pauline, thyme, mint, borschivnyk were the main components of content of Kossack pipe. That is why Cossacks almost newer were sick with asthma and bronchitis.
Moreover smoking was able to reduce pressure, calm nerves, improve appetite, sleep and even eyesight.

Vitamins and other necessary substances lacking in conventional food Cossacks received in this form.
Epifany Slavinetsky
Among famous doctors of that time it is important to mention Epifany Slavinetsky.
In the 1620s, he attended the Kyiv Brotherhood School and later continued his education abroad.
Epifany was one of the most educated people of his period that came from Central and Eastern Europe. He came to master the Latin, Polish, Ancient Greek and Hebrew languages.
Epifany Slavinetsky translated into Slavic languages the book of anatomy by Andreas Vesalius.

Epifany Slavinetsky revising service-books
Kyiv-Mohyla academy
Kyiv-Mohyla academy played a significant role in the preparing of medical stuff, with organization of hospital’s medical schools. During 14 years (1784-1798) more than 300 persons who were studying at academy, entered medical schools. The Academy was first opened in 1615 as the Kyiv Brotherhood School.
In 1632 the Kyiv Pechersk Lavra school and Kyiv Brotherhood School merged into the Kyiv-Mohyla Collegium (Latin: Collegium Kiyovense Mohileanum). The Collegium was named after Petro Mohyla.

Among the famous graduates of the Kyiv-Mohyla Academy there are names of Peter Doroshenko, Philip Orlik, Yuri Khmelnitsky, Paul Teterya, Gregory Skovoroda, Ivan Skoropadskyi, Ivan Mazepa etc.
Many graduates of the Academy continued to enrich their knowledge abroad and received their doctors’ degrees there. Many former students of this Academy have become the well-known scientists. They are the epidemiologist D. S. Samoilovych, the obstetrician N. M. Ambodyk-Maximovych, the podiatrist S. F. Chotovytsky, the anatomist O. Shumlyansky and many others.

The main building of Kiev-Mohyla Academy in the seventeenth century.
In 1686 the first bacteriological station was organized in Odessa which was of great importance in the development of microbiology and epidemiology. The famous scientists I. I. Mechnikov and M. F. Gamaliya worked at this station and succeeded much in their investigations. In spite of favorable conditions for the successful development of natural sciences in Russia many outstanding scientists worked in Ukraine. It is known that the brilliant scientist M. I. Pirogov and his followers (V. O. Karavayev, O. F. Shimanovsky, M. V. Sklifosovsky and others) made valuable contribution to the development of Ukrainian medicine.

The painting “Future doctors”
In the XVIII century medical schools were the main educational institutions on preparation of the doctors. About 2000 men had received rank of the doctors. The period of training at these schools varied from 5 till 10 years. It is necessary to recollect the professors and scientists of our medical schools: O. Shumlanskiy, M. Terehovskiy and others.
O. Shumlanskiy (1748-1795) in 1793 finished his scientific work devoted to a structure of kidneys.
He was the first who describe a structure of kidneys. He established that malpigiy body was not gland, which was concerned at that time.
M. Terehovsky (1740-1796) in his doctoral studies proved that the microorganisms in the calm water do not appear themselves but are coming from outside.
Nestor Maksymovitch-Ambodyck (1744-1812) published the dictionary, where he showed a lay of new terms. He published the books on botany.

At the end of the 18th and during the 19th centuries the medical departments were formed at the Universities of Kharkiv, Kyiv, Lviv and Odessa. The total number of physicians has increased in Ukraine. The medicine of Zemstvo was widely used at that time.
During the Crimean War (1854-1856), upon Pirogov’s initiative the first detachment of nurses was trained and sent to Sevastopol to help its defenders. It gave the beginning of the organization of “Red Cross”.
Kharkiv State Medical University
The Kharkiv State Medical University is one of the oldest higher educational establishments in Ukraine.
It was founded in 1805 as the Medical Faculty of the Kharkov University. At present, over 600 teachers work at the departments of this Medical University.

Odessa State Medical University
In year 2000 Odessa State Medical University has completed its 100 years of existence. Odessa State Medical University is the first university of Ukraine which started medical education in English medium in 1996.

Modern Ukrainian medicine
The first steps towards modern Ukrainian medicine were made in 1898-1910, when the first scientific associations of Ukrainian doctors were established: the Ukrainian scientific Society in Kyiv and the Shevchenko Scientific Society in Lviv. The first works on medicine in Ukrainian were published and the first disease prevention and treatment institutions of clearly Ukrainian orientation were established. At the same time, Ukrainian doctors made themselves heard at European medical forums in Paris, Madrid, Prague and Belgrade and the Ukrainian Halitska army health service established the new Ukrainian military medicine.
In January 1918, he first medical journal in Eastern Ukraine “Ukrainski Medychni Visti” was published. In its editorial “Our tasks today” Ovksentiy Korchak-Chepurkovskyi, the oldest Ukrainian professor-hygienist, the founder of social hygiene, wrote the following: “Our main task is to develop Ukrainiaational medicine as a science and practical field of knowledge”. To achieve this goal, it was necessary to “open our scientific and educational medical establishments, draw upon the experience of medicine…”.
Korchak-Chepurkovskyi organized and headed the first Ukrainian medical university department.
He was one of the founders of the Ukrainian Academy of Science, where he established a medical section which made the functions of a center of Ukrainian medical science development, and organized a health research department, a prototype of later academic institute. He also researched Ukrainian medical terminology as well as the health and demography of the Ukrainian population.
Among the first national scientific schools were those of surgeons (by Yevhen Cherniakhivskyi), obstetrician-gynecologist (by Oleksaner Krupskyi), physician-gerontologist (by Ivan Bzylevych), otolaryngologists (by Oleksander Puchkivsky), microbiologists (by Marko Neshadymenko) and others.
The Ukrainian National Museum of Medicine
The National Museum of Medicine of Ukraine is an exposition modernly equipped and showing the path of medicine and public health development in Ukraine from ancient times to the present day. The need for such museum is obvious: it is the base for teaching the history of medicine and other disciplines in medical schools.
The National Museum of Medicine of Ukraine was established in 1973. It is located in the building of the former anatomical theatre of Kyiv University. The first chief of the museum was its founder, Honoured Science Worker of Ukraine, Doctor of Medical Sciences, Professor O.A. Grando.
The creators of the museum took a fresh approach to its organization. Poster exhibitions, original interiors, portraits of famous scientists and physicians, dioramas of the most significant events in Ukrainian medicine – all this make the exhibition bright and exciting. The National Museum of Medicine of Ukraine is the largest of all European medical museums.
The special painful pages of our history are the Holodomor of 1932-1933 and Chernobyl which are reflected in the halls of the museum.

The Ukrainian National Museum of Medicine
Mykola Amosov
Mykola Amosov was a Ukrainian doctor, heart surgeon, inventor, enthusiast, known for his inventions of several surgical procedures for treating heart defects.
In 1955 he was the first in Ukraine who began treatment for heart diseases surgically. In 1958, he was one of the first in the Soviet Union to introduce into the practice the method of artificial blood circulation (in 1963). Amosov was first in the Soviet Union to perform the mitral valve replacement, and in 1965 for the first time in the world he created and introduced into practice the antithrombotic heart valves prosthethesis. Amosov elaborated a number of new methods of surgical treatment of heart lesions, the original model of heart-lung machine.
His work on the surgical treatment of heart diseases won a State Prize of Ukraine (1988) gold medals and Silver Medal (1978) of the Exhibition of Economic Achievements of the USSR.
The clinic established by Amosov, produced about 7000 lung resections, more than 95000 operations for heart diseases, including about 36000 operations with extracorporeal blood circulation.
In 1983 Amosov’s cardiac surgery clinic was reorganized in Kiev Research Institute of Cardiovascular Surgery and in the Ukrainian Republican cardiovascular surgical center. Each year, the institute fulfilled about 3000 heart operations, including over 1500 – with extracorporeal blood circulation. Amosov was the first director of the Institute, and since 1988 – Honorary Director of the Institute.
In 1955, Amosov created and headed the first in the USSR Chair of Thoracic Surgery for the postgraduate studies and later the Chair of Anesthesiology. These Chairs have prepared more than 700 specialists for Ukraine and other republics.
Volodymyr Petrovych Filatov

Volodymyr Filatov was a Ukrainian ophthalmologist and surgeon best known for his development of tissue therapy. He introduced the tube flap grafting method, corneal transplantation and preservation of grafts from cadaver eyes. He founded The Filatov Institute of Eye Diseases & Tissue Therapy in Odessa. Filatov is also credited for restoring Vasily Zaytsev’s sight when he suffered an injury to his eyes from a mortar attack during Battle of Stalingrad.
First corneal transplantation was attempted by Filatov on 28 of February 1912, but the graft grew opaque. After numerous attempts over the course of many years, Filatov achieved a successful transplantation of cornea from a diseased person on 6 of May 1931.
MEDICINE IN THE NEW TIME
In 1575 Pare published, The Collected Works of Surgery. This work was initially attacked by the French Faculty of Physicians, but thanks to the support of Henry II, the King, the book began to spread Pare’s ideas across Europe. Over time these ideas began to change the way that surgeons approached their work – especially when treating wounds and performing amputations. Surgeons now knew that for operations to be successful they would have to combat pain, infection and bleeding using methods similar to those used by Pare.
Pare’s method, although groundbreaking, still left some problems to be solved in the future.
* Even though Pare’s use of a digestive (ointment) when treating wounds reduced the risk of infection, many patients still died from infection as effective antiseptics had not yet been invented.
* Pare’s method of using silk thread to tie off arteries could actually cause infection. Instruments used during operations were not often clean – there was no knowledge of germs – therefore bacteria on those instruments (and the silk thread) was often transferred to the wound and sealed inside.
William Harvey was born in England in 1578. He studied medicine at Padua University between 1598 and 1602. He was very interested in anatomy, particularly the work of Vesalius. After leaving university he worked as a doctor at St Bartholomew’s Hospital, London, and then as a lecturer in anatomy at the Royal College of Surgeons. He was also physician to both James I and Charles I.
Although Vesalius had proven that some of Galen’s ideas were incorrect, Galen’s explanation of the function of the heart was still accepted. Galen said that blood was made in the liver, and got into the arteries through holes in the septum of the heart. He said that blood was continually being made – to make up for the fact that it was used up by the body.
William Harvey observed how blood flowed around the body. Drawings like this demonstrate that veins have valves and return blood to the heart.
Like Pare and Vesalius, Harvey believed in the importance of careful observation, dissection and experiments in order to improve his knowledge of how the body worked. In 1615 Harvey began to work on the idea that blood circulated around the body. Around this time, water pumps were invented. This gave Harvey the idea that perhaps the heart worked in the same way as a water pump, and pumped blood around the body.
Harvey wanted to study the body as a living system, so he needed to dissect things which were still alive. He chose to study cold-blooded animals like frogs because their hearts beat slowly. This enabled him to see each separate expansion and contraction of the heart. He also dissected the bodies of dead criminals to ensure that the human heart was the same as that of the live animals he had studied.
Harvey’s study of beating hearts showed him that the heart was pushing out large volumes of blood. He proved that each push happened at the same time as the pulse which could be felt at the neck and at the wrist. He realised that so much blood was being pumped out by the heart, that it could not be used up and replaced by new blood as Galen had said. This suggested that there was a fixed amount of blood in the body, and that it was circulating.
Harvey now needed to prove his theory. By trying to pump liquids the wrong way past the valves in veins and arteries, Harvey proved that they were all ‘one-way’ systems. This proved his theory that blood flowed out from the heart through the arteries, and it flowed back through the veins to the heart where it was recycled again. He also devised a simple experiment that anyone could use on themselves to prove that blood only flows one way through the veins. By bandaging the upper arm, the valves show up as nodules on the vein. If your finger is pushed along the vein from one valve to the next, away from the heart, the section of vein will be emptied of blood. It will stay empty until you take your finger off. Harvey published his theory of the circulation of blood in his book, On the Motion of the Heart, in 1628. He included a sketch of how to perform this simple experiment, to prove his theory to readers.
Harvey’s theory met with opposition because it suggested that if there was a fixed amount of blood in the body, then there was no need for the practice of blood letting. Blood letting was a very common and well respected medical practice, which had been used ever since ancient times, (e.g. in the Four Humours). After his book was published he actually lost patients, as his ideas were considered strange for the time. Despite this, soon after his death, his theory was soon widely accepted. Over the next 300 years his theory was used to build up knowledge of what the blood did in various parts of the body as it circulated. Harvey made a great contribution to medical knowledge, but it was not until the 1900’s that the knowledge was used in medical practices. Medical practices in the Renaissance were not changed by Harvey’s work. Blood letting still continued to be a popular practice, and it was only in the 1900’s that doctors realised the importance of checking a patient’s blood flow by checking their pulse.
The portrayal of the history of medicine becomes more difficult in the 19th century. Discoveries multiply, and the number of eminent doctors is so great that the history is apt to become a series of biographies. Nevertheless, it is possible to discern the leading trends in modern medical thought.
MEDICINE IN THE 19-20TH CENTURIES
Development of Physiology
By the beginning of the 19th century, the structure of the human body was almost fully known, due to new methods of microscopy and of injections. Even the body’s microscopic structure was understood. But as important as anatomical knowledge was an understanding of physiological processes, which were rapidly being elucidated, especially in Germany. There, physiology became established as a distinct science under the guidance of Johannes Müller, who was a professor at Bonn and then at the University of Berlin. An energetic worker and an inspiring teacher, he described his discoveries in a famous textbook, Handbuch der Physiologie des Menschen (“Manual of Human Physiology”), published in the 1830s.

Among Müller’s illustrious pupils were Hermann von Helmholtz, who made significant discoveries relating to sight and hearing and who invented the ophthalmoscope; and Rudolf Virchow, one of the century’s great medical scientists, whose outstanding achievement was his conception of the cell as the centre of all pathological changes. Virchow, German pathologist and statesman, one of the most prominent physicians of the 19th century, pioneered the modern concept of pathological processes by his application of the cell theory to explain the effects of disease in the organs and tissues of the body. He emphasized that diseases arose, not in organs or tissues in general, but primarily in their individual cells. Virchow’s work Die Cellularpathologie, published in 1858, gave the deathblow to the outmoded view that disease is due to an imbalance of the four humours.

Hermann von Helmholtz

Rudolf Virchow
In France the most brilliant physiologist of the time was Claude Bernard, whose many important discoveries were the outcome of carefully planned experiments. His researches clarified the role of the pancreas in digestion, revealed the presence of glycogen in the liver, and explained how the contraction and expansion of the blood vessels are controlled by vasomotor nerves. He proposed the concept of the internal environment—the chemical balance in and around the cells—and the importance of its stability. His Introduction à l’étude de la médecine expérimentale (1865; An Introduction to the Study of Experimental Medicine) is still worthy of study by all who undertake research.

Verification of the germ theory
Perhaps the overarching medical advance of the 19th century, certainly the most spectacular, was the conclusive demonstration that certain diseases, as well as the infection of surgical wounds, were directly caused by minute living organisms. This discovery changed the whole face of pathology and effected a complete revolution in the practice of surgery.
The idea that disease was caused by entry into the body of imperceptible particles was of ancient date. It had been expressed by the Roman encyclopaedist Varro as early as 100 BC, by Fracastoro in 1546, by Athanasius Kircher and Pierre Borel about a century later, and by Francesco Redi, who in 1684 wrote his Osservazioni intorno agli animali viventi che si trovano negli animali viventi (“Observations on Living Animals Which Are to Be Found Within Other Living Animals”), in which he sought to disprove the idea of spontaneous generation. Everything must have a parent, he wrote; only life produces life.
A 19th-century pioneer in this field, regarded by some as founder of the parasitic theory of infection, was Agostino Bassi of Italy, who showed that a disease of silkworms was caused by a fungus that could be destroyed by chemical agents.
Agostino Bassi, pioneer Italian bacteriologist, anticipated the work of Louis Pasteur by 10 years in discovering that numerous diseases are caused by microorganisms.
In 1807 he began an investigation of the silkworm disease mal de segno (commonly known as muscardine), which was causingserious economic losses in Italy and France. After 25 years of research and experimentation, he was able to demonstrate that the disease was contagious and was caused by a microscopic, parasitic fungus. He concluded that the organism,later named Botrytis paradoxa (now Beauvaria) bassiana, wastransmitted among the worms by contact and by infected food.
Bassi announced his discoveries in Del mal del segno, calcinaccio o moscardino (1835; “The Disease of the Sign, Calcinaccio or Muscardine”) and proceeded to make the important generalization that many diseases of plants, animals, and man are caused by animal or vegetable parasites. Thus, he preceded both Pasteur and Robert Koch in formulating a germ theory of disease. He prescribed methods for the prevention and elimination of muscardine, the success of which earned him considerable honours.
The main credit for establishing the science of bacteriology must be accorded to the French chemist Louis Pasteur. It was Pasteur who, by a brilliant series of experiments, proved that the fermentation of wine and the souring of milk are caused by living microorganisms. His work led to the pasteurization of milk and solved problems of agriculture and industry as well as those of animal and human diseases. He successfully employed inoculations to prevent anthrax in sheep and cattle, chicken cholera in fowl, and finally rabies in humans and dogs. The latter resulted in the widespread establishment of Pasteur institutes.

Louis Pasteur
Louis Pasteur was born in 1822: he was the son of a tanner. He early showed considerable talent and at the age of twenty he received his bachelor of science degree. He heard the lectures on chemistry at the Sorbonne and was appointed laboratory assistant. He studied crystals but soon Pasteur’s interest was turned away from the study of crystals to the investigation of fermentation.
In 1857 Louis Pasteur sent to the Little Scientific Society a paper on “Alcoholic Fermentation” in which he concluded that “the deduplication of sugar into alcohol and carbolic acid is correlative to a phenomenon of life”. A new era in medicine dates from those two publications.
The facts that fever were catching, that epidemics spread, that infection could remain attached to articles of clothing, etc… all gave support to the view that the actual cause was something alive, a contagium vivum. It was really a very old view which first clearly expressed by Fracastorius. The Veronese physician in the sixteenth century, who spoke of the seeds of contagion passing from one person to another; and he was the first to draw a parallel between the processes of contagion and the fermentation of wine. But it was a study of the processes of fermentation that led Pasteur to the sure ground on which we now stand.
Pasteur studied alcoholic fermentation and lactic fermentation in sour milk; he found that both fermentations were caused by minute organisms, and were hastened by exposure to the air. He proved that the microscopic organisms were not spontaneously generated but were introduced by air.
All along the analogy between disease and fermentation must have been in Pasteur’s mind: and then came the suggestion. “What would be most desirable is to push those studies far enough to prepare the road for a serious research into the origin of various diseases”. If the changes in lactic alcoholic and butyric fermentations are due to minute living organisms, why should not the same tiny creatures make the changes which occur in the body in the putrid and suppurative diseases?
Fermentations of every sort held Pasteur’s attention during the years 1857 to 1863. This was the period of consolidation of his observation, which led ultimately to recognition of the germ theory of disease. For each type of fermentation there was a specific organism. Diseases of beer and wine could be traced to undesirable microorganisms, which set up fermentation of their own and interrupted the activities of the yeast. Spoilage of vinegar, he also found, was due to uncontrolled growth of a specific organism. The difficulty was overcome by sterilization at a temperature of about 130 degrees Fahrenheit, slopping its further development. Thus “pasteurization” was invented.
So impressed was he with the analogy between fermentation and the infectious diseases that, in 1863, he assured the French Government of his ambition “to arrive at the knowledge of the causes of putrid and contagious diseases”.
After a study upon the diseases of wines, which has had most important practical bearings, an opportunity arose which changed the whole course of his career, and profoundly influenced the development of medical science.
A disease of the silkworm had, for some years, ruined one of the most important industries in France, and in 1865 the Government asked Pasteur to give up his laboratory work and teaching, and to devote his whole energies to the task of investigating this disease and its causes. Notwithstanding all the difficulties and obstacles encountered in the problem, Pasteur carried his silkworm studies to a successful conclusion.
From Pasteur, Joseph Lister derived the concepts that enabled him to introduce the antiseptic principle into surgery. In 1865 Lister, a professor of surgery at Glasgow University, began placing an antiseptic barrier of carbolic acid between the wound and the germ-containing atmosphere. Infections and deaths fell dramatically, and his pioneering work led to more refined techniques of sterilizing the surgical environment.

Joseph Lister
Although pain had been banished from the operating-room after different methods of anaesthesia had been introduced, the spectre of infection still remained in pre-Listcrian days. Erysipelas, pyemia, septicemia and hospital gangrene were endemic in most surgical wards. The man who changed all this was Joseph Lister.
In Edinburgh he received the position of lecture on surgery at the College of Physicians and assistant surgeon to the Royal Infirmary.
Lister soon became a very busy man, taught surgery at the College of Physicians, operated at the Royal Infirmary and worked in the laboratory, studying particulary inflammation, gangrene and the coagulation of the blood and publishing papers on these subjects.
In 1860, at the age of 33, he went to Glasgow where he was appointed professor of surgery at the University. There he found the same scourges haunting the surgical wards – suppuration and gangrene.
He was struck by the fact that simple fractures healed without complications, whereas compound fractures with laceration of the skin were followed by suppuration and often gangrene and death. Furthermore, inflammation, or even suppuration, was sure to follow any wound.
Lister saw that sepsis was the principal obstacle to any great advantage in surgery. Finally, noting that closed wounds did not suppurate while open ones exposed to the air did, he concluded that suppuration was in some manner due to contact with the air but that the air alone did not cause suppuration.
He found the solution of his problem in the work of Louis Pasteur on fermentation and putrefaction; it was not the air but the germs in the air that produced suppuration. He saw at once that putrefaction could only be avoided by preventing germs from gaining access to wounds. He looked around for a suitable antiseptic, and chose carbolic acid. With it Lister made his first antiseptic dressing in March, 1865.
The case was a compound fracture of the leg, the sort of wound which previously had almost invariably become infected, often with fatal results. He washed the wound out with the carbolic solution, and applied a piece of lint soaked with the solution over it. Healing was astonishingly good and Lister was encouraged to try this method in other cases. By March 1867 he was able to report a total of eleven cases of compound fracture treated by the antiseptic method, with nine recoveries, one amputation and one death. This was an unprecedented result.
In April, 1867, he was able to write: “Since the antiseptic treatment has been brought into full operation… my wards, though in other respects under precisely the same circumstances as before, have completely changed their character, so that during the last nine months not a single instance of pyemia, hospital gangrene, or erysipelas has occurred in them.” Lister was only 40 when he wrote these words.
The antiseptic doctrine did not have a sympathetic reception in England: it was attacked by some medical men. Lister nevertheless went ahead with his experiments to improve his method. After a while he stopped using undiluted carbolic acid to purify recent wounds because he found that it caused superficial sloughing. A five per cent watery solution proved to be strong enough for his purposes.
In 1869, Lister returned to Edinburgh as professor of clinical surgery, remaining in Edinburgh nine years. He continued to work on his antiseptic methods, carried out laboratory experiments on putrefaction and fermentation, and began his important studies on ligatures. Noting that infection often came from ligatures, he soaked first silk ligatures and later catgut in carbolic acid before employing them and found that this method prevented putrefaction.
He became so obsessed with the fear that microbes might fall upon the wound during an operation that he introduced in 1870 the carbolic spray to purify the atmosphere. He clung obstinately to this practice for 17 years but finally admitted that it was superfluous.
In 1877, after an absence of 25 years. Lister retumed as professor of clinical surgery at King’s College Hospital, London. He occupied the chair of surgery for 15 years.
English surgeons in general remained hostile to Lister’s doctrine. As late as 1880 in all the British Isles there were only one or two clinics where his methods were used. But abroad Lister’s methods were promptly and thoroughly tested, and his discovery confirmed. Surgeons not only adopted Lister’s methods of controlling surgical infections, but they greatly improved upon them.
Slowly but surely Lister’s great eminence was recognised at home. In 1883 he was elected president of the Royal Society. Lister died in 1912.
Obstetrics had already been robbed of some of its terrors by Alexander Gordon at Aberdeen, Scotland, Oliver Wendell Holmes at Boston, and Ignaz Semmelweis at Vienna and Pest (Budapest), who advocated disinfection of the hands and clothing of midwives and medical students who attended confinements. These measures produced a marked reduction in cases of puerperal fever, the bacterial scourge of women following childbirth.
Another pioneer in bacteriology was the German physician Robert Koch, who showed how bacteria could be cultivated, isolated, and examined in the laboratory. A meticulous investigator, Koch discovered the organisms of tuberculosis, in 1882, and cholera, in 1883. By the end of the century many other disease-producing microorganisms had been identified.
Robert Koch is a prominent German bacteriologist, the founder of modern microbiology. He was born in 1843, died in 1910. When Koch became a doctor he carried on many experiments on mice in a small laboratory. In 1882 Koch discovered tuberculosis bacilli. In his report made in the Berlin Physiological Society Koch described in detail the morphology of tuberculosis bacilli and the ways to reveal then. Due to his discovery Koch became known all over the world. In 1884 Koch published his book on cholera. This book included the investigations of his research work carried out during the cholera epidemic in Egypt and India. From the intestines of the men with cholera Koch isolated a small comma-shaped bacterium. He determined that these bacteria spread through drinking water. In 1905 Koch got the Nobel prize for his important scientific discoveries.

In 1883 Koch went to Egypt to study cholera. At that time there was a widespread epidemic of cholera in Egypt.
Nobody knew the origin of this disease, there were not any protective measures against it.
The disease spread very rapidly from one place to another and thousands of healthy people died. But sometimes some people who were in a constant contact with the diseased person did not catch cholera.
As soon as Koch came to Alexandria he and his two assistants Gaffcky and Fisher began their investigations. In the blood, kidneys, spleen, liver and lungs of the people who died of cholera Koch found many microorganisms but all of them were not the agents of cholera. However in the walls of the intestines and in stools Koch always found a microorganism which looked like a comma. Many times Koch tried to grow this bacterium on gelatin but he failed to do it. Many times Koch inoculated this bacterium to the experimental animals, but none became ill with cholera. As the epidemic of cholera became less in Egypt, Koch went to India to continue his investigations there. In Kalcutta Koch often walked along its muddy streets, where the poor lived. Once Koch saw some muddy water on the ground near a small house.
Koch looked into that water and he thought he saw there those “commas”. He took some of this water, analysed it under the microscope many times and found there the same bacteria which he had so many times revealed in the people with cholera. Koch also established that animals could not catch this disease. The source of the disease was the water which people drank.
DISCOVERIES IN CLINICAL MEDICINE AND ANAESTHESIA
n There was perhaps some danger that in the search for bacteria other causes of disease would escape detection. Many physicians, however, were working along different lines in the 19th century. Among them were a group attached to Guy’s Hospital, in London: Richard Bright, Thomas Addison, and Sir William Gull. Bright contributed significantly to the knowledge of kidney diseases, including Bright’s disease, and Addison gave his name to disorders of the adrenal glands and the blood. Gull, a famous clinical teacher, left a legacy of pithy aphorisms that might well rank with those of Hippocrates.
n In Dublin Robert Graves and William Stokes introduced new methods in clinical diagnosis and medical training; while in Paris a leading clinician, Pierre-Charles-Alexandre Louis, was attracting many students from America by the excellence of his teaching.
n The most famous contribution by the United States to medical progress at this period was undoubtedly the introduction of general anaesthesia, a procedure that not only liberated the patient from the fearful pain of surgery but also enabled the surgeon to perform more extensive operations. The discovery was marred by controversy. Crawford Long, Gardner Colton, and Horace Wells are all claimants for priority.
n Crawford Long, American physician, is traditionally considered the first to have used ether as an anesthetic in surgery. He observed that persons injured in “ether frolics” (social gatherings of people who were in a playful state of ether-induced intoxication) seemed to suffer no pain, and in 1842 he painlessly removed a tumour from the neck of a patient to whom he had administered ether.
n Gardner Colton, American anesthetist and inventor, was among the first to utilize the anesthetic properties of nitrous oxide in medical practice. After a dentist suggested the use of the gas as an anesthetic, Colton safely used it in extracting thousands of teeth. As he was studying medicine in New York (without taking a degree), Colton learned that the inhalation of nitrous oxide, or laughing gas, produced exhilaration. After a public demonstration of its effects in New York City proved to be a financial success, he began a lecture tour of other cities.
n Horace Wells, American dentist, was a pioneer in the use of surgical anesthesia. While practicing in Hartford, Connecticut, in 1844, Wells noted the pain-killing properties of nitrous oxide (“laughing gas”) during a laughing-gas road show and thereafter used it in performing painless dental operations. He was allowed to demonstrate the method at the Massachusetts General Hospital in January 1845, but when the patient proved unresponsive to the gas, Wells was exposed to ridicule.
n It was William Thomas Morton who, on Oct. 16, 1846, at Massachusetts General Hospital, in Boston, first demonstrated before a gathering of physicians the use of ether as a general anaesthetic. He is credited with gaining the medical world’s acceptance of surgical anesthesia. The news quickly reached Europe, and general anaesthesia soon became prevalent in surgery.
n At Edinburgh, the professor of midwifery, James Young Simpson, had been experimenting upon himself and his assistants, inhaling various vapours with the object of discovering an effective anaesthetic. He was the first to use chloroform in obstetrics and the first in Britain to use ether. In November 1847 chloroform was tried with complete success, and soon it was preferred to ether and became the anaesthetic of choice.
ADVANCES AT THE END OF THE CENTURY
n Patrick Manson, a British pioneer in tropical medicine, showed in China, in 1877, how insects can carry disease and how the embryos of the Filaria worm, which can cause elephantiasis, are transmitted by the mosquito. Manson explained his views to a British army surgeon, Ronald Ross, then working on the problem of malaria, and Ross discovered the malarial parasite in the stomach of the Anopheles mosquito in 1897.
n In Cuba, Carlos Finlay expressed the view, in 1881, that yellow fever is carried by the Stegomyia mosquito. Following his lead, the Americans Walter Reed, William Gorgas, and others were able to conquer the scourge of yellow fever in Panama and made possible the completion of the Panama Canal by reducing the death rate there from 176 per 1,000 to 6 per 1,000.
n Other victories in preventive medicine ensued, because the maintenance of health was now becoming as important a concern as the cure of disease; and the 20th century was to witness the evolution and progress of national health services in a number of countries.
n In addition, spectacular advances in diagnosis and treatment followed the discovery of X rays by Wilhelm Conrad Röntgen, in 1895, and of radium by Pierre and Marie Curie in 1898. Before the turn of the century, too, the vast new field of psychiatry had been opened up by Sigmund Freud.
n The tremendous increase in scientific knowledge during the 19th century radically altered and expanded the practice of medicine. Concern for upholding the quality of services led to the establishment of public and professional bodies to govern the standards for medical training and practice.
The 20th century has produced such a plethora of discoveries and advances that in some ways the face of medicine has changed out of all recognition. In 1901, for instance, in the United Kingdom the expectation of life at birth, a primary indicator of the effect of health care on mortality (but also reflecting the state of health education, housing, and nutrition), was 48 years for males and 51.6 years for females. After steady increases, by the 1980s life expectancy had reached 71.4 years for males and 77.2 years for females. Other industrialized nations showed similar dramatic increases. Indeed, the outlook has so altered that, with the exception of diseases such as cancer and AIDS, attention has become focused on morbidity rather than mortality, and the emphasis has changed from keeping people alive to keeping them fit.
The rapid progress of medicine in this era was reinforced by enormous improvements in communication between scientists throughout the world. Through publications, conferences, and later computers and electronic media, they freely exchanged ideas and reported on their endeavours. No longer was it common for an individual to work in isolation. Although specialization increased, teamwork became the norm. It consequently has become more difficult to ascribe medical accomplishments to particular individuals.
In the first half of the century, emphasis continued to be placed on combating infection, and notable landmarks were also attained in endocrinology, nutrition, and other areas. In the years following World War II, insights derived from cell biology altered basic concepts of the disease process; new discoveries in biochemistry and physiology opened the way for more precise diagnostic tests and more effective therapies; and spectacular advances in biomedical engineering enabled the physician and surgeon to probe into the structures and functions of the body by non-invasive imaging techniques like ultrasound (sonar), computerized axial tomography (CAT), and nuclear magnetic resonance (NMR). With each new scientific development, medical practices of just a few years earlier became obsolete.
Infectious diseases and chemotherapy
In the years following the turn of the century, ongoing research concentrated on the nature of infectious diseases and their means of transmission. Increasing numbers of pathogenic organisms were discovered and classified. Some, such as the rickettsias, which cause diseases like typhus, were smaller than bacteria; some were larger, such as the protozoans that engender malaria and other tropical diseases. The smallest to be identified were the viruses, producers of many diseases, among them mumps, measles, German measles, and poliomyelitis; and in 1910 Peyton Rous showed that a virus could also cause a malignant tumour, a sarcoma in chickens.
There was still little to be done for the victims of most infectious organisms beyond drainage, poultices, and ointments, in the case of local infections, and rest and nourishment for severe diseases. The search for treatments aimed at both vaccines and chemical remedies.
Germany was well to the forefront in medical progress. The scientific approach to medicine had been developed there long before it spread to other countries, and postgraduates flocked to German medical schools from all over the world. The opening decade of the 20th century has been well described as the golden age of German medicine. Outstanding among its leaders was Paul Ehrlich.
While still a student, Ehrlich carried out some work on lead poisoning from which he evolved the theory that was to guide much of his subsequent work—that certain tissues have a selective affinity for certain chemicals. He experimented with the effects of various chemical substances on disease organisms. In 1910, with his colleague Sahachiro Hata, he conducted tests on arsphenamine, once sold under the commercial name Salvarsan. Their success inaugurated the chemotherapeutic era, which was to revolutionize the treatment and control of infectious diseases. Salvarsan, a synthetic preparation containing arsenic, is lethal to the microorganism responsible for syphilis. Until the introduction of penicillin, Salvarsan or one of its modifications remained the standard treatment of syphilis and went far toward bringing this social and medical scourge under control.
Sulfonamide drugs
In 1932 the German bacteriologist Gerhard Domagk announced that the red dye Prontosil is active against streptococcal infections in mice and humans. Soon afterward French workers showed that its active antibacterial agent is sulphanilamide. In 1936 the English physician Leonard Colebrook and his colleagues provided overwhelming evidence of the efficacy of both Prontosil and sulphanilamide in streptococcal septicemia (bloodstream infection), thereby ushering in the sulphonamide era. New sulphonamides, which appeared with astonishing rapidity, had greater potency, wider antibacterial range, or lower toxicity. Some stood the test of time; others, like the original sulphanilamide and its immediate successor, sulfapyridine, were replaced by safer and more powerful successors.
Antibiotics
A dramatic episode in medical history occurred in 1928, when Alexander Fleming noticed the inhibitory action of a stray mold on a plate culture of staphylococcus bacteria in his laboratory at St. Mary’s Hospital, London. Many other bacteriologists must have made the observation, but none had realized the possible implications. The mold was a strain of Penicillium—P. notatum—which gave its name to the now-famous drug penicillin. In spite of his conviction that penicillin was a potent antibacterial agent, Fleming was unable to carry his work to fruition, mainly because biochemists at the time were unable to isolate it in sufficient quantities or in a sufficiently pure form to allow its use on patients.

Alexander Fleming
Ten years later Howard Florey, Ernst Chain, and their colleagues at Oxford University took up the problem again they isolated penicillin in a form that was fairly pure (by standards then current) and demonstrated its potency and relative lack of toxicity. By then World War II had begun, and techniques to facilitate commercial production were developed in the United States. By 1944 adequate amounts were available to meet the extraordinary needs of wartime.
Antituberculous drugs
While penicillin is the most useful and the safest antibiotic, it suffers from certain disadvantages. The most important of these is that it is not active against Mycobacterium tuberculosis, the bacillus of tuberculosis. In view of the importance of tuberculosis as a public health hazard, this is a serious defect. The position was rapidly rectified when, in 1944, Selman Waksman, Albert Schatz, and Elizabeth Bugie announced the discovery of streptomycin from cultures of a soil organism, Streptomyces griseus, and stated that it was active against M. tuberculosis. Subsequent clinical trials amply confirmed this claim. Streptomycin suffers, however, from the great disadvantage that the tubercle bacillus tends to become resistant to it. Fortunately, other drugs became available to supplement it, the two most important being para-aminosalicylic acid (PAS) and isoniazid. With a combination of two or more of these preparations, the outlook in tuberculosis improved immeasurably. The disease was not conquered, but it was brought well under control.
Other antibiotics
Penicillin is not effective over the entire field of microorganisms pathogenic to humans. During the 1950s the search for antibiotics to fill this gap resulted in a steady stream of them, some with a much wider antibacterial range than penicillin (the so-called broad-spectrum antibiotics) and some capable of coping with those microorganisms that are inherently resistant to penicillin or that have developed resistance through exposure to penicillin.
This tendency of microorganisms to develop resistance to penicillin at one time threatened to become almost as serious a problem as the development of resistance to streptomycin by the bacillus of tuberculosis. Fortunately, early appreciation of the problem by clinicians resulted in more discriminate use of penicillin. Scientists continued to look for means of obtaining new varieties of penicillin, and their researches produced the so-called semisynthetic antibiotics, some of which are active when taken by mouth, while others are effective against microorganisms that have developed resistance to the earlier form of penicillin.
Immunology
Dramatic though they undoubtedly were, the advances in chemotherapy still left one important area vulnerable, that of the viruses. It was in bringing viruses under control that advances in immunology – the study of immunity – played such a striking part. One of the paradoxes of medicine is that the first large-scale immunization against a viral disease was instituted and established long before viruses were discovered. When Edward Jenner introduced vaccination against the virus that causes smallpox, the identification of viruses was still 100 years in the future. It took almost another half century to discover an effective method of producing antiviral vaccines that were both safe and effective.
In the meantime, however, the process by which the body reacts against infectious organisms to generate immunity became better understood. In Paris, Élie Metchnikoff had already detected the role of white blood cells in the immune reaction, and Jules Bordet had identified antibodies in the blood serum. The mechanisms of antibody activity were used to devise diagnostic tests for a number of diseases. In 1906 August von Wassermann gave his name to the blood test for syphilis, and in 1908 the tuberculin test – the skin test for tuberculosis – came into use. At the same time, methods of producing effective substances for inoculation were improved, and immunization against bacterial diseases made rapid progress.
Antibacterial vaccination
Typhoid
In 1897 the English bacteriologist Almroth Wright introduced a vaccine prepared from killed typhoid bacilli as a preventive of typhoid. Preliminary trials in the Indian army produced excellent results, and typhoid vaccination was adopted for the use of British troops serving in the South African War. Unfortunately, the method of administration was inadequately controlled, and the government sanctioned inoculations only for soldiers that “voluntarily presented themselves for this purpose prior to their embarkation for the seat of war.” The result was that, according to the official records, only 14,626 men volunteered out of a total strength of 328,244 who served during the three years of the war. Although later analysis showed that inoculation had had a beneficial effect, there were 57,684 cases of typhoid – approximately one in six of the British troops engaged – with 9,022 deaths.
A bitter controversy over the merits of the vaccine followed, but before the outbreak of World War I immunization had been officially adopted by the army. Comparative statistics would seem to provide striking confirmation of the value of antityphoid inoculation, even allowing for the better sanitary arrangements in the latter war. In the South African War the annual incidence of enteric infections (typhoid and paratyphoid) was 105 per 1,000 troops, and the annual death rate was 14.6 per 1,000; the comparable figures for World War I were 2.35 and 0.139, respectively.
It is perhaps a sign of the increasingly critical outlook that developed in medicine in the post-1945 era that experts continued to differ on some aspects of typhoid immunization. There was no question as to its fundamental efficacy, but there was considerable variation of opinion as to the best vaccine to use and the most effective way of administering it. Moreover, it was often difficult to decide to what extent the decline in typhoid was attributable to improved sanitary conditions and what to the greater use of the vaccine.
Tetanus
The other great hazard of war that was brought under control in World War I was tetanus. This was achieved by the prophylactic injection of tetanus antitoxin into all wounded men. The serum was originally prepared by the bacteriologists Emil von Behring and Shibasaburo Kitasato in 1890 – 92, and the results of this first large-scale trial amply confirmed its efficacy. (Tetanus antitoxin is a sterile solution of antibody globulins – a type of blood protein – from immunized horses or cattle.)
It was not until the 1930s, however, that an efficient vaccine, or toxoid, as it is known in the cases of tetanus and diphtheria,was produced against tetanus. (Tetanus toxoid is a preparation of the toxin – or poison – produced by the microorganism; injected into humans, it stimulates the body’s own defences against the disease, thus bringing about immunity.) Again, a war was to provide the opportunity for testing on a large scale, and experience with tetanus toxoid in World War II indicated that it gave a high degree of protection.
Diphtheria
The story of diphtheria is comparable to that of tetanus, though even more dramatic. First, as with tetanus antitoxin, came the preparation of diphtheria antitoxin by Behring and Kitasato in 1890. As the antitoxin came into general use for the treatment of cases, the death rate began to decline. There was no significant fall in the number of cases, however, until a toxin–antitoxin mixture, introduced by Behring in 1913, was used to immunize children. A more effective toxoid was introduced by the French bacteriologist Gaston Ramon in 1923, and with subsequent improvements this became one of the most effective vaccines available in medicine. Where mass immunization of children with the toxoid was practiced, as in the United States and Canada beginning in the late 1930s and in England and Wales in the early 1940s, cases of diphtheria and deaths from the disease became almost nonexistent. In England and Wales, for instance, the number of deaths fell from an annual average of 1,830 in 1940 – 44 to zero in 1969. Administration of a combined vaccine against diphtheria, pertussis (whooping cough), and tetanus (DPT) is recommended for young children. Although an increasing number of dangerous side effects from the DPT vaccine have been reported, it continues to be used in most countries because of the protection it affords.
BCG vaccine for tuberculosis
If, as is universally accepted, prevention is better than cure, immunization is the ideal way of dealing with diseases caused by microorganisms. An effective, safe vaccine protects the individual from disease, whereas chemotherapy merely copes with the infection once the individual has been affected. In spite of its undoubted value, however, immunization has been a recurring source of dispute. Like vaccination against typhoid (and against poliomyelitis later), tuberculosis immunization evoked widespread contention.
In 1908 Albert Calmette, a pupil of Pasteur, and Camille Guérin produced an avirulent (weakened) strain of the tubercle bacillus. About 13 years later, vaccination of children against tuberculosis was introduced, with a vaccine made from this avirulent strain and known as BCG (bacillus Calmette-Guérin) vaccine. Although it was adopted in France, Scandinavia, and elsewhere, British and U.S. authorities frowned upon its use on the grounds that it was not safe and that the statistical evidence in its favour was not convincing.
One of the stumbling blocks in the way of its widespread adoption was what came to be known as the Lübeck disaster. In the spring of 1930, 249 infants were vaccinated with BCG vaccine in Lübeck, Ger.; by autumn, 73 of the 249 were dead. Criminal proceedings were instituted against those responsible for giving the vaccine. The final verdict was that the vaccine had been contaminated, and the BCG vaccine itself was exonerated from any responsibility for the deaths. A bitter controversy followed, but in the end the protagonists of the vaccine won when a further trial showed that the vaccine was safe and that it protected four out of five of those vaccinated.
Immunization against viral diseases
With the exception of smallpox, it was not until well into the 20th century that efficient viral vaccines became available. In fact, it was not until the 1930s that much began to be known about viruses. The two developments that contributed most to the rapid growth in knowledge after that time were the introduction of tissue culture as a means of growing viruses in the laboratory and the availability of the electron microscope. Once the virus could be cultivated with comparative ease in the laboratory, the research worker could study it with care and evolve methods for producing one of the two requirements for a safe and effective vaccine: either a virus that was so attenuated, or weakened, that it could not produce the disease for which it was responsible in its normally virulent form; or a killed virus that retained the faculty of inducing a protective antibody response in the vaccinated individual.
The first of the viral vaccines to result from these advances was for yellow fever, developed by the microbiologist Max Theiler in the late 1930s. About 1945 the first relatively effective vaccine was produced for influenza; in 1954 the American physician Jonas E. Salk introduced a vaccine for poliomyelitis; and in 1960 an oral poliomyelitis vaccine, developed by the virologist Albert B. Sabin, came into wide use.
These vaccines went far toward bringing under control three of the major diseases of the time although, in the case of influenza, a major complication is the disturbing proclivity of the virus to change its character from one epidemic to another. Even so, sufficient progress has been made to ensure that a pandemic like the one that swept the world in 1918 – 19, killing more than 15,000,000 people, is unlikely to occur again. Centres are now equipped to monitor outbreaks of influenza throughout the world in order to establish the identity of the responsible viruses and, if necessary, take steps to produce appropriate vaccines.
During the 1960s effective vaccines came into use for measles and rubella (German measles). Both evoked a certain amount of controversy. In the case of measles in the Western world it was contended that, if acquired in childhood, it is not a particularly hazardous malady, and the naturally acquired disease evokes permanent immunity in the vast majority of cases. Conversely, the vaccine induces a certaiumber of adverse reactions, and the duration of the immunity it produces is problematical. In the end the official view was that universal measles vaccination is to be commended.
The situation with rubella vaccination was different. This is a fundamentally mild affliction, and the only cause for anxiety is its proclivity to induce congenital deformities if a pregnant woman should acquire the disease. Once an effective vaccine was available, the problem was the extent to which it should be used. Ultimately the consensus was reached that all girls who had not already had the disease should be vaccinated at about 12 years. In the United States children are routinely immunized against measles, mumps, and rubella at the age of 15 months.
The immune response
With advances in cell biology in the second half of the 20th century came a more profound understanding of both normal and abnormal conditions in the body. Electron microscopy enabled observers to peer more deeply into the structures of the cell, and chemical investigations revealed clues to their functions in the cell’s intricate metabolism. The overriding importance of the nuclear genetic material DNA (deoxyribonucleic acid) in regulating the cell’s protein and enzyme production lines became evident. A clearer comprehension also emerged of the ways in which the cells of the body defend themselves by modifying their chemical activities to produce antibodies against injurious agents.
Up until the turn of the century, immunity referred mostly to the means of resistance of an animal to invasion by a parasite or microorganism. Around mid-century there arose a growing realization that immunity and immunology cover a much wider field and are concerned with mechanisms for preserving the integrity of the individual. The introduction of organ transplantation, with its dreaded complication of tissue rejection, brought this broader concept of immunology to the fore.
At the same time, research workers and clinicians began to appreciate the far-reaching implications of immunity in relation to endocrinology, genetics, tumour biology, and the biology of a number of other maladies. The so-called autoimmune diseases are caused by an aberrant series of immune responses by which the bodies own cells are attacked. Suspicion is growing that a number of major disorders such as diabetes, rheumatoid arthritis, and multiple sclerosis may be caused by similar mechanisms.
In some conditions viruses invade the genetic material of cells and distort their metabolic processes. Such viruses may lie dormant for many years before becoming active. This may be the underlying cause of many cancers, in which cells escape from the usual constraints imposed upon them by the normal body. The dreaded affliction of acquired immune deficiency syndrome (AIDS) is caused by a virus that has a long dormant period and then attacks the cells that produce antibodies. The result is that the affected person is not able to generate an immune response to infections or malignancies.
Endocrinology
At the beginning of the 20th century, endocrinology was in its infancy. Indeed, it was not until 1905 that Ernest H. Starling, one of the many brilliant pupils of Edward Sharpey-Schafer, the dean of British physiology during the early decades of the century, introduced the term hormone for the internal secretions of the endocrine glands. In 1891 the English physician George Redmayne Murray achieved the first success in treating myxedema (the common form of hypothyroidism) with an extract of the thyroid gland. Three years later, Sharpey-Schafer and George Oliver demonstrated in extracts of the adrenal glands a substance that raised the blood pressure; and in 1901 Jokichi Takamine, a Japanese chemist working in the United States, isolated this active principle, known as epinephrine or adrenaline.
Insulin
During the first two decades of the century, steady progress was made in the isolation, identification, and study of the active principles of the various endocrine glands, but the outstanding event of the early years was the discovery of insulin by Frederick Banting, Charles H. Best, and J.J.R. Macleodin 1921. Almost overnight the lot of the diabetic patient changed from a sentence of almost certain death to a prospect not only of survival but of a long and healthy life.

Frederick Banting

Charles H. Best

J.J.R. Macleodin
For more than 30 years, some of the greatest minds in physiology had been seeking the cause of diabetes mellitus. In 1889 the German physicians Joseph von Mering and Oskar Minkowski had shown that removal of the pancreas in dogs produced the disease. In 1901 the American pathologist Eugene L. Opie described degenerative changes in the clumps of cells in the pancreas known as the islets of Langerhans, thus confirming the association between failure in the function of these cells and diabetes. Sharpey-Schafer concluded that the islets of Langerhans secrete a substance that controls the metabolism of carbohydrate. Then Banting, Best, and Macleod, working at the University of Toronto, succeeded in isolating the elusive hormone and gave it the name insulin.
Insulin was available in a variety of forms, but synthesis on a commercial scale was not achieved, and the only source of the hormone was the pancreas of animals. One of its practical disadvantages is that it has to be given by injection; consequently an intense search was conducted for some alternative substance that would be active when taken by mouth. Various preparations – oral hypoglycemic agents, as they are known – appeared that were effective to a certain extent in controlling diabetes, but evidence indicated that these were only of value in relatively mild cases of the disease. For the person with advanced diabetes, a normal, healthy life remained dependent upon the continuing use of insulin injections.
Cortisone
Another major advance in endocrinology came from the Mayo Clinic, in Rochester, Minn. In 1949 Philip S. Hench and his colleagues announced that a substance isolated from the cortex of the adrenal gland had a dramatic effect upon rheumatoid arthritis. This was compound E, or cortisone, as it came to be known, which had been isolated by Edward C. Kendall in 1935. Cortisone and its many derivatives proved to be potent as anti-inflammatory agents. Although it is not a cure for rheumatoid arthritis, as a temporary measure cortisone can often control the acute exacerbation caused by the disease and can provide relief in other conditions, such as acute rheumatic fever, certain kidney diseases, certain serious diseases of the skin, and some allergic conditions, including acute exacerbations of asthma. Of even more long-term importance is the valuable role it has as a research tool.
Sex hormones
Not the least of the advances in endocrinology was the increasing knowledge and understanding of the sex hormones. This culminated in the application of this knowledge to the problem of birth control. After an initial stage of hesitancy, the contraceptive pill, with its basic rationale of preventing ovulation, was accepted by the vast majority of family-planning organizations and many gynecologists as the most satisfactory method of contraception. Its risks, practical and theoretical, introduced a note of caution, but this was not sufficient to detract from the wide appeal induced by its effectiveness and ease of use.
Vitamins
In the field of nutrition, the outstanding advance of the 20th century was the discovery and the appreciation of the importance to health of the “accessory food factors,” or vitamins. Various workers had shown that animals did not thrive on a synthetic diet containing all the correct amounts of protein, fat, and carbohydrate; they even suggested that there must be some unknown ingredients iatural food that were essential for growth and the maintenance of health. But little progress was made in this field until the classical experiments of the English biologist F. Gowland Hopkins were published in 1912. These were so conclusive that there could be no doubt that what he termed “accessory substances” were essential for health and growth.
The name vitamine was suggested for these substances by the biochemist Casimir Funk in the belief that they were amines, certain compounds derived from ammonia. In due course, when it was realized that they were not amines, the term was altered to vitamin.
Once the concept of vitamins was established on a firm scientific basis it was not long before their identity began to be revealed. Soon there was a long series of vitamins, best known by the letters of the alphabet after which they were originally named when their chemical identity was still unknown. By supplementing the diet with foods containing particular vitamins, deficiency diseases such as rickets (due to deficiency of vitamin D) and scurvy (due to lack of vitamin C, or ascorbic acid) practically disappeared from Western countries, while deficiency diseases such as beriberi (caused by lack of vitamin B1, or thiamine), which were endemic in Eastern countries, either disappeared or could be remedied with the greatest of ease.
The isolation of vitamin B12, or cyanocobalamin, was of particular interest because it almost rounded off the fascinating story of how pernicious anemia was brought under control. Throughout the first two decades of the century, the diagnosis of pernicious anemia, like that of diabetes mellitus, was nearly equivalent to a death sentence. Unlike the more common form of so-called secondary anemia, it did not respond to the administration of suitable iron salts, and no other form of treatment touched it; hence, the grimly appropriate title of pernicious anemia.
In the early 1920s, George R. Minot, one of the many brilliant investigators that Harvard University has contributed to medical research, became interested in work being done by the American pathologist George H. Whipple on the beneficial effects of raw beef liver in severe experimental anemia. With a Harvard colleague, William P. Murphy, he decided to investigate the effect of raw liver in patients with pernicious anemia, and in 1926 they were able to announce that this form of therapy was successful. The validity of their findings was amply confirmed, and the fear of pernicious anemia came to an end.
As so often happens in medicine, many years were to pass before the rationale of liver therapy in pernicious anemia was fully understood. In 1948, however, almost simultaneously in the United States and Britain, the active principle, cyanocobalamin, was isolated from liver, and this vitamin became the standard treatment for pernicious anemia.
Malignant disease
While progress was the hallmark of medicine after the beginning of the 20th century, there is one field in which a gloomier picture must be painted, that of malignant disease, or cancer. It is the second most common cause of death in most Western countries in the second half of the 20th century, being exceeded only by deaths from heart disease. Some progress, however, has been achieved. The causes of the various types of malignancies are not known, but many more methods are available for attacking the problem; surgery remains the principal therapeutic standby, but radiotherapy and chemotherapy are increasingly used.
Soon after the discovery of radium was announced, in 1898, its potentialities in treating cancer were realized; in due course it assumed an important role in therapy. Simultaneously, deep X-ray therapy was developed, and with the atomic age came the use of radioactive isotopes. (A radioactive isotope is an unstable variant of a substance that has a stable form; during the process of breaking down, the unstable form emits radiation.) High-voltage X-ray therapy and radioactive isotopes have largely replaced radium. Whereas irradiation long depended upon X rays generated at 250 kilovolts, machines that are capable of producing X rays generated at 8,000 kilovolts and betatrons of up to 22,000,000 electron volts (MeV) have come into clinical use.
The most effective of the isotopes is radioactive cobalt. Telecobalt machines (those that hold the cobalt at a distance from the body) are available containing 2,000 curies or more of the isotope, an amount equivalent to 3,000 grams of radium and sending out a beam equivalent to that from a 3,000-kilovolt X-ray machine.
Of even more significance have been the developments in the chemotherapy of cancer. Nothing remotely resembling a chemotherapeutic cure has been achieved, but in certain forms of malignant disease, such as leukemia, which cannot be treated by surgery, palliative effects have been achieved that prolong life and allow the patient in many instances to lead a comparatively normal existence.
Fundamentally, however, perhaps the most important advance of all in this field has been the increasing appreciation of the importance of prevention. The discovery of the relationship between cigarette smoking and lung cancer is the classic example. Less publicized, but of equal import, is the continuing supervision of new techniques in industry and food manufacture in an attempt to ensure that they do not involve the use of cancer-causing substances.
Tropical medicine
The first half of the 20th century witnessed the virtual conquest of three of the major diseases of the tropics: malaria, yellow fever, and leprosy. At the turn of the century, as for the preceding two centuries, quinine was the only known drug to have any appreciable effect on malaria. With the increasing development of tropical countries and rising standards of public health, it became obvious that quinine was not completely satisfactory. Intensive research between World Wars I and II indicated that several synthetic compounds were more effective. The first of these to become available, in 1934, was quinacrine (known as mepacrine, Atabrine, or Atebrin). In World War II it amply fulfilled the highest expectations and helped to reduce disease among Allied troops in Africa, Southeast Asia, and the Far East. A number of other effective antimalarial drugs subsequently became available.
An even brighter prospect—the virtual eradication of malaria—was opened up by the introduction, during World War II, of the insecticide DDT (1,1,1-trichloro-2,2,-bis[p-chlorophenyl]ethane, or dichlorodiphenyltrichloro-ethane). It had long been realized that the only effective way of controlling malaria was to eradicate the anopheline mosquitoes that transmit the disease. Older methods of mosquito control, however, were cumbersome and expensive. The lethal effect of DDT on the mosquito, its relative cheapness, and its ease of use on a widespread scale provided the answer. An intensive worldwide campaign, sponsored by the World Health Organization, was planned and went far toward bringing malaria under control.
The major problem encountered with respect to effectiveness was that the mosquitoes were able to develop a resistance to DDT; but the introduction of other insecticides, such as dieldrinand lindane (BHC), helped to overcome this difficulty. In recent years the use of these and other insecticides has been strongly criticized by ecologists, however.
Yellow fever is another mosquito-transmitted disease, and the prophylactic value of modern insecticides in its control was almost as great as in the case of malaria. The forest reservoirs of the virus present a more difficult problem, but the combined use of immunization and insecticides did much to bring this disease under control.
Until the 1940s the only drugs available for treating leprosy were the chaulmoogra oils and their derivatives. These, though helpful, were far from satisfactory. In the 1940s the group of drugs known as the sulfones appeared, and it soon became apparent that they were infinitely better than any other group of drugs in the treatment of leprosy. Several other drugs later proved promising. Although there is as yet no known cure – in the strict sense of the term – for leprosy, the outlook has so changed that there are good grounds for believing that this age-old scourge can be brought under control and the victims of the disease saved from those dreaded mutilations that have given leprosy such a fearsome reputation throughout the ages.
Surgery in the 20th century
The opening phase
Three seemingly insuperable obstacles beset the surgeon in the years before the mid-19th century: pain, infection, and shock. Once these were overcome, the surgeon believed that he could burst the bonds of centuries and become the master of his craft. There is more, however, to anesthesia than putting the patient to sleep. Infection, despite first antisepsis (destruction of microorganisms present) and later asepsis (avoidance of contamination), is still an ever-present menace; and shock continues to perplex physicians. But in the 20th century, surgery has progressed farther, faster, and more dramatically than in all preceding ages.
The shape of surgery that entered the new century was clearly recognizable as the forerunner of today’s, blurred and hazy though the outlines may now seem. The operating theatre still retained an aura of the past, when the surgeon played to his audience and the patient was little more than a stage prop. In most hospitals it was a high room lit by a skylight, with tiers of benches rising above the narrow, wooden operating table. The instruments, kept in glazed or wooden cupboards around the walls, were of forged steel, unplated, and with handles of wood or ivory.
The means to combat infection hovered between antisepsis and asepsis. Instruments and dressings were mostly sterilized by soaking them in dilute carbolic acid (or other antiseptic), and the surgeon often endured a gown freshly wrung out in the same solution. Asepsis gained ground fast, however. It had been born in the Berlin clinic of Ernst von Bergmann where, in 1886, steam sterilization had been introduced. Gradually, this led to the complete aseptic ritual, which has as its basis the bacterial cleanliness (as opposed to social cleanliness) of everything that comes in contact with the wound. Hermann Kümmell, of Hamburg, devised the routine of “scrubbing up”. In1890 William Stewart Halsted, of Johns Hopkins University, had rubber gloves specially made for operating, and in 1896 Johannes von Mikulicz-Radecki, a Pole working at Breslau, Ger., invented the gauze mask.
Many surgeons, brought up in a confused misunderstanding of the antiseptic principle – believing that carbolic would cover a multitude of sins, many of which they were ignorant of committing – failed to grasp what asepsis was all about. Thomas Annandale, for example, blew through his catheters to make sure that they were clear, and many an instrument, dropped accidentally, was simply given a quick wipe and returned to use. Tradition died hard, and asepsis had an uphill struggle before it was fully accepted. “I believe firmly that more patients have died from the use of gloves than have ever been saved from infection by their use,” wrote W.P. Carr, an American, in 1911. Over the years, however, a sound technique was evolved as the foundation for the growth of modern surgery.
Anesthesia, at the turn of the century, progressed slowly. Few physicians made a career of the subject, and frequently the patient was rendered unconscious by a student, a nurse, or a porter wielding a rag and bottle. Chloroform was overwhelmingly more popular than ether, on account of its ease of administration, despite the fact that it was liable to kill by stopping the heart.
Although by the end of the first decade, nitrous oxide (laughing gas) combined with ether had displaced – but by no means entirely – the use of chloroform, the surgical problems were far from ended. For years to come the abdominal surgeon besought the anesthetist to deepen the level of anesthesia and thus relax the abdominal muscles; the anesthetist responded to the best of his ability, acutely aware that the deeper he went, the closer the patient was to death. When other anesthetic agents were discovered, the anesthetist came into his own, and many advances in spheres such as brain and heart surgery would have been impossible without his skill.
The third obstacle, shock, is perhaps the most complex and the most difficult to define satisfactorily. The only major cause properly appreciated at the start of the 20th century was loss of blood, and once that had occurred nothing, in those days, could be done. And so, the study of shock – its causes, its effects on human physiology, and its prevention and treatment – became allimportant to the progress of surgery.
In the latter part of the 19th century, then, surgeons had been liberated from the age-old bogies of pain, pus, and hospital gangrene. Hitherto, operations had been restricted to amputations, cutting for stone in the bladder, tying off arterial aneurysms (bulging and thinning of artery walls), repairing hernias, and a variety of procedures that could be done without going too deeply beneath the skin. But the anatomical knowledge, a crude skill derived from practice on dead bodies, and above all the enthusiasm, were there waiting. Largely ignoring the mass of problems they uncovered, surgeons launched forth into an exploration of the human body.
They acquired a reputation for showmanship; but much of their surgery, though speedy and spectacular, was rough and ready. There were a few who developed supreme skill and dexterity and could have undertaken a modern operation with but little practice; indeed, some devised the very operations still in use today. One such was Theodor Billroth, head of the surgical clinic at Vienna, who collected a formidable list of successful “first” operations. He represented the best of his generation—a surgical genius, an accomplished musician, and a kind, gentle man who brought the breath of humanity to his work. Moreover, the men he trained, including von Mikulicz, Vincenz Czerny, and Anton von Eiselsberg, consolidated the brilliant start that he had given to abdominal surgery in Europe.
Changes before World War I
The opening decade of the 20th century was a period of transition. Flamboyant exhibitionism was falling from favour as surgeons, through experience, learned the merits of painstaking, conscientious operation – treating the tissues gently and carefully controlling every bleeding point. The individualist was not submerged, however, and for many years the development of the various branches of surgery rested on the shoulders of a few clearly identifiable men. Teamwork on a large scale arrived only after World War II. The surgeon, at first, was undisputed master in his own wards and theatre. But as time went on and he found he could not solve his problems alone, he called for help from specialists in other fields of medicine and, even more significantly, from his colleagues in other scientific disciplines.
The increasing scope of surgery led to specialization. Admittedly, most general surgeons had a special interest, and for a long time there had been an element of specialization in such fields as ophthalmology, orthopedics, obstetrics, and gynecology; but before long it became apparent that, to achieve progress in certain areas, surgeons had to concentrate their attention on that particular subject.
Abdominal surgery
By the start of the 20th century, abdominal surgery, which provided the general surgeon with the bulk of his work, had grown beyond infancy, thanks largely to Billroth. In 1881 he had performed the first successful removal of part of the stomach for cancer. His next two cases were failures, and he was stoned in the streets of Vienna. Yet, he persisted and by 1891 had carried out 41 more of these operations with 16 deaths – a remarkable achievement for that era.
Peptic ulcers (gastric and duodenal) appeared on the surgical scene (perhaps as a new disease, but more probably because they had not been diagnosed previously), and in 1881 Ludwig Rydygier cured a young woman of her gastric ulcer by removing it. Bypass operations – gastroenterostomies – soon became more popular, however, and enjoyed a vogue that lasted into the 1930s, even though fresh ulcers at the site of the juncture were not uncommon.
The other end of the alimentary tract was also subjected to surgical intervention; cancers were removed from the large bowel and rectum with mortality rates that gradually fell from 80 to 60 to 20 to 12 percent as the surgeons developed their skill. In 1908 the British surgeon Ernest Miles carried out the first abdominoperineal resection for cancer of the rectum; that is, the cancer was attacked both from the abdomen and from below through the perineum (the area between the anus and the genitals), either by one surgeon, who actually did two operations, or by two working together. This technique formed the basis for all future developments.
Much of the new surgery in the abdomen was for cancer, but not all. Appendectomy became the accepted treatment for appendicitis (in appropriate cases) in the United States before the close of the 19th century; but in Great Britain surgeons were reluctant to remove the organ until 1902, when King Edward VII’s coronation was dramatically postponed on account of his appendicitis. The publicity attached to his operation caused the disease and its surgical treatment to become fashionable—despite the fact that the royal appendix remained in the King’s abdomen; the surgeon, Frederic Treves, had merely drained the abscess.
Neurosurgery
Though probably the most demanding of all the surgical specialties, neurosurgery was nevertheless one of the first to emerge. The techniques and principles of general surgery were inadequate for work in such a delicate field. William Macewen, a Scottish general surgeon of outstanding versatility, and Victor Alexander Haden Horsley, the first British neurosurgeon, showed that the surgeon had much to offer in the treatment of disease of the brain and spinal cord. Macewen, in 1893, recorded 19 patients operated on for brain abscess, 18 of whom were cured; at that time most other surgeons had 100 percent mortality rates for the condition. His achievement remained unequaled until the discovery of penicillin.
An American, Harvey Williams Cushing, almost by himself consolidated neurosurgery as a specialty. From 1905 on, he advanced neurosurgery through a series of operations and through his writings. Tumours, epilepsy, trigeminal neuralgia, and pituitary disorders were among the conditions he treated successfully.
Radiology
In 1895 a development at the University of Würzburg had far-reaching effects on medicine and surgery, opening up an entirely fresh field of the diagnosis and study of disease and leading to a new form of treatment, radiation therapy. This was the discovery of X rays by Wilhelm Conrad Röntgen, a professor of physics. Within months of the discovery there was an extensive literature on the subject: Robert Jones, a British surgeon, had localized a bullet in a boy’s wrist before operating; stones in the urinary bladder and gallbladder had been demonstrated; and fractures had been displayed.

Wilhelm Conrad Röntgen
Experiments began on introducing substances that are opaque to X rays into the body to reveal organs and formations, both normal and abnormal. Walter Cannon, a Boston physiologist, used X rays in 1898 in his studies of the alimentary tract. Friedrich Voelcker, of Heidelberg, devised retrograde pyelography (introduction of the radiopaque medium into the kidney pelvis by way of the ureter) for the study of the urinary tract in 1905; in Paris in 1921, Jean Sicard X-rayed the spinal canal with the help of an oily iodine substance, and the next year he did the same for the bronchial tree; and in 1924 Evarts Graham, of St. Louis, used a radiopaque contrast medium to view the gallbladder. Air was also used to provide contrast; in 1918, at Johns Hopkins, Walter Dandy injected air into the ventricles (liquid-filled cavities) of the brain.
The problems of injecting contrast media into the blood vessels took longer to solve, and it was not until 1927 that António Moniz, of Lisbon, succeeded in obtaining pictures of the arteries of the brain. Eleven years later, George Robb and Israel Steinberg of New York overcame some of the difficulties of cardiac catheterization (introduction of a small tube into the heart by way of veins or arteries) and were able to visualize the chambers of the heart on X-ray film. After much research, a further refinement came in 1962, when Frank Sones and Earl K. Shirey of Cleveland showed how to introduce the contrast medium into the coronary arteries.
World War I
The battlefields of the 20th century stimulated the progress of surgery and taught the surgeon innumerable lessons, which were subsequently applied in civilian practice. Regrettably, though, the principles of military surgery and casualty evacuation, which can be traced back to the Napoleonic wars, had to be learned over again.
World War I broke, quite dramatically, the existing surgical hierarchy and rule of tradition. No longer did the European surgeon have to waste his best years in apprenticeship before seating himself in his master’s chair. Suddenly, young surgeons in the armed forces began confronting problems that would have daunted their elders. Furthermore, their training had been in “clean” surgery performed under aseptic conditions. Now they found themselves faced with the need to treat large numbers of grossly contaminated wounds in improvised theatres. They rediscovered debridement (the surgical excision of dead and dying tissue and the removal of foreign matter).
The older surgeons cried “back to Lister”, but antiseptics, no matter how strong, were no match for putrefaction and gangrene. One method of antiseptic irrigation – devised by Alexis Carrel and Henry Dakin and called the Carrel–Dakin treatment – was, however, beneficial, but only after the wound had been adequately debrided. The scourges of tetanus and gas gangrene were controlled to a large extent by antitoxin and antiserum injections, yet surgical treatment of the wound remained an essential requirement.
Abdominal casualties fared badly for the first year of the war, because experience in the utterly different circumstances of the South African War had led to a belief that these men were better left alone surgically. Fortunately, the error of continuing with such a policy 15 years later was soon appreciated, and every effort was made to deliver the wounded men to a suitable surgical unit with all speed. Little progress was made with chest wounds beyond opening up the wound even further to drain pus from the pleural cavity between the chest wall and the lungs.
Perhaps the most worthwhile and enduring benefit to flow from World War I was rehabilitation. For almost the first time, surgeons realized that their work did not end with a healed wound. In 1915 Robert Jones set up special facilities for orthopedic patients, and at about the same time Harold Gillies founded British plastic surgery in a hut at Sidcup, Kent. In 1917Gillies popularized the pedicle type of skin graft (the type of graft in which skin and subcutaneous tissue are left temporarily attached for nourishment to the site from which the graft was taken). Since then plastic surgery has given many techniques and principles to other branches of surgery.
Between the world wars
The years between the two world wars may conveniently be regarded as the time when surgery consolidated its position. A surprising number of surgical firsts and an amazing amount of fundamental research had been achieved even in the late 19thcentury, but the knowledge and experience could not be converted to practical use because the human body could not survive the onslaught. In the years between World Wars I and II, it was realized that physiology – in its widest sense, including biochemistry and fluid and electrolyte balance – was of major importance along with anatomy, pathology, and surgical technique.
The problem of shock
The first problem to be tackled was shock, which was, in brief, found to be due to a decrease in the effective volume of the circulation. To combat shock, the volume had to be restored, and the obvious substance was blood itself. In 1901 Karl Landsteiner, then in Austria, discovered the ABO blood groups, and in 1914 sodium citrate was added to freshly drawn blood to prevent clotting. Blood was occasionally transfused during World War I, but three-quarters of a pint was considered a large amount. These transfusions were given by directly linking the vein of a donor with that of the recipient. The continuous drip method, in which blood flows from a flask, was introduced by Hugh Marriott and Alan Kekwick at the Middlesex Hospital, London, in 1935.

Karl Landsteiner
As blood transfusions increased in frequency and volume, blood banks were required. Although it took another world war before these were organized on a large scale, the first tentative steps were taken by Sergey Sergeyevich Yudin, of Moscow, who, in 1933, used cadaver blood, and by Bernard Fantus, of Chicago, who, four years later, used living donors as his source of supply. Saline solution, plasma, artificial plasma expanders, and other solutions are now also used in the appropriate circumstances.
Sometimes after operations (especially abdominal operations),the gut becomes paralyzed. It is distended, and quantities of fluid pour into it, dehydrating the body. In 1932 Owen Wangensteen, at the University of Minnesota, advised decompressing the bowel, and in 1934 two other Americans, Thomas Miller and William Abbott, of Philadelphia, invented an apparatus for this purpose, a tube with an inflatable balloon on the end that could be passed into the small intestine. The fluid lost from the tissues was replaced by a continuous intravenous drip of saline solution on the principle described by Rudolph Matas, of New Orleans, in 1924. These techniques dramatically improved abdominal surgery, especially in cases of obstruction, peritonitis (inflammation of the abdominal membranes), and acute emergencies generally, since they made it possible to keep the bowel empty and at rest.
Anesthesia and thoracic surgery
The strides taken in anesthesia from the 1920s onward allowed surgeons much more freedom. Rectal anesthesia had never proved satisfactory, and the first improvement on the combination of nitrous oxide, oxygen, and ether was the introduction of the general anesthetic cyclopropane by Ralph Waters of Madison, Wis., in 1933. Soon afterward, intravenous anesthesia was introduced; John Lundy of the Mayo Clinic brought to a climax a long series of trials by many workers when he used Pentothal (thiopental sodium, a barbiturate) to put a patient peacefully to sleep. Then, in 1942, Harold Griffith and G. Enid Johnson, of Montreal, produced muscular paralysis by the injection of a purified preparation of curare. This was harmless since, by then, the anesthetist was able to control the patient’s respiration.
If there was one person who was aided more than any other by the progress in anesthesia, it was the thoracic (chest) surgeon. What had bothered him previously was the collapse of the lung, which occurred whenever the pleural cavity was opened. Since the end of the 19th century, many and ingenious methods had been devised to prevent this from happening. The best known was the negative pressure cabinet of Ernst Ferdinand Sauerbruch, then at Mikulicz’ clinic at Breslau; the cabinet was first demonstrated in 1904 but was destined soon to become obsolete.
The solution lay in inhalational anesthesia administered under pressure. Indeed, when Théodore Tuffier, in 1891, successfully removed the apex of a lung for tuberculosis, this was the technique that he used; he even added an inflatable cuff around the tube inserted in the trachea to ensure a gas-tight fit. Tuffier was ahead of his time, however, and other surgeons and research workers wandered into confused and complex byways before Ivan Magill and Edgar Rowbotham, working at Gillies’ plastic-surgery unit, found their way back to the simplicity of the endotracheal tube and positive pressure. In 1931 Ralph Waters showed that respiration could be controlled either by squeezing the anesthetic bag by hand or by using a small motor.
These advances allowed thoracic surgery to move into modern times. In the 1920s, operations had been performed mostly for infective conditions and as a last resort. The operations necessarily were unambitious and confined to collapse therapy, including thoracoplasty (removal of ribs), apicolysis (collapse of a lung apex and artificially filling the space), and phrenic crush (which paralyzed the diaphragm on the chosen side); to isolation of the area of lung to be removed by first creating pleural adhesions; and to drainage.
The technical problems of surgery within the chest were daunting until Harold Brunn of San Francisco reported six lobectomies (removals of lung lobes) for bronchiectasis with only one death. (In bronchiectasis one or more bronchi or bronchioles are chronically dilated and inflamed, with copious discharge of mucus mixed with pus.) The secret of Brunn’s success was the use of intermittent suction after surgery to keep the cavity free of secretions until the remaining lobes of the lung could expand to fill the space. In 1931 Rudolf Nissen, in Berlin, removed an entire lung from a girl with bronchiectasis. She recovered to prove that the risks were not as bad as had been feared.
Cancer of the lung has become a major disease of the 20th century; perhaps it has genuinely increased, or perhaps modern techniques of diagnosis reveal it more often. As far back as 1913 a Welshman, Hugh Davies, removed a lower lobe for cancer, but a new era began when Evarts Graham removed a whole lung for cancer in 1933. The patient, a doctor, was still alive at the time of Graham’s death in 1957.
The thoracic part of the esophagus is particularly difficult to reach, but in 1909 the British surgeon Arthur Evans successfully operated on it for cancer. But results were generally poor until, in 1944, John Garlock, of New York, showed that it is possible to excise the esophagus and to bring the stomach up through the chest and join it to the pharynx. Lengths of colon are also used as grafts to bridge the gap.
World War II and after
Once the principles of military surgery were relearned and applied to modern warfare, instances of death, deformity, and loss of limb were reduced to levels previously unattainable. This was due largely to a thorough reorganization of the surgical services, adapting them to prevailing conditions, so that casualties received the appropriate treatment at the earliest possible moment. Evacuation by air (first used in World War I) helped greatly in this respect. Diagnostic facilities were improved, and progress in anesthesia kept pace with the surgeon’s demands. Blood was transfused in adequate – and hitherto unthinkable – quantities, and the blood transfusion service as it is known today came into being.
Surgical specialization and teamwork reached new heights with the creation of units to deal with the special problems of injuries to different parts of the body. But the most revolutionary change was in the approach to wound infections brought about by the use of sulphonamides and (after 1941) of penicillin. The fact that these drugs could never replace meticulous wound surgery was, however, another lesson learned only in the bitter school of experience.
When the war ended, surgeons returned to civilian life feeling that they were at the start of a completely new, exciting era; and indeed they were, for the intense stimulation of the war years had led to developments in many branches of science that could now be applied to surgery. Nevertheless, it must be remembered that these developments merely allowed surgeons to realize the dreams of their fathers and grandfathers; they opened up remarkably few original avenues. The two outstanding phenomena of the 1950s and 1960s – heart surgery and organ transplantation – both originated in a real and practical manner at the turn of the century.
Support from other technologies
At first, perhaps, the surgeon tried to do too much himself, but before long his failures taught him to share his problems with experts in other fields. This was especially so with respect to difficulties of biomedical engineering and the exploitation of new materials. The relative protection from infection given by antibiotics and chemotherapy allowed the surgeon to become far more adventurous than hitherto in repairing and replacing damaged or worn-out tissues with foreign materials. Much research was still needed to find the best material for a particular purpose and to make sure that it would be acceptable to the body.
Plastics, in their seemingly infinite variety, have come to be used for almost everything from suture material to heart valves; for strengthening the repair of hernias; for replacement of the head of the femur (first done by the French surgeon Jean Judet and his brother Robert-Louis Judet in 1950); for replacement of the lens of the eye after extraction of the natural lens for cataract; for valves to drain fluid from the brain in patients with hydrocephalus; and for many other applications. This is a far cry, indeed, from the unsatisfactory use of celluloid to restore bony defects of the face by the German surgeon Fritz Berndt in the 1890s. Inert metals, such as vitallium, have also found a place in surgery, largely in orthopedics for the repair of fractures and the replacement of joints.
The scope of surgery was further expanded by the introduction of the operating microscope. This brought the benefit of magnification particularly to neurosurgery and to ear surgery. In the latter it opened up a whole field of operations on the eardrum and within the middle ear. The principles of these operations were stated in 1951 and 1952 by two German surgeons, Fritz Zöllner and Horst Wullstein; and in 1952 Samuel Rosen of New York mobilized the footplate of the stapes to restore hearing in otosclerosis – a procedure attempted by the German Jean Kessel in 1876.
Although surgeons aim to preserve as much of the body as disease permits, they are sometimes forced to take radical measures to save life; when, for instance, cancer affects the pelvic organs. Pelvic exenteration (surgical removal of the pelvic organs and nearby structures) in two stages was devised by Allen Whipple of New York City, in 1935, and in one stage by Alexander Brunschwig, of Chicago, in 1937. Then, in 1960, Charles S. Kennedy, of Detroit, after a long discussion with Brunschwig, put into practice an operation that he had been considering for 12 years: hemicorporectomy—surgical removal of the lower part of the body. The patient died on the 11th day. The first successful hemicorporectomy (at the level between the lowest lumbar vertebra and the sacrum) was performed 18 months later by J. Bradley Aust and Karel B. Absolon, of Minnesota. This operation would never have been possible without all the technical, supportive, and rehabilitative resources of modern medicine.
Heart surgery
The attitude of the medical profession toward heart surgery was for long overshadowed by doubt and disbelief. Wounds of the heart could be sutured (first done successfully by Ludwig Rehn, of Frankfurt am Main, in 1896); the pericardial cavity – the cavity formed by the sac enclosing the heart – could be drained in purulent infections (as had been done by Larrey in 1824); and the pericardium could be partially excised for constrictive pericarditis when it was inflamed and constricted the movement of the heart (this operation was performed by Rehn and Sauerbruch in 1913). But little beyond these procedures found acceptance.

Yet, in the first two decades of the 20th century, much experimental work had been carried out, notably by the French surgeons Théodore Tuffier and Alexis Carrel. Tuffier, in 1912, operated successfully on the aortic valve. In 1923 Elliott Cutler of Boston used a tenotome, a tendon-cutting instrument, to relieve a girl’s mitral stenosis (a narrowing of the mitral valve between the upper and lower chambers of the left side of the heart) and in 1925, in London, Henry Souttar used a finger to dilate a mitral valve in a manner that was 25 years ahead of its time. Despite these achievements, there was too much experimental failure, and heart disease remained a medical, rather than surgical, matter.
Resistance began to crumble in 1938, when Robert Gross successfully tied off a persistent ductus arteriosus (a fetal blood vessel between the pulmonary artery and the aorta). It was finally swept aside in World War II by the remarkable record of Dwight Harken, who removed 134 missiles from the chest – 13 in the heart chambers – without the loss of one patient.
After the war, advances came rapidly, with the initial emphasis on the correction or amelioration of congenital defects. Gordon Murray, of Toronto, made full use of his amazing technical ingenuity to devise and perform many pioneering operations. And Charles Bailey of Philadelphia, adopting a more orthodox approach, was responsible for establishing numerous basic principles in the growing specialty.
Until 1953, however, the techniques all had one great disadvantage: they were done “blind.” The surgeon’s dream was to stop the heart so that he could see what he was doing and be allowed more time in which to do it. In 1952 this dream began to come true when Floyd Lewis, of Minnesota, reduced the temperature of the body so as to lessen its need for oxygen while he closed a hole between the two upper heart chambers, the atria. The next year John Gibbon, Jr., of Philadelphia brought to fulfilment the research he had begun in 1937; he used his heart–lung machine to supply oxygen while he closed a hole in the septum between the atria.
Unfortunately, neither method alone was ideal, but intensive research and development led, in the early 1960s, to their being combined as extracorporeal cooling. That is, the blood circulated through a machine outside the body, which cooled it(and, after the operation, warmed it); the cooled blood lowered the temperature of the whole body. With the heart dry and motionless, the surgeon operated on the coronary arteries; he inserted plastic patches over holes; he sometimes almost remodelled the inside of the heart. But when it came to replacing valves destroyed by disease, he was faced with a difficult choice between human tissue and man-made valves, or even valves from animal sources.
Organ transplantation
In 1967 surgery arrived at a climax that made the whole world aware of its medicosurgical responsibilities when the South African surgeon Christian Barnard transplanted the first human heart. Reaction, both medical and lay, contained more than an element of hysteria. Yet, in 1964, James Hardy, of the University of Mississippi, had transplanted a chimpanzee’s heart into a man; and in that year two prominent research workers, Richard Lower and Norman E. Shumway, had written: “Perhaps the cardiac surgeon should pause while society becomes accustomed to resurrection of the mythological chimera.” Research had been remorselessly leading up to just such an operation ever since Charles Guthrie and Alexis Carrel, at the University of Chicago, perfected the suturing of blood vessels in 1905 and then carried out experiments in the transplantation of many organs, including the heart.

Christian Barnard
New developments in immunosuppression (the use of drugs to prevent organ rejection) have advanced the field of transplantation enormously. Kidney transplantation is now a routine procedure that is supplemented by dialysis with an artificial kidney (invented by Willem Kolff in wartime Holland) before and after the operation; mortality has been reduced to about 10 percent per year. Rejection of the transplanted heart by the patient’s immune system was overcome to some degree in the 1980s with the introduction of the immunosuppressant cyclosporine; records show that many patients have lived for five or more years after the transplant operation.
The complexity of the liver and the unavailability of supplemental therapies such as the artificial kidney have contributed to the slow progress in liver transplantation (first performed in 1963 by Thomas Starzl). An increasing number of patients, especially children, have undergone successful transplantation; however, a substantial number may require retransplantation due to the failure of the first graft.
Lung transplants (first performed by Hardy in 1963) are difficult procedures, and much progress is yet to be made in preventing rejection. A combined heart-lung transplant is still in the experimental stage, but it is being met with increasing success; two-thirds of those receiving transplants are surviving, although complications such as infection are still common. Transplantation of all or part of the pancreas is not completely successful, and further refinements of the procedures (first performed in 1966 by Richard Lillehei) are needed.
HISTORY OF REGIONAL MEDICINE AND HISTORY OF THE UNIVERSITY
1. History of Ternopil state medical university.
In 1773 by the decision of an Austrian Empress Maria Theresa there was founded a medical school Collegium Medicum in Lviv. This medical school was a two-year course of study, and during the first year the following subjects were studied: anatomy, general pathology, general and special surgery, the study of medicine and compounding. In the second year students studied clinical medicine, special therapy and surgery, desmurgy, theoretical and practical obstetrics. Study has been prepared not only theoretical but also practical. Students at this school wear taught by very qualified professors, including one of the organizers and principal teacher Dr. Andrew Krupynskyy. He taught anatomy, obstetrics, general pathology and therapy. Krupynskyy was highly educated person.
Prince Constantine Ostrog in1570 established “Rus hospital” along with the Church of the Nativity. Then the hospital was moved closer to Kremenets gate.
Thus, already at the beginning of the city there were two hospitals. Their staff apparently were monks, especially in the Ukrainian-Basilian Fathers.

The mention about doctors is dated 1830. Names of physician and pharmacist Massynh and Fuchs are under the appeal to the Emperor asking to open the high school in the city.

I.Horbachevsky Ternopil State Medical University (TSMU)
Ternopil, an ancient picturesque city, lies on the banks of the quiet river Seret. It is one of the administrative, economic and cultural centres of Halychyna and it is called the capital of Halytske Podillya.

The educational process is planned and coordinated by the Educational Department of the University, it directing all the activities of the Medical Faculty, Faculty of Pharmacy, Faculty of Dentistry, Faculty of foreign Students, Institute of Nursing and Post-Graduate Faculty as for improving the efficacy of training future specialists and interns.
The University provides the succession of higher medical education: a junior medical specialist, a bachelor, a doctor specialist, a master, a post graduate student. Only higher schools with the IV-th accreditation level train specialists by 16 trades, 8 through the Mastership department, 11-through the Post-graduate department. 410 teachers (with 79 M.Ds and 265 Candidates of Sciences among them) provide the educational process. The pedagogical staff includes 1 Corresponding Member of the Academy of Medical Sciences of Ukraine, 8 Honourary Workers of Science and Technology, 3 Honourary Inventors of Ukraine. The Quality index of the scientific teaching staff is 80,9 %, it being one of the highest indices among the higher medical educational establishments in Ukraine.

The professors deliver lectures, instructors teach students at the practical lessons and seminars
Since 1999 some famous scientists of the Ukrainian medical institutions have been delivering their telecommunication lectures to the students.
All of the teachers take part in the educational methodical conferences, “round table” discussions and young teachers’ seminars.
Specific attention is paid to improving the professional training of the teaching staff of the University. Every 3-5 years or more frequently (in case of necessity) they have their specialization and thematic advanced training courses at Bogomolets National Medical University, the P.L. Shupik Medical University of Post-Graduate Education in Kyiv and at the other educational establishments of Ukraine.

The Human Anatomy Department Museum



During the practical lessons at the Human Anatomy Department the students learn the structure of the human body
Since 1999 quite a great part of the teaching staff attends advanced English courses to improve their knowledge level. Having received a Certificate they start teaching their subject to foreign students in English.
Constant sessions of the central and profile cyclic methodical commissions are held in the methodical study. The most important questions concerning teaching methods, organizing course and state examinations, creating integration educational curricula are discussed there.
![]() |
The students learn the diagnostic method of cystoscopy while studying the Urology course

Acquaintance with the ultrasound diagnosis method
All the plans, programmes and curricula created at the departments are kept there. Every year the teachers of the departments discuss and renovate the curricula, enriching them with some new information.
Great work as for the equipment of the lecture-halls and classrooms according to modern requirements is carried out at the University. Contemporary technical devices of training are installed, the departments are constantly supplied with new equipment necessary for the training process, reagents, and computer technique. The vivarium keeps different animals that are used during the experiments at practical classes.


Practical lessons at the surgical clinics are held at the patient’s bed

Demonstration of the pleural puncture method
Students learn the technique of trachea intubation while studying the course of Resuscitation and Anesthesiology.
The teaching staff takes an active part in the creation of typical educational curricula. Most of them are approved by the Ministry of Public Health. A number of the departments has published some textbooks, manuals, and multimedia computer programs. Some other departments are still developing them.
The teaching staff pays specific attention to the conditions of qualified training of foreign students.
They are provided with all the necessary educational literature. The most highly skilled and friendly teachers promote the training of future specialists; help them to acquire the necessary knowledge.
The pedagogical staff amounts 408 teachers, with 71 Doctors of Sciences, 59 professors, 124 Ass. Professors, 261 Candidates of Sciences. The pedagogical staff amounts 1 Corresponding member of AMS of Ukraine, 8 Honored Workers of Science and Technology of Ukraine, 3 Honoured Inventors of Ukraine, 4 Honoured Physicians of Ukraine.
The teaching staff develops the most modern technologies, methodics, performs numerous scientific elaborations, provides the effective educational and treatment processes. Every department has made its own optimal plan of lectures, practical classes and seminars according to the general typical curricula, this being coordinated by the cyclic methodical commission.

Since 2005 students study according to the credit-module system, which will mean recognition of the degree in all countries of the European Union. Since 2006 at the University the ‘one-day’ methodic and Z-system of training have been introduced. According to the latter the students during the first years learn clinical aspects together with the theoretical subjects.
The Medical Faculty at TSMU consists of the following departments:
Department of Internal Medicine, Propedeutics and Phthisiology Department of Internal Medicine with Clinical Immunology and Allergology Department of Infectious Diseases with Epidemiology and Dermatovenerology Department of Neurology, Psychiatry, Narcology, and Medical Psychology Department of Pediatrics and Pediatric Surgery Department of General and Operative Surgery with Topographic Anatomy, Traumatology and Orthopaedics Department of Surgery with Urology and Anaesthesiology Department of Obstetrics and Gynaecology, Department of Otorhinolaryngology, Ophthalmology, and Neurosurgery, Department of Oncology, Radiology Diagnostics and Therapy and Radiation, Medicine Department of Ambulatory Care and Family Medicine with Medical Equipment, Department of Medical Rehabilitation and Sports Medicine, Department of Pharmacology with Clinical Pharmacology and Pharmacy, and Pharmacotherapy, Department of Pathological Anatomy with Dissection Course and Forensic Medicine, Department of Obstetrics and Gynaecology of FPE Department of Surgery, Traumatology, and Orthopaedics of FPE, Department of Pediatrics of FPE Department of Therapeutics and Family Medicine of FPE, Department of Medical Diagnostics and Emergency Care of FPE.

The International relations department was formed in 2000 for improving and widening the partnering contacts with European and American schools. Therefore, the department was reorganized into separate independent unit of the university in 2005.
International relations department contributed in the creating of cooperational agreements between Ternopil State Medical University and its partners: University of South Carolina Upstate (USA) Vienna Medical University (Austria) Greenville Technical College (USA) Medical University of Silesia (Poland) Charles University (Czech republic) Slovak Medical University (Slovak republic) Moscow Medical Stomatological University (Russia) Technical University of Dresden (Germany).
The Teaching staff of TSMU takes part in international medical conferences, congresses, symposia, congresses in 18 countries of the world. International relations department contributed in the meeting at TSMU official delegations from such countries as India, Pakistan, Malaysia, Sudan, Vietnam, China.

The clinical departments of TSMU are specialized centers which provide the population with highly specialized medical aid. Such centers include: regional and city gastroenterological departments, regional and city cardiologic departments, regional immunological department, regional thoracic department, regional vascular department, regional and city neurosurgical department, department of minimally invasive surgery, regional center of eye microsurgery, regional otolaryngological department, regional neurologic and psychiatric center, regional oncology dispensary, and regional TB dispensary. Students and doctors attend clinical training and practice in the 540 patient’s bed hospitials in the vicinity of the University.
2. Dr.I.Horbachevsky as one of the most famous scientists of his time.
By the decision of the Cabinet of Ministers of Ukraine in 1992 the Ternopil State Medical University was named after Academician Ivan Horbachevsky.

IVAN HORBACHEVSKY
Born: 15.05.1854 (Ternopil region, Ukraine)
Died: 24.05.1942 (Prague)
Field of activity: Organic Chemistry, Biochemistry, Medical Chemistry, Public Health.
Doctor of Medicine, Professor, Head of Medical Chemistry Department, Dean of Faculty of Medicine at the Czech University, Prague, Rector of the Czech University, Prague, Member of the Sanitary Council of the Czech Kingdom, Member of the Highest Health Council of the Austro-Hungarian Empire in Vienna, Member of the Technical Investigation Council in Vienna, a Life Member of Lords’ House of Austrian Parliament, the1-st Minister of Health of Austro-Hungarian Empire, Rector of the Free Ukrainian University in Prague, Member of the Ukrainian University of Sciences, Full and Honorary Member of the T. Shevchenko Scientific Society, Ukraine.
Dr.Ivan Horbachevsky (Horbaczewski) was one of the most famous scientists of his time in the field of chemical organic synthesis. His investigations were a revolution in medical, organic and biological chemistry.

Dr. Ivan Horbachevsky (Horbaczewski) was one of the most famous scientists of his time in the field of chemical organic synthesis. His investigations were a revolution in medical, organic and biological chemistry.
ACTIVITIES OF DR. I. HORBACHEVSKY IN AUSTRIA
In 1877 a young graduate of the University of Vienna, Doctor of Medical Sciences Ivan Horbachevsky was appointed as an assistant at the Institute of Medicinal Chemistry, the University of Vienna. In 1882 he was the first person in the science world to synthesize uric acid from urea and glycine aminoacid. This discovery brought great glory to Austrian science and to Vienna University.
In his works Dr. Ivan Horbachevsky explored the causes and pathogenesis of gout, mechanisms of catabolism of mononucleotides, which are constituents of nucleic acids.
His hypotheses as to the nature and causes of pellagra were proved by the next generation of scientists and provided the groundwork for developing a rational humautrition system.
One of his achievements was that he determined the origins of uric acid in organism.
The significance of his works devoted to the conversion of nucleic acids to end products is highly regarded in the point of view of the regulation of synthesis and decomposition of nucleic acids, which has an impact on our ideas about the life at the molecular level. Due to his great managerial and leadership skills Dr. Horbachevsky was offered a position in the Highest Sanitary Council in Vienna; later he became the President of the Council.
Being one of the most outstanding scientific and public figures of his time, Dr. Horbachevsky was appointed as the first Minister of Health in 1918, thus becoming the founder of the Ministry of Health in Austria, the first Ministry of Health in the world.
For his outstanding achievements in chemical and medical science and public health, Dr. Horbachevsky was elected as a life member of the House of Lords of the Austrian Parliament and an advisor to the Austrian royal court.
In 1884 Dr. Horbachevsky became the first professor ever in medical chemistry at the Czech University in Prague. Although being very young, he earned scientific reputation by his paper on the preparation of uric acid by careful melting a mixture of glycine and urea, published in German on just 40 lines. (It was the young author’s third publication).

3. THE IMPACT OF DR. HORBACHEVSKY ON THE DEVELOPMENT OF UKRAINIAN SCIENCE AND CULTURE
He contributed greatly to establishing the Ukrainian University in Lviv. Dr. Horbachevsky was elected as a member of the T. Shevchenko Scientific Society in Lviv.
The brilliant scientist was one of the founders and later the first President of the Ukrainian Medical Association. In 1926 and 1932 he organized the 1st and the 2nd Ukrainian Scientific Congresses. His successful activities promoted Ukrainian science on the world level.
Dr. Horbachevsky made great efforts to create a national medical school. His leadership qualities and persistent work contributed greatly to establishing the Ukrainian Free University in Prague.