Articles of Union that bound Yale College and the Connecticut Medical Society together in the creation of the Medical Institution of Yale College were signed in 1810, but preparations delayed the school’s opening until November 1813. The first medical school in Connecticut and the sixth in the United States, the Medical Institution initially drew students primarily from Connecticut and the New England region. In the school’s earliest years, a faculty of five taught just five courses: Theory and Practice of Medicine; Surgery and Midwifery; Anatomy; Chemistry, Pharmacy, and Mineralogy; and Materia Medica (which mainly covered the use and preparation of medicinal plants). For those who wished to practice medicine, the school offered a level of formal training that went beyond the traditional apprenticeship system, the most common form of medical training in America in the 18th and early 19th centuries. The Medical Institution also provided an educational solution for those unable or unwilling to train abroad or at one of the handful of other medical schools in the United States, the oldest of which were located in Philadelphia and New York.

1836–1860
A fledgling medical school gains a surer footing

The Medical Institution had many successes in its first decades, but by mid-century the increasing professionalization of medicine forced attendant changes in medical education. Yale, like many other American medical schools, struggled to make its curriculum requirements more stringent while maintaining student enrollment. The school continued to be run jointly by the Connecticut Medical Society (which had the deciding vote in terms of governance) and by Yale faculty (which included the professors of Yale College as well as the Medical Institution). In 1845, the election of Charles Hooker, M.D., as dean of the Medical Institution marked the first appointment of a dean at any of Yale’s graduate schools. Cortlandt Van Rensselaer Creed, M.D., became the first African American to earn a degree from the Medical Institution when he graduated in 1857. In 1839, the school began to require a thesis for the M.D. degree, a requirement that is still in place today. (The oldest extant bound thesis written by a Yale medical student, entitled De Calculo Vesicae, is focused on bladder stones, and dates to 1837.)

1861–1885
The Civil War, and a new ideal in American
medical education

During the Civil War, the staff of theKnight U.S. Army General Hospitalin New Haven, under the direction of Pliny Jewett, M.D., an 1840 graduate of the Medical Institution, provided care for more than 25,000 wounded Union soldiers. The period following the war proved difficult for the Medical Institution, due not only to the costs wrought by America’s bloody internecine struggle, but also to the conditions arising from the changing landscape of American medicine. The school’s continued improvement of educational standards and intense competition with the growing number of medical schools in other states decreased student enrollments to the lowest point in its history. Debt mounted and financial problems were nearly constant. A lack of support from Yale College—both financial and institutional—seemed to augur oblivion for the medical school. But this period also marked the first steps toward the new ideal in American medical education—an ideal that rejected the old system of apprenticeship, embraced science rooted in the experimental method, and affirmed the importance of scientific research in medicine.

1886–1910
Major advances with the dawn
of a new century

In 1886, Herbert E. Smith, M.D., began his second year as dean of the medical school (by then known as the Medical Department of Yale College). It was not an auspicious time to lead the school: Yale’s Medical Department had hit bottom in both student enrollment and financial resources. Having studied at the University of Heidelberg, Smith was a proponent of the German approach to medical education, with its heavy emphasis on research. During Smith’s years as dean (1885–1910), professor Arthur W. Wright, Ph.D., who in 1861 had been one of three Yale students to receive the first Ph.D. degrees conferred in the United States, published the first X-ray image in America. The Medical Department made educational requirements for admission more stringent, lengthened the course of study, and expanded the curriculum to more closely model the German example by emphasizing research and clinical instruction. With clinical education an increasingly important feature of medical education, Smith began the process by which the medical school became intimately connected with the New Haven Hospital (now Yale-New Haven Hospital), as it remains today.

1911–1935
The Flexner Report and the debut of
the ‘Yale System’

The 1910 Flexner report, an unsparing assessment of medical education in America, caused upheaval in medical schools, with many struggling to adapt to the report’s recommendations or die: By 1920, nearly half of the 155 schools in North America were gone. At Yale, the report was instead the prelude to a vast transformation. The University made a financial commitment to its medical school unprecedented in its 100-year history. Under the visionary leadership of Dean Milton C. Winternitz, M.D., the School of Medicine refashioned itself and rose to national prominence. Winternitz outlined the school’s modern footprint and instituted the “Yale System” of medical education, which prizes students’ independence and their original research. A symbol of the new optimism,Sterling Hall of Medicine was dedicated on February 23, 1925. Key funds from the Rockefeller Foundation remade the school’s clinical departments, making Yale one of only a few medical schools at the time to adopt the “full-time” system, in which faculty received salaries to support themselves without relying on income from private practices.

1936–1960
Wartime spurs a national commitment to science

The Yale Poliomyelitis Study Unit (YPSU), formed in 1931, took a community-based approach to unraveling the causes of polio, then an epidemic disease. The YPSU’s John R. Paul, M.D., and James D. Trask, M.D., were the first to isolate poliovirus from living patients in several decades, which opened a new stage in po lio research. Another YPSU member, Dorothy M. Horstmann, M.D., made the important discovery that the virus is present in the blood in the disease’s early stages, thereby enabling researchers to develop a vaccine for the disease. With America’s entry into World War II, President Franklin D. Roosevelt instituted the Office of Scientific Research and Development (OSRD) to harness research in support of the Allied effort. With OSRD support, Louis S. Goodman, M.A., M.D., and Alfred Gilman, Ph.D., were studying chemical warfare agents, and serendipitously discovered that nitrogen mustards used in chemical warfare were remarkably good at killing certain cancerous tumors. In 1942, this work led to the first intravenous chemotherapy treatment of a cancer patient, marking the birth of medical oncology.

1961–1985
The birth of Medicare, and the rise
of molecular biology

Following World War II, federal funding for biomedical research exploded, and in 1965, the U.S. government’s Medicare program provided millions of Americans lacking health insurance with access to medical care. At the School of Medicine, a huge influx of grants from the National Institutes of Health, combined with a significant increase in clinical income, drove a massive expansion in which existing departments grew and new ones were formed. After Watson and Crick discovered the structure of DNA in 1953, medical research was rapidly and utterly transformed by molecular biology, which offered powerful new tools to identify cellular mechanisms at work in health and disease. In 1979, Joan A. Steitz, Ph.D., discovered snRNPs (“snurps”), RNA–protein complexes in the cell’s nucleus that perform a crucial step in the transfer of DNA information into messenger RNA (mRNA). Besides illuminating how mRNA is spliced together to create proteins, Steitz’s research on snRNPs has thrown new light on autoimmune diseases, and has helped to clarify how splicing lends extra versatility to genes, a process that is essential in the immune system.

1986–2010
Laying the groundwork for the medicine
of tomorrow

The last 20 years have seen breathtaking advances in molecular biology and genetics—most notably the publication of the complete sequence of the human genome in 2001—achievements that promise to lead to important insights into human disease and new, targeted therapies. Today, the ability to quickly and inexpensively sequence complete human genomes heralds the dawn of a long-awaited “personalized” approach to medicine, in which a patient’s genetic makeup helps to determine optimal treatment strategies. Among the important recent discoveries of School of Medicine faculty is the 1997 publication by Arthur L. Horwich, M.D., and colleagues of the atomic structure of a molecular protein-folding machinethat is essential to normal cell function. Faulty protein folding is a feature of neurodegenerative diseases such as Alzheimer’s disease. In 2007, Yale University acquired the 136-acre West Campus. With 20 buildings and over 1.5 million square feet of space, nearly a third of which is devoted to laboratories, West Campus will be home to five new scientific institutes, and state-of-the-art facilities for genomics, gene expression analysis, and drug discovery.