In the early years of the AIDS crisis, when fear and misunderstanding accompanied any mention of the disease, schools across the nation faced a decision: whether to allow students diagnosed with AIDS to attend classes. In October 1985, a New York school district barred children from attending classes after officials learned that their mothers’ boyfriends had been diagnosed with the disease. When a different New York district admitted a student with AIDS around that same time, attendance dropped by 25%, despite the fact that the specific school the child was attending was kept confidential. In Swansea, Massachusetts, school officials decided to “do the right thing” by admitting a teenager living with AIDS—only two families decided to keep their children from school after the decision. A year earlier, in late 1984, a Dade County, Florida school admitted triplets who had been diagnosed with AIDS, but kept the siblings isolated from the rest of the students.
While new controversies sprung up around the nation, one school in Central Indiana shot to the forefront of the debate in the summer of 1985. Ryan White, a 7th grade student in Howard County, was diagnosed with AIDS in December 1984 after contracting the disease from a contaminated hemophilia treatment. For several months, he was too ill to return to school, but in the spring of 1985 he began voicing his desire to return to his normal life by resuming classes at Western Middle School. When his mother met with school officials to talk about this possibility, she was met with resistance. Concerns about the health of other students, and that of Ryan himself, whose immune system had been ravaged by his illness, gave officials pause. In one of the earliest news articles about the issue, Western School Superintendent J.O. Smith asked:
You tell me. What would you do? . . . I don’t know. We’ve asked the State Board of Health. We’re expecting something from them. But nobody has anything to go by. Everybody wanted to know what they’re doing in other places. But we don’t have any precedent for this.
He was right. While a few schools had faced similar situations, the issues surrounding a child with AIDS attending school, namely, the risk this posed to other students, were far from settled. At this time, new and conflicting information came out at a dizzying pace. Most reports held that AIDS was not transmissible through casual contact, but others implied that you couldn’t rule out the possibility of it being passed through saliva, which would have made it a much bigger threat. With so much information—and misinformation—in the news cycle, the desire to hear from health authorities on the topic was understandable.
Three months later, the Board of Health released a document containing detailed guidelines for children with AIDS attending school:
AIDS/ARC children should be allowed to attend school as long as they behave acceptably . . . and have no uncoverable sores or skin eruptions. Routine and standard procedures should be used to clean up after a child has an accident or injury at school.
Despite this recommendation, Western School Corporation officials continued to deny Ryan admittance to class. Instead, they set up a remote learning system. From the confines of his bedroom, Ryan dialed in to his classes via telephone and listened to his teachers lecture. He missed out on visual aids, class participation, and sometimes the lectures themselves, as the line was often garbled or disconnected.
A November ruling, this time by the Department of Education, confirmed the Board of Health’s assertion that Ryan should be admitted to class:
The child is to be admitted to the regular classrooms of the school at such times as the child’s health allows in accordance with the Indiana State Board of Health guidelines.
Ryan returned to school for one day before the school filed an appeal and he was once again removed from class. A series of rulings, appeals, and other legal filings followed, ultimately ending when the Indiana Court of Appeals declined to hear further arguments and Ryan finally got what he and his family had fought so hard for—returning to classes for good. However, upon his August 25, 1986 return, Ryan faced intense discrimination from classmates and other community members. Addressing the Presidential Commission on the HIV Epidemic in 1988, Ryan recalled some of the more poignant moments from his time in Kokomo:
Some restaurants threw away my dishes, my school locker was vandalized inside and folders were marked ‘fag’ and other obscenities. I was labeled a troublemaker, my mom an unfit mother, and I was not welcome anywhere. People would get up and leave so they would not have to sit anywhere near me. Even at church, people would not shake my hand.
Because of these negative hometown experiences and his desire to evade oppressive media coverage, Ryan asked his mother if they could move out of Howard County. When the family decided to settle in Cicero, they couldn’t have known how drastically different their lives were about to become.
Tony Cook, who was the Hamilton Heights High School principal in the 1980s and is now a State Representative, heard through informal channels that Ryan’s family was moving into his school district in April 1987. The degree of media coverage surrounding Ryan’s battle to attend classes meant that Cook was well aware that his community’s reaction to the White family’s arrival would be heavily scrutinized. Thus, he set out on an AIDS educational crusade the likes of which had not been seen before.
With the backing of his superintendent and school board, Cook quickly made the decision that not only would Ryan be admitted to the school, but there would be no restrictions placed on what Ryan was able to do in school (while in class in Western Middle School, he was not able to attend gym, used a separate restroom, and ate off of disposable trays with plastic utensils.) After gathering AIDS-related materials from the Indiana State Board of Health, the Center for Disease Control, major newspapers, and scientific journals, Tony Cook turned what was supposed to be his summer break into a months-long educational campaign.
Throughout the coming months, Cook spoke about AIDS at Kiwanis groups, Rotary Clubs, churches, and to any group that asked. He sat in living rooms and at kitchen tables throughout the community, personally addressing the concerns of fellow citizens. The school developed a collection of AIDS education materials that could be checked out by students. Tony contacted members of the student government to ask them to act as student ambassadors, advocating on Ryan’s behalf with their fellow students and the media. The school staff went through additional training to prepare them for the possibility of a blood or other biohazard spill. By the time the school year came around, Cicero, Arcadia, and the surrounding area had some of the best informed populations when it came to AIDS.
The first few days of the 1987-1988 school year at Hamilton Heights High School were peppered with convocations in which Cook addressed each grade level to assuage any remaining concerns over sharing classrooms and hallways with Ryan. Students were encouraged to ask questions and support was provided for any feeling uncomfortable with the situation. Administration also offered to change class schedules to avoid conflict.
On Ryan’s first day of class, which was a week after school started, the campaign seemed to have been successful. As the press surrounded him on his way out, he smiled and said, “It went really great—really. Everybody was real nice and friendly.” Later, when speaking in front of the Presidential Commission on the HIV Epidemic, Ryan attributed his positive experiences at Hamilton Heights directly to the education campaign:
I am a normal, happy teenager again . . . I’m just one of the kids, and all because the students at Hamilton Heights High School listened to the facts, educated their parents and themselves, and believed in me . . . Hamilton Heights High School is proof that AIDS education in schools works.
When reflecting on the experience in a recent interview, Representative Cook spoke to the power of education to overcome even the most intense fear, “Yes, there were some folks that were uneasy and nervous, but we did see education overcome. And we saw a community that . . . trusted us.” One obstacle Ryan and the school faced was the sheer amount of publicity surrounding his move to Hamilton County. Hamilton Heights High School was an open campus–students traveled between three different buildings throughout the day–which would have made having members of the media on campus both distracting and potentially dangerous. But restricting access all together also wasn’t possible, as Ryan was a nationally-known figure by this time. The compromise was to have weekly press conferences during which Ryan, student ambassadors, and faculty could answer questions and update the press about the goings-on at the school, a practice that persisted throughout Ryan’s first full semester at Hamilton Heights.
After that first semester, the media began to lose interest in the story as it became more and more apparent that a mass walk-out or other dramatic event would not take place. The first time Tony Cook met Ryan, Cook asked why Ryan wanted so badly to attend school. During our interview with Representative Cook, he recalled that the fifteen-year-old Ryan, who by that time had been in the middle of a media storm for nearly two years, replied “’I just want to be a normal kid . . . I may die. So, for me, it’s important that I try to experience the high school experience as well as I can.” At Hamilton Heights High School, Ryan was able to do just that.
In the years following Ryan’s acceptance into Hamilton Heights High School, Ryan, Tony Cook, and others who had been involved in the educational program traveled around the country advocating for increased AIDS education. By August 1988, just one year after Ryan’s first day at Hamilton Heights, the Children’s Museum of Indianapolis began developing an exhibit centering on the issue:
While Ryan White zips around the country speaking out for AIDS education, the students of Hamilton Heights High School are telling children visiting The Children’s Museum in Indianapolis what it was like accepting Ryan into school . . . ‘I think everyone was uneasy at first,’ said one student on the videotape about Ryan’s coming to the school. ‘Education eased a lot of people’s minds,’ said another student.
Ryan White died on April 15, 1990 after being admitted to Riley Hospital for Children with a respiratory tract infection. In 2001, Ryan’s mother, Jeanne, donated the contents of his bedroom to the Children’s Museum of Indianapolis, where it has been painstakingly recreated as part of the “Power of Children” exhibit. The museum also houses thousands of letters written to Ryan and his family throughout his illness. You can read the letters and even help transcribe them here.
If you scour Scott’s Official History of the American Negro in the World War, On the Trail of the Buffalo Soldier, The Encyclopedia of African American Military History, The African American Encyclopedia, and the Who’s Who of the Colored Race, Dr. Joseph Ward’s name is nowhere to be found. This is a concerning omission, given that his leadership at Tuskegee, Alabama’s Veterans Hospital No. 91. helped prove to some white Jim Crow Southerners, medical practitioners, U.S. military officials, and even President Calvin Coolidge that African Americans were fit to manage large institutions. His significance is two-fold: in an era where African Americans were often excluded from medical treatment, Ward made care accessible to those in Indianapolis and, on a much larger scale, to Southern veterans.
Born in Wilson, North Carolina to Mittie Ward and Napoleon Hagans, Joseph traveled as a young man to Indianapolis in search of better opportunities. In the Circle City, he attended Shortridge High School and worked as the personal driver of white physician George Hasty. According to the African American newspaper The Freeman, Dr. Hasty “‘said there was something unusual in the green looking country boy, and to the delight of Joe as he called him, he offered to send him to school.'” By the 1890s, Ward had earned his degree from Indiana Medical College and practiced medicine in his adopted city. In 1899, The Freeman remarked “The fact that he has risen from the bottom of poverty, th[r]ough honorable poverty, without any assistance, is sufficient evidence to justify our belief in his success in the future.”
Barred from treating Black patients in city hospitals due to institutionalized discrimination, he opened Ward’s Sanitarium and Nurses’ Training School on Indiana Avenue around 1907, which soon garnered the praise of white physicians. He also convinced administrators at the segregated City Hospital to allow Ward’s Black nursing students to attend courses. By enabling them to pass the same state licensing test as white students, he opened professional opportunities to African American women in an era in which they were often relegated to domestic service and manual labor.
Dr. Ward became as foundational to Indianapolis’s rich Black history as The Freeman publisher Dr. George Knox and entrepreneur Madam C.J. Walker, for whom Ward helped get her professional start. He gave back to his city by helping found the African American Senate Avenue YMCA. During World War I, Ward temporarily left his practice to serve in the Medical Corps in France with the 92nd Division Medical Corps, where he worked as ward surgeon of Base Hospital No. 49. Again, his diligence propelled him to excellence, and he became one of two African Americans to achieve the rank of Major in World War I. In 1924, Dr. Ward’s name was etched into the annals of history, when he became the first African American commander of the segregated Veterans Hospital No. 91 at Tuskegee, Alabama. Ward’s decision to accept the position was itself an act of bravery, coming on the heels of hostility from white residents, politicians, and the Ku Klux Klan.
Initially, the Veterans Bureau placed the new hospital in control of a white staff, despite promising Black personnel they would manage it. After seemingly talking out of both sides of their mouths, Bureau officials gradually began replacing white staff with Black staff due to the unrelenting protest of African Americans across the country. This decision essentially pulled the pin from a grenade. Vanessa Northington Gamble contended in Making A Place for Ourselves: The Black Hospital Movement, 1920-1945 that “White Tuskegeeans saw the fight over the hospital as a ‘test of the supremacy of the Angle-Saxon race’ and were prepared to win the battle by any means necessary.” When African American bookkeeper John C. Calhoun arrived at the hospital to replace his white predecessor, he was handed a letter that warned:
WE UNDERSTAND YOU ARE REPORTING TO HOSPITAL TO ACCEPT DISBURSING OFFICERS JOB, IF YOU VALUE YOUR WELFARE DO NOT TAKE THIS JOB BUT LEAVE AT ONCE FOR PARTS FROM WHENCE YOU CAME OR SUFFER THE CONSEQUENCES, KKK.
He took heed, and an hour after Calhoun fled, approximately 50,000 Klan members marched on Tuskegee and burned a forty-foot cross, before silently marching near the veterans’ hospital. Although violence was avoided, one “fair-skinned” man reportedly “infiltrated the Klan by passing as white” and learned they planned to kill a Black leader and blow up the Tuskegee Institute. The community at large expressed their disapproval of Black leadership by protesting at the White House. Southern politicians did so by writing pieces for the local papers, like State Senator R. H. Powell, who insisted in The Montgomery Advertiser “We know that a bunch of negro officers, with uniforms and big salaries and the protection of Uncle Sam . . . will quickly turn this little town into a place of riot such as has been experienced in so many places where there has occurred an outbreak between the races.”
But President Calvin Coolidge’s Republican administration stood up to the Klan and continued to replace white staff with Black personnel. In a nod to the Confederacy’s defeat in the Civil War, The Buffalo American wrote that the Klan’s demonstration “proved to be another ‘lost cause’ and Negro workers continued to arrive.” With Dr. Ward’s appointment, the hospital’s staff was composed entirely of Black personnel. The hospital’s pioneering practitioners treated Southern Black veterans, many of whom suffered from PTSD following WWI service. Under Ward’s leadership, the Buffalo American reported, patients “are happy, content and enjoying the best of care at the hands of members of their own race who are inheritently [sic] interested in their welfare.” The Montgomery Advertiser noted in 1935 that No. 91 was among the largest U.S. veterans hospitals in the country, offering 1,136 beds, and experiencing a monthly wait list of about 375 patients. In addition to neuropsychiatric treatment, the hospital’s library hosted a bibliotherapy program and patients could view moving pictures and attend dances. The sprawling complex also provided job opportunities for Black laborers, waiters, stenographers, plumbers, and electricians.
In describing his leadership, Ward’s colleagues recalled that his purpose was firm, demeanor alert, and interactions with subordinates fair. Ward reportedly “amassed an enviable reputation in the Tuskegee community. His legendary inspection tours on horseback and his manly fearlessness in dealing with community groups at a time when there was a fixed subordinate attitude in Negro-white relations are two of the more popular recollections.” He proved so adept as a leader that the War Department promoted him to Lieutenant Colonel. A 1929 editorial for the Journal of the National Medical Association praised Ward for his ability “to win over to your cause the White South.” The author added that Ward “has served as an inspiration to the members of the staff of the hospital. He has stimulated original observation and contributions” and noted “‘Those who led the opposition to the organization of a Negro personnel openly and frankly acknowledge their mistake and their regret for the earlier unfortunate occurrences.'”
President Coolidge affirmed these characterizations in an address to Congress. Howard University conferred an honorary Master of Arts degree upon Ward for honoring his profession “under pioneer conditions of extraordinary difficulty.” The accolades go on. In regards to this praise, Ward was characteristically humble, stating in The Buffalo American on October 30, 1924, “‘My associates have worked as though they realized that not only them personally, but the entire group was on trial and whatever success we have had was due to that spirit.'”
Years after Ward’s appointment, racial tension had not entirely dissipated. In 1936, a federal grand jury charged Ward and thirteen others on the hospital’s staff with “conspiracy to defraud the Government through diversion of hospital supplies.” After more than eleven years of service, the esteemed leader was dismissed “under a cloud,” and he plead guilty to the charges in 1937.Black newspapers provided a different perspective on Ward’s rapid descent from grace. According to The New York Age, Black Republicans viewed the “wholesale indictment of the Negro personnel” at Veterans Hospital No. 91 as an attempt by Southern Democrats to replace Black staff with white, to “rob Negroes of lucrative jobs.” The paper added that these Southern Democrats tried to “take advantage of the administration of their own party in Washington and oust colored executives on charges they would not have dared to file under a Republican regime.” These Black employees, the paper alleged, became the “hapless victims of dirty politics.” Given the previous attempts of the white community to usurp control of the veterans hospital, one is tempted to see truth in this interpretation. After Ward’s dismissal, he quietly returned home to Indianapolis and resumed his private practice, which had moved to Boulevard Place. He practiced there until at least 1949 and in 1956 he died in Indianapolis.
The struggle for leadership of the new veterans hospital shifted the threat of African American autonomy from theoretical to real for the white Jim Crow South. It exposed the organizational capabilities of the white community in terms of protesting the possibility of this autonomy. It also exposed the capabilities of the Black community in terms of demanding their own governance, efforts Dr. Ward ensured were not made in vain. The young man who journeyed out of the South in search of better opportunities later returned to create them for others. Yet somehow his efforts are virtually absent from the historical record. With the help of doctoral student Leon Bates, IHB is changing that this summer by commemorating Lt. Col. Joseph H. Ward with a historical marker.
 “Dr. Joseph H. Ward,” The Freeman: An Illustrated Colored Newspaper (Indianapolis), July 22, 1899, 1, accessed Google News.
 “Maj. Ward Back from U.S. Work,” The Indianapolis Star, June 29, 1919, accessed Newspapers.com. “Dr. Joseph H. Ward,” The Freeman: An Illustrated Colored Newspaper (Indianapolis), July 22, 1899, 1, accessed Google News.
 Gamble, 90.
 Quotation from Gamble, 92.
 “Making Good at ‘The Tuskegee’ United States Veterans’ Hospital, No. 91,” The Buffalo (New York) American, 6, accessed Newspapers.com.
 Dr. Clifton O. Dummett and Eugene H. Dibble,”Historical Notes on the Tuskegee Veterans Hospital,” Journal of the National Medical Association 54, no. 2 (March 1962), 135.
 Editorial, “The U.S. Veterans’ Hospital, Tuskegee, Ala., Colonel Joseph Henry Ward,” Journal of the National Medical Association 21, no. 2 (1929): 65-66.
 “Dr. Dibble Succeeds Col. Ward as Head of Tuskegee Hospital,” The Pittsburgh Courier, accessed Newspapers.com; Colonel Indicted in Food Stealing,” The Montgomery Advertiser, July 10, 1936, accessed Newspapers.com; “Two Plead Guilty in Hospital Case,” The Montgomery Advertiser, March 25, 1936, accessed Newspapers.com.
 “Charge Southern Democrats Seek Control of Veterans Hospital at Tuskegee, As 9 Others Are Indicted,” The New York Age, October 3, 1936, accessed Newspapers.com.
As a researcher, few things are more disheartening than coming across that blemish on an otherwise inspiring legacy. But it happens more often than not in the messiness of human history. Events and actors often occupy an ambiguous position between right and wrong, progressive and stagnant, heroic and indifferent. We wish the loose ends of the stories could be tied up into one neat moral bow, but often it’s more complex. In wrestling with this phenomenon, I concluded two things: that context is everything and that we must remember that the historical figures we idolize—and sometimes demonize—were, in fact, evolving humans. The visionary and controversial leadership of Indianapolis Rev. Oscar McCulloch and Gary, Indiana Rep. Katie Hall inspired these conclusions.
In the early 20th century, Oscar McCulloch’s misguided attempt to ease societal ills was utilized to strip Americans of their reproductive rights. Born in Fremont, Ohio in 1843, McCulloch studied at the Chicago Theological Seminary before assuming a pastorship at a church in Sheboygan, Wisconsin. He moved to Indianapolis in 1877 to serve as pastor of Plymouth Congregational Church, situated on Monument Circle. On the heels of economic depression triggered by the Panic of 1873, he implemented his Social Gospel mission. He sought to ease financial hardship by applying the biblical principles of generosity and altruism. To the capital city, Brent Ruswick stated in his Indiana Magazine of History article, McCulloch “brought a blend of social and theological liberalism and scientific enthusiasm to his work in Indianapolis.” He also brought a deep sense of empathy for the impoverished and soon coordinated and founded the city’s charitable institutions, like the Indianapolis Benevolent Society, Flower Mission Society, and the Indianapolis Benevolent Society.
In 1878, McCulloch encountered the Ishmael family, living in abject poverty. He described them in his diary :
composed of a man, half-blind, a woman, and two children, the woman’s sister and child, the man’s mother, blind, all in one room six feet square. . . . When found they had no coal, no food. Dirty, filthy because of no fire, no soap, no towels.
Disturbed by the encounter, McCulloch headed to the township trustee’s office to research the Indianapolis family, who lived on land known as “Dumptown” along the White River, as well as in predominantly African American areas like Indiana Avenue, Possum Hollow, Bucktown, and Sleigho. He discovered that generations of Ishmaels had depended upon public relief. According to Ruswick, McCulloch came to believe that the Ishmaels, “suffering from the full gamut of social dysfunctions,” were not “worthy people suffering ordinary poverty but paupers living wanton and debased lives.” Over the course of ten years, the pastor sought to discover why pauperism reoccurred generationally, examining 1,789 ancestors of the Ishmaels, beginning with their 1840 arrival in Indiana.
The blemish. McCulloch’s nationally renowned 1888 “Tribe of Ishmael: A Study in Social Degradation” concluded that heredity and environment were responsible for social dependence. He noted that the Ishmaels “so intermarried with others as to form a pauper ganglion of several hundreds,” that they were comprised of “murderers, a large number of illegitimacies and of prostitutes. They are generally diseased. The children die young.” In order to survive, the Ishmaels stole, begged, “gypsied” East and West, and relied on aid from almshouses, the Woman’s Reformatory, House of Refuge and the township. Assistance, he reasoned, only encouraged paupers like the Ishmaels to remain idle, to wander, and to propagate “similarly disposed children.” In fact, those benevolent souls who gave to “begging children and women with baskets,” he alleged, had a “vast sin to answer for.” McCulloch’s sentiment echoes modern arguments about who is entitled to public assistance.
In addition to revoking aid, McCulloch believed the drain on private and public resources in future generations could be stymied by removing biologically-doomed children from the environment of poverty. Ruswick noted that McCulloch, in the era of Darwin’s Natural Selection, believed “pauperism was so strongly rooted in a person’s biology that it could not be cured, once activated” and that charities should work to prevent paupers from either having or raising children. This line of thought foreshadowed Indiana’s late-1890s sterilization efforts and 1907 Eugenics Law. The Charity Organization Society, consulting McCulloch‘s “scientific proof,” decided to remove children from families with a history of pauperism and vagrancy, essentially trampling on human rights for the perceived good of society.
But McCulloch had a change of heart. He began to rethink the causes of poverty, believing environmental and social factors were to blame rather than biological determinism. Ruswick notes that “Witnessing the rise of labor unrest in the mid-1880s, both within Indianapolis and nationwide, McCulloch began to issue calls for economic and social justice for all poor.”* To the ire of many of his Indianapolis congregants, the pastor defended union demonstrations and pro-labor parties. He no longer traced poverty to DNA, but to an unjust socioeconomic system that locked generations in hardship. McCulloch believed that these hardships could be reversed through legislative reform and organized protest. To his dismay, McCulloch’s new ideology reportedly resulted in his church being “‘broken up.'”
In a nearly complete reversal of his stance on pauperism, McCulloch wrote a statement titled “The True Spirit of Charity Organization” in 1891, just prior to his death. He opined :
I see no terrible army of pauperism, but a sorrowful crowd of men, women and children. I propose to speak of the spirit of charity organization. It is not a war against anybody. . . . It is the spirit of love entertaining this world with the eye of pity and the voice of hope. . . . It is, then, simply a question of organization, of the best method for method for the restoration of every one.
But after McCulloch’s death, Arthur H. Estabrook, a biologist at the Carnegie Institution’s Eugenics Research Office, repurposed McCulloch’s social study (notably lacking scientific methodology) into the scientific basis for eugenics. Historian Elsa F. Kramer wrote that Estabrook revised McCulloch’s “casual observations of individual feeblemindedness” into support for reforms that “included the institutionalization of adult vagrants, the prevention of any possibility of their future reproduction, and the segregation of their existing children—all to protect the integrity of well-born society’s germ-plasm.” McCulloch had unwittingly provided a basis for preventing those with “inferior” genetics from having children in the name of improving the human race. Kramer notes that co-opting the Ishmael studies for this purpose reflected “the changing social context in which the notes were written.” In fact, Estabrook resumed the Ishmael studies in 1915 because “of their perceived value to eugenic arguments on racial integrity.”
McCulloch’s work influenced Charles B. Davenport’s report to the American Breeders Association and Dr. Harry C. Sharp’s “Indiana Plan,” an experimental program that utilized sterilization to curtail unwanted behaviors of imprisoned Indiana men. Sharp also promoted Indiana’s 1907 Eugenics Law, the first in the U.S., which authorized a forced sterilization program “to prevent procreation of confirmed criminals, idiots, imbeciles and rapists” in state institutions. Twelve states enacted similar laws by 1913 and approximately 2,500 Hoosiers were sterilized before the practice ceased in 1974. Even though McCulloch moved away from his problematic beliefs, for decades they were utilized to rob Americans of the ability to have a family. His legacy proved to be out of his hands.
The complexities of African American Rep. Katie Hall’s legacy could not be more different. In 1983, Rep. Hall, built on a years-long struggle to create a federal holiday honoring the civil rights legacy of the late Dr. Martin Luther King, Jr. on his birthday. Each year since Dr. King’s assassination in 1968, U.S. Representative John Conyers had introduced a bill to make Dr. King’s January 15 birthday a national holiday. Many became involved in the growing push to commemorate Dr. King with a holiday, including musician Stevie Wonder and Coretta Scott King, Dr. King’s widow. But it was the Gary, Indiana leader who spent the summer of 1983 on the phone with legislators to whip votes and successfully led several hearings called to measure Americans’ support of a holiday in memory of King’s legacy. Hall was quoted in the Indianapolis News about her motivation:
‘The time is before us to show what we believe— that justice and equality must continue to prevail, not only as individuals, but as the greatest nation in this world.’
Representative Hall knew the value of the Civil Rights Movement first hand. In 1938, she was born in Mississippi, where Jim Crow laws barred her from voting. Hall moved her family to Gary in 1960, seeking better opportunities. Hall trained as a school teacher at Indiana University, and she taught social studies in Gary public schools. As a politically engaged citizen, Hall campaigned to elect Gary’s first Black Mayor, Richard Hatcher. She broke barriers herself when, in 1974, she became the first Black Hoosier to represent Indiana in Congress. Two years later, she ran for the Indiana Senate and won. While in the Indiana General Assembly, Hall supported education measures, healthcare reform, labor interests, and protections for women, such as sponsoring a measure to “fund emergency hospital treatment for rape victims,” including those who could not afford to pay.
The blemish. In 1987, voters elected Hall Gary city clerk, and it was in this position that her career became mired in scandal. In 2001, suspended city clerk employees alleged that Hall and her daughter and chief deputy, Junifer Hall, pressured them to donate to Katie’s political campaign or face termination. Dionna Drinkard and Charmaine Singleton said they were suspended after not selling tickets at a fundraiser for Hall’s reelection campaign. Although suspended, the Halls continued to list them as active employees, which meant Drinkard was unable to collect unemployment. The U.S. District Court charged the Halls with racketeering and perjury, as well as more than a dozen other charges. At trial, a federal grand jury heard testimony from employees who stated that the Halls forced them to sell candy and staff fundraisers to maintain employment. Allegedly, the Halls added pressure by scheduling fundraisers just before pay day. Investigators discovered cases of ghost-employment, noting that employees listed on the office’s 2002 budget included a former intern who was killed in 1999, a student who worked for the clerk part time one summer two years previously, and Indiana’s Miss Perfect Teen, who was listed as a “maintenance man.”
According to the Munster Times, the Halls alleged their arrest was racially motivated and their lawyers (one of whom was Katie’s husband, John) claimed that “the Halls only did what white politicians have done for decades.” Josie Collins countered in an editorial for the Times that “if they do the crime, they should do the time. This is not an issue of racial discrimination. It is an issue of illegal use of the taxpayers’ money.” Whether or not the Halls’ allegation held water, it is clear from phone recordings between Junifer and an employee, as well as the “parade of employees past and present” who testified against the Halls, that they broke the law.
In 2003, the Halls pled guilty to a federal mail fraud charge that they extorted thousands of dollars from employees. By doing so, their other charges were dropped. They also admitted to providing Katie’s other daughter, Jacqueline, with an income and benefits, despite the fact that she did not actually work for the city clerk. The Halls immediately resigned from office. In 2004, they seemed to resist taking accountability for their criminal actions and filed a countersuit, in which they claimed that Gary Mayor Scott King and the Common Council refused to provide them with a competent lawyer regarding “the office’s operation.” The Munster Times noted “The Halls said they wouldn’t have broken the law if the city of Gary had provided them sound advice.” Instead, they lost their jobs and claimed to suffer from “‘extreme mental stress, anxiety, depression, humiliation and embarrassment by the negative publication of over 500 news articles.'” For this, they asked the court to award them $21 million.
The City of Gary deemed the Halls’ Hail Mary pass “frivolous,” and a “‘form of harassment,'” arguing that “the Halls had no one to blame for their troubles but themselves.” The countersuit was dismissed. Junifer served a 16-month sentence at the Pekin Federal Correctional Institution in Pekin, Illinois. Katie Hall was placed on probation for five years. According to the Munster Times, one observer at her trial noted:
‘We are seeing the destruction of an icon.’
Thus ended Katie Hall’s illustrious political career, in which she worked so hard to break racial barriers and honor the legacy of Dr. Martin Luther King Jr. This leads to the perhaps unanswerable question: “Why?” Maybe in the early 2000s no one was immune from being swept into Gary’s notoriously corrupt political system. This system arose from the city’s segregated design, one which afforded white residents significantly more opportunities than Black residents. Possibly, the Halls sought to create their own advantages, at the expense of others. Either way, it is understandable that some Gary residents opposed the installation of a historical marker commemorating her life and work.
In many ways, McCulloch’s and Hall’s stories are not unique. It seems almost inevitable that with such prolific careers, one will make morally or ethically questionable decisions or at least be accused of doing so. Take African American physician Dr. Joseph Ward, who established a sanitarium in Indianapolis to treat Black patients after being barred from practicing in City Hospital. He forged professional opportunities for aspiring African American nurses in an era when Black women were often relegated to domestic service and manual labor. In 1924, Dr. Ward became the first African American commander of the segregated Veterans Hospital No. 91 at Tuskegee, Alabama. With his appointment, the hospital’s staff was composed entirely of Black personnel. Ward’s decision to accept the position was itself an act of bravery, coming on the heels of hostility from white residents, politicians, and the Ku Klux Klan. The blemish. In 1937, before a Federal grand jury he pled guilty to “conspiracy to defraud the Government through diversion of hospital supplies.” The esteemed leader was dismissed “under a cloud” after over eleven years of service. However, African American newspapers attributed his fall from grace to political and racial factors. According to The New York Age, Black Republicans viewed the “wholesale indictment of the Negro personnel” at Veterans Hospital No. 91 as an attempt by Southern Democrats to replace Black staff with white, to “rob Negroes of lucrative jobs.” Again, context comes into play when making sense of blemishes.
If nothing else, these complex legacies are compelling and tell us something about the period in which the figures lived. Much like our favorite fictional characters—Walter White, Don Draper, Daenerys Targaryen—controversial figures like Katie Hall and Oscar McCulloch captivate us not because they were perfect or aspirational, but because they took risks and were complex, flawed, and impactful. They were human.
Humorist Will Cuppy’s witticisms tended toward, as his biographer Wes Gehring put it, “dark comedy that flirts with nihilism.” Cuppy’s The Decline and Fall of Practically Everybody, published posthumously in 1950, spent four months on the New York Times best-seller list and enjoyed eighteen reprints in hardback. Decline and Fall typified Cuppy’s life’s work, which satirized human nature and utilized footnotes to great comedic effect. He spent sixteen years researching the historic figures featured in Decline and Fall, but, after years of battling depression, passed away before its publication.
The Auburn, Indiana native spent a lot of time on his grandmother’s South Whitley farm. There, he developed a love of animals and a curiosity about life. According to an oft-cited anecdote, Cuppy found himself wondering if fish think—and no one he knew was curious enough to similarly wonder or care if indeed fish do think. In search of more inquisitive conversationalists, Cuppy moved out of Indiana as soon as he could. Upon graduation from Auburn High School, Cuppy departed for the University of Chicago where he would spend the next twelve years taking a wide array of courses. He completed his B.A. in philosophy and planned to get his Ph.D. in Elizabethan literature.
While at university, Cuppy worked for the school paper. As a result, the University of Chicago Press hired Cuppy to “create some old fraternity traditions for what was then a relatively new college” to give the school more of an old east coast university feel. This assignment evolved into Cuppy’s first book, Maroon Tales, published in 1910. Eventually, Cuppy’s college friend Burton Rascoe invited him out to New York City, where Rascoe was an editor and literary critic for the New York Tribune. After agreeing to move to New York City, Cuppy decided to get his M.A. in literature and leave the University of Chicago rather than complete his Ph.D. He was ready to move on.
In 1921, Cuppy moved into a tarpaper and tin shack on Jones’ Island in New York. Suffering from hypersensitivity to sound, Cuppy wished to escape the noise of the city. He lived on the island year-round for eight years, with occasional visits to the city for supplies. The men of the Coast Guard station a few hundred feet down the beach befriended him and shared food, as well as fixed his typewriter. Cuppy called his beach home Tottering-on-the-Brink, giving insight into his mental health. But despite his seclusion, Cuppy’s career progressed. By 1922, he was writing occasionally for the New York Tribune, and in 1926 he joined the staff there as a book reviewer (by which time the Tribune had become the New York Herald Tribune).
Then, in 1929, Cuppy had to leave his shack because New York designated the area to become a state park, although he received permission to visit his hermitage for irregular vacations. Cuppy moved to an apartment in Greenwich Village, but even after he left his residence at Jones’ Island he would sometimes be referred to—and refer to himself—as a hermit because he continued to maintain an isolated lifestyle. Predictably, Cuppy found it difficult to stand the noise of the humming city. He tended to sleep during the day and work during the night to minimize his exposure to the cacophonous sounds. When it all got to be too much, Cuppy would blow on noisemaker as hard as he could out an open window.
Cuppy published a book about his experience living on Jones’ Island in 1929, How to Be a Hermit (Or A Bachelor Keeps House). The book was a best-seller—reprinted six times in six months—and put Cuppy on the map as a humorist and author. In traditional Cuppy fashion, he quipped “I hear there’s a movement among them [architects] to use my bungalow as a textbook example of what’s wrong with their business. The sooner the better—that will give the dome of St. Paul’s a rest.” And then there was this telling jest:
Coffee! With the first nip of the godlike brew I decide not to jump off the roof until things get worse—I’ll give them another week or so.
Cuppy followed up Hermit with How to Tell Your Friends from the Apes in 1931.
The 1930s were a busy time for Cuppy. In 1930, he tried to establish himself as a comic lecturer; however, after a brief stint of talks, it appeared the venture did not work out. A few years later Cuppy hosted a short-lived radio program on NBC called “Just Relax.” It proved too difficult to sustain a radio program with Cuppy’s singular brand of comedy and socially anxious tendencies—radio executives simply told him he wasn’t funny. Though his program didn’t last, Cuppy continued to appear in radio broadcasts sporadically through the years. He went on the radio to promote his next book, How to Become Extinct (1941).
Numerous reviews of mystery and crime novels had garnered Cuppy the distinction of being “America’s mystery story expert” as early as 1935. It was earned—in the course of his career Cuppy published around 4,000 book reviews. He secretly admitted that his heart wasn’t in it and he’d never particularly enjoyed the mystery and detective genre, but reviewing these books in his New York Herald Tribune column “Mystery and Adventure” was Cuppy’s steadiest income stream over the years. Nevertheless, in the 1940s Cuppy used his genre expertise to edit three anthologies of mystery and crime fiction. His freelance writing also picked up in this decade. National publications like McCall’s Magazine, The New Yorker, College Humor, For Men, and The Saturday Evening Post printed Cuppy’s essays that would later be compiled in his books, like How to Attract the Wombat (1949).
In a reflection that brings to mind Hoosier novelist Kurt Vonnegut Jr., Cuppy was fond of saying:
I’m billed as a humorist, but of course I’m a tragedian at heart.
One gets the sense from reading Cuppy’s material that he used humor as a coping mechanism. Quoting Cuppy, Gehring wrote that the dark humorist “believed humor sprang from ‘rage, hay fever, overdue rent, and miscellaneous hell.’” You could say that, like his humor, Cuppy’s life was tragic. Though he had long suffered from depression, multiple sources noted Cuppy’s declining health in mid-1949. Then, threatened with eviction from his Greenwich Village apartment and reeling from the end of a decades-long friendship, Cuppy followed through on decades of casual talk about self-harm. He died on September 19, 1949, due to suicide. He was buried in Auburn, Indiana’s Evergreen Cemetery.
After Cuppy died, his editor, Fred Feldkamp, took on the task of assembling Cuppy’s numerous notes into The Decline and Fall of Practically Everybody. Cuppy took his research seriously, and this is where Cuppy’s extensive education shined through. He would spend months researching a single short essay, reading everything he could find on the topic and amassing sometimes hundreds of notecards on each subject. Having worked on Decline and Fall for a whopping sixteen years before his death, Cuppy had collected many boxes of notecards filled with research. Decline and Fall was an immediatesuccess when it was published in 1950. Locally, the Indianapolis News named it one of the best humor books of the year, and listed it as the top best-seller in Indianapolis in non-fiction for the year. In 1951, Feldkamp used more of Cuppy’s notes to edit and publish How to Get from January to December; it was the final publication in Cuppy’s name.
Cuppy’s style was characterized by a satirical take on nature and historical figures. Footnotes were his comedic specialty—they were such a successful trademark that he was sometimes hired to add his touch of footnote flair to the works of fellow humorists. In Decline and Fall there is one footnote in particular which is emblematic of Cuppy’s unique dark humor: “It’s easy to see the faults in people, I know; and it’s harder to see the good. Especially when the good isn’t there.” Before the publication of Decline and Fall, Cuppy was frequently asked why he always wrote about animals—when would he write about people? But, of course, he had been lampooning humanity all along.
Perhaps it makes sense, then, that in the last decade of his life, Cuppy befriended William Stieg, the man who would go on to create the character Shrek in his 1990 children’s book by the same name. A young cartoonist, Stieg was hired to illustrate How to Become Extinct and Decline and Fall. Cuppy and Stieg struck up an extensive correspondence, and Cuppy influenced Stieg’s style. The notion of a humorous curmudgeon living in isolation and drawn out into the world by both necessity and outgoing friends strikes a familiar chord that echoes in Shrek.
Cuppy was a famous humorist in his time, and the acclaim of his better-known comedy contemporaries, like P. G. Wodehouse and James Thurber, certainly helped to heighten his renown. When Decline and Fall came out, a reviewer for the New York Times insisted that “certain people, at least, thought [Cuppy] among the funniest men writing in English.” Beyond his work as a humorist, Cuppy’s career as a literary critic had been impactful; the managing editor of the Detroit Free Press wrote that he had “given up reading whodunits” after Cuppy’s death because he didn’t trust any other critic to guide his mystery selections. The sadly serious humorist is less widely known today, but his quips seem more relevant than ever.
Be sure to see Will Cuppy’s state historical marker at the site of his childhood home in Auburn after it is unveiled in August.
Wes D. Gehring, Will Cuppy, American Satirist: A Biography (Jefferson, NC: McFarland & Company, 2013).
Norris W. Yates, “Will Cuppy: The Wise Fool as Pedant,” in The American Humorist: Conscience of the Twentieth Century (Ames: Iowa State University Press, 1964).
Al Castle, “Naturalist Humor in Will Cuppy’s How to Tell Your Friends from the Apes,” Studies in American Humor, 2, 3 (1984): 330-336.
See Part I to learn about the unparalleled professional accomplishments of Dr. Helene Knabe.
Who entered Dr. Helene Knabe’s rooms at Indianapolis’s Delaware Flats and brutally cut her throat from ear to ear? The killer was skilled enough to cut her on one side first, missing her carotid artery and cutting deep enough to cause her to choke on her blood. The second cut just nicked the carotid artery and cut into the spine.
Officials followed a variety of leads regarding the gruesome crime. The first person on the list, suspiciously, was an African American janitor named Jefferson Haynes, who lived below her. Second on the list was a Greek prince who was seen mailing a letter near her apartment. This absurd line of inquiry continued for months by the very people who should or could have advanced the case more quickly. Police Chief Martin Hyland reasoned that she committed suicide because at 5’6″ and 150 pounds, he believed her strong enough to ward off any attack or to take her own life.
Also problematic, evidence was left in a room where anyone could access it. Although fingerprinting was in its infancy, officials ignored a bloody fingerprint, despite Dr. Knabe having no blood on her hands. Police and some physicians believed despondency over her unproven sexual preference or financial situation caused her to take her own life. Even Detective William Burns, known as America’s Sherlock Holmes, publicly stated that based solely on the evidence in the newspapers, he believed she killed herself.
Local, state, national, and even some international press ran stories about Dr. Knabe. Indianapolis newspapers were surprisingly fair in their coverage and published editorial and opinion pieces that were overwhelmingly complementary of Dr. Knabe and her professional achievements. Although these newspapers interviewed people who believed Dr. Knabe got what she deserved, they did not give these sentiments undue attention or sensationalize them.
Thankfully, the coroner, Dr. Charles O. Durham, determined that Dr. Knabe was murdered. Dr. Durham noted she had defense wounds on her arms and he was adamant that she could not have made both cuts. He also noted several factors he considered “strongly presumptive of murder,” including the position of the hands, which had been closed after death; the absence of a plausible suicide weapon; and the fact that many witnesses had seen a man that night around the apartment building. Dr. Durham’s findings negated rumors regarding Dr. Knabe’s sexuality and finances, which police felt could have contributed to her death by her own hand.
In response to Dr. Durham’s findings, female doctors who were Dr. Knabe’s friends actively tried to help find her killer. They hired private investigator Detective Harry Webster at their own expense, through donations, and at the detective’s own expense. Almost fifteen months after her death, two men were indicted by a grand jury, based on Detective Webster’s findings. The prosecution believed that Dr. William B. Craig was engaged to Dr. Knabe, a fact he vehemently denied, and that he wanted out of the relationship. As Dean of Students, lecturer, and financial stakeholder in the Indiana Veterinary College, he would have been very familiar will zoology and the “sheep’s cut,” which is the type reported to have killed her.
Dr. Craig met Dr. Knabe in 1905 and maintained a friendship, at the very least. He recommended her for the position as Chair of Hematology and Parasitology in 1909 at the veterinary college. Shortly before her death, Dr. Craig and Dr. Knabe seemed to be in the middle of an ongoing dispute. Dr. Knabe went to the IVC to see about changing her lecture time with Dr. Craig so that she could attend her course at the Normal College. Dr. Craig became enraged when a colleague asked for his answer and he said “Oh, f—! Tell her to go to hell!” and he stormed out of the room. The night before Dr. Knabe died, Dr. Craig’s housekeeper overheard them arguing and she heard Dr. Knabe say, “But you can continue to practice and so can I!” Police had a letter in their possession in which Dr. Knabe told a friend she was getting married. Dr. Knabe confided to a friend she was getting married to a man with an “ungovernable temper.” At the time of her death, Dr. Knabe, an accomplished seamstress and dressmaker, commissioned a costly dress, indicative that she was getting married.
The second man indicted, Alonzo M. Ragsdale, was an undertaker and Dr. Knabe’s business associate. Dr. Knabe often joked with Ragsdale that when she died, she would be sure to give him her business. And so she did. Augusta appointed Ragsdale undertaker and estate executor. He was accused of concealing evidence against Dr. Craig in the form of the kimono Dr. Knabe was wearing at the time of her death. It was said he had laundered it in an effort to rid it of blood stains.
In the words of Ms. Frances Lee Watson, Clinical Professor of Law at IUPUI, “She was screwed from day one.” Dr. Knabe was never treated as a victim; she was treated as a villain. Society in general could not understand a woman wanting to work in a field that was sometimes unpleasant and coarse. In the media and by some of her peers, Dr. Knabe was chastised for being assertive in her career and pursuing her dreams. Her character was summarily attacked because she expected equality with her peers, male or female. Because she was a 35-year-old woman, who was a physician living in a small apartment—rather than a grand home with a husband and children—Dr. Knabe was automatically judged unhappy. Due to Alonzo Ragsdale, who in addition to being indicted was also an unscrupulous estate executor, the public believed her to be an unsuccessful, pauper physician.
The truth was Dr. Knabe had many revenue streams from jobs that she loved: practitioner, instructor, and artist. She planned to continue her work and make herself even more financially stable. By looking at her financial records, Dr. Charles Durham proved that she was financially sound, bringing in over $150 per month. The public did not know for many months that Dr. Knabe chose to send most of her disposable income back to her uncle because he was no longer able to work.
None of these facts mattered. The defense attacked Dr. Knabe’s personal character in the courtroom, claiming she was an aggressive and masculine woman. The character witnesses, who sought to discredit Dr. Craig, suddenly moved out of state or could not be found. A key witness who positively identified Dr. Craig changed his story, and Dr. Craig’s own housekeeper, who had signed an affidavit stating she saw him return late and leave early with a bundle of clothes the night Dr. Knabe died, refused to come to the courthouse.
Consequently, the state’s case fell apart and after nine days the prosecution could not make a connection between Dr. Craig and the evidence. In an unusual move, the judge stepped in as the thirteenth juror and instructed the jury to acquit Dr. Craig. Normally a judge provided this instruction only when a technical error was committed, which was not the situation in this trial. He did rule that the prosecution had proven Dr. Knabe had been murdered, but that they had no real evidence against Dr. Craig.
Because there was now nothing to be an accessory to, the charges against Ragsdale were dropped. No one was ever convicted of Dr. Knabe’s murder. Oddly enough after the trial, Ragsdale declared Dr. Knabe’s estate insolvent without collecting all debts. Many of her personal items did not sell and their whereabouts were undocumented. The probate records submitted over three years to the courts contained erroneous calculations that went unnoticed and several hundred dollars were not reconciled.
Dr. Knabe was buried in an unmarked grave at Crown Hill. Over the years, newspapers have revisited her case, but in 1977 her case file was destroyed in a flood. Unfortunately, the sensationalizing of Dr. Knabe’s death has obscured her legacy as a tenacious, committed, and savvy physician in a field dominated by men.
The Steuben County Asylum near I-69 in northeastern Indiana represents two contrasting ideals of poverty care. On the one hand, this imposing building on the rural landscape embodied the modern ideal of an end to poverty through scientific principles. In spite of the U.S. industrial economy of the later 19th century, marked by frequent panics and recessions, a new poor care system held out the hope that all indigent persons could be retrained and readied to work in the modern industrial world. The new system would provide a safety net supporting those through the hard years and would help impoverished people develop improved habits in a healthy and orderly atmosphere. On the other hand, this building symbolized failure and loss of place in the community. To be a resident of this facility required separation from society and often induced a lifelong stigma of shame.
These institutions represented both a severe solution meant to frighten the “lazy” into working harder and a belief in a safety net to support those living on the margins.  This was especially important in an era when layoffs were not supplemented by benefits like workers compensation. Across rural America, there was fear associated with the various names for asylums: almshouses, county farms and infirmaries, poor farms, county homes, workhouses, and “the pogey.”
Traditional poor relief (after private charities and local churches were exhausted) fell to local government. This was called “outdoor relief” because the poor or destitute were helped where they lived. To contain costs, the sheriff might “warn out” (or throw out) potential pauper residents to discourage poor people from staying there. Officials often employed this method to keep immigrants, especially the Irish, from settling in their town.
If the family could not care for an indigent resident, a landowner might take that person in on the lowest bid for room and board. By the 1820s, this informal arrangement was rapidly supplanted by an increasingly standardized system recognizing one place as a county poorhouse. The professionalization of these institutions focused on isolating each class of patient from what social reformers thought was the cause of their ailments or bad habits. The system was intended to instill a culture of order on the disorder of their lives. The enforced order would help cure the issues they faced. However, most residents used the farm only for periodic stays during times of unemployment and sickness.
In line with the rest of the nation, Indiana initiated its statewide system of county poor asylums. In 1821, the state legislature approved Indiana’s first poorhouse in Knox County. Following the national standards for poorhouse improvements, promoted in prescriptive literature, many counties built what were called “model homes” by the later nineteenth century. These were modern buildings constructed to meet the current standards of that time. These asylums even provided libraries for residents to use in preparation for a changed life outside the asylum.
Many rural almshouses were working farms, providing food for residents and a profit to the county government. The Democrat newspaper of Huntington County praised its superintendent in 1871 for keeping the farm as an “almost self-sustaining . . . charitable institution.” Efficiency and thrift were valued far higher than any other management trait. These practices led to abuses of a very vulnerable group in society. To create a more orderly life for their residents, almshouses increased the level of isolation and separation in the homes. This policy is reflected in the houses’ physical form as it changed during the 19th century.
The Democrat provided a brief glimpse into the Huntington County almshouse during February of 1871. The paper listed 18 assorted inmates, but ten or twelve more typically resided there during the year. Most residents were temporally admitted during sicknesses and job slowdowns. Most poorhouses apparently hosted a few long-term residents and sometimes children were born there too. The farm around the almshouse provided work for residents capable of manual labor. One resident at the Huntington Almshouse was the full-time farm hand. Others worked on the farm or in the almshouse kitchen.
Between 1830 and 1900, four stages in almshouse design demonstrated a stronger commitment to scientific poor care. The first stage involved converting a portion of a private house to accommodate paupers placed in the home owner’s care. The owners made no effort to separate the residents, and they were assigned farm work, as able, to help earn their keep. The famous 1872 poem by Will Carleton “Over the Hill to the Poor House” was inspired by his experience at just such a home in Hillsdale, Michigan. The lack of family support, as well as old age temporarily landed elderly mothers in the poor house.
In the second stage, the county purchased a farm to use for the care of the poor. Other buildings might be constructed for dorm facilities for the majority of the residents.
The next stage was the first real attempt at building a custom facility for poor care. The Steuben County Asylum, built in 1885, appears to match this third stage. The strong center area indicates there was a public entrance with rooms for the County Superintendent of the Poor. There is room enough to separate men from women and to create the ordered environment that could be both helpful and oppressive.
The fourth stage is the full scale, scientifically approved poor house. As can be seen in the illustration above this facility is a massive element in the landscape. The very obvious three-part construction is easy to recognize. Some of you will have seen buildings like this around rural Indiana. They seem out-of-place among local farms. They may be marked by a road name such as Asylum Road or County Farm Road. The well-used 1911 textbook, The Almshouse Construction, and Management (written in Indiana) noted that asylums must be near the center of the region they serve, allow for complete segregation of the sexes, provide an abundance of sunlight and fresh air, and be designed for convenient access for administrators to the whole house. 
They are designed to house men and women in completely separate wings with public space in a center section. Usually, the County Superintendent of the Poor lived in the upstairs of the center section. Larger homes had infirmaries for men and for women. This feature became more common in the early 20th century as the almshouse became more of an old age home rather than a place of refuge from destitution.
When I first started researching this theme I interviewed staff at the Steuben County Asylum, which had been completely converted to a senior rest home. The problem was that many elderly residents refused to consider living there out of the memory of what that building had once meant. Even in the 1980s, seniors related residency in the poorhouse with a loss of freedom and personal dignity. The company managing the care facility failed to grasp the public memory of the County Asylum on that generation. Ironically, the current generation of seniors (Baby Boomers) might laugh at residing in a former poorhouse perhaps as a way of poking fun at their elders’ fears.
County poorhouses should remain a visual reminder of the hazards inherent in reform efforts. Even with good intentions, abuses of vulnerable people occurred. The poorhouse had little to regulate it except mixed national ideals and local attitudes. Torn between purposes of punishment and rescue, poorhouses failed to cure poverty. The complexity of poverty caused reformers and politicians endless pains. We might gain some comfort that citizens and politicians before us found poverty as difficult to manage as we do now.
 David Wagner, The Poorhouse: America’s Forgotten Institution, (Lanham MD: Rowman & Littlefield Pub., 2005), 19.
 Kayla Hassett, “The County Home in Indiana: A Forgotten Response to Poverty and Disability,” (Masters Thesis, Ball State University, May 2013), 13.
 See report by Henry N. Sanborn, “Institution, Libraries: The Outlook in Indiana” Forty-Third Annual Meeting Conference of Charities and Correction. (Indianapolis, IN 1916), 367-371.
 “The County Alms-House Its General Condition-The Number and Character of its Inmates,” The Democrat, (Huntington, Indiana), February 2, 1871, accessed www.poorhousestory.com.
A private research web page titled Poorhouse Story provides images, primary sources, and readings for poorhouses and related agencies around the United States. They can be accessed at http://www.poorhousestory.com.
The black snake undulated between the two women, winding back and forth, circling overhead. A lascivious leer seemed to be affixed to the snake’s mouth as it weaved, moving the women closer, but then winding between and pulling them apart. Augusta Knabe could not bear to see this horrible apparition between them. She reached for her cousin.
Augusta lost her grip on Helene and sat up in bed, struggling to catch her breath. She pushed her sweat-drenched hair back and collected herself. What a horrible dream! Augusta felt guilty she had not accepted her cousin’s offer of tea the past afternoon. She was sure the dream was her penance for wanting to avoid late afternoon traffic and enjoy the comfort of her home after shopping. Augusta promised herself she would stop by Helene’s flat after school and take her to tea the very next afternoon. Despite this promise, Augusta passed the rest of the night fitfully.
Augusta’s cousin, Helene Elise Hermine Knabe, yearned to be a doctor. In Germany women were not allowed in medical school until 1900 and it would not be allowed for women in the German state of Prussia, where she lived, until 1908. Her father, Otto Windschild, left her mother when Knabe was an infant and she was raised by her uncle after her mother died. Given her humble upbringing, becoming a doctor became more of a dream and less a reality with each passing year.
When Augusta informed Helene that women were allowed to attend medical school in America, Helene’s life changed forever and she moved to Indianapolis in 1896. The motto she heard most often growing up was “You cannot be a master in anything unless you know every detail of the work.” No one applied this maxim more than Knabe. To prepare for school she worked for four years in domestic and seamstress work in order to learn English from the upper class. She attended Butler University for a term to supplement her self-learning and to prepare her for the rigors of medical school.
In 1900, Knabe entered the co-educational Medical College of Indiana (MCI). She was required to attend classes, dissect every body part of cadavers, maintain a 75% grade in all classes, refrain from drinking, and work fourteen hour days. During this time, she continued as a seamstress to supplement her income. Knabe also used her drawing skills by providing medical textbook illustrations to several books, including detailed sketches for anatomy, surgery, and pathology slides.
Knabe proved a trailblazer with her medical school accomplishments. Dr. Frank B. Wynn, the Director of Pathology at MCI, appointed her curator of the pathology museum. She was consequently placed in charge of the pathology labs at the school. Much to the chagrin of many of her male peers, Dr. Wynn chose her to be his only preceptee for the year. She began teaching underclassmen, an unheard of honor for a student. On April 22, 1904, Knabe became one of two women to graduate from MCI. She threw herself wholeheartedly into her profession, burning the candle at both ends to gain a foothold in practice, networking, and skills.
Dr. Knabe stayed on in her positions as lab curator and clinical professor—for which she was not paid. Appointed a deputy state health officer in 1905 by Dr. J. N. Hurty, the Secretary of the Indiana State Board of Health (ISBH), Dr. Knabe became the first woman to hold this office in Indiana. Part of her duties involved investigating suspected epidemics, such as typhoid and diphtheria, and making recommendations to reverse unsanitary conditions. Dr. Knabe routinely traveled the state to work with the public and doctors, and processed hundreds of pathological samples.
Despite Dr. Knabe’s expertise, Dr. Hurty did not hire her as superintendent of the lab. Instead, he chose Dr. T. V. Keene, regardless of the fact that he did not apply for the job. As the laboratory grew, Dr. Knabe became Assistant Bacteriologist and was expected to work longer hours and spend more time in the field. During her work at the ISBH, Dr. Knabe presented papers and worked with the public in diagnosis and education. Local papers interviewed her for her thoughts on how to make Indianapolis a more beautiful and clean city.
Dr. Knabe also kept current on new methods, most notably studying with Dr. Anna Wessel Williams of the New York Research Laboratory. Dr. Williams was brilliant in her own right as the originator of the rapid diagnosis of rabies, which was based on research from Negril and the co-developer of the diphtheria antitoxin. Dr. Knabe proved the widespread existence of rabies in Indiana. From this work, she implemented ways to prevent the spread of rabies by educating the public about the disease and its consequences.
Widely accepted as the state expert on rabies, Dr. Knabe was promoted to acting superintendent and paid $1,400 annually. Dr. Hurty promised her the superintendent position and an increase to $1,800 or $2,000. Over a year later Dr. Hurty told Dr. Knabe that there was no money for her salary increase and that because she was a woman she could not command the amount of money the position should pay anyway. Dr. Knabe contacted the newspaper and tendered her resignation, citing discrimination and broken promises.
Dr. Hurty had searched for what he considered “a real capable man” by actively recruiting Dr. Simmonds as the new superintendent. Additionally, although Dr. Hurty told Dr. Knabe the state had no money for her raise, he informed Dr. Simmonds he would pay $2,000 the first year and $3,000 in the second. That was a 47% increase from Dr. Knabe’s salary. The final slap in the face came from Dr. Simmonds himself in the first 1909 Indiana State Board of Health bulletin. He published Dr. Knabe’s findings about rabies in Indiana and elsewhere without crediting her.
Leaving the oppressiveness of state employ could not have been better for Dr. Knabe. Her dedication to medicine was rejuvenated. She opened her own private practice and continued her rabies research at $75 or more per case. While many female physicians shied away from accepting male patients because they may not be taken seriously or feared being attacked by male patients, Dr. Knabe insisted on having a phone installed in her apartment in case a patient needed her. She would always answer a knock or a call, regardless of the hour. Quite often she would treat people for free or accept payments via the barter system. This is how she acquired a piano and the lessons to go with it.
One of her biggest achievements was when she became the first elected female faculty for the Indiana Veterinary College (IVC), where she was the Chair of the Parasitology and Hematology. Dr. Knabe’s tenure at the IVC predates any recognized woman department chair at any veterinary college in the United States prior to 1920.
Demonstrating her willingness to be a social feminist, Dr. Knabe bucked trends at every turn by her work in sex education. She served as the medical director and Associate Professor of Physiology and Hygiene, known today as sex education, at the Normal College of the North American Gymnastics Union in Indianapolis. She also networked with women’s clubs and the Flanner House to create and teach hygiene and sanitation practices to all ethnic groups across the State of Indiana, especially African American communities.
The same night that Augusta dreamt about the black snake, a person entered Dr. Knabe’s rooms at the Delaware Flats and brutally cut her throat from ear to ear. The killer was skilled enough to cut her on one side first, missing her carotid artery and cutting deep enough to cause her to choke on her blood. The second cut just nicked the carotid artery and cut into the spine. See Part II to learn how Dr. Knabe’s non-conformist lifestyle and work as a female physician would be used against her in the bungled pursuit of her killer.
Shells rained down on men who had endured disease, the obliteration of their comrades, sleep deprivation, the constant shriek of ammunition, and the literal smell of death. Burrowed into dirt along the Western Front, these Allied men slugged it out in a battle of wills and weaponry until they defeated the Central Powers in 1918. Many of the American troops that helped ensure victory in World War I, as well as the surgeons waiting on hand in ambulances to treat shrapnel-torn faces, were there, in part, because of the efforts of Indiana dentist Otto King. For Dr. King, dentistry went beyond staving off cavities and engineering attractive smiles. He applied his dental skills to a greater good both abroad and at home, and encouraged the nation’s dentists to follow suit. This meant mobilizing dentists to treat war-induced maxillofacial trauma and establishing free dental clinics for poor children who missed school due to untreated oral issues.
After graduating from the Northwestern University Dental School in 1897, Dr. King practiced dentistry in his hometown of Huntington, Indiana. He assumed a national leadership role in his profession in 1913, when he was elected the general secretary of what is today known as the American Dental Association (ADA). The role of general secretary was equivalent to that of a modern executive director. Dr. King helped transform dentistry from a trade to a profession through the establishment of The Journal of the American Dental Association (JADA), published in Huntington and distributed nationally. Dentists across the country—notably those in remote areas—could learn about best practices, research findings, educational and professional opportunities, and new dental theories through articles like “The Functions of Dentistry and Medicine in Race Betterment” (1914) and “Commercialism vs. Professional Ethics” (1915).
Editor King also used the journal to mobilize dentists for World War I service and published findings related to war-related injuries, such as Leo Eloesser’s 1917 “Gunshot Wounds and Lesions Produced by Shell and Shrapnel in the Jaws and Face.” Just days before the U.S. entered World War I by declaring war on Germany in 1917, Dr. King gave an interview printed in newspapers across the country about the Preparedness League of American Dentists, an extension of the ADA. The emergence of trench warfare during the “gravest crisis” in history created an urgent need for dentists on the frontlines and the Preparedness League worked to recruit dentists for Army and Navy service from every state, as well as Puerto Rico, the Philippine Islands, and Canada. Dr. King did his part when Colonel Kean ordered him to choose dentists to serve at base hospital No. 32, located in Indianapolis, in July 1917.
Dr. King explained that the league would respond to this need by securing “in each locality, a nucleus of the trained dental specialists, who will assist in the instructions of the members of the unit along the lines of war dental surgery, as a measure of preparedness against war and to co-operate in treatment of wounds of the jaws and face, in case of actual warfare.” He stated that “Whereas Red Cross base hospitals are being formed, we are, as fast as possible, organizing dental units in connection therewith and co-operation is established between the organizations.”
The New York Times reported that the Preparedness League outfitted dental ambulances sent to the warfront to reach patients in “out-of-the way places.” In addition to treating the wounded, these ambulances isolated men in order to prevent the spread of diseases like mumps and German measles. The Times reported that “at least 20 per cent of the men are incapacitated and kept from active service on account of illness finding its source in diseased conditions of the mouth.” These state-of-the-art ambulances included a fountain cuspidor, electric lathe, sanitation cabinet, steam sterilizer, nitrous oxide, vulcanizer, and a typewriter on which to record treatment findings. A secondary use for these ambulances involved relief work in France, where the Red Cross mobilized dentists to treat the teeth of children.
Because of his work with the Preparedness League, Dr. King was appointed as one of twelve members on the Committee on Dentistry, General Medical Board of the Council of National Defense. He headed the Committee on Publicity, a subcommittee tasked with recruiting dentists for Army service. Dr. King utilized JADA for this purpose, including a blank application to the Dental Reserve Corps and publishing pieces like “You Can Help Win the War!—An Appeal for Prompt Individual Service by Every Member” and “How May You Assist the Medical Department of the United States Army?”
In one article, U.S. Army dentist Dr. John S. Marshall detailed the morbid gamut of dental injuries awaiting military personnel, including:
blows upon the face from the closed fist; kicks of horse or mule; the impact of some heavy missile propelled with considerable force; the extraction of teeth, tho this is rare; a fall from a horse, bicycle, or a gun-carriage; the passage of a wheel over the face; and gunshot injury in line of duty or from accident, or design, with suicidal intent or otherwise.
He lamented the devastation wrought by shell fragments, which “tear away the soft tissues and underlying bone, leaving a hideous and ghastly wound.” Because of these traumas, “the Oral Surgeon has during this World War come into his own.” These surgeons not only performed life-saving procedures, but also helped restore facial features in what the Chicago Tribune described in 1918 as “a new branch of the healing art—that of plastic surgery.” The Decatur, Illinois Daily Review noted that working to reverse oral disfigurement “have given the dentists a new distinction.”
In addition to injuries, success on the battlefield was impeded by defective teeth, as they hindered the ability to eat and subsequently weakened the fighting force. Dr. King thus noted the importance of the U.S. Army Dental Corps, stating “It is truly said that an army fights on its stomach and teeth . . . As monitors of the teeth, the dentists are supervisors of the stomach, hence, the army is helpless without our professional officers.”
But just getting troops onto the battlefield proved to be a challenge. Dr. King utilized his prominence in the profession to convince dentists to treat recruits barred from service due to dental issues—at no cost. He warned that “more than 2,000 applicants for enlistment were in danger of being refused entrance into the fighting force of the nation because of defective teeth.” In April 1917, he volunteered to personally treat rejected recruits and he convinced local dentists to prepare the mouths of two rejected recruits. Under his direction, the ADA hosted a “Help Win the War” convention in 1918, which featured a series of clinics about dental treatment and military recruitment.
The Chicago Tribune reported that by the time of the conference, Preparedness League members had performed more than 500,000 free operations on recruits, enabling the men to pass the military’s physical examination. A letter printed in the JADA encouraged dental colleges, dispensaries, and hospital clinics to work with the Preparedness League to treat the mouths of recruits. The author lamented that the criteria for military acceptance included only “a mouth free from disease producing conditions and four (4) opposing molars, two on either side . . . This requirement is a joke but we can change it no doubt, if desired.”
With the conclusion of the Great War, Dr. King intensified his efforts to bring his “great humanitarian mission” to fruition. This involved educating the public about importance of dental prevention, particularly among children. He noted that most infectious diseases, such as diphtheria and small-pox, entered through the nose and mouth, making the maintenance of a healthy oral environment crucial. He observed that many children missed school due to infections and malnutrition caused by defective teeth, but their parents lacked the resources to treat the maladies. Dr. King hoped to prevent these painful and disruptive dental issues by educating children about hygiene, through demonstrations and nursery rhymes, and by offering free preventative treatment. In an address about oral hygiene, Dr. King proclaimed that “For years we have been trying to dam back or cure diseased bodies, due to neglected Oral Hygiene conditions, but overlooking the source or beginning of life as represented in childhood as the place to teach and establish preventative medicine.”
Dr. King helped establish free clinics on the East Coast and implored the public and lawmakers to invest in their establishment, stating in 1917 that “Disease is a social menace, an enemy of the State.” In a 1920 criticism of American dental care, he noted that “The children of our country deserve as effective physical care as the livestock.” He anticipated backlash for proposing free dental clinics, but argued that “socialized health” should be wielded as a weapon against “capitalized disease.” Dr. King’s dogged belief that dentistry could uplift humanity radiated from the trenches of Gallipoli to classrooms in New York.
Learn more about the extraordinary Dr. Otto King with IHB’s new historical marker.
With a staff of over 155 and over 170 volunteers, today’s Fort Wayne Visiting Nurse is a far cry from its humble beginnings. In 1888, a group of Fort Wayne women organized the Ladies Relief Union with a mission to “help the sick poor of Ft Wayne.” Calling themselves the Visiting Nurse Committee, they soon discovered a link between poverty and disease. Dr. Jessie Calvin, a Fort Wayne sanitation and indoor plumbing pioneer, encouraged women’s church groups to raise money for a qualified nurse that could meet community needs.
Prior to the 1860s, nursing was “typically considered a domestic responsibility provided in the home by family members.” Nursing as a profession evolved after the Civil War, when women gained experience caring for wounded soldiers. Historian Clifton J. Philips noted that in the post-war period, women’s religious orders were “especially active in establishing hospitals in an attempt to extend to the general public, and the poor in particular, some of the services formerly rendered to sick and wounded soldiers.” As hospitals materialized, so too did nursing groups and training programs. By 1897, The Fort Wayne Journal-Gazette noted that “modern nurses” wished to “be given her proper position as a skilled assistant in serious illness.”
The paper added that:
The daily or visiting nurse is a recent development of modern nursing and meets the needs of many people who find it inconvenient to have a nurse stopping in the house and requiring more or less attention from servants perhaps already overtaxed. The visiting nurse comes in for an hour or so every day to perform those services for which her skill is needed.
On March 1, 1900, an organizational meeting was held in Fort Wayne and the Visiting Nurse League became a reality. At a salary of $10.00 a week, Josephine Shatzer was hired as the League’s first nurse. During harsh weather she took a trolley, but normally she could be seen making her rounds on her bicycle. Regardless of transportation, it was clear that she wasted no time. On her first day she saw six patients. Next she established a baby milk station at First Presbyterian Church, instructing new moms how to prepare formula.
She also volunteered at free immunization clinics, bathed patients, delivered meals, changed bedding, dressed wounds, cared for the elderly and ill in their homes and endeared herself to all she served. At the end of her first year she had made hundreds of calls, utilizing supplies donated by local churches, relief societies, and drug stores. For those patients who could pay, the charge of a one-hour visit was fifteen cents.
The Fort Wayne Daily News praised the program in 1900, noting that the league “found favor with all classes of people” and that visits to the “sick poor” conveyed not only “help and benefit, but hope and good cheer to every member of the family.” In 1913, the Public Health Nursing Association appointed a visiting nurse to serve African American patients at the Flanner Guild. Dr. Calvin continued to guide the League, through the years of World War I and the 1918 flu epidemic, which took the lives of 3,266 Hoosiers.
By 1919, a nurse named Dixon had reached a salary of $100.00 a month. League expenditures in 1920 stood at $1,300 annually and in 1922 the Community Chest offered its support. Insurance companies began to hire nurses from local visiting nurse groups to assess policyholders who were ill, and paid the League seventy-five cents a visit. By 1923, the League reorganized and served in an advisory, instructional health teaching capacity. The Great Depression wrought poor health conditions: eight nurses made over 29,000 visits to 4,477 patients. One made 3,255 orthopedic visits to 104 crippled children, many of whom were victims of polio. Sisters of Saint Joseph Hospital provided free hospitalization in the pediatric ward for those who could not pay. In the 1940s, World War II increased demands for nursing schools to produce eligible Nurse Corps candidates.
In 1954, the agency changed its name to Visiting Nurse Service, Inc. By 1956, it undertook a program that cared for stroke victims in their homes, which served to collect data on medication, exercise, loss of function and the need for expanded therapy service. By 1962, the agency’s director Eva Rosser introduced a State Board of Health-funded program that focused on treating the chronically ill in their homes, instead of a hospital or nursing home. Later the group provided care as home health aides and the agency became certified for Medicare on July 1966.
With certification came additional paper work. According to History of Visiting Nurse, by 1983, “a record number of visits for a single month occurred (2,200), due to earlier hospital releases and greater technology used in the home.” During this period, the glucometer was used for the first time, registered nurses were trained to perform phlebotomy services, and around-the-clock care was made available for all patients. In 1984, Medicare Hospice Benefit became available and Visiting Nurse Service merged with Hospice of Fort Wayne, Parkview, and Lutheran Hospices. The agency introduced computerized billing and, by the end of the decade, services for the frail and disabled. By 1990, Hospice service visits totaled 38,177, a forty percent increase over previous months.
History of Visiting Nurse noted that in 1990 the agency moved to the “Moellering Unit of the nearly vacant former Lutheran Hospital.” In 1995, the 1984 merger dissolved and Visiting Nurse Service and Hospice became a free standing agency. By February 2001, a new Hospice Home facility opened and in 2006 a building expansion added patient rooms. In 2011, nurse practitioners joined the staff and the “Watchful Passage” program began, in which trained volunteers remained at patient’s bedside during the last few days of life. In 2018, Visiting Nurse staff and volunteers can proudly stand tall celebrating 130 years of community service.
Spanish Influenza hit Indiana in September of 1918. While the virus killed otherwise healthy soldiers and civilians affected by WWI in other parts of the world since the spring, most Hoosiers assumed they were safe that fall. Still, newspaper headlines made people nervous and health officials suspected that the mysterious flu was on their doorstep.
In April of 1917, the United States joined the Allied effort. Residents of Indianapolis, like most Hoosiers, largely united around the war effort and organized in its support. In addition to registering for military service, the National Guard, and the Red Cross, they organized Liberty Loan drives to raise funds and knitting circles to make clothing for their soldiers. Farmers, grain dealers, and bankers met to assure adequate production and conservation of food. They improved the roads in order to mobilize goods for the war effort, including a road from Indianapolis to nearby Fort Benjamin Harrison located just nine miles northeast of downtown Indianapolis. This penchant for organization would be extremely valuable throughout the bleak coming months.
The U.S. Army constructed Fort Benjamin Harrison over a decade earlier with the intention of stationing one infantry regiment there. However, with America’s entry into the war, Fort Ben (as it was colloquially known) became an important training site for soldiers and officers. It also served as a mobilization center for both Army and National Guard units. In August 1918, just prior to the flu outbreak, the War Department announced that the majority of the fort would be converted into General Hospital 25. The Army planned for the hospital to receive soldiers native to Indiana, Kentucky, and Illinois who would be returning from the front as wounded, disabled, or suffering from “shell shock.” By September, the newly established hospital was ready to receive a few hundred “wounded soldiers returning from France.” But the soldiers stationed there preparing to receive causalities, began to fall ill themselves.
On September 26, 1918, the front page of the Indianapolis News announced unidentified cases of illness in training detachments stationed at the Indiana School for the Deaf, the Hotel Metropole, and Fort Benjamin Harrison. The detachment at the deaf school denied that men were infected with the deadly Spanish Influenza that was on the rise as soldiers returned to the U.S. from the front. The medical officers instead claimed “the ailment here is not as serious as that prevailing in the east.”
Despite this reassurance, the high number of cases was alarming. The major in command of the detachment issued a quarantine. The lieutenant from the hotel detachment also claimed that none of the illnesses there were caused by Spanish influenza. He referred to the cases as “stage fright,” as opposed to a full outbreak of the disease. At Fort Benjamin Harrison, sixty men suffered from influenza, but the Indianapolis News reported “none has been diagnosed as Spanish influenza and no case is regarded as serious.” The medical officers there reported, “An epidemic is not feared.”
While the front page reassured the city’s residents that there was nothing to fear and that the military had everything under control, a small article tucked away on page twenty-two hinted at the magnitude of the coming pandemic. Twenty-seven-year-old Walter Hensley of Indianapolis had died of Spanish Influenza at a naval training detachment on the Great Lakes. His body arrived in the city for burial soon after, a funeral that would be the first of many for otherwise healthy young military men. Only a few weeks later, Indianapolis would be infected with over 6,000 cases with Fort Benjamin Harrison caring for over 3,000 patients in a 300 bed facility before the end of the epidemic.
Indianapolis was not alone in its unpreparedness, as little was known about the strange flu. Influenza was certainly not uncommon, but most flu viruses killed the very young, sick, and elderly. The 1918 influenza, on the other hand, killed otherwise healthy young adults ages twenty to forty – precisely the ages of those crowded into military camps around the world. Furthermore, the disease could spread before symptoms appeared. Infected soldiers and other military personnel with no symptoms amassed in barracks and tents, on trains and ships, and in hospitals and trenches. As troops moved across the globe, so did the flu. It took on the name “Spanish influenza,” because unlike France and England, Spain did not censor reports of the outbreak.
While many modern historians and epidemiologists now believe the pandemic likely began in a crowded army camp in Fort Riley Kansas, Americans in 1918 feared its spread from Europe and took some unlikely precautions. On July 3, 1918, the South Bend News-Times assured its readers that a Spanish passenger liner that had arrived in an Atlantic port “was thoroughly fumigated and those on board subjected to thorough examination by federal and state health officers.” Such measures did little to stop the flu, however, and by September 14 the South Bend newspaper reported on several East Coast deaths from Spanish influenza. On the same day, the Indianapolis News printed a notice from the Surgeon-General Rupert Blue, head of the U.S. Public Health Service, offering advice for preventing infection. These public notices became routine over the following months of the pandemic. Among methods listed for preventing the spread of the disease, Blue recommended “rest in bed, fresh air, abundant food, with Dover’s powders for the relief of pain.” He also warned of the “danger of promiscuous coughing and spitting.”
Over the next few days, newspapers reported that the nation’s training camps were infected. On September 17, the Richmond Palladium and the Indianapolis News reported “approximately four thousand men are in quarantine today as the result of Spanish influenza breaking out in the aviation camp of the naval training station” on the Great Lakes in Illinois. The following day, the South Bend News-Times reported that “Spanish influenza now has become epidemic in three army camps” with 1,500 cases in Massachusetts, 1,000 in Virginia, and 350 in New York. The military scrambled to meet the needs of the infected, and anxious citizens awaited a response from the government’s health services.
On September 19, 1918, Surgeon-General Blue sent a telegraph to the head health officer of each state requesting they immediately conduct a survey to determine the prevalence of influenza. In response, Dr. John Hurty, Indiana’s Secretary of the Board of Health, telephoned the local health officials in each city requesting a report. Hurty warned that the flu was “highly contagious,” but stated that “quarantine is impractical,” according to the Indianapolis News. Instead, he offered Hoosiers this advice:
Avoid crowds . . . until the danger of this thing is past. The germs lurk in crowded street cars, motion picture houses and everywhere there is a crowd. They float on dust, and therefore avoid dust. The best thing to do is to keep your body in a splendid condition and let it do its own fighting after you exercise the proper caution of exposure.
One week later, hundreds of men were sick with influenza in Indiana training camps. Again Hurty offered the best advice that he could while advising citizens to remain calm. However, he had to admit: “It has invaded several of our training camps and will doubtless become an epidemic in civil life.” He advised:
If all spitting would immediately cease, and if all coughers and sneezers would hold a cloth or paper handkerchief over their noses and mouths when coughing or sneezing, then influenza and coughs and colds would almost disappear. We also must not forget to tone up our physical health, for even a few and weak microbes may find lodgment in low toned bodies. To gain high physical tone, get plenty of sleep in a well ventilated bedroom. Don’t worry, don’t feast, don’t hurry, don’t fret. Look carefully after elimination. Eat only plain foods. Avoid riotous eating of flesh. Go slow on coffee and tea. Avoid alcohol in every form. Cut out all drugs and dopes . . . Frown on public spitters and those who cough and sneeze in public without taking all precautions.
Most notably, in this same September 26 front page article in the Indianapolis News, Hurty stated that Indiana had “only mild cases . . . and not deaths.” This would soon change.
Despite these public reassurances, Hurty and other Indianapolis civic leaders knew they needed to do more to prepare. Since little was known about how the flu spread, these men tried to keep the city safe using their intuition. A clean city seemed like a safer city, so they organized a massive clean up. On September 27, the Indianapolis News reported:
To prevent a Spanish Influenza epidemic in Indianapolis, Mayor Charles W. Jewett today directed Dr. Herman G. Morgan, secretary of the city board of health, to order all public places – hotel lobbies, theaters, railway stations and street cars – placed at once in thorough sanitary condition by fumigation and cleansing.
The article noted that in other cities local officials had been unable to prevent widespread infection and that Indianapolis should learn from their failures and “get busy now with every preventative measures that can be put in operation to make conditions sanitary so that infection will not spread.”
By the end of the month, influenza had reached the civilian population. Officials continued to discourage people from gathering in crowds and encouraged anyone with a cough or cold to stay home. The News reported that Indianapolis movie houses had begun showing films on screens in front of the buildings instead of inside the theaters.
Meanwhile, the numbers of infected men at Fort Benjamin Harrison rose. By the end of September, officers in charge of the base hospital reported that there were “about 500 cases of respiratory disease” at the camp. Although newspapers still reported that it was unclear whether these illness were indeed Spanish influenza, it was clear that the situation was growing dire. Because so many nurses believed Indiana was safe from the pandemic and volunteered to work out east to fight the virus, the fort’s hospital only had twenty trained nurses to care for the hundreds of sick men. The Indianapolis News reported that enlisted soldiers were “being employed as nurses” and that one battalion of engineers had been completely quarantined. Notices of soldiers dying from influenza and related pneumonia began to fill the pages of Indianapolis newspapers.
By October 1, the number of sick men at Fort Benjamin Harrison rose to 650. The Indianapolis News reported, “No new troops are arriving at the engineer camp,” and “fifty engineers were lent to the base camp hospital yesterday to act as orderlies and clerks and to release medical corps for service as nurses.” The article concluded, “The hospital needs a number of trained nurses.” While the bodies of Hoosier soldiers stationed at camps around the country arrived in the city, Fort Benjamin Harrison had yet to lose one of its own. Less than a week later, that changed.
On Sunday night, October 6, 1918, ten soldiers died in the fort’s hospital bringing the total for the week to forty-one deceased soldiers. Four civilians died from influenza and six more from the ensuing pneumonia. At the fort, officials reported 172 new cases of influenza (bringing the total to 1,653 sick soldiers). Of these, the base hospital was attempting to care for 1,300 men.
In response, Dr. Morgan announced “a sweeping order prohibiting gatherings of five or more persons.” The front page of the News read, “PUBLIC MEETINGS ARE FORBIDDEN,” and noted that all churches, schools, and theaters were closed until further notice. Only gatherings related to the war effort were exempt, such as work at manufacturing plants and Liberty loan committee meetings. The prominent doctor even discouraged people from gathering at the growing numbers of funerals, encouraging only close family to attend. In October of 1918, Indianapolis must have looked like a ghost town.
The sick desperately needed nurses and nowhere more than at Fort Benjamin Harrison. Two front page Indianapolis News headlines for October 7 read, “Ft. Harrison Soldiers in Dire Need of Nurses,” and “Graduate Nurses Are Needed for Soldiers.” The News reported that at Fort Ben “soldier boys are dying for lack of trained help” and that the “few nurses in service are worn to the point of exhaustion.” Officers of the local Red Cross worked to redirect nurses who were awaiting transport overseas to the local effort against influenza, while the women of the motor corps of the Indianapolis Red Cross were busy transporting needed supplies by automobiles.
The rest of the newspaper that day was filled with reports on school closings, cancelled meetings, the numbers of sick in various counties, and funerals. The plague was peaking and Fort Benjamin Harrison suffered the most. While most residents of Indiana stayed far away from the infected camp, the brave women of Lutheran Hospital in Fort Wayne took their nursing skills into the heart of the epidemic. The Fort Wayne Sentinel reported on October 7, the same day the Red Cross called urgently for nurses, “10 Local Nurses Respond.” The paper continued:
Willing to risk their lives in the nation’s service in helping combat the ravages of Spanish influenza, ten Lutheran hospital nurses left the city . . . for Fort Benjamin Harrison, near Indianapolis, Ind., where they will enter service in the military base hospital, which is very urgently in need of qualified nurses to aid in fighting the epidemic.
The following day, the Sentinel published a picture of the brave nurses and the local paper praised their “patriotic devotion to place their training at the disposal of their government even at the risk of their lives.”
The same day, a medical officer from the fort hospital told the Indianapolis Star that several trained nurses had reported for duty “within the last few hours to relieve the situation” and that “everything that can be done for the boys is being done.” The Star reported that the officer was responding to “wild rumors” that the soldiers were not getting adequate care. However, the Indiana Red Cross and Board of Health knew that more nurses were needed. On October 11, the Fort Wayne Sentinel shared Dr. Hurty’s report that “during last night thirty soldiers had succumbed to the ravages of the epidemic at Fort Harrison, some of them expiring before their uniforms could be removed from them.” One of the men was Captain C. C. Turner of the medical reserve who had been sent to the fort from another camp only a few days before to help combat the influenza outbreak. His records had not even arrived yet and his relatives could not be contacted.
The situation at the fort prompted Dr. Morgan and several other leading doctors of the city to issue a statement. The doctors praised the efforts of the hospital staff and volunteers. They stated:
The medical staff of Camp Benjamin Harrison has succeeded in fourteen days in expanding a hospital of about 250 beds to one of 1,700 beds by occupying the well-built brick structures formerly used as barracks. These they were able to equip adequately with the assistance of the American Red Cross which . . . proved itself able to supply every demand made by the army on the same day the request was made.
The doctors reported that the hospital had treated 2,500 patients in the previous two weeks. Despite their heroic efforts, the epidemic persisted.
The city also bolstered its efforts as the number of infected rose to 1,536 civilians. On October 11, the Indianapolis News reported 441 new cases of influenza in a twenty-four hour period. In response, Dr. Morgan announced that the city board of health “enlarged the order against public gatherings of every description” and that the Indianapolis police department would enforce the order. “Dry beer saloons,” which were prohibition era gathering places, were closed. Department stores were prohibited from having sales and would be closed completely if found too crowded. Finally, the board of health directed its officers to post cards reading “Quarantine, Influenza,” on houses containing a sick person. The next twenty-four hours brought the city 250 new cases and the fort 47 new cases of Spanish flu. In that same period, twenty four young men died at Fort Benjamin Harrison. The epidemic was peaking.
A week later there was some evidence that the virus began to relax its grip on the fort, if not the city. The Indianapolis News reported that while the previous twenty-four hours had brought twenty-eight deaths to the city, the fort suffered only four. And while the city reported 252 new civilian cases, the fort reported only twelve new cases. Since the fort was struck by influenza before the city, civilians must have seen this decrease at the fort as a good sign. The plague had almost run its course.
On October 30, Dr. Hurty announced that the closing ban would be lifted in Indianapolis. Newspapers reported the lowest number of new cases since the start of the deadly month and Fort Harrison reported that not one person had died in the previous twenty-four hours. Schools could reopen Monday, November 4 and people with no cold symptoms could ride street cars and attend movie theaters. Through the end of 1918 and the beginning of 1919, there were small resurgences of the epidemic. Morgan ordered the wearing of gauze masks in public and discouraged gatherings. However, the worst had passed, and the war had ended.
As Indianapolis began to return to normal, the damage was assessed. On November 24, 1918, the Indianapolis Star tallied the state’s loss at 3,266 Hoosiers, mostly young men and women. This massive loss of citizens in their prime left 3,020 children orphaned. The War Department also assessed the losses at Fort Benjamin Harrison. The Surgeon General reported that General Hospital 25 at the fort treated a total of 3,116 cases of influenza and 521 cases of related pneumonia. The hard work of the medical staff and brave volunteers transformed a fort designed to care for a few hundred injured men into a giant hospital caring for thousands.
The city also benefited from leadership of the committed men of Indianapolis and the State Board of Health, as well as cooperative citizens. According to the University of Michigan Center for the History of Medicine, “In the end, Indianapolis had an epidemic death rate of 290 per 100,000 people, one of the lowest in the nation.” The center attributes the city’s relative success to “how well Indianapolis as well as state officials worked together to implement community mitigation measures against influenza,” whereas in other cities “squabbling among officials and occasionally business interests hampered effective decision-making.” Indianapolis leaders presented a united front, shop and theater owners complied despite personal loss, and brave men and women volunteered their services at risk to their own lives. Somehow only one of the heroic volunteer nurses stationed at Fort Benjamin Harrison lost her life.
On May 6, 1919, the Indianapolis News replaced columns of text detailing the influenza-related losses with jubilant articles about the city’s preparations for Welcome Home Day. Trains unloaded Hoosier soldiers still carrying their regimental colors. Indianapolis decked herself out in red, white, and blue. On May 7, 1919, 20,000 men and women walked in the welcome parade that stretched for 33 blocks. Many, like the men and women of Hospital No. 32, trained and mobilized at Fort Benjamin Harrison. Many had survived the Spanish Influenza, nursed the sick, or lost a friend to the pandemic. Not a single article mentioned it. The city was ready to move towards peace and healing.