“Nestled in the wooded hills of southern Indiana, lies a land of fantasy. . . where it’s Christmas every day.”
Indiana has its fair share of uniquely named towns – Gnaw Bone, Popcorn, Pinhook, Needmore, and Pumpkin Center to name a few. But perhaps the most well-known idiosyncratic place name is Santa Claus in Spencer County, Indiana.
So, how did we get this intriguing sobriquet? Before we get there, we should cover some of the history of the area. The Shawnee, Miami, and Delaware tribes first stewarded the land that later became Spencer County. At the turn of the 19th century, many of these tribes joined Tecumseh’s confederation to oppose white encroachment. However, both U.S. policy and the Treaty of Fort Wayne in 1803 and the Treaty of Vincennes in 1804 opened the land to white settlement. Crossing over from Kentucky, white settlers established permanent homes by 1810 in the Indiana territory near Rockport on the Ohio River, 17 miles southwest of modern-day Santa Claus. But by the mid-nineteenth century when settlers decided to incorporate their new town, they did not originally pay such homage to the Christmas holiday.
As with many place names, the origin of the name Santa Claus is mostly the stuff of legend. The Indiana State University Folklore Archive has preserved three versions of the story behind the name Santa Claus. Below is one example:
Several families settled in the area and decided that they should have a name for their community. They decided on Santa Fe. They applied for a post office to make it official. On Christmas of 1855, everyone was greatly excited at the thought of going to their own brand new post office for their Christmas cards and gifts instead of having to ride to Dale. Unfortunately, a large white envelope with important seals arrived the day before Christmas to reveal that a town in Indiana already was named Santa Fe. Determined to get their post office just as quickly as possible, the citizens of Santa Fe decided to discuss the matter that very night, Christmas Eve. While they were signing, the whole world outdoors became filled with an intense, blinding light, and a little boy came rushing in. ‘The Star, the Christmas star is falling! Everyone rushed out just in time to see a flaming mass shooting down from the heavens and crash into a low distant hill. They considered it an omen of good fortune. Returning to the meeting, it seemed to most natural thing for all the folk to agree that the name Santa Fe should be changed to Santa Claus.
This account is certainly embellished to some extent, seeing as the “Christmas Star” (which appears in the sky every twenty years when Jupiter and Saturn align in the winter sky) made its last appearance in 2020 and did not, in fact, fall from the sky in 1855. However, it gives us an idea of why Santa Claus citizens themselves believe to be their origin story.
However it happened, the townsfolk eventually decided on Santa Claus as a replacement name, and the Santa Claus post office was officially established on May 21, 1856.
For years, however, the strangely named town was just that – a town with a strange name. It wasn’t until Santa Claus Postmaster James Martin began answering letters written to Saint Nick in the early 20th century that the town began truly embracing its merry moniker. It’s unclear when or why letters to the man at the North Pole began arriving at the Santa Claus, Indiana post office, but in 1914 Martin began writing back, and the tradition only grew from there.
Mail clerks around the country began rerouting letters simply addressed “Santa Claus” to the Indiana town for Martin to handle. Parents began writing notes with enclosed letters or packages to be stamped with the Santa Claus postmark and sent back, making the letters and gifts under the tree on Christmas morning that much more authentic.
By 1928, Martin and his clerks were, not unlike Santa and his elves, handling thousands of letters every holiday season and were garnering enough attention to catch the eye of Robert Ripley of Ripley’s Believe It or Not. Before Ripley’s was an after school tv show and before it was a coffee table book you bought at your school’s annual Scholastic Book Fair, it was a syndicated newspaper panel that shared interesting tidbits and oddities from around the world. And on January 7, 1930, the oddity in question was none other than Santa Claus, Indiana.
It was a brief mention, but it was enough. The next Christmas, Martin reported that the number of parcels and letters coming through his post office had grown exponentially, adding:
I guess my name ought to be Santa Claus, because I have to pay out of my own pocket for handling all this mail. I’ve hired six clerks to help out and I recon it’s going to cost $200. But it advertises the town and besides lots of folks from all around come out to the store to see us sending out the mail.
With great fame comes great scrutiny, or at least it did in this case. By 1931, the Associated Press reported that officials in Washington were considering changing the name of the town as the stress put on the postal system during the holiday season was becoming too much to handle. Christmas lovers across the country bemoaned the potential loss, but none so loudly as the citizens of Santa Claus, who contacted their U.S. Senator James Watson and U.S. Representative John Boehne, of Indiana.
Watson and Boehne got to work for their constituents. Representative Boehne notified the USPS that the entire Indiana delegation would oppose the name change if it were to go forward. Senator Watson took a more direct route and went straight to Postmaster General Walter Brown to assure him that, “The people won’t want it changed. “ “The name must not be changed nor the office abolished.”
In the end, of course, the citizens were able to preserve their beloved town’s name, and the tradition continued to grow.
Entrepreneurs, hoping to cash in on the Christmas spirit, began to take notice of the small town. In 1935, Vincennes speculator Milt Harris founded the business called Santa Claus of Santa Claus, Incorporated. Harris erected Santa’s Candy Castle, the first tourist attraction in town. Built to look like a fairy castle and filled with candy from project sponsor Curtiss Candy Company, the Candy Castle was the centerpiece of what Harris dubbed Santa Claus Town, a little holiday village of sorts made up of his business ventures. The castle would eventually be joined by Santa’s Workshop and a toy village.
Across town, a different, similarly named business, Santa Claus, Incorporated, brainchild of Chicago businessman Carl Barrett, built another Yuletide monument, a 22-foot tall statue of Santa Claus purportedly made of solid granite. This colossal Kris Kringle was the start of a second Christmas themed landmark, this one called Santa Claus Park. All of this in a town of fewer than 100 people.
Both attractions were dedicated during the Christmas season of 1935, but all the holiday spirit in the world wasn’t enough to keep the peace between Harris and Barrett.
By 1935, the town of Santa Claus, Indiana was home to two organizations – Santa Claus, Incorporated, owned by Carl Barrett, and Santa Claus of Santa Claus, Incorporated, owned by Milt Harris. Barrett and Santa Claus, Incorporated were developing Santa Claus Park, which featured the 22-foot Santa Claus statue. Harris and his company were developing Santa Claus Town, featuring Santa’s Candy Castle. Barrett filed suit against Harris, alleging that the latter had no right to use a name so similar to its own. Meanwhile, Harris filed suit against Barrett because Barrett had bought and was building Santa Claus Park on land that had been leased to Harris by the previous owner.
A judge put an injunction on Santa Claus Park, meaning Barrett could not move forward with development. Eventually, this tongue twister of a case went all the way to the Indiana Supreme Court, which ruled in 1940 that both companies could keep using their names and overturned the injunction, meaning that the plans for Santa Claus Park could move forward, regardless of Harris’s lease.
However, the protracted legal battle, combined with wartime rationing, which impacted tourism due to gasoline and tire shortages, took a toll on both attractions. By 1943, cracks ran through the base of the giant Santa Statue and the Candy Castle had closed its doors.
With the end of the war came new opportunities. In 1946, retired Evansville industrialist and father of nine, Louis Koch, opened Santa Claus Land after being disappointed that the town had little to offer visiting children hoping to catch a glimpse of the jolly man in the red suit. This theme park, reportedly the first amusement park in the world with a specific theme, included a toy shop, toy displays, a restaurant, themed rides and, of course, Saint Nicholas.
This was no run of the mill Santa Claus, though. Jim Yellig would become, according to the International Santa Claus Hall of Fame, “one of the most beloved and legendary Santas of all time.” Yellig had donned the red and white suit at the Candy Castle and volunteered to answer letters to Santa for years before becoming the resident Santa at the new park, a position which he held for 38 years. During his tenure as Saint Nick, Yellig heard the Christmas wishes of over one million children.
Throughout “Santa Jim’s” tenure, Santa Claus Land continued to grow, thanks in large part to Louis Koch’s son, Bill Koch, who took over operation of the park soon after its founding. By 1957, the park offered a “miniature circus,” a wax museum, Santa’s Deer Farm, and an outdoor amphitheater. Live entertainment shows, such as a water ski show, started and in the early 1970s rides such as Dasher’s Seahorses, Comet’s Rockets, Blitzen’s Airplanes, and Prancer’s Merry-Go-Round were added. And in 1984, the Koch family expanded from a strictly Christmas-themed park to include Halloween and Fourth of July sections and changed its name to Holiday World. Still in operation today as Holiday World & Splashin’ Safari, the theme park, which features what are considered some of the best wooden roller coasters in the world, welcomes over 1 million people per year.
Today, the town of Santa Claus is more “Christmas-y” than ever. Many of its 2,400 residents live in Christmas Lake Village or Holiday Village on streets with names like Poinsettia Drive, Candy Cane Lane, or Evergreen Plaza. The Candy Castle was renovated and reopened in 2006 and is known for its wide selection of cocoas and its Frozen Hot Chocolate. Carl Barrett’s 22-foot Santa Statue was restored by Holiday World in 2011 and now welcomes tourists from all over the world. Visitors to Holiday World can stay at Lake Rudolph Campground and RV Park or Santa’s Lodge. Every Christmas season, the small town comes alive with festivals, parades, and even Christmas fireworks. And, of course, dedicated volunteers still answer children’s letters to Santa, even if they sound a little different than they used to.
Many companies choose a face for their brand and then build a mythology around it. For example, the Converted Rice Company marketed their new parboiled, vacuum-dried rice as the homey-sounding “Uncle Ben’s Rice.” The company used the racially charged nomenclature “uncle” and an image of a distinguished-looking African American man to imply that the product would be like a friendly servant for the housewife. The company has claimed at various times that “Uncle Ben” was a respected rice grower or a hotel maitre d’, but more likely he never existed — much like Mr. Clean, Sara Lee, or Mr. Goodwrench.
While there are plenty of questions surrounding his origin story, the man called “Dr. Scholl,” was not only the founder of one of the most famous companies in the world and the inventor of many of its products, but he was a master of the world of advertising — changing the business in innovative ways. Scholl may (or may not) have been a quack doctor, but he was a crackerjack businessman.
William Mathais Scholl was born on a farm in Kankakee, LaPorte County, Indiana in 1882.* According to the 1900 census, William spent his youth working as a laborer on his parents’ farm, along with many other siblings. Sometime around 1900, Scholl moved to Chicago and found a job as a salesman at the popular Ruppert’s Shoe Store on Madison Street. Here, he encountered a variety of foot problems faced by his customers and became interested in podiatry. That same year, secondary sources claim, he enrolled in medical school at Loyola University. This has been hotly debated.
Despite investigations beginning in the 1920s and continuing today, it is still unclear if Scholl graduated with a medical degree around 1904 as he claimed. The Scholl College of Podiatric Medicine in Chicago supports the Scholl Museum which is dedicated to memorializing his achievements and authoritatively refers to him as “Dr. William Mathias Scholl.” However, the records of the American Medical Association tell a different story. According to Robert McClory’s investigative piece for the Chicago Reader in 1994:
“Visit the recently opened Scholl Museum . . . and you’ll find the doctor and his achievements raised to almost mythic levels . . . But check through the old AMA records and you’ll read about a man whose credentials are ‘entirely irregular,’ whose methods smack ‘strongly of quackery,’ and whose products ‘cannot be recommended’.”
There are also questions about his state medical license, as well as a later degree he claimed from the Chicago Medical College, an institution described by the American Medical Association as “low grade.” The AMA described Scholl’s “whole record” as “entirely irregular.”
Dr. Scholl, or “Dr.” Scholl, built an empire which has made his name recognizable all over the world. Degree in hand or not, at the turn of the twentieth century, young Scholl was busy inventing various devices intended to alleviate foot pain. One such device was the “Foot-Eazer,” which was a hit with the Ruppert’s Shoe Store customers. Supposedly one customer offered him several thousand dollars to start his business. He declined the offer, but was inspired to start his own business.
In 1904, Scholl set up shop in a small office in a building at 283-285 E. Madison Street in Chicago – the first location of the Scholl Manufacturing Company. By the next year, he began innovating new advertising techniques. Scholl would purportedly travel to various shoe stores, ask for the manager, and take out a human foot skeleton and put it on the counter. He used the foot to show how complicated and delicate all of the tiny bones are that hold so much weight and take so much abuse. He would demonstrate how supportive and comfortable his products worked.
Whether or not his products worked, his strategy of marketing directly to the store manager did. In addition to charging for the construction of the product, he also charged for consultations and fittings. Business boomed and in 1907 he moved into five rooms in a building on Schiller Street which had been abandoned by Western Wheel Works, a bicycle company. Almost immediately, he purchased the building and expanded the factory until it took up the entire block. The building stands and is in use as the Cobbler Square apartment complex — a nod to it’s former use.
By 1908, Scholl was using advertisements in trade journals to continue marketing his products directly to shoe store owners and managers. His approach at this point was to set up a booth at various fairs and train these prospective clients on how to talk about the Foot-Eazer “from a scientific prospective.” The ad below addresses these shoe store managers with several lofty promises about the Foot-Eazer:
“It will pay you well to be an expert in correcting foot troubles. . . you can sell a pair to one customer out of every three. Your profit is a dollar a pair – if you have 3000 customers that’s a thousand dollars for you . . .You will understand the science of it the moment you see it . . . as I have been allowed sweeping patents on it no one else can make anything like it.”
Scholl explained to this clients that his product was backed by “science,” would make them rich, and he was the only one who could provide it.
By 1909 he was recruiting teams of salespeople to approach the store owners for him. He set up a correspondence course to teach them the anatomy of the foot and the “science” behind his products. The course was called “Practipedics” and was described as “The Science of Giving Foot Comfort and Correcting the Cause of Foot and Shoe Troubles Based on the Experience, Inventions and Methods of Dr. William M. Scholl.” The ads from this period show that he was marketing these classes and sales opportunities to both men and women, an interesting approach for a time when few women worked outside the home. The ad below shows a woman studying the Foot-Eazer and promises that “This Alone Should Pay Your Rent.”
From here, Scholl’s business expanded even more quickly. By the time the U.S. entered World War One, Scholl was marketing to three different audiences — managers and owners of shoe stores, retail customers, and potential sales recruits — all through extensive advertising. Hoosier State Chronicles has a wealth of examples of ads for Scholl’s products, for stores selling them, and even for the Practipedics course. Indiana shoe stores often advertised special days where Scholl’s salespeople, presented as medical experts in foot care, would be at the store for personal fittings. In a 1917 issue of the Indianapolis News, the New York Store advertised their latest shoe styles and noted that they carried “A Complete Line of Dr. Scholl’s Foot Comfort Appliances.” In 1920, the South Bend Shoe Company advertised in the South Bend News-Tribune: “Foot Expert Here . . . A specialist from Chicago loaned to this store by Dr. Wm. M. Scholl the recognized foot authority.” This “expert” was most likely trained via correspondence course or week-long class and almost certainly never met Scholl.
Sometimes all three of Scholl’s audiences were targeted in one message, such as in the advertisement below from the Indianapolis News. First, the ad promises foot comfort to the average reader and pedestrian and explains to them the product while emphasizing the availability of “medically” trained dealers. Second, it advertises Marott’s Shoe Shop on East Washington who’s owners will have to stock up on Scholl’s products and provide the “foot expert.” Finally, the ad explains to the shoe dealers and other potential Scholl’s salespeople how to register for the next Scholl’s training course in Indianapolis. Additionally, Marrott’s Shoe Shop was a “Dr. Scholl’s Foot Comfort Store” which was supposed to consistently staff such “trained” foot experts — not just for special events. In Marrott’s advertisement which ran below the Scholl’s advertisement, the store claims that “Dr Scholl’s Foot Appliances are handled exclusively in Indianapolis by Marott’s Shoe Shop.” However, a search of Hoosier State Chronicles shows several other Indianapolis stores schilling for Scholl — including the New York Store from the advertisement above.
Another Indiana “Dr. Scholl’s Foot Comfort Store” was the Lion Store in Hammond. They were one of many stores around the country to participate in Scholl’s marketing plan for “Foot Comfort Week.” They advertised their participation and “foot expert” in the Hammond Times on June 12, 1917. Even general clothing stores participated in the marketing scheme. On June 21, 1917, the E. C. Minas Company, which called itself “Hammond’s Greatest Department Store,” advertised “Foot Comfort Week” in the Hammond Times which the ad claimed was happening “throughout the continent.” They noted that their store carried “the complete line” of Scholl’s appliances and “experts at fitting them to individual needs.” Later ads for the week-long event had more outrageous marketing schemes such advertisements for “Prettiest Foot” contests. Search Hoosier State Chronicles for more.
By the end of the war, Scholl’s company was established across the U.S, Europe, Egypt, and even Australia. He had also established a Podiatry College and written a text book. However, medical doctors working in the field were quick to criticize Scholl’s entangled business and medical operations and began to publicly question his qualifications. In 1923, the National Association of Chiropodists passed a resolution condemning Scholl’s work and banning him from advertising in their publications. Again, Robert McClory’s investigative article is the best source for more information on the controversy stirred up around Scholl’s standing in the medical community.
Scholl was not slowed down by the nay-saying in the least. He continued to invent, patent foot products, and open new stores around the world. According to McClory:
“In his lifetime Scholl would create more than 1,000 patented ointments, sprays, cushions, pads, supports, shields, springs and other mechanical and chemical gizmos for the feet. Eventually the Scholl empire would include more than 400 outlet stores and employ some 6,000 people worldwide.”
According to a short essay by Fred Cavinder in Forgotten Hoosiers (2009), during World War II, the Scholl plant in England made surgical and hospital equipment while the Chicago plant converted to the manufacture of military equipment. Cavinder writes, “As Word War II ended, Dr. Scholl invented the compact display fixture with the familiar blue and yellow colors.”
Scholl remained connected to the northwest region of Indiana throughout his life. He resided primarily in a single rented room at the downtown Chicago Illinois Athletic Club. However, later in life he purchased a home in Michigan City, Indiana, where he had moved his side business, Arno Adhesive Tapes. This company made all of the plaster and tape for the Dr. Scholl products. In the 1960s, Arno also expanded greatly and Scholl, now in his seventies, remained just as active in its management.
Scholl died in 1968 and is buried in Pine Lake Cemetery in La Porte Indiana. His family sold the Scholl’s brand to a large pharmaceutical company in 1979 and it remains successful to this day. So whether we remember him as “Dr.” or Dr. Scholl, he created an empire, changed an industry, and invented new ways to market and advertise. Search Hoosier State Chronicles for the many more advertisements.
* The 1900 census gives his birth year as 1884, but all other records including passport applications, WWI draft card, and death records cite 1882 as the correct year.
For further information, especially on the controversy surrounding Scholl’s medical qualifications see:
Robert McClory, “Best Foot Forward,” Chicago Reader, January 13, 1994, accessed ChicagoReader.com
Marriage is complicated enough. Add in opposing political views, routinely confronting systemic racism and sexism, and coping with the hardships of the Great Depression and World War II, and it’s even more challenging. African American attorneys Elizabeth and J. Chester Allen experienced these struggles and, while theirs was not a perfect marriage, through compromise, mutual respect, shared obstacles and goals, and love, they enjoyed 55 years together as man and wife. The South Bend couple dedicated themselves to each other and to uplifting the Black community by crafting legislation, organizing social programs, creating jobs, and demanding educational equality. The opportunities the Allens created for marginalized Hoosiers long outlived them.
On his way to Indianapolis in the late 1920s, J. Chester’s car broke down in South Bend and, after staying with a family on Linden Street, liked the city so much he decided to make it his home. Or so the story goes. Elizabeth Fletcher Allen, whom he met at Boston University and married in 1928, was likely working towards her law degree back in Massachusetts when J. Chester made that fateful trip. She would eventually join her husband in Indiana, but in the meantime J. Chester quickly got to work serving South Bend’s Black community. In 1930, J. Chester was admitted to the bar and the following year was appointed County Poor Attorney for St. Joseph County.
His arrival was perhaps serendipitous, as the Great Depression had begun rendering African Americans, who were already disenfranchised, destitute. J. Chester served as management committee chairman of the Hering House, which he described as “‘the clearing house of most of the social activities of the colored people as well as the point of contact between the white and colored groups of South Bend. . . . Its activities in the three fields of spiritual, mental and physical training make it indeed a character building institution.'” Through the organization, J. Chester helped provide 4,678 meals to unemployed African Americans, along with clothes, lodging, and medical aid to others in the Black community in 1931.
In addition to providing basic necessities during those lean years, J. Chester took on various anti-discrimination lawsuits in South Bend. In 1935, he helped prosecute a case against a white restaurant owner, who refused to serve Charles H. Wills, Justice of the Peace, in a section designated only for white patrons. That same year, J. Chester served as attorney for the Citizens Committee, formed in protest to the “unwarranted shooting” of Arthur Owens, a Black 18 year-old man, by white police officer Fred Miller. The Indianapolis Recorder, an African American newspaper, noted that eleven eyewitnesses recounted that “the youth was shot by Officer Miller as he stepped from a car with hands raised, after having been commanded by the officer and his companion, Samuel Koco Zrowski, to halt.” The officers had been pursuing the car with the belief it had been stolen.
Elizabeth Allen-likely back in town temporarily-and other Black leaders organized a mass meeting to protest the “wanton, brutal and unwarranted” shooting. Despite boycotts, a benefit ball to raise prosecutorial funds, and protests by the Black community and white communists, a grand jury did not return an indictment against Officer Miller for voluntary and involuntary manslaughter. This, J. Chester said, was due to “blind prejudice on the part of the prosecutor.”
Despite a disheartening outcome, J. Chester continued to lend his legal expertise to combating local discrimination. The following year, he and a team of lawyers challenged Engman Public Natatorium’s ban on African Americans from using the facilities. The team presented a petition, likely prepared by Elizabeth, to the state board of tax commission demanding Engman remove all restrictions. Allen and other NAACP representatives had tried this in 1931, arguing that the natatorium was “supported in whole or in part by taxes paid by residents of the city,” including African Americans. Without access to the pool, they would be relegated to unsafe swimming holes, one of which led to the death of a Black youth the previous summer. While they had no luck in 1931, the 1936 appeal convinced commissioners to provide African American residents access to the pool, but only on the first Monday of every month and on a segregated basis. This was just one victory in the decades-long fight to fully desegregate the natatorium.
While it appears that Elizabeth lent her aid to certain events in South Bend, like protesting the shooting of Owen, it is tough to discern Elizabeth’s activities at this time. This is perhaps due to scant documentation for African Americans, particularly women, during this period. Likely, she was working towards her law degree at Boston University, despite being told by an admissions officer “there was not need to come and advised she get married.” Proving the officer wrong, Elizabeth not only got married, but gave birth to two children while pursuing her law degree. She attributed this tenacity to the confidence her father instilled in her during childhood and later said “’To be a woman lawyer you have to have the hide of a rhinoceros.’”
Her persistence paid off and after joining J. Chester in South Bend, she was admitted to the bar in 1938. Perhaps her presence inspired in him a sense of security and conviction, resulting in a run for the Indiana General Assembly. That year, voters elected J. Chester (D) the first African American to represent St. Joseph County. Rep. Allen introduced and supported bills that would eliminate racial discrimination in sports, the judicial system, and public spaces. The new lawmaker also endorsed bills that would require Indianapolis’s City Hospital to employ Black personnel and that would mandate appointing at least one African American to the State Board of Public Instruction, telling his colleagues “the legislature should see to it that these children had a spokesman of their own racial group to assure their obtaining a measure of equal accommodation and facilities in the segregated public school system” (Indianapolis Recorder, March 11, 1939). Writer L.J. Martin praised Allen’s unwavering commitment to serving Black Hoosiers while in public office, noting in the Indianapolis Recorder,
Hon. J. Chester Allen said he had stayed up late at night reading bills for such ‘racial traps.’ He found them, he eliminated them, one hotel sponsored bill in particular would have been a slap at the race. Mr. Allen astonishes me, in the forcible argument for racial progress.
While J. Chester walked the halls of the statehouse, championing bills that furthered racial equality, Elizabeth was able to make a difference as a lawyer. The couple opened “Allen and Allen” in 1939—the same year she gave birth to their third child. One of the first Black female lawyers in the city, and likely state, Elizabeth quickly forged a reputation as an articulate and ambitious woman. She did not hesitate to express her convictions, not even to First Lady Eleanor Roosevelt. Elizabeth sent her a letter expressing the need to integrate housing and provide African Americans with the same government-funded housing white Americans received. Elizabeth’s son, Dr. Irving Allen, told an interviewer that Roosevelt’s response resulted in his mother’s “angry departure” from the Democratic Party. Allegedly, Roosevelt “sent back this long-winded pretentious letter rationalizing the situation . . . that the races couldn’t live together.” Both idealistic, Dr. Allen recalled that his parents’ political discourse over the dinner table “could blow up at any time.”
Elizabeth’s editorial for the South Bend Tribune, entitled “Negro and 1940,” also provides insight into her views. She lauded the “new Negro,” who:
is fearless and motivated by confidence in his belief that he owes to his race the duty of guiding those members whose minds have not been trained to clear thinking, his knowledge that the able members of his race have always from the beginning of this country contributed to the civic upbuilding and a conviction that it is up to him to keep the gains which have been made.
By this definition, Elizabeth exemplified the “new Negro,” dedicating her life to uplifting South Bend’s Black community through her work with the NAACP’s Legal Redress Committee and by organizing drives to improve housing for minorities. According to her son, Dr. Irving Allen, Elizabeth embodied the Black empowerment she wrote about, challenging oppression and advocating for those “being cheated out of a decent life.” Dr. Allen suspected that his mother also wanted to effect change as a legislator, but sacrificed her political aspirations to support her husband’s career.
Although Elizabeth felt she had to shelve her political aspirations, she complemented her husband’s legislative work, particularly regarding World War II defense employment. The outbreak of war in Europe in 1939 created an immediate need for the manufacture of ordnance. While U.S. government war contracts lifted many Americans out of the poverty wrought by the Depression, many manufacturers refused to hire African Americans. This further disenfranchised them as, according to W. Chester Hibbitt, Chairman of the Citizens’ Defense Council, an estimated 54% of African Americans living in Indiana were on relief by 1941.
And while the federal government complained of a labor shortage, J. Chester contended that “Negro workers, skilled and semi-skilled, by the thousands are walking the streets or working on W. P. A. projects, because they happen to have been endowed with a dark skin by the Creator of all men'” (“The Story of House Bill No. 445, p.15). He argued that it was the responsibility of lawmakers to prohibit employment discrimination, not only to eliminate poverty, but to safeguard democracy. Echoing the Double V campaign, Rep. Allen stated that “our first line of defense should be the preservation of the belief in the hearts of all men, black and white alike, that Democracy exists for all of us; that we are all entitled to a home, a job and the expectancy of better things to come for our children.” The continued denial of American minorities’ rights undermined the fight for freedom abroad.
Elected to a second term in 1940, J. Chester led the call for anti-discrimination legislation. Months before President Roosevelt issued Executive Order 8802, Rep. Allen and Rep. Evans introduced House Bill No. 445. If enacted, it would make it illegal for Indiana companies benefiting from federal defense contracts “to discriminate against employing any person on account of race, color or creed.” So popular was the bill that after the Indiana Senate passed it, delegations of African Americans and their children filled statehouse corridors and galleries, carrying “placards advocating passage of the bill, describing the measure as the only thing necessary to provide Negroes with jobs” (“The Story of House Bill No. 445”, p.7).
Despite the bill’s promising fate, on the last day of session the House kicked it over to the Committee on Military Affairs, where it essentially died. In an article for the Indianapolis Recorder, J. Chester noted that although the bill was defeated,
such state-wide attention had been drawn to the sad economic plight of the Negro workers of Indiana and its attendant dangers that people of both races agreed that the alleviation of the Negro unemployment problem was the number one job of the preparations for war of Indiana and proceeded in for right home-rule manner to do something about it.
On June 1, 1941, Governor Schricker answered the call to “do something about it,” appointing J. Chester the Coordinator of Negro Affairs to the Indiana State Council of Defense. As part of the Indiana Plan of Bi-Racial Cooperation, Allen traveled throughout the state, appealing to groups like the A.F.L., C.I.O., and the Indiana State Medical, Dental and Pharmaceutical Association, which all formally pledged to employ African Americans. Through intensive groundwork, Allen established bi-racial committees in at least twenty Indiana cities.
Based on the “mutual cooperation between the employer, labor and the Negro,” the Recorder reported that these local committees would “go into action whenever and wherever Negro industrial employment presents a problem.” Although his persuasive skills often convinced employers to hire Black employees, historian Emma Lou Thornbrough noted that “Allen sometimes invoked Order 8802 and threats of federal investigation to persuade management to employ and upgrade black workers.”
Allen and the bi-racial committees also served as a sort of “middlemen” for white employers who wanted to hire African Americans, but were unsure how to recruit those best-suited for the job. Allen and the committees distributed “mimieographed questionnaires,” which provided” more valuable information with respect to Negro labor supplies, skills, etc. This information was then used with great effect in the mobilization and cataloguing of types of dependable Negro workers for local defense industries.”
Under Allen’s leadership, the Indiana Plan proved incredibly successful, providing employment to those, in Allen’s words, “whose record of loyalty and services dates in an unbroken chain back to the year 1620” (“The Indiana Plan of Bi-Racial Cooperation,” p.5). According to the “Job Opportunities for Negroes” pamphlet, between July 1, 1941 and July 1, 1942, there “was a net increase of 82% Negro employment, most of which was in manufacturing. . . . working conditions also improved” (p.2). (It should be noted that employers continued to deny African Americans jobs in “skilled capacities.”) In fact, Indiana was awarded the “Citation of Merit” by the National Director of Civilian Defense for “outstanding work in the field of race relations.” So efficiently organized and implemented, other states used the plan as a model to bring African Americans into the workforce.
The Bi-Racial Cooperation Plan’s significance endured long after World War II ended. White employers could no longer claim that Black Hoosiers lacked the skills or competence required of the workplace or that it was “unnatural” for white and Black employees to work alongside each other. Reflecting on the program, Allen wrote in 1945, “Time was when a Negro interested in securing better employment opportunities for his people could not even obtain an audience with those able to grant such favors.” But the Bi-Racial Cooperation plan “has accomplished more for the Negro’s permanent economic improvement than had been done in the preceding history of the state.”
While African Americans were often the first to be let go from defense jobs with the conclusion of war, Allen’s work permanently wedged the door open to employment for Black Hoosiers. Allen, perhaps at the encouragement of Elizabeth, emphasized the importance of creating job opportunities for Black women and in his 1945 article noted that thousands of female laborers “have been upgraded from traditional domestic jobs, to which all colored women had previously been assigned irrespective of training or ability, to defense plants as receptionists, power-sewing machine operators, line operators and other better paying positions where their training can be utilized.”
Like her husband, Elizabeth refused to accept that Black Hoosiers would be excluded from the economic boon created by defense jobs. In the early 1940s, she established a nurse’s aid training and placement program for Black women in St. Joseph County. Of her WWII work, Elizabeth’s son said that she opened professional doors for Black women and that she saw herself as helping people who were oppressed. Like J. Chester, Elizabeth helped select local men for placement in defense jobs and, according to an October 11, 1941 Indianapolis Recorder article
used the utmost care in selecting the men to go into the factory realizing that future opportunities were dependent upon the foundation which these pioneers laid both in building good will among the fellow employes, and proving to the management that colored are reliable, trustworthy, hard-working and capable of advancing.
While J. Chester traveled the state, Elizabeth tended to the needs of the local community, chairing a drive in 1942 at Hering House for “community betterment in housing[,] social and industrial fields.” In the 1940s, Elizabeth organized various meetings to improve local housing for the Black community, emphasizing the link between substandard residences and crime rates, delinquency, and health. Deeply committed to ensuring quality education for African American children, Elizabeth founded Educational Service, Inc. in 1943, which encouraged youth to pursue social and economic advancement, provided financial aid to “worthy” students, offered individual counseling, and fostered good citizens. All of this while caring for three young children and likely manning the couple’s law office, as J. Chester fulfilled his duties with the Indiana State Council of Defense. Fortunately, Elizabeth later told the South Bend Tribune, “I want to keep busy constantly. I have to be about something all the time.”
When the war clouds cleared, the Allens achieved many of their professional and philanthropic goals. But they also experienced immense personal loss that appeared to test their marriage. Their post-war journey is explored in Part II.
Elizabeth F. Allen, “Negro and 1940,” South Bend Tribune, October 1, 1939, 5, accessed Newspapers.com.
The Indiana State Chamber of Commerce, “The Story of House Bill No. 445 . . . A Bill That Failed to Pass,” (Indianapolis, 1941?), Indiana State Library pamphlet.
The Indiana State Defense Council and The Indiana State Chamber of Commerce, “The Indiana Plan of Bi-Racial Cooperation,” Pamphlet No. 3, (April 1942), Indiana State Library pamphlet.
Mary Butler, “Mrs. Elizabeth Allen Lays Down Law to Family,” South Bend Tribune, July 30, 1950, 39, accessed Newspapers.com.
“Adult Award Winner,” South Bend Urban League and Hering House, Annual Report, 1960, p. 5, accessed Michiana Memory.
“Area Women Lawyers Tell It ‘Like It Is,’” South Bend Tribune, March 9, 1975, 69, accessed Newspapers.com.
Marilyn Klimek, “Couple Led in Area Racial Integration,” South Bend Tribune, November 30, 1997, 15, accessed Newspapers.com.
Emma Lou Thornbrough, Indiana Blacks in the Twentieth Century (Bloomington: Indiana University Press, 2000), p. 207.
Oral History Interview with Dr. Irving Allen, conducted by Dr. Les Lamon, IU South Bend Professor Emeritus, David Healey, and John Charles Bryant, Part 1 and Part 2, August 11, 2004, Civil Rights Heritage Center, courtesy of St. Joseph County Public Library, accessed Michiana Memory Digital Collection.
“The Long Distance Telephone is the Modern Thanksgiving Greeting:” this 1929 Indiana Bell Telephone Co. advertisement will certainly resonate with Hoosiers, who are finding alternative ways to spend the holidays during the pandemic. The ad continues—and we relate—”Distances, however, and the press of modern affairs sometimes seek to rob us” of the mouthwatering aromas of Grandma’s kitchen. Fortunately, the #telephone “takes our voices quickly and easily to the home folks whenever they are, and leaves lasting impressions of thoughtfulness and occasion for real Thanksgiving.”
Despite the stock market having just crashed, Americans in 1929 kept traditions alive and counted their blessings. While 2020 celebrations will look different in many Hoosier households, we thought we would look back at some of the recipes shared in the pages of historic Indiana newspapers, especially those published during periods of hardship. But before you get to cooking, be sure to pick up some skillets, pie dishes, and perhaps some nut crackers (to keep greedy fingers at bay) from Vonnegut’s.
Perhaps bespeaking the tension felt in households across the nation during the Great Depression, Jean Allen told the tale of one woman, who was grateful that Thanksgiving came only once a year (Muncie Star Press, November 17, 1934, 8). The woman “gave each of her children a sound spanking, tucked them in bed, and sat down to plan her Christmas dinner.” Mindful of these struggles, Allen crafted menus that would “save you a lot of work, worry, and wear and tear,” with a focus on “goodness” and cost.
If Allen’s recipes aren’t your persuasion, check out this 1935 issue of the African American newspaper, the Indianapolis Recorder, which featured all cranberry everything, from tapioca to ice.
Just days before the attack on Pearl Harbor plunged Americans into World War II, the Indianapolis Recorder noted that during a “New Deal Thanksgiving,” it was understandable that “some of us didn’t get right into the spirit of it.” Nonetheless, one could take a decorative page from those who did, bestowing their dinner table with lace and yellow chrysanthemums or perhaps a combination of fruit, apples leaves, and red, gold, and white placards.
The following year, the Recorder noted that there was much to be thankful for “in a world and season of great distress,” as Americans were “confronted presently with obligations and sacrifices to be made in prosecuting the war.” While it was natural to despair, and to worry that next year’s Thanksgiving could require even more sacrifices and rationing, the author wrote “the American people generally have enjoyed an abundance of the comforts or luxuries of life not realized by other peoples of the world. We have taken the needs or desires of our daily life as a matter of course.” Bowed over steaming plates, Hoosiers likely prayed for the safety of their sons, uncles, aunts, brothers, and sisters overseas.
A seasoned procrastinator? The Kokomo Tribune has you covered with some last minute recipes. But before digging in, be mindful of Dr. C.C. Robinson’s suggestions. He advised readers in 1923, via the Muncie Evening Post, to “Remember that cheerfulness is a most necessary asset for enjoying a real meal. If your wife has invited someone who doesn’t agree with your idea on the League of Nations, don’t forget to carry on with a smile just the same. It helps the liver secretions.” Sound advice, in these polarized times. However, we have to disagree with his warning “Don’t think you have to eat everything.” After sampling the fare, be sure to compliment the chef, as it “may make her heart beat a little faster or increase the blood pressure for the time being.”
If you’re looking for a way to use up some of leftover turkey—once the tryptophan wears off, of course—this issue of the South Bend News-Times serves up several ideas.
While this year’s Turkey Day feels a little different, these articles show that historically Americans have adapted to hardship, while retaining a sense of gratitude. Whether you’re making a meal for those closest to you or daydreaming of next year’s meal, we hope you have enjoyed exploring Thanksgiving recipes from years past. Search for more recipes using Newspapers.com. and Hoosier State Chronicles, which provides free access to over 1.1 million pages of newspapers spanning 216 years.
Describing the presidency of Franklin D. Roosevelt for the 2014 Ken Burns documentary The Roosevelts, conservative political writer George E. Will stated:
The presidency is like a soft leather glove, and it takes the shape of the hand that’s put into it. And when a very big hand is put into it and stretches the glove — stretches the office — the glove never quite shrinks back to what it was. So we are all living today with an office enlarged permanently by Franklin Roosevelt. 
Seventy-five years after President Roosevelt’s death, the debate continues over how much power the president should have, especially in regards to taking military action against a foreign power. On January 9, 2020, the U.S. House of Representatives voted to restrict that power, requiring congressional authorization for further action against Iran. The issue now moves to the Senate.
But the arguments over this balance of war powers are not new. In fact, in 1935, Indiana congressmen Louis Ludlow forwarded a different solution altogether – an amendment to the U.S. Constitution that would allow a declaration of war only after a national referendum, that is, a direct vote of the American people. Had the Ludlow Amendment passed, the U.S. would only engage militarily with a foreign power if the majority of citizens agreed that the cause was just. Ludlow’s ideas remain interesting today as newspaper articles and op-eds tell us the opinions of our Republican and Democratic representatives regarding the power of the legislative branch versus the executive branch in declaring war or military action. But what do the American people think, especially those who would have to fight? According to Brown University’s Cost of War Project, “The US government is conducting counterterror activities in 80 countries,” and the New York Timesreported last year that we now have troops in “nearly every country.”  But what does it mean to say “we” have troops in these countries? And does that mean that we are at war? Do the American people support the deployment of troops to Yemen? Somalia? Syria? Niger? Does the average American even know about these conflicts?
Expanding Executive War Power
Many don’t know, partly because the nature of war has changed since WWII. We have a paid professional military as opposed to drafted private citizens, which removes the realities of war from the daily lives of most Americans. Drone strikes make war seem even more obscure compared to boots on the ground, while cyber warfare abstracts the picture further.  But Americans also remain unaware of our military actions because “U.S. leaders have studiously avoided being seen engaging in ‘war,’” according to international news magazine the Diplomat.  In fact, Congress has not officially declared war since World War II.  Instead, today, Congress approves “an authorization of the use of force,” which can be “fuzzy” and “open-ended.”  Despite the passage of the War Powers Act of 1973, which was intended to balance war powers between the president and Congress, presidents have consistently found ways to deploy troops without congressional authorization.  And today, the Authorization for Use of Military Force Joint Resolution, passed in the wake of the September 11 attacks, justified an even greater extension of executive power in deploying armed forces.
“To Give to the People the Right to Decide . . .”
Indiana congressman Louis L. Ludlow (Democrat – U.S. House of Representatives, 1929-1949), believed the American people should have the sole power to declare war through a national referendum.  After all, the American people, not Congress and not the President, are tasked with fighting these wars. Starting in the 1930s, Representative Ludlow worked to amend the Constitution in order to put such direct democracy into action. He nearly succeeded. And as the debate continues today over who has the power to send American troops into combat and what the United States’ role should be in the world, his arguments concerning checks and balances on war powers remain relevant.
Ludlow maintained two defining viewpoints that could be easily misinterpreted, and thus are worth examining up front. First, Ludlow was an isolationist, but not for the same reasons as many of his peers, whose viewpoints were driven by the prevalent xenophobia, racism, and nativism rooted in the 1920s. In fact, Ludlow was a proponent of equal rights for women and African Americans throughout his career.  Ludlow’s isolationism was instead influenced by the results of a post-WWI congressional investigation showing the influence of foreign propaganda and munitions and banking interests in profiting off the conflict. 
Second, Ludlow was not a pacifist. He believed in just wars waged in the name of freedom, citing the American Revolution and the Union cause during the American Civil War.  He supported the draft during WWI and backed the war effort through newspaper articles.  Indeed, he even voted with his party, albeit reluctantly, to enter WWII after the bombing of Pearl Harbor.  He believed a direct attack justified a declaration of war and included this caveat in his original resolution. What he did not believe in was entering war under the influence of corporations or propaganda. He wanted informed citizens, free of administrative or corporate pressure, to decide for themselves if a cause was worth their lives. He wrote, “I am willing to die for my beloved country but I am not willing to die for greedy selfish interests that want to use me as their pawn.” 
So, who was Louis Ludlow and how did he come to advocate for this bold amendment?
“I Must and Would Prove My Hoosier Blood”
Ludlow described himself as a “Hoosier born and bred” in his 1924 memoir of his early career as a newspaper writer.  He was born June 24, 1873 in a log cabin near Connersville, Fayette County, Indiana. His parents encouraged his interests in politics and writing, and after he graduated high school in 1892, he went to Indianapolis “with food prepared by his mother and a strong desire to become a newspaperman.” 
He landed his first job with the Indianapolis Sun upon arrival in the Hoosier capital but quickly realized he needed more formal education. He briefly attended Indiana University before becoming seriously ill and returning to his parents’ home. After he recovered, he spent some time in New York City, but returned to Indianapolis in 1895. He worked for two newspapers, one Democratic (Sentinel) and one Republican (Journal) and the Indianapolis Press from 1899-1901. While he mainly covered political conventions and campaign speeches, he interviewed prominent suffrage worker May Wright Sewall and former President Benjamin Harrison, among other notables. He also became a correspondent for the (New York) World. 
In 1901, the Sentinel sent Ludlow to Washington as a correspondent, beginning a twenty-seven-year career of covering the capital. During this time, he worked long hours, expanded his political contacts, and distributed his stories to more and more newspapers. He covered debates in Congress during World War I and was influenced by arguments that membership in the League of Nations would draw the U.S. further into conflict. By 1927 he was elected president of the National Press Club. He was at the height of his journalistic career and had a good rapport and reputation within the U.S. House of Representatives.
With the backing of Democratic political boss Thomas Taggart, Ludlow began his first congressional campaign at the end of 1927 and announced his candidacy officially on February 23, 1928.  The Greencastle Daily Herald quoted part of Ludlow’s announcement speech, noting that the candidate stated, “some homespun honesty in politics is a pressing necessity in Indiana.”  He won the Democratic primary in May 1928 and then campaigned against Republican Ralph E. Updike, offering Hoosiers “redemption” from the influence of the KKK.  Ludlow “swept to an impressive victory” over Updike in November 1928, as the only Democrat elected from 269 Marion County precincts.  He took his seat as the Seventh District U.S. Representative from Indiana on March 4, 1929. 
The Indianapolis Star noted that while Ludlow was only a freshman congressman, his many years in Washington as a correspondent had made him “familiar with the workings of the congressional machinery” and “well known to all [House] members,” earning him the “confidence and respect of Republicans and Democrats alike.”  The Star claimed: “Perhaps no man ever entering Congress has had the good will of so many members on both sides of the aisle.”  This claim was supported by Ludlow’s colleagues on the other side of that aisle. Republican senator James E. Watson of Indiana stated in 1929, “Everybody has a fondness for Louis Ludlow, and as a congressional colleague, he shall have the co-operation of my office in the advancement of whatever he considers in the interest of his constituency.”  Republican representative John Cable of Ohio agreed stating:
Louis Ludlow has character and ability. He is the sort of a man who commands the respect and confidence of men and women without regard to party lines. He will have the co-operation of his colleagues of Congress, Republican as well as Democrats, and no doubt will render a high class service for his district.
Cable went so far as to recommend Ludlow for the vice-presidential candidate for the 1932 election.
Ludlow achieved some modest early economic successes for his constituents, including bringing a veterans hospital and an air mail route to Indianapolis. By 1930, however, he set his sights on limiting government bureaucracy and became interested in disarmament as a method to reduce government spending. Concurrently, he threw his support behind the London Naval Treaty which limited the arms race, and he became a member of the Indiana World Peace Committee. During the 1930 election, he stressed his accomplishments and appealed to women, African American, Jews, veterans, businessmen, and labor unions. He was easily reelected by over 30,000 votes. 
Back at work in the House, he sponsored an amendment to the Constitution in 1932 to give women “equal rights throughout the United States” which would have addressed legal and financial barriers to equality. He was unsuccessful but undaunted. He introduced an equal rights amendment in 1933, 1936, 1939, 1943, and 1945.  [A separate post would be needed to do justice to his work on behalf of women’s rights.] He also worked to make the federal government responsible for investigating lynching, as opposed to the local communities where the injustice occurred. He introduced several bills in 1938 that would have required FBI agents to investigate lynchings as a deterrent to this hate crime, but they were blocked by Southern Democrats. His main focus between 1935 and 1945 was advocating for the passage of legislation to restrict the government’s war powers and end corporate war profiteering.
“To Remove The Profit Incentive to War”
In 1934 the Special Committee on Investigation of the Munitions Industry, known as the Nye Committee after its chairman Senator Gerald Nye (R-ND), began to investigate the undue influence of munitions interests on U.S. entry into WWI. Like many Americans, Ludlow was profoundly disturbed by the committee’s conclusions. As Germany rearmed and Hitler’s power grew during the 1930s, Ludlow worried that the threat of a second world war loomed and the U.S. government, especially the executive branch was vulnerable to the influence of profiteers, as highlighted by the Nye Committee reports. He stated:
I am convinced from my familiarity with the testimony of the Nye committee and my study of this question that a mere dozen – half a dozen international financiers and half a dozen munitions kings, with a complaisant President in the White House at Washington – could maneuver this country into war at any time, so great are their resources and so far reaching is their power. I pray to God we may never have a President who will lend himself to such activities, but, after all, Presidents are human, and many Presidents have been devoted to the material aggrandizement of our country to the exclusion of spiritual values . . . 
Although he admired President Franklin D. Roosevelt’s diplomatic abilities Ludlow thought, as historian Walter R. Griffin asserted, that “it was entirely possible that a future President might very well possess more sordid motives and plan to maneuver the country into war against the wishes of the majority of citizens.”  As a protection against the susceptibility of the legislative and especially the executive branches to financial pressures of the munitions industry, Ludlow introduced a simple two-part resolution [HR-167] before the House of Representatives in January 1935. It would amend the Constitution to require a vote of the people before any declaration of war. He summed up the two sections of his bill in a speech before the House in February 1935: “First. To give the people who have to pay the awful costs of war the right to decide whether there shall be war. Second. To remove the profit incentive to war.”  He believed that the resolution gave to American citizens “the right to a referendum on war, so that when war is declared it will be the solemn, consecrated act of the people themselves, and not the act of conscienceless, selfish interests using the innocent young manhood of the Nation as its pawns.”
More specifically, Section One stated that unless the U.S. was attacked, Congress could not declare war without a majority vote in a national referendum. And Section Two provided that once war was declared, all properties, factories, supplies, workers, etc. necessary to wage war would be taken over by the government. Those companies would then be reimbursed at a rate not exceeding 4% higher than their previous year’s tax values.  This would remove the profit incentive and thus any immoral reasons for a declaration of war.
In an NBC Radio address in March 19235, Ludlow told the public:
The Nye committee has brought out clearly, plainly and so unmistakably that it must hit every thinking persons in the face, the fact that unless we write into the constitution of the United States a provision reserving to the people the right to declare war and taking the profits out of war we shall wake up to find ourselves again plunged into the hell of war . . . 
He added that “a declaration of war is the highest act of sovereignty. It is a responsibility of such magnitude that it should rest on the people themselves . . .” 
Ludlow’s resolution, soon known as the Ludlow Amendment, was immediately referred to the House Committee on the Judiciary. During committee hearings in June 1935, no one spoke in opposition to the bill and yet the committee did not report on the resolution to the House before the end of the first session in August, nor when they reconvened in 1936. Ludlow attempted to force its consideration with a discharge petition but couldn’t round up enough congressional signatures. Congress was busy creating a second round of New Deal legislation intended to combat the Great Depression and was less concerned with the war clouds gathering over Europe. Despite Ludow’s passionate advocacy both in the House and to the public, his bill languished in committee. In February 1937, he made a fresh attempt, dividing Sections One and Two into separate bills. The same obstacles persisted, and despite gathering more congressional support for his discharge petition, these resolutions too remained in committee. 
“What Might Have Been”
During a special session called by Roosevelt in November 1937 (to introduce what has become known as the “court-packing plan”), Ludlow was able to obtain the necessary signatures to release his resolution from committee. While congressional support for the Ludlow Amendment had increased, mainly due to the advocacy of its namesake, opposition had unified as well. Opponents argued that it would reduce the power of the president to the degree that the president would lose the respect of foreign powers and ultimately make the U.S. less safe. Others argued that it completely undermined representative government by circumventing Congress and thus erode U.S. republican democracy. Veterans’ organizations like the American Legion were among its opponents, and National Commander Daniel J. Doherty combined these arguments into a public statement before the January 1939 House vote. He stated that the bill “would seriously impair the functions and utility of our Department of State, the first line of our national defense.” He continued: “The proposed amendment implies lack of confidence on the part of our people in the congressional representatives. This is not in accord with the facts. Other nations would readily interpret it as a sign of weakness.”  The Indianapolis Star compared the debates over the resolution to “dynamite” in the House of Representatives. And while Ludlow had the backing of “1,000 nationally known persons,” who issued statements of support, his opponents had the backing of President Roosevelt who continued to expand the powers of the executive branch. In a final vote the Ludlow Amendment was defeated 209-188. 
Ludlow continued to be a supporter of Roosevelt and when Japanese forces attacked Pearl Harbor in 1941, the Indiana congressman voted to declare war, albeit reluctantly. He stated:
Japan has determined my vote in the present situation. If the United States had not been attacked I would not vote for a war declaration but we have been attacked . . . American blood has been spilled and American lives have been lost . . . We should do everything that is necessary to defend ourselves and to see that American lives and property are made secure. That is the first duty and obligation of sovereignty. 
After the close of World War II, Louis Ludlow continued his work for peace at an international level, calling on the United Nations to ban the atomic bomb. But he no longer advocated for his bill, stating that with the introduction of the bomb and other advanced war technology it was “now too late for war referendums.”  He told Congress in 1948:
Looking backward, I cannot escape the belief that the death of the resolution was one of the tragedies of all time. The leadership of the greatest and most powerful nation on earth might have deflected the thinking of the world into peaceful channels. Instead, we went ahead with tremendous pace in the invention of destruction . . . I cannot help thinking what might have been. 
Ludlow continued his service as a member of the U.S. House of Representatives until January 1949 after choosing not to seek reelection. Instead of retiring, he returned to the Capitol press gallery where his career had begun some fifty years earlier. And before his death in 1950, he wrote a weekly Washington column for his hometown newspaper, the Indianapolis Star.
“The People . . . Need to Have a Major Voice in the Use of Force . . .”
Ludlow’s eighty-five-year-old argument for giving Americans a greater voice in declaring war gives us food for thought in the current debate over war powers. Today, the conversation has veered away from Ludlow’s call for a direct referendum, but the right of the people’s voices to be heard via their elected representatives is being argued over heatedly in Congress. Many writers for conservative-leaning journals such as the National Review agree with their liberal counterparts at magazines like the New Yorker, that Congress needs to reassert their constitutional right under Article II to declare war and reign in the powers of the executive branch. This, they argue, is especially important in an era where the “enemy” is not as clearly defined as it had been during the World Wars. Writing for the National Review in 2017, Andrew McCarthy argued:
The further removed the use of force is from an identifiable threat to vital American interests, the more imperative it is that Congress weighs in, endorses or withholds authorization for combat operations . . . to ensure that military force is employed only for political ends that are worth fighting for, and that the public will perceive as worth fighting for. 
Writing for the New Yorker in 2017, Jeffery Frank agreed, stating:
The constitution is a remarkable document, and few question a President’s power to respond if the nation is attacked. But the founders could not have imagined a world in which one person, whatever his rank or title, would have the authority to order the preemptive use of nuclear weapons – an action that . . . now seems within the realm of possibility. 
And in describing the nonpartisan legal group Protect Democracy’s work to create a “roadmap” for balancing congressional and executive powers, conservative writer David French wrote for the National Review that “requiring congressional military authorizations in all but the most emergency of circumstances will grant the public a greater voice in the most consequential decisions any government can make.” 
So, if many liberals and conservatives agree that Congress should hold the balance of war powers, who is resisting a return to congressional authorization for military conflicts? According to the Law Library of Congress, the answer would be all modern U.S. Presidents. The library’s website explains that “U.S. Presidents have consistently taken the position that War Powers Resolution is an unconstitutional infringement upon the power of the executive branch” and found ways to circumvent its constraints. 
This bloating of executive war power is exactly what Ludlow feared. When his proposed amendment was crushed by the force of the Roosevelt administration, Ludlow held no personal resentment against FDR. He believed that this particular president would always carefully weigh the significance of a cause before risking American lives. Instead, Ludlow’s feared how expanded executive war powers might be used by some future president. In a January 5, 1936 letter, Ludlow wrote:
No stauncher friend of peace ever occupied the executive office than President Roosevelt, but after all, the period of one President’s service is but a second in the life of a nation, and I shudder to think what might happen to our beloved country sometime in the future if a tyrant of Napoleonic stripe should appear in the White House, grab the war power, and run amuck. 
A bridge between Ludlow’s argument and contemporary calls for Congress to reassert its authority can be found in the words of more recent Hoosier public servants. Former Democratic U.S. Representative Lee Hamilton and Republican Senator Richard Lugar testified before the Senate Committee on Foreign Relations on April 28, 2009 on “War Powers in the 21st Century.” Senator Lugar stated:
Under our Constitution, decisions about the use of force involve the shared responsibilities of the President and the Congress, and our system works best when the two branches work cooperatively in reaching such decisions. While this is an ideal toward which the President and Congress may strive, it has sometimes proved to be very hard to achieve in practice . . . The War Powers Resolution has not proven to be a panacea, and Presidents have not always consulted formally with the Congress before reaching decisions to introduce U.S. force into hostilities . . . 
In 2017, in words that echo Rep. Ludlow’s arguments, Rep. Hamilton reiterated that “the people who have to do the fighting and bear the costs need to have a major voice in the use of force, and the best way to ensure that is with the involvement of Congress.” While the “enemy” may change and while technology further abstracts war, the questions about war powers remain remarkably consistent: Who declares war and does this reflect the will of the people who will fight in those conflicts? By setting aside current political biases and looking to the past, we can sometimes see more clearly into the crux of the issues. Ludlow would likely be surprised that the arguments have changed so little and that we’re still sorting it out.
Kreps writes that this “light footprint warfare,” made possible by technological advancement, creates a “gray zone” in which it’s unclear which actors are responsible for what results, thus fragmenting opposition.
 Garance Franke-Tura, “All the Previous Declarations of War,” The Atlantic, August 31, 2013; Robert P. George and Michael Stokes Paulsen, “Authorize Force Now,” National Review, February 26, 2014.
Franke-Tura wrote about congressional use of force in Syria in 2013: “If history is any guide, that’s going to be a rather open-ended commitment, as fuzzy on the back-end as on the front.” Writing for the National Review in 2014, Robert P. George and Michael Stokes Paulsen agreed that in all cases of engaging in armed conflict not in response to direct attack, the president’s power to engage U.S. in military conflict (without an attack on the U.S.) is “sufficiently doubtful” and “dubious.”
While the purpose of the War Powers Resolution, or War Powers Act, was to ensure balance between the executive and legislative branches in sending U.S. armed forces into hostile situations, “U.S. Presidents have consistently taken the position that War Powers Resolution is an unconstitutional infringement upon the power of the executive branch” and found ways to circumvent its constraints, according to the Law Library of Congress. Examples include President Reagan’s deployment of Marines to Lebanon starting in 1982, President George H. W. Bush’s building of forces for Operation Desert Shield starting in 1990, and President Clinton’s use of airstrikes and peacekeeping forces in Bosnia and Kosovo in the 1990s.
Writer and National Review editor Jim Geraghty wrote in 2013: “There are those who believe the War Powers Act is unconstitutional – such as all recent presidents . . .” Journals as politically diverse as the National Review and its liberal counterpart the New Yorker, are rife with articles and opinion pieces debating the legality and constitutionality of the Act. Despite their leanings, they are widely consistent in calling on Congress to reassert its constitutional authority to declare war and reign in the war powers of the executive branch.
According to the Law Library of Congress, in 2001, Congress transferred more war power to President George W. Bush through Public Law 107-40, authorizing him to use “all necessary and appropriate force” against nations, groups, or even individuals who aided the September 11 attacks.
 Louis Ludlow, Hell or Heaven (Boston: The Stratford Company, 1937).
 Walter R. Griffin, “Louis Ludlow and the War Referendum Crusade, 1935-1941,” Indiana Magazine of History 64, no. 4 (December 1968), 270-272, accessed Indiana University Scholarworks. Griffin downplays Ludlow’s early congressional career, however, he pushed for many Progressive Era reforms. Ludlow worked for an equal rights amendment for women, an anti-lynching bill, and the repeal of Prohibition.
Ibid.; United States Congress,“Report of the Special Committee on Investigation of the Munitions Industry (The Nye Report),” Senate, 74th Congress, Second Session, February 24, 1936, 3-13, accessed Mount Holyoke College.
 “Speech of Hon. Louis Ludlow of Indiana, in the U.S. House of Representatives,” February 19, 1935, Congressional Record, 74th Congress, First Session, Pamphlets Collection, Indiana State Library.
 Ernest C. Bolt, Jr., “Reluctant Belligerent: The Career of Louis Ludlow” in Their Infinite Variety: Essays on Indiana Politicians, eds. Robert Barrows and Shirley S. McCord, (Indianapolis: Indiana Historical Bureau, 1981): 363-364.
 Louis Ludlow, Public Letter, March 8, 1935, Ludlow War Referendum Scrapbooks, Lilly Library, Indiana University, cited in Griffin, 273.
 Louis Ludlow, From Cornfield to Press Gallery: Adventures and Reminiscences of a Veteran Washington Correspondent (Washington D.C., 1924), 1. The section title also comes from this source and page. Ludlow was referring to the Hoosier tendency to write books exhibited during the Golden Age of Indiana Literature.
 “G.O.P. Wins in Marion County,” Greencastle Herald, November 7, 1927, 3, accessed Hoosier State Chronicles; “Ludlow Wins Congress Seat,” Indianapolis Star, November 27, 1928, 1, accessed Newspapers.com.
 Everett C. Watkins, “Ludlow Will Leap from Press Gallery to Floor of Congress,” Indianapolis Star, March 3, 1929, 13, accessed Newspapers.com.
 “Discuss Women’s Rights,” Nebraska State Journal, March 24, 1932, 3, accessed Newspapers.com; “Women Argue in Favor of Changes in Nation’s Laws,” Jacksonville (Illinois) Daily Journal, March 24, 1932, 5, accessed Newspapers.com; “Woman’s Party Condemns Trial of Virginia Patricide,” Salt Lake Tribune, December 2, 1925, 1, accessed Newspapers.com; “Equal Rights Demanded,” Ada (Oklahoma)Weekly News, January 5, 1939, 7, accessed Newspapers.com; Bolt, 383.
The National League of Women Voters crafted the language of the original bill which Ludlow then sponsored and introduced. In 1935, the organization passed a resolution that “expressed gratitude . . . to Representative Louis Ludlow of Indiana for championing women’s rights.”
 “Ludlow Asks War Act Now,” Indianapolis Star, March 13, 1935, 11, accessed Newspapers.com.
 “To Amend the Constitution with Respect to the Declaration of War,” Hearing before Subcommittee No. 2 of the Committee on the Judiciary House of Representatives, 74th Congress, First Session, On H. J. Res. 167, accessed HathiTrust; Griffin, 274-275.
 Everett C. Watkins, “Ludlow Bill ‘Dynamite’ in House Today,” Indianapolis Star, January 10, 1938, 1, accessed Newspapers.com.
 Louis Ludlow to William Bigelow, January 5, 1936, in Griffin, 282.
 U.S. Senate Committee on Foreign Relations, War Powers in the 21st Century, April 28, 2009, Hearing before the Committee on Foreign Relations, United States Senate, 111th Congress, First Session, (Washington: U.S. Government Printing Press, 2010), accessed govinfo.gov.
On the corner of Sixth Avenue and Washington Street stands a complex forged out of Indiana limestone. Plants creep through shattered windows, “UR MOM” is spray-painted across a balcony, and the scorched roof opens up into the heavens. The remains of Gary’s City Church represent very different things to onlookers. For some, they symbolize the unfulfilled promise of industrial utopia. For others like Olon Dotson, professor of Architecture and Planning at Ball State University and a Ph.D. candidate in Purdue University’s American Studies Program, “The remains of the structure serve as a monument to racism and segregation.” For most, it is simply the backdrop for a scene in Transformers 3. Few would disagree, however, that City Church embodies the rise and fall of Steel City.
The church’s history is as nuanced as the feelings its remains inspire. The First Methodist Episcopal Church of Gary, was established in 1906, the same year the United States Steel Corporation gave birth to the city. The company converted acres of swampland and sand dunes, and soon Gary—named after U.S. Steel founding chairman Elbert Henry Gary—found itself dominated by steel mills. The expanding market for steel shaped the city’s built environment andencouraged population growth there. Between 1906 and 1930, increasing numbers of European immigrants, Black Southerners, Mexicans, and white migrants flocked to the region looking for work in the steel industry.
Historian James B. Lane contended that “Because of U.S. Steel’s limited concept of town planning, two strikingly different Gary’s emerged: one neat and scenic, the other chaotic and squalid.” Businessmen, as well as skilled plant operators and managers, settled North of the Wabash Railroad tracks. They resided in Gary Land Company’s subdivisions among paved streets, quaint homes, and lush rows of trees. Northsiders relaxed in limestone restaurants and club rooms after a long day of work. The cost to live in this area precluded many newcomers, primarily African Americans and immigrants, from settling there. They instead lived on the Southside, often in tarpaper shacks, tents, and barracks that lacked ventilation. Lane noted that because the Gary Land Company largely neglected this area, landlords “took advantage of the housing shortage and absence of health regulations or building codes by charging inflated rents and selling property under fraudulent liens.” This marshy region, deemed the “Patch,” attracted “mosquitos, and the pestilential outhouses, unpaved alleys, damp cellars, and overcrowded dwellings were breeding grounds for typhoid, malaria, and tuberculosis.”
Lane noted that immigrant families on the Southside organized into “shanty” communities, where they “stuck together but adjusted their old-world lifestyles to new circumstances.” Sometimes various ethnic and racial groups socialized, and even learned from one another, as Black residents taught immigrants English and vice versa. Lacking access to the opportunities and amenities of the Northside, rampant crime and vice arose as “laborers entered the omnipresent bars armed and ready to squeeze a few hours of action into their grim lives.” Segregated from its inception, Gary’s social construction ultimately resulted in its implosion.
In the burgeoning metropolis, the aforementioned First Methodist congregation met in local schools, businesses, and an abandoned factory before constructing a church on the corner of Adams Street and Seventh Avenue in 1912. With rapid socioeconomic and demographic change taking place in Gary, the church, under the vision of white pastor William Grant Seaman, initiated plans in 1917 to move into the heart of the city. A native of Wakarusa, Indiana, Seaman earned his B.A. from DePauw University and his Ph.D. from Boston University. After ministering and teaching in various states, the pragmatic pastor relocated to Steel City in 1916 at the request of Chicago Bishop Thomas Nicholson.
Seaman, nicknamed “Sunny Jim” for his disposition, contended that Gary’s Methodist church had an obligation to ease the challenges faced by the:
industrial worker . . . often suffering injustice;
the foreigners within our boundaries . . . They represent some fifty different race and language groups;
our brothers in black, coming from the Southland in a continuous stream;
our own white Americans, who come in large numbers from the village and the farm.
He noted that this ministry was especially important, given that many urban churches had relocated to Gary’s outskirts as the city grew more congested. According to historian James W. Lewis, Reverend Seaman felt “the modern city was plagued by a breakdown of traditional community and social control, resulting in an anonymous, mobile, materialistic, hedonistic population.” He therefore believed that it was the church’s responsibility “to develop programs which would provide some of the support, guidance, and satisfaction characteristic of traditional communities.”
Compassionate and industrious, Seaman felt called to meet the “religious and creature-comfort need[s]” of the laborers and their families who poured “in great human streams through the gates of these mills.” However, his beliefs about the city’s newcomers, particularly the African American population, are problematic by today’s standards. He felt that white church leaders were best qualified to uplift the growing Black population, writing in 1920 that “colored people are very ignorant, and to a surprising degree morally undeveloped, and this fact is true of a very large number of their preachers.” Seaman justified the need for white leadership by citing rumors that Black-led denominations “are cultivating in their people a sense of being wronged.” Like Gary’s Stewart Settlement House (on which he served as a board member), Seaman’s intentions seem two-fold: to implement social control in a diversifying city and to provide humanitarian aid.
Lewis noted of Seaman and other white leaders:
Although their perception of the cause was often flawed and their service of it often mixed with other motives, their actions revealed their conviction that the church should be a prominent force for good, even in the modern city.
While Seaman held a paternalistic view of the Black community, his efforts to combat racism drew the ire of the Ku Klux Klan. Seaman opposed showing the film Birth of aNation, which reinforced stereotypes about the supposed inherent savagery of African Americans. He also tried unsuccessfully to convince the Methodist Hospital to admit Black patients.
The ambitious pastor quickly got to work, meeting with leaders of the Centenary of Methodist Missions and the U.S. Steel Corporation to drum up support for a downtown church. His lobbying paid off and both groups donated approximately $350,000 to build an “oasis” that would be open seven days a week. In October 1926, Seaman’s vision was realized when City Church—as the First Methodist Episcopal’s downtown church came to be called—opened to much fanfare. Reporters marveled at the ornate cathedral, which boasted of a social-educational unit, gymnasium, rooftop garden, tennis court, and community hall equipped with a “moving picture outfit” and modern stage. It also contained retail stores and a commercial cafeteria, which generated income for church expenses. This was necessary, Seaman said, because the downtown church ministered to groups having fewer resources with which to support the sanctuary.
Although Sunny Jim sought inclusivity, records indicate that the congregation remained white until the church’s closing. Conspicuously absent from photographs of pews lined with worshippers—hair bobbed and suits pressed—were members of color. While Black residents did not bow their heads in prayer beside white congregants (who likely did not welcome their presence), they did utilize City Church’s amenities. According to Lewis, Seaman was fairly successful in promoting the community hall “‘as a religiously neutral ground for artistic and civic events,’” although “there was little mixing of cultures.”
City Church tried to navigate race relations in a polarized city, to some degree, opening its doors to civic, social, and spiritual gatherings. In 1927, the church hosted a race relations service, in which members and pastors of African American churches Trinity M. E. and First Baptist shared in services. Reverend Seaman delivered the principle address, stating “We shall make no progress toward race union . . . until we view each other as God views us, children of the same Father and brothers all.” After toiling in factories, Swedes, Mexicans, and Croatians gathered at City Church to study, worship, and play. Romanian children, “Americanized” at schools like Froebel, congregated in the church gym to socialize and shoot hoops.
When Reverend Seaman left in 1929 under unclear circumstances, the church turned inward and ministered less frequently to Gary’s immigrant and Black populations, especially during the demanding years of the Great Depression and World War II. Unfortunately, Gary’s Negro YMCA closed and African Americans were the first to be let go at the mills, making churches and relief organizations more crucial than ever. Resentment built among Gary residents as they competed for government support, resulting in the voluntary and forced repatriation of Mexican workers on relief rolls. The church did offer programs where weary (likely white) residents could momentarily forget their troubles, hosting Gary Civic Theater plays and an opera by a renowned singer.
Church records from the early Atomic Era denote renewed interest in ministering to the church’s diverse neighbors. The degree to which the church took action is unclear, although advertisements for Race Relations Sunday indicate some walking of the talk.* City Church photographs document an immunization clinic, which served both African American and white children, as well as cooking classes for Spanish girls. It is clear, however, that, despite the efforts of some City Church pastors, members of the white congregation largely did not support, and sometimes opposed, integrated Sunday mornings. With Steel City’s influx of African Americans and immigrants in the 1950s and 1960s, Gary’s white population fled to the suburbs, depleting the urban core of tax revenue. City Church members belonged to this exodus. Tellingly, on a 1964 survey, Rev. Allen D. Byrne appears to have checked, only to erase, a box noting that the church ministered to racial groups.
This changed temporarily with the leadership of Reverend S. Walton Cole, who perhaps came closest to fulfilling Reverend Seaman’s mission, with his 1964 appointment. Cole wrote frequently in City Church’s newsletter, Tower Talk, about confronting one’s personal prejudices and the role of the church in integrating minority groups. Unafraid to confront social issues, Cole argued at a Methodist Federation meeting, “We are not socialists and communists when we talk about moral problems in our nation. Wouldn’t Jesus talk about poverty if he walked among us today?” Under Cole’s pastorship, the church hired Aurora Del Pozo to work with Gary’s Spanish-speaking population. Such efforts, Tower Talk reported, went a long way in understanding their Hispanic neighbors, noting “we were introduced to the viewpoints and attitudes held by these Spanish speaking people that were a surprise to most of us.”
Cole, addressing the trend of church members to “shut their ears and eyes” and move out of the city, noted in 1966:
Hate is the strongest of all. We hate the Negroes, the Puerto Ricans, the Mexicans, the Irish, the English, the Germans, the French. We hate the Jews, the Catholics, the Baptists, the Methodists, the Presbyterians, the Republicans, the Democrats, the Socialists. We hate everybody, including ourselves. This is the way of the world, the secular world.
He countered that the Christian way centered around demonstrating love and hope for all. The NAACP awarded Reverend Cole with the first Roy Wilkins award for his work in civil rights. During his pastorship, the church worked to redevelop the downtown area, striving to “maintain a peaceful and developing community by improving race relations.” But this same year, fugitive James Earl Ray assassinated Martin Luther King Jr. in Memphis, setting off a string of riots across the country. Riots in Gary’s Midtown section, formerly the Patch, that summer resulted in gunfire, looting, and burning. Gary’s first African American mayor, Richard Hatcher, contended “‘slum conditions in the city and inequalities in education and employment have fostered the tenseness'” that led to the riots.
Some of Gary’s African American residents got involved in the Black Power Movement, which arose after decades of educational, political, and housing discrimination. The movement espoused racial pride, social equality, and political representation through artistic expression and social (and sometimes violent) protest. In 1972, Gary hosted the National Black Political Convention, which drew over 10,000 Americans of color. State delegates and attendees—comprised of Black Panthers, Socialists, Democrats, Republicans, and Nationalists—hoped to craft a cohesive political strategy to advance Black civil rights. This event highlighted Gary’s polarization along racial lines, which became so profound that City Church reported in the 1970s: “Evening sessions are difficult without police protection. Most folks are afraid to come downtown.” This schism was perhaps inevitable, given that city planners constructed Gary around the color of residents’ skin. As City Church membership sharply declined, church leaders realized they needed to build meaningful relationships with the local community.
It became apparent they had waited too long. The 1973 Pastor’s Report to the Administrative Board noted:
Most residents in the immediate area will already have found a convenient church where they are welcome . . . Furthermore Blacks are not likely to come to a church which they ‘feel’ has excluded them for several years. The neighborhood may have continued to change from one social class group to another, so that there is an almost unbridgeable gap between the white congregation and the persons living in the community.
A survey of urban church leaders cautioned in 1966 that, regardless of resources or mission, a white church in a Black neighborhood could only carry on for so long, that the “ultimate end is the same. THE CHURCH DIES!” City Church leaders considered merging with a local Black church, but when community interviews revealed that minority groups did not trust the church, leaders decided to close in 1975. Die it DID.
After decades of decomposition, philanthropic organizations and city leaders have turned their attention to redeveloping the building. After all, as Professor Dotson warns, Gary is in jeopardy of the “eminent collapse under the weight of its own history.” As of now, the most likely outcome involves stabilizing the building and converting it into a ruins garden. A supporter of the ruins concept, Knight Foundation’s Lilly Weinberg, seemingly invokes Reverend Seaman with her statement that “Creating spaces for Gary’s residents to meet and connect across backgrounds and income levels is essential to community building.” Some in Gary oppose this plan, arguing that if the city receives funding it should be allocated to existing African American churches that need structural support, rather than one that ultimately abandoned the Black community.
Regardless of City Church’s fate, Ball State Professor Olon Dotson argues it is crucial that Gary’s legacy of segregation is incorporated into its story “for the sake of the young children, attending 21st Century Charter School at Gary, who look out their classroom windows, or wait for their parents every day, in front of the abandoned ruins of a church, in the midst of abandoned Fourth World space.” If the ruins embody Gary’s past, what is done with them now could signify Steel City’s future.
For a list of sources used and historical marker text for City Church, click here.
* Without the digitization of Gary newspapers, and given the lack of documentation of Gary’s Black residents during the period, it is difficult to give voice to those City Church attempted to reach. Pastor Floyd Blake noted in 1973 that the church conducted over 100 interviews with Black, white, and Spanish-speaking residents regarding their perception of City Church. Although we have been unable to uncover them, they could provide great insight. Please contact email@example.com if you are aware of their location.
“Acute Labor Shortage Perils Midwest Farms”
–(Valparaiso) Vidette-Messenger of Porter County
“No Labor Shortage”
– Indianapolis Recorder
So which was it? An acute labor shortage endangering the farms of the corn-belt, and in turn, the country’s war production? Or no labor shortage at all? The answer is surprising and continues to impact policy today.
The Agricultural Front
Just before U. S. entry into the Second World War, large farming and agricultural processing companies—which had become dependent on the cheap labor that was abundant during the Great Depression—warned of an impending labor shortage. They claimed that there was not a sufficient number of workers available to fill the positions left behind by the men enlisting in the armed forces, or by the men and women who left the farm for war-related industrial work.
At the same time, with the introduction of President Roosevelt’s Lend-Lease program (which lent food and supplies to Great Britain and its allies), the U.S. needed to produce more agricultural products than ever before. The battle on the agricultural front would need a larger number of agrarian soldiers. Indiana newspapers worried over how Hoosier farmers would meet production goals as their sons left for the “army camps” and “defense industrial plants.” The Muncie Post Democratcontinued:
Now that the sons are gone, the farm operators find it impossible to compete with industrial labor wages for help. This may result in many acres uncultivated this season . . . This condition rates as serious when food production is important in the defense program.
In spring 1942, Purdue University reported that “anticipated shortages of farm labor, resulting from enlistments in the armed forces and attractive industrial wages, have not developed.” However, as the year went on, Indiana newspapers became more frantic in tone. They reported that farmers were selling acreage and animals because they could not find farm hands to help with the work. The weekly industry newspaper, the Prairie Farmer, surveyed eighty-one midwestern counties and reported that three-fourths of them “were found to be suffering from a shortage of farm hands.”
Indiana Canneries and the “Labor Shortage”
By the fall of 1942, large Indiana agricultural businesses joined the national cry of “labor shortage.” Indiana newspapers gave extensive coverage to the professed concerns of the tomato canning industry. The Muncie Evening Press ran the headline: “Labor Shortage Hits Tomatoes: Cannery Shutdowns and Crop Losses Threaten.”
The article reported that the “acute war-born labor shortage” would close a dozen canneries and that “picked tomatoes awaiting processing [were] lying idle and periled by rotting.” State government officials and the Indiana Farm Bureau spoke on behalf of the canneries and appealed to local men and women to go to work at the plants. Hasil E. Schenck, president of the Indiana Farm Bureau, stated:
Reduced farm production will be no reflection on the patriotism of farmers, for without manpower they can not produce food and fiber any better than industry can produce ships, tanks and guns without steel.
Indiana Governor Henry Schricker issued “an appeal to housewives and all others available to apply for work at the nearest cannery.” The Evening Press reported that the canneries were already employing WPA workers and were calling for women “peelers” and for school children “packers” to volunteer their services.
Yes, volunteer. These industry giants, many of whom had profitable government contracts, were asking for women and children to freely donate their labor. A few days after the call for volunteers went out, the Elwood Call-Leader praised the response of school staff and students in the Madison County area while rebuking the “apathetic and uncooperative” attitudes of local women—women who likely had increased workloads at home because of the war effort. According to the article, employment service and local government officials complained that “despite all appeals that have been made throughout the past week, many . . . women still do not realize the seriousness of the situation and are not willing to work, even [though] they are needed only to get through the brief critical period the industry is now facing.”
The Call-Leader added that army officials were “alarmed at the situation” and were “making a check to see whether the army will be able to get the tomatoes it has ordered.” The canneries’ message was clear. Without cheap or free labor, American boys on the front would go without food. Like corporations across the country, Indiana businesses began to demand that the government supply them with an inexpensive source of labor.
African American Newspapers and the “Labor Shortage”
And yet, African American newspapers saw “no labor shortage.” The Indianapolis Recorderreported that the companies need only to “hire negroes.” The Recorder, continued:
Nobody has yet proved there is a labor shortage in this country. . . There is no need to work a few workers to death while others walk the streets hungry, seeking work. There are still enough qualified workers in this country to allow employers to continue their discrimination against workers because of the race, religion, and nationality of such workers.
Indiana’s African American newspapers reported that thousands of African Americans were looking for work and were willing to travel great distances to take jobs, but employers didn’t want them. For example, in November 1942, the Indianapolis Recorder and the Evansville Argus reprinted a report from Graphic Magazine that 3,000 African American men left “the Deep South” at the request of California farmers for help saving the harvest. When they arrived “there were no jobs for them!”
The Labor Shortage Myth
The observations of the African American newspapers were correct. There was no labor shortage that the federal government could not meet with domestic workers. However, the myth of the labor shortage had its own power.
Over the previous decade, the Great Depression created a large surplus of workers seeking employment. In 1941, the Department of Agriculture and the Department of Labor reported that farmers had “come to consider this over supply as the normal supply, and to consider any reduction in the surplus supply as a shortage.” These departments concluded, however, that all of the shortages, perceived or real, could be met by moving surplus domestic workers into the areas of need. The catch, however, was that the balanced supply of available workers and demand for their labor required employers to pay a fair wage for agricultural labor.
A remarkably organized effort of the Farm Security Administration (FSA) and the U. S. Employment Service (USES) was prepared to deal with any real “pockets of labor scarcity.” They expanded the New Deal migratory camp program, setting up permanent and mobile camps around the country to bring American workers across the country for harvests. However, because employers had to pay more reasonable wages, they still complained of shortage. In fact, they cited higher wages as evidence of a shortage.
Statistics from the Indiana division of the U.S. Employment Service show that Indiana’s available labor pool reflected the national situation. J. Bradley Haight, the Director of the U.S. Employment Service (USES) in Indiana estimated in 1942 that there were “100,000 individuals in the state seeking employment. He stated, “The job insurance division issued checks to 40,000 persons. This represents a reservoir of labor which is to be tapped.” However, the large growers, dependent on cheap labor, continued to cry shortage even as they were provided with workers by the FSA and USES—workers that they didn’t want to employ because of racial prejudice or unwillingness to pay a fair wage.
So these wealthy, powerful, and organized growers and processors of agricultural commodities demanded that the federal government respond to their manufactured labor shortage by importing foreign workers. The government quickly gave in to their demands. History professor Cindy Hahamovitch, writing for the Center for Immigration Studies, summarized the government’s response to the labor myth:
The officials who created the guestworker program never believed there was a national labor shortage in agriculture. . . They created the importation program, not because it was necessary, but because it was politically expedient to do so, because the nation’s most powerful growers were demanding the preservation of the cheap, plentiful, and complacent labor force to which they had become accustomed over the previous 20 years of agricultural depression.
The federal government complied because the myth was persuasive. A false labor shortage would have the same effect on agricultural production as a real one. No amount of statistics or economic reports could allay the fears of farmers worrying if sufficient help would be available at harvest time. Therefore, farmers anticipating a lack of aid and picturing their produce rotting in the fields, would plant less, and the country wouldn’t meet its production goals—just as if there was a real labor shortage.
Despite their best efforts to meet the real pocket labor shortages with domestic workers and their distribution of reports on the available domestic labor pool, the federal government needed to allay the small farmer’s growing fear of a massive shortage. By 1942, the Roosevelt administration was cornered into responding to the shortage myth by importing foreign workers. As Congress tore apart the Farm Security Administration and its program of migrating workers to areas of need, U. S. Secretary of Agriculture, Claude R. Wickard, left for Mexico to negotiate a deal that would affect agricultural and immigration policy for decades.
Hoosier Dirt Farmer as U. S. Secretary of Agriculture
Claude R. Wickard was a Hoosier dirt farmer through and through. He was born in 1893 and raised in Carroll County on his family’s farm. His father, a staunch democrat named for Andrew Jackson, was a strict disciplinarian who raised his son with every expectation that the farm was his present, future, and legacy. The younger Wickard, however, grew ambitious. He saw that the farm could be more productive and efficient with the application of modern methods. Against his father’s wishes, he enrolled in classes at Purdue, where he learned about scientific farming and got hands-on experience with sanitary hog care and breeding. He soon vastly improved the farm and received recognition from farming organizations as a leader in modern farming methods. His influence in local Farm Bureau organizations grew in the 1920s and he advanced to several leadership positions where he took on the challenges of his fellow farmers.
Beginning at Purdue and continuing throughout his career, Wickard remained focused on rural social justice and “the farm problem.” To Wickard, social justice for rural folks meant that farmers should have equal buying power as urban workers. The inextricably related farm problem was what economists called a parity problem, that is, the prices farmers received for their products was not in balance with their expenses. Wickard, like many leaders of the New Deal, spent his early career trying to figure out how the state and federal government could achieve parity for farmers by solving the problem of overproduction.
By 1930, several factors made Wickard a prime political candidate. First and foremost, while most Indiana farmers were Republicans, Wickard was born into a staunchly Democratic family and remained loyal to the party despite the fact that the national party had not prioritized rural concerns through the 1920s. Thus, Wickard was one of the few farmers with influence in the Farm Bureau and other organizations who was also a Democrat. Second, Wickard’s embrace of scientific farming ideas made him open to production control as a method to achieving parity for farmers. Most farmers, who were already barely making ends meet while operating their farms at full production could not imagine cutting down on output. Wickard, however, could see that farmers needed help from the federal government to make the drastic, nationwide economic shift required to give them the same standard of living as the urban people they fed. This way of thinking aligned with the ideas of the men who would soon take over leadership of the nation. Wickard was poised to join them.
His political career began modestly. A group of county organizers convinced him to run for a state senate seat and he reluctantly agreed. Wickard stated in an interview:
I didn’t like politics . . . [but] like all other things, sometimes you’ve got to make your contributions to your community and to the Democratic Party . . . I had a feeling of responsibility toward my fellow citizen.
Wickard was elected state senator November 8, 1932 as Democrats swept elections across the country and Franklin Delano Roosevelt won the U. S. presidency.
In May 1933, the Agricultural Adjustment Act took effect and farmers saw that the new administration recognized their plight. The Agricultural Adjustment Administration (AAA or Triple-A), a division of the Department of Agriculture, was tasked with creating parity through taxing companies that used agricultural produce and decreasing production. Wickard was quickly elected chairman of the Corn-Hog Section of the Indiana Triple-A. He soon became the Assistant to the Chief of the National Corn-Hog Division, and in July 1933 Wickard went to Washington.
When he arrived in Washington as second in command of the Corn-Hog Section of the AAA, he was overwhelmed by the job. In his own words, Wickard was “just a farmer” and had to work to understand the complex economic issues the administration faced. And he got frustrated with the pace of bureaucracy. However, he was likeable, earnest, easy to work with, and his ideas about parity aligned with those of Henry Wallace, the Secretary of Agriculture. Most important to Wickard’s rise, however, was that he was known as a loyal Democrat and commanded the respect of midwestern farmers.
When the Department of Agriculture reorganized by region, as opposed to commodity in 1936, Wickard became Assistant Director of the North Central Division. By this point, Wickard was on Wallace’s radar and the secretary saw potential in the Hoosier dirt farmer. Wallace later noted that Wickard was rare in a department of apolitical technocrats and subject experts in that he was actually a Democrat. Wallace stated: “He was about the only one of the whole crowd in agriculture that had any claim to being a democratic politico.” In the fall of 1936, Wallace brought Wickard with him as he stumped for FDR throughout the Midwest. When FDR won reelection, Wickard continued to make himself useful to Wallace at the USDA and was quite successful and well-liked in his division.
In January 1940, Wallace recommended Wickard to FDR for the position of Undersecretaty of Agriculture. After making sure he was not aligned with Roosevelt’s Hoosier adversary Paul McNutt, the president agreed. Wickard was sworn in February 29, 1940. He served less than six months before Wallace resigned as Secretary of Agriculture to run as FDR’s vice president. Wallace recommended Wickard to succeed him and Wickard was sworn in as the U. S. Secretary of Agriculture September 1940.
Wickard, The Labor Issue, and The Bracero Program
With much of Europe dependent on U.S. agricultural production, the Secretary of Agriculture’s job was even more important than in peace time. Meeting war production goals was paramount. Wickard faced many challenges, among them, the increasing claims of a labor shortage. In December 1941, Wickard testified before the U.S. House of Representatives Agriculture Committee:
The farm labor shortage is not as serious as generally believed. Farm production has suffered, of course, from the loss of farm hands who have been drafted or got higher pay in defense plants. But the situation is not alarming.
While he downplayed the labor shortage claims, he did make it clear that farmers would “have to pay more for their help” than they had before the war stimulated the economy and reduced the labor surplus. As the earlier examination of newspaper articles has shown, this was not an option many corporations were willing to consider.
Less than a year later, Wickard had changed his approach to the issue. The (Richmond) Palladium-Item reported :
Secretary of Agriculture Wickard warned that the United States would face a food shortage unless it quickly solves the problem of manning the farms. He estimated the armed forces and factories may drain off approximately 2,000,000 farm workers by the end of 1942 in addition to those who have already gone.
By this point, it seemed like Wickard was treating the labor shortage claims as a legitimate threat to production goals. However, this same Palladium article still noted that “the most mentioned causes” of the shortage “were high wages.” Even at the peak of industry claims of a labor shortage, the crux of the issue was still that companies would “have to pay more for their help,” as Wickard told the House in 1941.
While Wickard described his understanding of complex economic issues as limited and his progress in grasping what his statistician colleagues reported as slow and labored, he deeply understood and cared about agricultural issues and maintained a strong moral decision-making process throughout his career. Like most government officials with access to labor statistics, Wickard would have known that, while there was no labor shortage, a fictional labor shortage was just as dangerous to the war effort. It is, however, possible that his tenuous grasp of complex economic issues meant that he thought the shortage was real. (His biographer Dean Albertson implies the second). Wickard’s career record shows that he would not have acted to address the labor shortage had he not believed it was the best thing for the American people. There are many instances during his career when a different vote or decision would have furthered his political career, but he did what he believed to be the right thing for American farmers.*
Tasked with addressing the issue, Wickard left for the Second Inter-American Conference on Agriculture in Mexico City early in July 1942, to make a deal that would import Mexican workers and ensure the United States met its production goals. Several agencies were involved in creating a plan to import Mexican agricultural workers, but it was Wickard who was responsible for negotiating an agreement between the interests of the Mexican government, the United States government, American farmers, labor organizations, and large farming and processing conglomerates.
Mexican Secretary of Foreign Affairs Evequiel Padilla Peñaloza was reluctant to agree because of U.S. exploitation of and discrimination against Mexican workers in the past. Padilla insisted that any agreement include a number of guarantees for the rights of braceros. Padilla demanded Mexican workers receive the same guarantees of wages and working and living conditions as American workers. Wickard agreed to a minimum wage and work and living standard. However, there were no such guarantees for American workers. Thus, as labor organizations were quick to point out, these workers were guaranteed, at least in theory, more protection by the U. S. government than domestic farm laborers. After ten days of negotiations Wickard formalized the agreement August 4, 1942. In less than a year’s time, Indiana farms were benefiting from foreign labor. Hoosier response to these guest workers was mixed.
In Part Two of this post we will look at the stories of these farmers and foreign workers as told through Indiana newspapers:
Albertson, Dean. Roosevelt’s Farmer: Claude R. Wickard in the New Deal. New York: Columbia University Press, 1961.
Bracero History Archive. Roy Rosenzweig Center for History and New Media, George Mason University, Smithsonian National Museum of American History, Brown University, and the Institute of Oral History at the University of Texas El Paso, http://braceroarchive.org/
Collingham, Lizzie. The Taste of War: World War II and the Battle for Food. New York: Penguin Books, 2011.
Craig, Richard B. The Bracero Program: Interest Groups and Foreign Policy. Austin: University of Texas Press, 1971.
Hahamovitch, Cindy .”The Politics of Labor Scarcity: Expediency and the Birth of the Agricultural ‘Guestworkers’ Program,” Report for the Center for Immigration Studies, December 1, 1999, accessed https//cis.org/Report/Politics-Labor-Scarcity.
Hurt, Douglas R. American Agriculture: A Brief History. Ames, IA: Iowa State University Press, 1994.
On April 19, 2018, over a chain link fence Hammond resident and former EPA attorney David Dabertin voiced his concerns about the former site of Federated Metals to Governor Eric Holcomb. East Chicago environmental activist Thomas Frank told Mother Jones weeks after the visit “’We’d known for quite some time that there was some contamination there,’” but the Indiana Department of Environmental Management allowed plants at the site to keep polluting. For decades, industry was the region’s bread and butter and often the corporation’s and community’s financial well-being was prioritized over health or environmental concerns. Frank noted that older generations viewed the plants with “a sense of pride as it provided jobs and stability” and do not “‘want to look at what they’re so proud of and see that it’s harming them.'”
The EPA’s 2018 investigation of Hammond’s soil lead levels, a response to the “national criticism of its slow reaction to polluted water in Flint, Mich., and lead-contaminated housing in East Chicago,” (Chicago Tribune) inspired us to take a look at Federated Metal’s origins. In 1937, the Chicago-based company announced it would establish a plant in the Whiting-Hammond area. By 1939, hundreds of workers produced non-ferrous metals used in automobile, housing, and oil drilling industries. Almost immediately after production began, the community voiced complaints about the effects on their health.
In the spring, a citizens committee decried the fumes and smoke being expelled by the new smelting and refining plant—so noxious that students at St. Adalbert Catholic parochial school had to miss class due to illness—and pressed city officials to intervene. That year, resident Frank Rydzewski wrote to the Munster Times that Federated Metals foisted upon the Hammond community a “generous sample of sickening odors which emit from its midget—partially concealed smoke stacks and which have already showed its ill-effects on pupils of a school situated not a block distant.”
Rydzewski’s next sentiment encompassed the conflicting priorities related to Federated Metals from the 1930s until its closing in 1983: “Certainly, the value of health impairment to residents in the vicinity far surpasses any questionable tax-able asset this company can create.” Although he bemoaned the fumes plaguing the city’s residents, he also noted that the plant could “boast of its colored personnel; its predominating out-of-state and outside employe[e]s; its labor policies.” Since the 1930s, Federated Metals has served as both the bane and pride of Hammond and Whiting residents. The plant experienced labor strikes, symbolized livelihood and industrial progress, helped the Allies win World War II, and was the site of accidental loss of life.
In April, the Munster Times reported that hundreds of residents in the area “revolted” against the plant’s operations at the city council meeting. They charged that “harmful gas discharges from the plant damaged roofs of residences, caused coughing and sneezing that punctuated school studies and prayers in the Whiting church and school and made it virtually impossible to open doors or windows of homes in the neighborhood.”
The paper noted that Mrs. Feliz Niziolkeiwicz wept as she addressed plant manager Max Robbins. She told him “You can live in my home for free rent if you think you can stand the smoke nuisance. The home I built for $10,000 is almost wasted because of the acid from the plant.” Her concerns were shared by Hammond Mayor Frank R. Martin, the city council, the city board of public works and safety, and the health department, whose secretary ordered Federated Metals one month prior to “abate the nuisance” within sixty days. In October, the company was tried in a Hammond city court hearing and found not guilty of criminal liability for the fumes, despite city health inspector Robert Prior testifying that Federated Metals “continued to operate and discharge gasses on the Whiting-Robertsdale community after repeated warnings to abate the alleged nuisance.”
By November, Federated Metals had constructed a $50,000 smoke stack much taller than the previous, offending one, so as to diffuse smoke farther above the Robertsdale neighborhood. In March 1940, Prior stated that citizen protests had ceased with the improvement. Following this remediation, the Munster Times published a smattering of articles throughout the 1940s about health complaints related to plant output. In October 1941, the Times published a short, but eyebrow-raising article regarding allegations that Federated Metals tried to pay Whiting residents in the area as a settlement for property damaged by fumes. Councilman Stanley Shebish shouted “When the people of this community suffer bad health and many can’t go to sleep at night because of this smoke and particles of waste, it is time to stop an underhanded thing like this!” Health officials maintained that the sulphur dioxide fumes were “not a menace to health,” but may be “detrimental to flowers and shrubs.” Whiting’s St. Adalbert’s Church filed a similar complaint about the health of students, teachers, and parishioners in 1944.
While citizens lamented pollutants, the plant churned out “vital war materials” for World War II operations. (The Air Force also awarded the company contracts in the 1950s.) In accordance with the national post-war trend, 1946 ushered in labor strikes at the Hammond-Whiting plant. The Times reported that in January CIO United Steelworkers of America closed down the “Calumet Region’s steel and metal plants,” like Inland Steel Co., Pullman-Stan. Car & Mfg. Co., and Federated Metals. On February 17, Federated Metals agreed to increase the wages of its 350 employees to $32 per month. Labor strikes, such as that which “deprived workers of a living and dampened Calumet Region business,” took place at Federated Metals until at least 1978. This last strike lasted nearly five months and required the service of a federal mediator.
On January 5, 1949, one of the grimmest events in the plant’s history took place at the receiving department. While unloading a shipment from National Lead Co., Federated workers were suddenly overcome by arsenic seeping from rain-sodden drums. The gas, which can also cause paralysis, memory loss, and kidney damage, took the lives of four men and hospitalized eleven. The Times noted that “only the caprice of weather saved scores of Hammond and Whiting residents” from dying while the open freight cars transported the drums from Granite City, Illinois to the Federated Metals plant. The cities’ residents narrowly avoided catastrophe, since rain causes metal dross to generate deadly arsine gas.
Dr. Richard H. Callahan, East Chicago deputy coroner, probed the deaths and placed the blame primarily on the state board of health. He lamented “‘It is inconceivable that the chemists in the state board did not know that dross used by Federated Metals would poison workmen with arsine. Federated Metals was in the possession of a dangerous toy.” He noted that safeguards against arsenic poisoning had existed for thirty years, ranging from gas masks to the use of caged birds, who fell ill at lower concentrations of gas than humans. The Times noted that Dr. Callahan’s investigation was expected to “foster national and international safeguards against arsine poisoning.”
A.J. Kott wrote in the paper that Federated workers’ lives could have been saved had British Anti-Lewisite (BAL) been on hand, “a miracle drug, discovered during World War I in University of Chicago laboratories.” Instead, the drug had to be rushed to St. Catherine Hospital to treat affected workers. While Dr. Callahan identified the state board as the responsible party, questions regarding Federated’s culpability lingered, such as if they violated the state act requiring employees wear gas masks and if they should have had BAL on hand. Following the accident, the company promised to strengthen safety procedures, like employing gas detecting devices when material arrived.
Nearly twenty years later, Federated Metals found itself in the cross-hairs of the environmental movement, which had produced the first Earth Day and the Environmental Protection Agency. Learn about the U.S. Justice Department’s suit against Federated and the politics of pollution in Part II.
With a staff of over 155 and over 170 volunteers, today’s Fort Wayne Visiting Nurse is a far cry from its humble beginnings. In 1888, a group of Fort Wayne women organized the Ladies Relief Union with a mission to “help the sick poor of Ft Wayne.” Calling themselves the Visiting Nurse Committee, they soon discovered a link between poverty and disease. Dr. Jessie Calvin, a Fort Wayne sanitation and indoor plumbing pioneer, encouraged women’s church groups to raise money for a qualified nurse that could meet community needs.
Prior to the 1860s, nursing was “typically considered a domestic responsibility provided in the home by family members.” Nursing as a profession evolved after the Civil War, when women gained experience caring for wounded soldiers. Historian Clifton J. Philips noted that in the post-war period, women’s religious orders were “especially active in establishing hospitals in an attempt to extend to the general public, and the poor in particular, some of the services formerly rendered to sick and wounded soldiers.” As hospitals materialized, so too did nursing groups and training programs. By 1897, The Fort Wayne Journal-Gazette noted that “modern nurses” wished to “be given her proper position as a skilled assistant in serious illness.”
The paper added that:
The daily or visiting nurse is a recent development of modern nursing and meets the needs of many people who find it inconvenient to have a nurse stopping in the house and requiring more or less attention from servants perhaps already overtaxed. The visiting nurse comes in for an hour or so every day to perform those services for which her skill is needed.
On March 1, 1900, an organizational meeting was held in Fort Wayne and the Visiting Nurse League became a reality. At a salary of $10.00 a week, Josephine Shatzer was hired as the League’s first nurse. During harsh weather she took a trolley, but normally she could be seen making her rounds on her bicycle. Regardless of transportation, it was clear that she wasted no time. On her first day she saw six patients. Next she established a baby milk station at First Presbyterian Church, instructing new moms how to prepare formula.
She also volunteered at free immunization clinics, bathed patients, delivered meals, changed bedding, dressed wounds, cared for the elderly and ill in their homes and endeared herself to all she served. At the end of her first year she had made hundreds of calls, utilizing supplies donated by local churches, relief societies, and drug stores. For those patients who could pay, the charge of a one-hour visit was fifteen cents.
The Fort Wayne Daily News praised the program in 1900, noting that the league “found favor with all classes of people” and that visits to the “sick poor” conveyed not only “help and benefit, but hope and good cheer to every member of the family.” In 1913, the Public Health Nursing Association appointed a visiting nurse to serve African American patients at the Flanner Guild. Dr. Calvin continued to guide the League, through the years of World War I and the 1918 flu epidemic, which took the lives of 3,266 Hoosiers.
By 1919, a nurse named Dixon had reached a salary of $100.00 a month. League expenditures in 1920 stood at $1,300 annually and in 1922 the Community Chest offered its support. Insurance companies began to hire nurses from local visiting nurse groups to assess policyholders who were ill, and paid the League seventy-five cents a visit. By 1923, the League reorganized and served in an advisory, instructional health teaching capacity. The Great Depression wrought poor health conditions: eight nurses made over 29,000 visits to 4,477 patients. One made 3,255 orthopedic visits to 104 crippled children, many of whom were victims of polio. Sisters of Saint Joseph Hospital provided free hospitalization in the pediatric ward for those who could not pay. In the 1940s, World War II increased demands for nursing schools to produce eligible Nurse Corps candidates.
In 1954, the agency changed its name to Visiting Nurse Service, Inc. By 1956, it undertook a program that cared for stroke victims in their homes, which served to collect data on medication, exercise, loss of function and the need for expanded therapy service. By 1962, the agency’s director Eva Rosser introduced a State Board of Health-funded program that focused on treating the chronically ill in their homes, instead of a hospital or nursing home. Later the group provided care as home health aides and the agency became certified for Medicare on July 1966.
With certification came additional paper work. According to History of Visiting Nurse, by 1983, “a record number of visits for a single month occurred (2,200), due to earlier hospital releases and greater technology used in the home.” During this period, the glucometer was used for the first time, registered nurses were trained to perform phlebotomy services, and around-the-clock care was made available for all patients. In 1984, Medicare Hospice Benefit became available and Visiting Nurse Service merged with Hospice of Fort Wayne, Parkview, and Lutheran Hospices. The agency introduced computerized billing and, by the end of the decade, services for the frail and disabled. By 1990, Hospice service visits totaled 38,177, a forty percent increase over previous months.
History of Visiting Nurse noted that in 1990 the agency moved to the “Moellering Unit of the nearly vacant former Lutheran Hospital.” In 1995, the 1984 merger dissolved and Visiting Nurse Service and Hospice became a free standing agency. By February 2001, a new Hospice Home facility opened and in 2006 a building expansion added patient rooms. In 2011, nurse practitioners joined the staff and the “Watchful Passage” program began, in which trained volunteers remained at patient’s bedside during the last few days of life. In 2018, Visiting Nurse staff and volunteers can proudly stand tall celebrating 130 years of community service.