I get back to my discussion with Jim, a bit of history will be helpful.
Many people to this day blame the suburban mall, at least in part, for the demise of Americaâs downtowns. I have found throughout my adult life at cocktail and dinner parties that it is politically correct to be prodowntown and antisuburb. Inner cities are alive with culture and sophistication; suburban communities are vacuous and homogenized. West Side Story vs. Leave It to Beaver. The â21â Club vs. McDonaldâs. In fact, a popular myth holds that regional malls began to spring up in cornfields across America shortly after World War II, creating suburbiaâan evil, all-powerful magnet that drew people, housing, and commercial development away from our nationâs vulnerable inner cities.
Thatâs a distorted view of history. First of all, we need to recognize that settlers from the beginning of our nationâs history established unique patterns of urban development far different from the Old World cities from which they fled. An urban study in the 1890s, a time when commuter railroads had already kickstarted the growth of suburbs, found that the population density of American cities averaged twenty-two people per acre, compared to 157 for cities in Germany.
Americaâs affinity for a suburban lifestyle is certainly not a new aspiration. In fact, archaeologists discovered the following inscription on a clay tablet dating back to 539 BC:
Our property seems to me the most beautiful in the world. It is so close to Babylon that we enjoy all the advantages of the city, and yet when we come home we are away from all the noise and dust.
Nevertheless, we tend to focus on the 1940s and 1950s as the dawn of Americaâs suburbs. Of course, by the 1920s, immigration, advances in mechanized farming, as well as the flow of returning servicemen from World War I, had created an unsustainable wave of urban population growth. Because our cities could not accommodate these numbers, growth migrated to the cityâs fringe. In response, retailers followed to conveniently serve customers in these new communities.
But as a general rule, the major department store companiesâDaytonâs in Minneapolis, Hudsonâs in Detroit, Wanamakerâs in Philadelphia, Lazarus in Columbus, Marshall Fieldâs in Chicagoâstayed put in their protected downtown locations. For the first half of the twentieth century, established downtown department stores exercised significant control of land use and the political process. Politicians marched in store-sponsored holiday parades, and wings of hospitals were built with generous corporate contributions from retailers. These stores were among the largest employers and most visible local businesses. In 1953, Hudsonâs flagship store in downtown Detroit employed 12,000 people and maintained a delivery force of 500 drivers operating 300 trucks!
Because of this extraordinary influence and their ability to control the distribution of name brands, department stores were able to forestall competition from new retailers in their markets. Americaâs cities were essentially one-store towns. Before landmark free-trade rulings in the 1960s and 1970s, if you wanted to purchase a Hathaway dress shirt in Detroit, you had to go to Hudsonâsâand only Hudsonâs. They, like powerful department stores in other cities, absolutely controlled distribution of the most popular apparel brands.
For decades, dominant department stores made it difficult if not impossible for upstart merchants such as Sears, Montgomery Ward, and other variety and specialty stores to secure competitive downtown locations.
Contrary to dinner party myth, retail follows residential growth.People shop where they live, not necessarily where they work. As a result, upstart stores found locations called âhot spotsâ two, four, or six miles from downtown. These street-front properties were serviced primarily by