Tag Archives: 1970s

Runaway menu prices

Restaurant prices are rising during the current inflationary period, but this is scarcely the first time. In fact it’s at least the fourth in little more than a century.

The first was during World War I, particularly after the war ended. In response, many restaurants teamed up for cooperative buying to keep costs under control to a degree. Drugstore soda fountains and other inexpensive eating places gained a thriving lunch business, while first-class restaurants raised prices as they whisked away frills including cloth tablecloths and napkins. The average restaurant operator’s motto became “simpler, cheaper, faster.” In New York, the venerable Mouquin’s hiked steak prices, charging $4.50 for a porterhouse steak with mushrooms that had historically been only $1.00.

The tough business climate combined with Prohibition caused the closure of droves of fancy restaurants such as Delmonico’s, which had been sliding for a while.

Complaints mounted. In 1920 Chicago’s city hall called restaurateurs on the carpet to explain their high charges, as the “Carry-Your-Lunch” movement grew. Boston put a U.S. District Attorney on the job to investigate prices at the city’s popular restaurants, including The Puritan and The Pilgrim.

Restaurant workers wanted raises, but it was a bad climate for strikes. Chicago’s 1000-seat faux-luxe North American Restaurant sacked their striking waiters and installed a cafeteria line. Their advertising copy assured customers they didn’t need to tip because “There was no one there to tip.” At the same time the North American’s advertising championed low prices, the ballyhooed bargain-priced “whole baby lobster” shrank to half a baby lobster. Did they think customers wouldn’t notice?

Although World War II also raised restaurant prices, that did not dampen patronage by war workers who enjoyed higher wages than ever. The president of the Society of Restaurateurs reported that from 1941 to 1944 New York City’s 19,000 restaurants went from serving 3 million to 8 million meals a day.

Soon the federal Office of Price Administration tried to control prices at restaurants across the country by freezing them to April 4-10, 1943, levels. Restaurateurs found ways to skirt regulations by reducing portions and substituting “blue plate” specials for what had previously been a regular meal including appetizer and dessert. In addition to reducing food costs, the move also saved a lot of dishwashing. Quality and sanitation went down as patrons mobbed restaurants severely short staffed due to military recruitment and the lure of defense industry jobs. High prices continued through 1948 as did meat rationing. [Britling advertisement, 1942]

The “stagflation” of the 1970s was still to come, with inflation accompanying a stagnating economy – a situation similar to what some economists see looming today.

In 1970 consumer prices rose steadily, especially for food and restaurant meals. Soon New York maitre d’s became friendlier and even the city’s rich began to complain about costs. A wealthy woman who had never paid attention to prices and customarily ate out six or more times a week became angry at being charged over $4 for a melon wrapped with prosciutto at the Plaza’s Oak Room. A nationwide Gallup survey found that a substantial percentage of restaurant goers had cut back on evening dinners out.

A few years later famous NYC restaurants including the Colony and Le Pavillon failed. At the same time Chinese restaurants were prospering. Across the country, salad bars became popular as did fast food outlets and restaurants specializing in dishes such as pizza, pasta, and tacos. Books recommending inexpensive restaurants did well. By 1974 three chains – McDonalds, Colonel Sanders, and Burger King — were furnishing 13% of all food eaten outside the home nationwide. Five years later there were 66,000 franchise outlets in the U.S., nearly double the number in 1973. Elsewhere, doggie bags soared in popularity and some customers began packing away anything edible on the table. A few restaurants went so far as to remove tops from ketchup bottles to discourage patrons from carting off their ketchup. [above: 1970s fast food streetscape]

Printing houses could barely keep up reprinting menus as prices went up, up, up. And still the restaurant industry experienced heavy, some said “booming,” business – even though patrons were eating more hamburgers than steaks. Analysts thought it was due to the number of working wives, along with the fact that the hike in supermarket prices outdid restaurant price increases. The president of the National Restaurant Association reported that the country’s half million restaurants enjoyed rising sales throughout the mid-1970s, with 1975’s take 16% higher than the year before. Nonetheless the industry fought a proposed increase in the federal minimum wage from $2.30 to $3.00 an hour.

Despite continuing challenges, the economy began to improve in 1982, ushering in a period of gastronomic innovation in restaurants.

© Jan Whitaker, 2022

1 Comment

Filed under chain restaurants, food, restaurant prices

Halloween soup

Although the food-page story in a New Orleans newspaper said that this photo showed a jack-o-lantern just carved by Chef Gunter Preuss for his children, I can’t help feeling a little bit spooked by it. Is it how he’s holding that knife, or his serious gaze?

Never mind, because the story was about the Harvest Cream Soup he makes out of the pumpkin’s insides. (See recipe below.)

At the time of this story, 1976, Gunter Preuss and his wife Evelyn were owner-operators of the Versailles Restaurant in New Orleans. Eight years later they acquired a part interest in Broussard’s, which they took over from 1993 to 2013.

The Versailles received a glowing review in Richard Collin’s “Underground Gourmet” column in 1978 — although it was definitely not a restaurant for the price-conscious diner. Collin declared it “spectacular,”and “about as fine a restaurant as one can imagine.” He singled out many dishes as “platonic,” meaning they could not be more perfect. Among them were Bouilabaisse Marseillaise, Rack of Lamb Persillades, Ris de Veau Grenobloise, and Pears Cardinal. Chef Preuss was also featured on the show Great Chefs of New Orleans.

The recipe for pumpkin soup does not give amounts for every ingredient. It calls for a pumpkin’s interior, seeds removed, to be cubed and washed. Then sauté the cubes with onions and celery until glazed. Add flour and a half quart of chicken stock. Simmer the mixture over medium heat for 45 to 60 minutes, seasoning with salt, white pepper, powdered ginger, and white wine. Then strain the soup and add three eggs yolks and a cup of light cream. Simmer on low flame for five minutes, then pour into cups and serve with a whipped cream topping and a touch of ginger. Serves six.

Enjoy Halloween!

Leave a comment

Filed under chefs, food, popular restaurants, proprietors & careers

Underground dining

In the 1960s, with the rumble of social change came a flood of interest in low-priced eating places with character and good food. In this spirit, New Yorkers Milton Glaser and Jerome Snyder began a newspaper and magazine column titled The Underground Gourmet, followed by a guide book in 1966 with the same name.

Their book led to a series. It’s been a little difficult to nail down how many different UGs there were, but here is my list, with initial publication dates: New York (1966), San Francisco (1969), Los Angeles (1970), Washington D.C. (1970), New Orleans (1971), Boston (1972), Honolulu (1972), and Long Island (1973).

Several factors probably contributed to the new mood regarding restaurants. The economy was bad and the public was looking for bargains. Youth culture was blossoming as the baby boomers grew older, many becoming college students. And increased travel abroad was widening the public’s interest in unfamiliar foods and ways of cooking.

The public’s attraction to low-priced independent restaurants could also be seen as a reaction against the growth of fast food chains taking place, the greater use of frozen food in restaurants, and a rebellion against the blandness of much American food.

What was considered a low price for a meal during these years? The first New York edition specified in 1966, “Great meals . . . for less than $2 and as little as 50¢.” But the third edition (1977) explained that “unending inflation . . . has changed our perception of an inexpensive meal from one that cost $2.00 to one that costs $5.00 or $6.00.” For the New Orleans’ second edition in 1973, author Richard Collin promised meals “for less than $3.75 and as little as 50c.” This was still a lower price than featured by the others, which ran from $1.00 to $3.75 in San Francisco in 1969; $1.00 to $4.00 in D.C. in 1970; and “under $4.00″ in Boston in 1972. Dining in Honolulu remained a bargain, with the 1972 UG promising meals as inexpensive as in the first New York edition (50¢ to $2).

Low price was not really what set the best of the recommended restaurants apart from others. Rather it was the quality of the food for the price. Although Mr. Steak in 1970 offered its most expensive meal – Steak & Lobster with salad, toast and potatoes – for $3.99, it didn’t make the cut, though strangely enough a few other chain restaurants did win recommendations including a McDonald’s in D.C. and a Burger King in New Orleans.

What were some of the most remarkable finds in these books? Richard Collin [above cartoon] discovered a number of dishes that he gave his highest praise, naming them “platonic dishes,” as perfect as that dish could possibly be. His New Orleans list of platonic dishes included Oysters Bienville and Fried Chicken at Chez Helene’s soul food restaurant — which he rated one of the city’s finest restaurants; Creole Gumbo at Dooky Chase; and Fried Potato Poor Boys at the dirt-cheap Martin’s Poor Boy.

The number of restaurants that met the criteria varied from city to city. Boston and D.C. are notably slim books. New York is the fattest volume. San Francisco and New Orleans have about 2/3 the heft of New York. However, with his shorter entries, Richard Collin packed over 250 restaurants into the 1973 revised New Orleans edition, rating everywhere he ate, including some very bad places. Needless to say, this makes for interesting reading.

In his 1969 UG, R. B. Read made a case that the San Francisco area had a unique set of restaurants from all over the world, such as at The Tortola, which preserved “hacienda cookery” from the days before gringos settled in the state. He also heaped praise on restaurants that were rare in the U.S. then — from Korea, the West Indies, and Afghanistan. The latter instance, Khyber Pass, offered a “fabled” ashak, which he described as “aboriginal ravioli.” In a different category of unusual was The Trident in Sausalito, with jazz and a “debonairly eclectic” menu with a psychedelic design.

Because my copy is the third edition of the New York UG (The All New Underground Gourmet, 1977), I did not get the flavor of the earlier versions, which is a shame. Sadly, Jerome Snyder died during the publication of the book. That and rising prices may have cast a pall over this edition, which strikes me as less interesting than the New Orleans and San Francisco UGs. The original NYC book contained 101 of the best low-cost eating places (out of 16,000!). The third edition has about 130. The three given the highest ratings for “excellent food” were the Italian Caffe da Alfredo, and two Greek restaurants, Alexander the Great and Syntagma Square. Mamma Leone’s showed up in the book even though it met the price criterion only for its Buffet Italiano Luncheon where for $4.25 it spread out 25 feet of salami, mortadella, meatballs, celery, olives, green bean salad, and more.

The UG authors for Boston were Joseph P. and E. J. Kahn, Jr.; Washington D.C.’s were Judith and Milton Viorst. Both books show a lower level of enthusiasm. The Viorsts admitted that Washington “has not been known for its restaurants” and that of the 100 restaurants they visited, “a substantial proportion were so awful that we were unable to include them.” Father and son Kahn began by telling of a long-time resident of Boston and Cambridge who couldn’t imagine that anyone could recommend inexpensive restaurants since the area’s expensive restaurants were “bad enough.” The Kahns then admitted, “It is probably true that the Boston area does not loom large in the world of cuisine.”

Despite their reservations, the authors of both books managed to find some places they liked. The Viorsts singled out five D.C. restaurants as “great finds.” They were: the Calvert Café, an Arabic place “worthy of shahs and empresses”; Don Pedro, Mexican, with a marvelous mushroom appetizer called hongos; the Cuban El Caribe, featuring raw Peruvian-style fish cubes in lemon-onion sauce (95¢); Gaylord, an Indo Pakistani restaurant with “delicious samosas”; and Warababa, a West-African place run by a Ghanaian couple with “exquisite” dishes such as peanut butter soup and Joloff rice flavored with bits of beef and vegetables.

The Kahns didn’t exactly rave about finds in Boston or Cambridge. But, after encountering “enough blandness while making our rounds to put us to sleep,” they enjoyed spicy lamb stew at Peasant Stock in Cambridge. They included the No Name restaurant on Fish Pier – no name, no sign, no lights, no decor — where a seafood chowder (50¢) served as the house special and was “so incredibly rich and so brimming with hunks of fresh fish that a cupful could be a meal in itself.” But the popular Jack and Marion’s in Brookline, known for its giant menu and huge portions, ranked merely as one of the area’s “better delicatessens.”

Alas, I couldn’t find the books from Honolulu, Los Angeles, or Long Island, but I saw a magazine piece that criticized the Los Angeles UG for its surprising inclusion of 25 restaurants in Palm Springs.

© Jan Whitaker, 2022

8 Comments

Filed under alternative restaurants, ethnic restaurants, restaurant prices

The Mister chains

Sometimes I feel the need to focus on ridiculousness in restaurants, maybe because I run across so many instances of it when I’m meandering through old sources. Lately I’ve been exploring franchising and have encountered numerous silly concepts expressed in the names of chains. Many businesses across the country adopted “Mister” or “Mr.” as part of their names, and this seems to have been particularly true of restaurant chains. [For now, I’m calling all of them Mister.]

There are also scores of restaurants with names such as Mister Mike’s or Mister T’s, but those are usually not part of franchise chains and the letter or nickname refers to an actual person, usually the owner, who may be known by that name in real life. I’m not including those here.

I’m more interested in the Misters that are not named for actual humans. At least I’m hoping that there is no real-life Mister Beef, Bun, Burger, Chicken, Drumstick, Fifteen, Hambone, Hamwich, Hofbrau, Pancake, Quick, Sandwich, Sirloin, Softee, Steak, Swiss, or Taco.

There were also Sir chains, such as Sir Beef, plus Kings and Senors. Were they in their own way an expression of multiculturalism? Being “continental,” Sir Beef was classier than most of the Misters.

For quite a while I believed there could be no Mister Chicken. That seemed obvious to me – who wants to be called a chicken? But then it occurred to me that I should do a little more research. I was proven wrong. Maybe I shouldn’t be surprised. Surrounding the logo shown here were the words: Home of America’s Best Barbecue Chicken Since 1966!” Although there were restaurants by the same name in Rockford IL and Atlanta GA, I don’t know if they were related.

I find Mister Pancake’s face somehow threatening, but never mind that – he was a hit in his hometown of Indianapolis. He came into the world there in 1959, but I don’t know if he appeared anywhere else.

I especially like the logos that attempt to humanize food, particularly unlikely items such as hambones. Sadly for him and his girlfriend, Mister Hambone International – aka Hammy — really didn’t catch on. Starting out in Virginia in 1969, he opened at least one place in North Carolina, but nothing, I think, internationally.

Mister Softee with his natty bow tie, born in New Jersey, was mainly peddled out of ice cream trucks, but there were also restaurants of the same name that served hamburgers, steaks, hot dogs, fish, etc., along with the creamy guy. In 1967 a mobile franchise cost $2,500 while a restaurant was ten times that, which may account for why there were then 1,600 trucks — even as far off as the French West Indies — but only 5 restaurants. Overall, Mister Softee, like Mister Steak, had a more successful life than most of the Misters.

Mister Drumstick, born in Atlanta, offered the World’s Best Fried Chicken. I can’t help but wonder why he is holding a hamburger rather than a chicken leg. Maybe it was because his franchise was sold in connection with Mister Sirloin, a roast beefery, as well as Mister Hamwich, a ham sandwich purveyor. So far I’ve found four Mister Drumsticks in Atlanta and a few in Illinois, Ohio, and Missouri. Nino’s Mister Drumstick in Sandusky OH looks more athletic than Atlanta’s, but of course he has the advantage of legs. Was he a go-go dancer in an earlier phase of his career?

I like the Drumsticks, but my favorites are Mister Bun and Mister Sandwich (of New York City!). They are so versatile. They can handle anything that goes between two slices of bread. I don’t know what Mister Sandwich looked like but Mister Bun was a strange one, with his extremely short legs, his six-guns, and his 10-gallon hat. I can’t really figure him out. Is he trying to compensate for being nothing but bread?

The three Florida creators of Mister Bun had high hopes in 1968 when they opened their first location in Palm Beach, with plans to add more outlets in Florida as well as a number of other states where investors were interested. They advertised for franchisees by telling them that Mister Bun featured “the eight most popular food items in this nation.” It was true that Mister Bun could hold almost anything, so they settled on roast beef, cold cuts, roast pork, frankfurters and fish, accompanied by french fries and onion rings, and washed down with a range of beverages, including beer. Alas, Mister Bun had a rather unhappy life, experiencing little growth, abandonment by his primary creator, and time in court.

Females seemed to stay out of the game, so there are no Mrs. Buns, Mrs. Beefs, Mrs. Tacos . . . or Miss Steaks. Maybe theirs was the wiser course.

© Jan Whitaker, 2022

13 Comments

Filed under chain restaurants, food, restaurant fads, restaurant names, signs

Eye appeal

Long before the internet, color photography became a factor that restaurants had to take into account. In the 1974 book Focus on . . . adding eye appeal to foods, author Bruce H. Axler noted, “The dramatic four-color, full-spread photos of food appearing in magazines have set visual standards for the restaurateur.” Perhaps he was thinking of Gourmet magazine in particular.

Color photography began to be used for advertisements in magazines in the 1930s, and consequently became identified with commerce rather than art. It was used mostly in women’s magazines, frequently to advertise food products at a time when major brands and ad agencies were hiring home economists to oversee product promotion and photography.

After decades of viewing photos of brightly colored food arranged artistically in attractive settings, the American public, possibly women in particular, expected food to look as good as it tasted. With the increase in restaurant patronage in the 1960s and 1970s, restaurants began to realize they needed to focus more on the appearance of what they served.

Bruce Axler, building on considerable experience in the hospitality industry, set out to assist restaurateurs in dealing with vexing problems such as too much whiteness or brownness, shapeless blobs and piles, flat sandwiches, and the empty-plate look. Perhaps most important, he addressed the issue of commonplace food that didn’t look worth its high price considering how much cheaper it was at the place down the street.

Given patrons’ high expectations regarding visuals, Axler set out a depressingly cynical scenario on page 1: “If it [restaurant food] is any less luscious looking, it suffers by comparison to such photos; especially when the guest has had three ice-cold martinis and cannot really taste the difference between a prickly pear and a mashed rutabaga.” He seemed to suggest that restaurateurs couldn’t even count on taste and texture working for them anymore.

He also observed that some of the old-time fixes could no longer be relied upon. Broken potato chips couldn’t fill a void, he noted. Nor could food displays be enlivened by the old standbys parsley and paprika. “Buffets are loaded with mystery meats and salads similarly garnished with parsley and rouged with paprika like so many ancient chorines.”

He should have counseled against overuse of lettuce garnishes and potato borders too.

Axler’s suggestions included ladling soup from a tureen and serving sandwiches opened up, both to fill the plate and to display their innards. He advised that “Mounds are better than blobs, rolls better than slices, shingled layers better than piles,” and that vegetables should be portioned in odd numbers. To give the impression of increased worth, he recommended anchovy or grated cheese toppings.

At times his suggestions bordered on the desperate, such as “planting sparklers in food items” and floating small lit candles on soup croutons. I, for one, am not among the many customers he believed “would enjoy the visual appeal of a bright red tulip stuffed with chicken salad.”

Nonetheless, there is no doubt that restaurants were eager to adopt ideas such as his. Many have become standard practice, yet by now it has become clear that chefs have many more tricks up their sleeves, especially when it comes to making a dish look deserving of a high price. Some seem to go against the wisdom of the past. Who in the 1970s could have foreseen how powerfully miniature food artfully arranged on a king-size plate could signify a $$$$ restaurant?

© Jan Whitaker, 2019

3 Comments

Filed under food, restaurant customs, women

Crazy for crepes

The crepes craze, which began in the 1960s, became intense in the 1970s. By the late 1980s it had all but disappeared.

But before crepes achieved popularity, they were almost unknown in the U.S. The exception was Crepes Suzette, thin, delicate pancakes with an orange-butter sauce and liqueurs that were often dramatically lit aflame at the diners’ table. Like Cherries Jubilee, Crepes Suzette usually only appeared on high-priced menus, such as the Hotel Astor [1908 quotation].

Before 1960 even fewer restaurants served savory crepes, and those that did would also seem to have been expensive restaurants. In 1948 the Colony in New York City served Crepes Colony with a seafood filling. And in the late 1950s New York’s Quo Vadis offered Crepes Quo Vadis, filled with curried seafood and glazed with a white sauce, as hors d’oeuvres.

Although few Americans had ever eaten Crepes Suzette, it’s likely that the fame of this prized dish helped pave the way for the creperie craze, with restaurants primarily featuring crepes. Crepes were regarded as an exotic luxury dish that, by some miracle, was affordable to the average consumer, sometimes costing as little as 60 or 75 cents apiece around 1970.

Crepes enjoyed a mystique, offering a link to European culture and a break from the meat and potatoes that dominated most restaurant menus in the late 1960s and early 1970s.

At a time when America was seen as the world leader in modern ways of living – including industrially efficient food production — Europe was imagined as a romantically quaint Old World where traditional ways were preserved and many things were still handmade.

American creperies catered to their customers’ wish for a taste of Europe. With country French decor, servers in folk costumes, and names such as Old Brittany French Creperie and Maison des Crepes [pictured at top, Georgetown], diners were imaginatively transported to a delightfully foreign environment quite unlike the brand new shopping malls in which many creperies were located. Another exotic touch employed by quite a few creperies was to use the French circumflex mark in crêpes (which I have not done in this blogpost).

Filled with creamed chicken, ratatouille, or strawberries and whipped cream (etc.), crepes soon became a favorite lunch, dinner, and late-night supper for college students, dating couples, shoppers, and anyone seeking “something different.” Along with crepes, menus typically included a few soups, most likely including French onion soup, a spinach-y salad, and perhaps a carafe of wine.

San Francisco’s Magic Pan Creperie led the trend and, after being acquired by Quaker Oats in 1969, spread to cities across the country, with the chain eventually totaling about 112. The first Magic Pan, a tiny place on Fillmore Street, was opened in 1965 by Paulette and Laszlo Fono, who came to this country in 1956 after the failed anti-Communist uprising in their native Hungary. A few years later they opened another Magic Pan in Ghirardelli Square and Laszlo patented a 10-pan crepe-maker capable of turning out 600 perfectly cooked crepes per hour [pictured here].

As Quaker opened Magic Pans, they invariably received a warm welcome in newspaper food pages. It was as though each chosen city had been “awarded” one of the creperies, usually situated in upscale suburban shopping malls such as St. Louis’s Frontenac Plaza or Hartford’s West Farms Mall. When a Magic Pan opened in Dallas’ North Park shopping center in 1974, it was called “as delightful a restaurant as one is likely to find in Dallas.”

Among Magic Pan amenities (beyond moderate prices), reviewers were pleased by fresh flowers on each table, good service, delicious food, pleasant decor, and late hours. Many of the Magic Pans stayed open as late as midnight – as did many independent crepe restaurants. [Des Moines, 1974]

In hindsight it’s apparent that creperies responded to Americans’ aspirations to broaden their experiences and enjoy what a wider world had to offer. It was a grand adventure for a high school or college French class or club to visit a creperie, watch crepe-making demonstrations, and have lunch. [below: student at the Magic Pan, Tulsa, 1979] But what one Arizona creperie owner called the “highbrow taco” did not appeal to everyone. The operator of a booth selling crepes at Illinois county fairs reported that hardly anyone bought them and that some fairgoers referred to them as creeps or craps.

I would judge that crepes and creperies reached the pinnacle of popularity in 1976, the year that Oster came out with an electric crepe maker for the home. Soon the downward slide began.

Quaker sold the Magic Pans in 1982 after years of declining profits. The new owner declared he would rid the chain of its “old-lady” image, i.e., attract more male customers. Menus were expanded to include heartier meat and pasta dishes.

Even though new creperies continued to open here and there – Baton Rouge got its first one in 1983 – there were signs as early as 1980 that the crepe craze was fading. A visitor to a National Restaurant Association convention that year reported that crepes were “passé” and restaurants were looking instead for new low-cost dishes using minimal amounts of meat or fish. A restaurant reviewer in 1986 dismissed crepes as “forgotten food” served only in conservative restaurant markets. Magic Pans were closing all over, and by the time the 20-year old Magic Pan on Boston’s Newbury Street folded in 1993, very few, if any, remained.

© Jan Whitaker, 2018

10 Comments

Filed under alternative restaurants, atmosphere, chain restaurants, food, popular restaurants, proprietors & careers, restaurant fads

Dining with a disability

Throughout the 20th century the number of mobility-impaired Americans grew – due to medical advances, lengthening lifespans, polio epidemics, wars, and rising rates of automobile accidents. In the late 1950s and early 1960s the problem of physical barriers confronting those using wheelchairs, braces, canes, and walkers, began to get attention, largely as a result of activism by the disabled.

At first the focus was on public buildings, but it soon expanded to include commercial sites such as restaurants. One of the early efforts to ease a path was the publication of a 1961 Detroit guide book that devoted several pages to describing features of two dozen popular restaurants that were at least minimally accessible. For instance The Village Manor in suburban Grosse Pointe had a street-level front entrance and a ramp in back as well as main floor restrooms outfitted with grab bars. But several of the restaurants listed had steps at entrances, narrow doorways, restrooms too small to maneuver a wheelchair, and tables too low for wheelchair seating.

In 1962 the National Society for Crippled Children and Adults (NSCCA, an organization that had added “Adults” to its name during WWII) joined with the President’s Committee on Employment of the Handicapped (established in 1947) to launch a nationwide movement to change architectural standards and building codes so as to remove barriers affecting people with mobility limitations. This marked a new attitude acknowledging that handicapped people wanted to “do more things and go more places” but were blocked by the built environment. It was becoming apparent, reported one newspaper, that those “who were no longer ‘shut-ins’ were ‘shut-outs.’”

In 1963 the NSCCA began sponsoring surveys of public and private buildings which included restaurants. In various cities local volunteers equipped with measuring tapes compiled records of buildings concerning the width of doorways, number of steps, presence of ramps and elevators, and placement and design of restroom facilities. Meanwhile, in New Jersey the Garden State Parkway altered its restaurants and restrooms for disabled travelers.

Overall, though, there was very little action. The surveys showed that accessibility in the United States – not only in restaurants, but in schools, court houses, hospitals, churches, and all kinds of businesses – was rare. A survey of Oklahoma in 1968 revealed that only 32 of the first 2,144 public facilities checked were fully accessible to anyone operating their own wheelchair, while 60% were entirely inaccessible. In Oklahoma City, the state’s capitol, only one of the 20 restaurants surveyed at that point could accommodate a wheelchair user.

1968 was the year when official recognition of the problems presented by architectural barriers was achieved with the passage of a federal law that decreed that any building constructed even partly with federal funds had to be barrier-free. Although restaurants remained unaffected by the law, it was significant for demonstrating a growing recognition that accessibility problems arose from the environment as much as from the disabilities of individuals. It would, however, take another 22 years, with passage of the Americans with Disabilities Act in 1990, before serious attention was given to eliminating obstacles in all kinds of public facilities.

Despite a common (and illogical) attitude held by numerous restaurant owners that there was no need to make their restaurants accessible since disabled people did not frequent them, there were a few owners who voluntarily removed barriers before the ADA passed. When the owner of the Kitchen Kettle in Portland OR remodeled in 1974 he built an entrance ramp and a low lunch counter. In Omaha, Grandmother’s Skillet, co-owned by Bob Kerrey who had lost a leg in the Vietnam war (and later became governor of Nebraska and a U.S. senator), had a restaurant designed in 1976 that could be used by anyone in a wheelchair or on crutches. In California, a builder constructed accessible homes as well as fast food restaurants with ramps and restroom grab bars in the mid-1970s.

In the 1980s it became a fairly common practice for restaurant reviewers to note whether an eating place was accessible or, more likely, not. Most of America remained inaccessible. As irony would have it, that included much of Future World at Disney’s Epcot Center. Several fast food cafes there required patrons to get into a line formed by bars that were spaced too narrowly for wheelchairs. Even more depressing were the ugly letters advice columnist Ann Landers received in 1986 after she defended the rights of a handicapped woman to patronize restaurants. “Would you believe there are many handicapped people who take great pleasure in flaunting their disability so they can make able-bodied people feel guilty?” wrote one reader.

Passage of the ADA was a big step forward, but it didn’t work miracles. Even in the late 1990s it took enforcement activity from the U.S. Justice Department to get some restaurants to comply. Friendly’s, a family restaurant chain, was fined and compelled to alter entrances, widen vestibules, and lower counters, among other changes. Wendy’s settled out of court and agreed to remove or widen zigzag lanes at their counters.

Although many restaurants have gone to great lengths to guarantee accessibility, problems remain. Even when a restaurant is in compliance, there’s a good chance that disabled patrons will have an uncomfortable experience. This was detailed beautifully in a 2007 NYT story by Frank Bruni titled “When Accessibility Isn’t Hospitality.” His dining companion Jill Abramson, then editor of the paper and using a wheelchair following an accident, found that even luxury restaurants could present dismal challenges to patrons with mobility limitations.

© Jan Whitaker, 2017

10 Comments

Filed under patrons, restaurant decor

All the salad you can eat

The salad bar most likely developed from the Americanized version of the smorgasbord which, by the 1950s, had shed its Swedish overtones and turned into an all-you-can-eat buffet. The smorg concept lingered on for a while in the form of salad “tables” holding appetizers and a half dozen or so complete salads typically anchored by three-bean, macaroni, and gelatin. Eventually someone came up with the idea of simply providing components in accordance with the classic three-part American salad which structurally resembles the ice cream sundae: (1) a base, smothered with (2) a generous pouring of sauce, and finished with (3) abundant garnishes. Or, as a restaurant reviewer summarized it in the 1980s, “herbage, lubricant and crunchies.”

Whatever its origins, the salad bar as we know it – with its hallmark cherry tomatoes, bacon bits, and crocks full of raspberry and ranch dressings — became a restaurant fixture in the 1970s. Introduced as a novelty to convey hospitable “horn-of-plenty” abundance and to mollify guests waiting for their meat, it became so commonplace that the real novelty was a restaurant without one. Though strongly associated with steakhouses, particularly inexpensive chains, salad bars infiltrated restaurants of all sorts except, perhaps, for those at the pinnacle of fine dining. Salad bars were positively unstoppable at the Joshua Trees, the Beef ’n Barrels, and the Victoria Stations, some of which cunningly staged their salad fixings on vintage baggage carts, barrels, and the like.

Although industry consultants advised that a salad bar using pre-prepared items could increase sales while eliminating a pantry worker, restaurant managers often found that maintaining a salad setup was actually a full-time task. Tomatoes and garbanzos had a tendency to roll across the floor, dressings splashed onto clear plastic sneeze-guards, and croutons inevitably fell into the olde-tyme soup kettle.

The hygienic sneeze-guard came into use after World War II, first in schools and hospital cafeterias. Although a version of it had made its appearance in commercial restaurants in the early 20th century with the growth of cafeterias, many restaurants served food buffet style into the 1950s and 1960s without using any kind of barrier. The Minneapolis Board of Health required that uncovered smorgasbords either install sneeze-guards or close down in 1952, but it seems that their use did not become commonplace nationwide until the 1970s. Eklund’s Sweden House in Rockford IL thought it was novel enough to specifically mention in an advertisement in 1967. Massachusetts ordered them to be used in restaurants with buffets or salad bars in 1975.

On the whole salad bars went over well with the public – and still do — but by the late 1970s professional restaurant critics were finding it hard to hide their disdain. Judging them mediocre, some blamed customers who were gullible enough to believe they were getting a bargain. Others were wistful, such as the forbearing reviewer in Columbia, Missouri, who confessed, “It would be a nice change to get something besides a tossed make-it-yourself salad, and to have it brought to the table.” The trend at the Missouri college town’s restaurants, however, was in the opposite direction. In the 1980s Faddenhappi’s and Katy Station ramped up competition by offering premium salad makings such as almonds and broccoli while Western Sizzlin’ Steaks pioneered a potato bar.

© Jan Whitaker, 2017

3 Comments

Filed under chain restaurants, food, restaurant customs, sanitation

Chocolate on the menu

Chocolate concoctions have always been found in the dessert section of restaurant menus. Right? You’ve already figured out that I’m going to say no. But, naturally, it’s a bit more complicated than that.

Until the later 19th century the main form in which Americans consumed chocolate in public eating places was not as a dessert but as a hot beverage.

Confusion arises over the meaning of dessert, which is used in various ways on American menus. In the 19th century, dessert often was the very last course, coming after “Pastry,” which included pies, cakes, puddings, and ice cream. In this case dessert meant fruit and nuts. But sometimes ice cream was listed under dessert. For example, the Hancock House hotel in Quincy MA displayed the following on a menu in June of 1853:

Puddings & Pastry
Sago Custards, Apple Pies, Mince Pies, Rhubarb Pies, Custards, Tarts
Dessert
Blanc Mange, Oranges, Almonds, Raisins, Strawberries, Ice Cream

In cheaper eating places, there was no fruit or nuts and dessert came closer to what we mean  today, which is how I will use it for the rest of this post – referring to sweet dishes that come toward the end of the meal and are rarely nuts and usually other than simple fruit.

The absence of anything chocolate on the Hancock House menu was not unusual for that time. I looked at quite a lot of menus and the first instance of chocolate other than as a beverage that I found was chocolate ice cream in the 1860s. It was not too unusual to find chocolate eclairs on a menu in the later 19th century, and chocolate cake turned up in the 1890s. According to an entry in The Oxford Companion to Food and Drink, however, chocolate cake in the late 1800s could refer to yellow cake with chocolate frosting.

By the early 20th century chocolate appeared on menus in various forms: as pudding, layer cake, devil’s food cake, ice cream, eclairs, and ice cream sodas and sundaes. In the 1920s, chocolate shops appeared and were similar to tea shops. They offered light meals, desserts, and chocolate as a drink or as candy, and other desserts. They were popular with women, as were department store tea rooms, another type of eating place that was heavy on sweet things. In the case of Shillito’s department store in Cincinnati, a 1947 menu offered quite a few chocolate treats.

Toasted Pecan Ice Cream Ball with Hot Fudge Sauce 35
Apple Pie 20
Black Raspberry Pie 20
Banana Cream Pie 20
Pineapple Layer Cake 20
Shillito’s Special Fudge Cake 20
Chocolate Doublette with Mint Ice Cream and Fudge Sauce 35
Chocolate Luxurro 35
Hot Fudge Sundae 25
Vanilla Ice Cream with Nesselrode Sauce 25
Fresh Peach Parfait 30
Pineapple or Orange Sherbet 15
Vanilla, Fresh Peach, Chocolate or Mint Ice Cream 20

Starting in the 1970s and reaching a high point in the 1980s began a chocolate frenzy that continues today. With the help of restaurant marketing, millions of Americans discovered they were “chocoholics.”

If you stepped into San Francisco’s Pot of Fondue in 1970 you could do Cheese Fondue for an appetizer, Beef  Bourguignonne Fondue as a main dish, and Chocolate Fondue for dessert. But the Aware Inn in Los Angeles pointed more forcefully at dessert trends to come with its 1970s “dangerous Chocolate Cream Supreme” costing $2 and described as “somewhere between chocolate mousse and fudge.”

Adjectives such as “dangerous” continued the sinful metaphor conveyed earlier by “devil’s food.” Soon “special” chocolate desserts were named for immoral inclinations (“decadence”) or perhaps fatal pleasures (“death by chocolate,” “killer cake”). All this led at least one journalist to protest against the unsubtle marketing of chocolate desserts in the 1980s. She pleaded with servers: “Do not expect me to swoon when you roll back your eyes in ecstasy as you recite a dessert list that offers nothing but chocolate, via cheesecake, chip cake, profiterols, madeleine, mousse, bombe, eclair, napoleon, torte, tart or brownie.”

From restaurant reviews from the 1980s it’s noticeable that most reviewers jumped on the chocolate bandwagon with descriptions along the lines of “scrumptious” chocolate desserts “to die for.” But quite a few were critical, especially of chocolate mousse, which was readily available to restaurants powdered or wet, even “pipeable.” After a 1978 visit to a restaurant expo overflowing with convenience food products, the Washington Post’s restaurant reviewer Phyllis Richman observed, “The final insult of your dinner these days could be chocolate mousse made from a mix, but that is only another in the long line of desecrations in the name of chocolate mousse.” Often critical reviewers deplored chocolate mousse that tasted as if made of instant pudding mix combined with a non-dairy topping product, which very likely it was.

“Chocolate Decadence” cake took a beating in a review by Mimi Sheraton who in 1983 no doubt irritated many chocolate lovers when she referred to the prevalence of “dark, wet chocolate cake that seems greasy and unbaked, the cloying quality of such a sticky mass being synonymous with richness to immature palates.” More recently, what I call a “fantasy escape” restaurant in upstate New York was cited unfavorably for serving a boxed cake provided by a national food service that it merely defrosted, sprinkled with fresh raspberries, grandly named “Towering Chocolate Cake,” and placed on the menu for a goodly price.

Let the buyer aware, but no doubt many restaurant patrons do in fact realize that they are willing co-conspirators in fantasy meals. Along these lines, nothing can be too chocolate-y, triple obviously outdoing double. Decorations of some sort are de rigeur. Along with whipped cream, ultra-chocolate desserts might be adorned with orange rind slivers, raspberry sauce, or dripping frosting. In 1985 the Bennigan’s chain brought their “Death by Chocolate” into the world, consisting of two kinds of chocolate ice cream, chopped up chocolate candy bars, a chocolate cracker crust, with the whole thing dipped in chocolate and served with chocolate syrup on the side.

One theory about what brought about restaurants’ chocolate dessert blitz relates it to declining sales of mixed drinks in the 1980s as patrons became aware of the dangers of drinking and driving. Then, according to a 1985 Wall Street Journal story, elaborate, expensive desserts offered a way to make up for lost cocktail sales. Fancy desserts are undoubtedly higher-profit items than many entrees, but I suspect that another major factor favoring the rise of ultra-chocolate desserts was the culture of consumer indulgence that increased restaurant patronage in the 1970s, 1980s, and beyond.

© Jan Whitaker, 2017

11 Comments

Filed under food, guides & reviews, menus

Why the parsley garnish?

parsleyNchicken

Nothing decorated more restaurant plates in the 20th century than parsley, most of it by all accounts uneaten.

Why use so much of what nobody wanted? The best answer I can come up with is that parsley sprigs were there to fill empty spaces on the plate and to add color to dull looking food.

Parsley was not the only garnish around, but it has probably been the most heavily used over time. It has shared the role of plate greenery with lettuce, especially after WWII when lettuce become readily available, and to a lesser extent with watercress.

Parsley has long been a favorite in butcher shops where it is tucked around steaks and roasts. As early as 1886 restaurants were advised to emulate butchers and decorate food in their show windows with “a big, red porterhouse steak, with an edge of snow-white fat, laid in the center of a wreath of green parsley.” By the early 20th century, almost the entire U.S. parsley crop, more than half of which was grown in Louisiana and New York, went to restaurants and butchers. By 1915 parsley sprigs were a ubiquitous restaurant garnish that many regarded as a nuisance. Diners sometimes suspected that the parsley on their plate had been recycled from a previous customer.

While European chefs use garnishes as edible complements to the main dish, Americans have focused primarily on their visual properties.

parsleyGuidetoConvenienceFoodscvrAround 1970 when convenience foods invaded restaurant kitchens, garnishes took on heightened significance in jazzing up lackluster, monochromatic frozen entrees. In the words of Convenience and Fast Food Handbook (1973),“The emergence of pre-prepared frozen entrees on a broad scale has revived the importance of garnishing and in addition, has led to innovative methods of food handling, preparation and plating. If an organization is to achieve sustained success in this field, emphasis must be placed on garnishing and plating. These are the two essentials that provide the customer with excitement and satisfaction.” [partial book cover shown above, 1969]

Excitement?

parsleyNOThe head of the Southern California Restaurant Association admitted in 1978 that he hated to see all the food used as garnishes go to waste in his restaurant, including “tons” of lettuce. But this was necessary for merchandising, he said: “We have to make food attractive. It’s part of the cost of putting an item on the table.” It was – and is – probably true that an ungarnished plate such as shown here looked unattractive to most Americans.

parsleyNfiletmignon

So many garnishes decorated food in American restaurants in the 1970s that food maestro James Beard got very grumpy about it, calling it stupid and gauche. He could allow watercress with lamb chops or raw onion rings on a salad, but put a strawberry in the center of his grapefruit half and he was outraged. Next to orange slices and twists, his most detested “tricky” garnishes were tomato roses and flowers. Funny that he didn’t mention radish roses such as the one shown above.

© Jan Whitaker, 2008, revised 2015

14 Comments

Filed under food, restaurant customs