Monday, October 24, 2011

WHY WE QUIT GETTING PLASTERED

Perhaps the most singular trait of American homes is the hollow, cardboardy thud of our gypsum-board walls.  No one else has anything quite like them.  Mind you, if it weren’t for World War II, our walls might not sound quite so hollow.  

Before the war, American homes were routinely plastered inside--a painstaking process that first required nailing thousands of feet of wooden stripping called lath to the ceiling and walls of every room. The lath was covered with a coarse layer of plaster known as a “scratch coat”.  The wet plaster squeezed through the gaps in the lath, locking it to the walls and ceiling. 

Days later, when the scratch coat was dry, a second “brown coat” was applied to make the surfaces roughly flat.  This, too, had to dry for several days.  Last came the “skim coat”, a thin layer of pure white plaster that produced a smooth finished surface, something like the cream cheese topping does on a cheesecake.  

Depending on the weather, this process could take days or weeks, during which no other trade could work inside the house.  This was how plasterwork had been done for centuries, and there seemed no reason to change.  

Then came World War II, and with it an urgent need for military structures ranging from barracks to whole bases.  Faced with shortages of both labor and material, Uncle Sam was desperate to find faster and cheaper ways to build.  And since beauty was not much of an issue, eliminating plaster was an obvious starting point.

Enter the United States Gypsum Company, which way back in 1916 had invented a building board made of gypsum sandwiched between sheets of tough paper.  After more than two decades, the product they called Sheetrock still hadn’t really caught on.  Even its successful use in most of the buildings at the Chicago’s World’s Fair of 1933-34 didn’t do much for sales.  But the urgencies of wartime construction changed all that.  

As the government soon came to appreciate, Sheetrock did away with the need for wood lath, multiple plaster coats, and days and days of drying time (hence its generic name, “drywall”).  Installation was simple:  After the 4x8 sheets were nailed up, the nail holes were filled, paper tape was used to cover the joints, and a textured coating was troweled on to help disguise the defects.

Of course, all this was only meant as a stopgap replacement for plaster, but as you’ve probably guessed, it didn’t turn out that way.  By the war’s end, many builders who’d gotten used to slapping up drywall were suddenly reluctant to go back to the trouble and expense of plastering. 

What’s more, Sheetrock’s arrival coincided with the rise of modern architecture, which preferred plain, flat surfaces to the fussy moldings and reveals of prewar styles.  To Modernist tastes, the fact that Sheetrock couldn’t be molded the way wet plaster could was hardly a drawback.  People seemed more dismayed by the flimsy cardboardish sound of the walls in their postwar homes, but they soon got used to it.

Flimsy or not, there’s no doubt that Sheetrock proved a huge boon to the postwar housing industry.  Prior to the war, the typical American developer built about four houses a year.  By the late Forties, a developer like the legendary Bill Levitt was able to churn out 17,000 tract homes at Long Island’s Levittown, sell them for $7,990 , and still make a thousand dollars profit on each.  Mass production was the key to the postwar housing boom, and Sheetrock helped make it happen.  

Just something to bear in mind next time your kids smash a doorknob through the bedroom wall.

Tuesday, October 11, 2011

MAKING LIGHT

Next time you head for the bathroom in the middle of the night, consider what the casual act of lighting your way would’ve entailed just over a century ago: If you were lucky enough to have a house with piped-in gas, you could strike a match to the nearest gas mantle to get a blinding white flame. Otherwise, you’d have to stumble your way to the john by the light of a guttering candle. No wonder so many Victorian houses burned to the ground.

Although nowadays it’s hard to imagine a world without electric lighting, it's been with us for a relative wink of an eye. Thomas Edison perfected his incandescent bulb in 1879, after trying out hundreds of filament materials ranging from bamboo to hair to paper (he finally settled on tungsten). Not so well known is that Edison also had to invent a way to evacuate the air from the bulbs--no mean task using Victorian technology.  

Even so, it took another twenty years or so before electric lights had largely replaced gas mantles in American homes. As late as the early 1900s, older houses with gaslight were still being retrofitted for electricity. These transitional houses are easy to spot: the wires leading to the electric fixtures were often run inside the old gas pipes. 

In the early days of electric lighting, fixtures intentionally flaunted naked bulbs so that no one could possibly mistake them for gas.  It was a way for people to advertise their modernity, much as hipsters of the 1990s sported conspicious cell phone antennas on their cars.

Since that time, there have been surprisingly few fundamental changes in residential lighting.  Switches and wiring were eventually hidden inside of walls instead of being mounted on top of them, but other than that, most houses continued to have lighting fixtures in the center of ceilings, much as they had in the days of gaslight. The Revivalist home styles of the 1920s brought a craze for wall sconces--another gaslight derivative--but the fashion had largely died out by the end of that decade.

The first really new development in lighting since Edison’s light bulb was neon tubing, which made a big splash in the early 1930s. It made its American debut in a sign for a Packard showroom, and was soon all the rage as signage in movie theaters and other commercial buildings. However, with its otherworldly glow, it found little use in residential design.  

Fluorescent lighting (not to be confused with neon) was introduced not long afterward.  Being diffuse and hence glare-free, and also producing much more light for a given amount of power, it quickly became the standard for commercial buildings.  Still, no matter how hard architects tried to push its use in luminous ceilings and other Modernist lighting concepts, the sickly blue-green quality of its light did not endear it to homeowners. It took another forty years of improvement, as well as laws mandating its use, before fluorescent lighting was grudgingly accepted into American homes.

In the interim, a number of other high-efficiency lighting types have been developed, including mercury vapor, sodium vapor, and metal halide, but the unnatural spectrum of light they produce has also precluded their use in domestic work. 

By contrast, halogen residential lighting, introduced during the 80s, was an instant hit with the public. Why? Halogen’s warm, yellow-white light is very close to the spectrum of sunlight. Accordingly, engineers are currently working hard to make the next big development in high efficiency lighting--light-emitting diodes, or LEDs--as warm and friendly as incandescent and halogen lamps.

Because the sun, after all, is still everyone’s favorite lighting fixture.