Wednesday, September 23, 2020


Author's Note: Please excuse the godawful layout of this blog. Google, in its infinite wisdom, has recently seen fit to introduce an "improved" interface that, to put it simply, just doesn't work. As far as I've been able to determine, gobs of functionality and simplicity have been removed, and nothing of particular value has been added. In the last few months I've managed to dodge having to use this abominable "upgrade" by reverting to the legacy interface. Alas, Google has now pulled the plug on that option as well, and you see the pitiful result below. Even at that, getting it to this level required hours upon hours of frustration.  —Arrol Gellner

If you’ve ever seen one of the old Buck Rogers movie serials, with their packing-crate robots and Art Deco rockets shooting sparks, you can appreciate how quaint another era’s vision of the future can be—and how difficult it is to get it right. Yet speculating on things to come, whether in writing, in images, or in three dimensions, is something humans find irresistible. 

Architects are no exception. The Futurist movement of the early 20th century, for instance, saw technology as man’s savior, and it proponents liked to wax poetic over things like turbines and high voltage towers. Yet to many modern eyes, their stark, mechanistic cities of tomorrow are not so much redemptive as sinister. 

"Cityscape", a 1939 work by the Italian Futurist
Tullio Crali

During the 1920s, the Russian Constructivists saw architecture in equally edgy terms. Thanks to Stalin’s growing distaste for their work, their most ambitious ideas, like those of the Futurists, were never built. This fact has ironically worked in their favor, since speculating on the future is a good deal safer than actually trying to build it in three dimensions. Paper predictions remain snugly encased in the context of their own time, while real structures must actually occupy—however uncomfortably—the future they were meant to predict. 

The "House Of The. Future" at Disneyland
in Anaheim, California, Circa 1957. 
Its predictions turned out to be wildly off the mark.

Disneyland’s 1957 House of Tomorrow, an all-plastic home designed by MIT and sponsored by the chemical giant Monsanto, is a classic example of this phenomenon.  With its plastic furniture, plastic dishes, and molded plastic walls, it turned out to be an almost comically inept predictor of housing’s future. While plastics did find limited acceptance in many kinds of building materials, from drain pipes to windows, the predicted plastics revolution augured by the House of Tomorrow never materialized. Indeed, the actual building trends of the early twenty-first century have shown a steady retreat from man-made polymers and controlled environments, back toward organic materials and more environmentally-sensitive design.   

An unusual view of the 1939 New York World's Fair's
theme structures, the Trilon and Perisphere.
(Architects: Wallace Harrison and
J. Andre Fouilhoux)

Theme parks and expositions in general have been a steady source of futuristic centerpieces, from the Trylon and Perisphere of the 1939 New York World’sFair, to the globe-like, 140-foot tall Unisphere at the 1964 fair held on the same site, to the more recent Spaceship Earth, the Florida EPCOT Center’s eighteen-story geodesic sphere of 1982.  

Overshadowing all of these is the 605-foot tall Space Needle, centerpiece of the 1962 Seattle World’s Fair. With its concave pylons and flying-saucer superstructure, the Space Needle evoked the sort of future in which people would have robot housekeepers and fly around in jet-powered backpacks--that is, when they weren’t out driving their atomic cars. This space-age optimism even permeates the color names used in the tower’s paint scheme:  Astronaut White, Orbital Olive, Re-entry Red, and Galaxy Gold.

The observation platform of Seattle's famed Space Needle (1962)
In an era of boundless optimism, the future looked like this.
(Architects: Edward E. Carlson and John Graham Jr.)

As a now charmingly-retro hallmark for Seattle, the Space Needle has been an unqualified success—even today, it remains the city’s biggest tourist attraction. As a predictor of future architectural trends, though, the Needle missed the mark.  
The fact that the Space Needle and its futuristic brethren already seemed quaintly outdated within a decade of their completion shows just how risky building a vision of the the future can be. It’s a sure bet that our own “House of Tomorrow” predictions about computer-orchestrated homes—the sort of scenario in which your toaster automatically goes online to buy more Eggos--are just as likely to come to naught.   

Still, architects will no doubt keep offering you their ideas of what’s to come. Whether our predictions pan out or not—well, notwithstanding Covid-19, a worrisome election, and any other nasty surprises 2020 may have in store for us—the future will be here soon enough.


Monday, September 14, 2020

INDIRECT LIGHTING: From The Stage To Your Living Room

Early predecessor of indirect lighting:
Limelight spotlight, used to illuminate
the front stage area of theaters
until the end of the nineteenth century.

What do movie palaces have to do with how you light your home?

Plenty. After electric lighting replaced gaslight at the end of the 19th century, most electric lighting was “specular”, a fancy way of saying it came from a point source like the white-hot filament of a standard light bulb. That situation changed during the 1920s with the arrival of indirect lighting (“indirect” meaning that the light source is hidden).   

Indirect lighting took a while to catch on because, at first, electric fixtures were used just like gas mantles. No one thought of hiding them, since doing so would have been foolhardy with gas.  Moreover, exposed light bulbs were initially seen as an emblem of modernity.  

If you’ve ever tried to read by the light of an unshaded light bulb, though, you know that the glare they produce can be a real problem. Indirect lighting provided a dramatic solution: by concealing the light source, it diffused the light and, unlike an ordinary shade, completely eliminated specular glare.  
Spectacular use of soffit lighting in the
auditorium of the Wiltern Theater,
Los Angeles, c. 1931 (Architects: Stiles O. 
Clements and G. Albert Landsburgh)

Movie theaters were among the first to adopt indirect lighting. Auditoriums needed subdued lighting for safety even during the show, and of course having a lot of glary specular lamps wouldn’t do. Since live theaters had long used concealed footlights along the front edge of the stage—the well-known “limelight” you’ve heard about—it wasn’t much of a stretch to use indirect lighting in other parts of the building.  

Perhaps the most dramatic new form of indirect lighting in theaters was soffit lighting.  Typically, it consisted of a ceiling that stepped up from a low level at the perimeter (the “soffit”) to a higher one in the center.  Lighting fixtures were hidden in a continuous horizontal recess separating the two levels, so that a diffuse, glare-free light would bounce off of the upper ceiling into the space below.   

Indirect under cabinet lighting
provides the most even and
glare-free lighting for
kitchen work surfaces.
The futuristic hovering effect this technique produced soon became a favorite with Art Deco commercial architects, who used it in countless clever ways.  Naturally, it wasn’t long before these ideas were showing up in the latest homes as well.

But don’t think indirect lighting is all just theatrical razzle dazzle. It can be practical as well. For example, if you mount miniature fixtures under your kitchen’s wall cabinets and conceal them with a shallow skirt or “valance”, they’ll light the countertop beautifully, but won’t shine in your eyes. 

What’s more, indirect lighting can be remarkably cheap. Since you don’t see the light source, you can use ordinary fixtures costing a few dollars—instead of overpriced boutique fixtures costing hundreds—and still get very sophisticated results. Today, LEDs have vastly expanded the opportunities for indirect lighting. LED lighting strip is available as narrow as 3/8" wide, allowing it to be hidden practically anywhere. 

LED lighting strip has made it possible
to install indirect lighting in places it
couldn't go before.

However, indirect lighting can be low-tech as well; d
epending on the space available, ordinary porcelain sockets, light ropes, or even strands of miniature Christmas lights will do the job. Nor does the structure that conceals the lamps have to be expensive: most soffit lighting, for example, consists of little more than an ordinary lumber framework finished with drywall.  

Regardless of how you design your indirect lighting, though, remember that the lamps—yes, even LEDs—will need replacement now and then. Make sure that you have reasonable access, especially in tight locations like ceiling coves. And for  heaven’s sake, turn off the juice first.

Tuesday, September 8, 2020


Heaven help the pedestrian in shopping centers like this one—
which unfortunately are typical across the nation.
A few blocks from my office, there’s a dreary, ten-year-old strip mall fronted by literally acres of unrelieved parking lot.  Though it has no fewer than five separate entrances for cars, God help anyone who dares to approach the place on foot. To reach its quarter-mile-long phalanx of storefronts, you can either negotiate the single paltry thread of sidewalk the developers saw fit to provide, or else try to cross a vast sea of dirty asphalt on foot, with cars flashing carelessly past on all sides and bearing down behind you unseen.

One of the many exasperating tenets of postwar planning was the assumption that nobody would ever want to walk anywhere, anytime. Shopping centers, not to speak of downtown streets, were laid out mainly to suit automobiles and not people. Seemingly, the only time a human was expected to walk outdoors was enroute to the driver’s seat.  

In an environment designed for
and dominated by cars,
pedestrians are just in the way.
Yet many people do walk, and hopefully many more will do so in coming years. What with traffic snarls, interminable waits at signals, and the inevitable battle for parking, it’s often quite literally faster to walk three or four blocks than it is to drive that far. And mind you, I say this as a lifelong motorhead. 

Given all the bad things we’ve found out about designing cities around cars instead of people, modern planners are doing their best to bring pedestrians into this creaky old equation. It’s a fine idea in theory, but in practice, wherever cars and pedestrians mix, the cars invariably win out. The reason is obvious:  Since a car weighs twenty to thirty times what a person does, any contest between the two will not end up in the pedestrian’s favor. Hence, we’re psychologically conditioned from childhood to subordinate ourselves to those big bad cars.  
Self-driving cars are not going to change scenes like this—
they may even make them worse. The problem is
in the cars, not in who's driving them.
(Image: Bill O'Leary, The Washington Post)

Less obvious, but just as problematic, a car also takes up about thirty times as much space as a person on foot, resulting in vast areas of our cities that have no function whatever but to store our four-wheeled friends. All told, we pave over about forty percent of our cities solely to accommodate motor vehicles (in Los Angeles, the figure is said to be closer to sixty percent).  This autocentric environment extends right into our own homes, one-quarter of which we happily devote to garage space. 

For decades, the rhetoric of New Urbanist planning has promised to reverse these twisted priorities. More recently we've heard utopian predictions about the benefits of self-driving cars, but these, too, will not address the root problem—driverless or not, they are still cars, and will still dominate public roads at the expense of those who'd rather walk. 

In my town, you'll find this lovely shopping street—
but rather than making it a pedestrian mall, traffic engineers
decided to let cars go barreling down the middle of it.
Much has been predicted, but little has actually changed on the ground. I recently stopped in at yet another shopping complex not far from my office, this one barely two years old. Unlike the stupefying strip mall mentioned earlier, this “retail village” employs many of the latest New Urbanist planning ideas--varied building facades, happy little plazas, pretty paving, and the like.  
For hapless shoppers, alas, these potentially lovely surroundings are completely co-opted by the constant stream of cars that go barreling right through the heart of the place. That’s right:  For some inexplicable reason, automobiles weren’t barred from what might have been a charming little shopping lane.  

So far, neither New Urbanist rules nor Silicon Valley tech have been enough to change the game.  Those big bad cars are still winning it.  

Tuesday, September 1, 2020


The Great Pyramid of Giza, built 2580 to 2560 BC.
Architect/master builder: Hemiunu
In the past, an architect was just what the Latin word suggested—a “master builder”.  Practical experience was the most important schooling such a person could have, and architects thus trained gave us the Great Pyramid of Giza, the Parthenon, and all the cathedrals of the Middle Ages. 

Only during the past hundred years or so has the right to use the title “architect” been determined by academic degrees and testing rather than by practice. In 1897, Illinois became the first state to require that architects be licensed. California followed suit in the early years of the new century.  

 The National Council of Architectural Registration Boards was founded in 1919 and held its first annual meeting two years later. Given the ever-increasing complexity of building technology, the remaining states instituted requirements for licensure over the next thirty years, with the last two holdouts, Vermont and Wyoming, doing so only in 1951.  

Today, no one may use the title “architect” in the United States without fulfilling a  seven-and-a-half-year long course of education and office internship, including an exhaustive series of examinations. Despite the rigors of this procedure, mere possession of an architectural license has never been a guarantee of talent. Or, as my old boss used to put it, “You can have a fishing license, but it doesn’t mean you’re gonna catch any fish.”

The boardroom at Frank Lloyd Wright's Taliesin West, one of the
handful of schools that still emphasize hands-on training.
Most of the facility was built by its students.
Conversely, a lack of formal education and licensure hasn’t always ruled out extraordinary ability. The last two installments in this series recounted six non-architects—Frank Lloyd Wright, Addison Mizner, Cliff May, Carr Jones, Buckminster Fuller, and Craig Ellwood—who changed the course of architecture and, just as important, made the world a more interesting and beautiful place.

None of the six had formal training or licenses (in Wright’s case, his practice predated licensure requirements). Wright and Mizner gained their entire architectural educations through apprenticeship—Wright with Louis Sullivan, and Mizner with Willis Polk. May, Jones, Fuller, and Ellwood had no formal architectural training whatever.  

None of this is meant to suggest that no schooling is better than bad schooling, or that licensure is unimportant. But it does suggest that there are alternatives to the usual way we teach architecture and building, and how we judge architectural skill.  

Buckminster Fuller, non-architect, but one of the
most creative thinkers and builders of the
twentieth century.
It’s no accident that each of the gifted non-architects cited above learned his craft mainly through practical experience, not through academics.  Today, a handful of schools still struggle to include such hands-on training—Wright’s Taliesin and Paolo Soleri’s Arcosanti among them. Yet for the most part, the architectural establishment remains firmly entrenched in the belief that formal schooling and office internship are the only legitimate basis for competence and licensure.  

Today, few would deny the contributions of geniuses like Wright and Fuller, romantics like Jones, Mizner and May, and even consummate front men like Ellwood. Yet the current process of education and licensure, overwhelmingly weighted as it is toward academic and office training, holds little room for such mavericks in the future. That’s a pity, because in many ways, the practically-trained architect follows most closely in the footsteps of the  “master builder”.    

Monday, August 24, 2020


Architect Carr Jones managed to conjure lyrical homes out of
castoff materials—practicing green architecture long before
that term was invented.
Last time, we looked at the careers of Frank Lloyd Wright, Addison Mizner, and Cliff May, all renowned architects who were never formally trained or licensed. Today we’ll touch on a few more architects who made an undeniable contribution to the profession, despite their lack of formal credentials.

Carr Jones, a designer-builder who practiced in the San Francisco Bay Area for almost half a century beginning in the late teens, was a pioneer in green architecture if ever there was one. Jones fashioned lyrically beautiful homes out of used brick, salvaged timber, and castoff pieces of tile, slate, and iron, often wrapping his dramatically-vaulted rooms around a landscaped central court. 

Some of Jones's interiors are startling in their modernity;
this living room of a Carr Jones home in Piedmont,
California dates from 1932.
Perhaps because he was trained as a mechanical engineer and never traveled abroad, Jones was all but innocent of architectural pretension. Instead, he built on unvarying principles of comfort, conservation, and craftsmanship. And unlike many trained architects whose style changes with every faddish breeze that blows, Jones’s convictions remained uncompromised right down to his death in 1966.

R. Buckminster Fuller had no architectural training either, and indeed was expelled from Harvard during his freshman year for "irresponsibility and lack of interest". His first job was working as an apprentice machine fitter. Yet over the course of his long and wide-ranging career, Fuller’s architectural innovations included not only the geodesic dome—his best-known invention—but also the gleaming, steel-sheathed Dymaxion House, a dwelling meant to be mass produced in a factory and installed on the site as you might bolt down a lamppost.    
Buckminster Fuller posing with an early model of his
Dymaxion house, circa 1927. 

In the context of today’s fussy, retrograde home designs, Fuller’s visionary proposals for the geodesic dome and the futuristic Dymaxion House may draw smiles, but this reflects more on the glacial pace of architectural progress than any flaw in Fuller’s thinking.

A later Dymaxion house in Rose Hill, Kansas, designed
to be built using postwar-idled aircraft plants,
and built between 1948 and 1958.
Not surprisingly, Fuller dismissed conventional architects, saying: “They work under a system that hasn't changed since the Pharaohs.” During his lifetime, the onetime Harvard dropout received exactly 47 honorary doctorates from universities the world over, and today is deservedly included in practically any general survey of twentieth-century architecture.  

One highly influential non-architect had creative skills of another kind. Craig Ellwood was the celebrated Southern California modernist whom one critic called “the very best young architect to emerge from the West Coast in the years following World War II.”  A brilliant self-promoter, Ellwood (who was born Johnny Burke and took his tonier surname from a local liquor store) parlayed some minor development experience into a career that reached the highest echelon of modern architecture. So skilled was Ellwood at presenting himself that despite being barely educated—his entire formal training consisted of night classes at UCLA—he was twice considered for the deanship at Mies van der Rohe’s Illinois Institute of Technology.  

A typically elegant Craig Ellwood design in the Brentwood
area of Los Angeles, circa 1958.
Understandably, Ellwood took pains to hide the fact that he was unlicensed from his elite clientele, and he relied heavily on a gifted staff to carry out his basic concepts. That he was able to enrapture critics, editors, and clients alike despite his lack of education can only increase one’s admiration for his skill. And in the final analysis, nothing can detract from the breathtakingly elegant steel-and-glass creations that are the legacy of the Ellwood office.

Next week:  The common thread among great architects and great non-architects alike.

Monday, August 17, 2020


One of Frank Lloyd Wright's earliest and
least-known "commissions"—this curious
windmill tower built for his family's
Spring Green, Wisconsin, farm in 1897.
Though some of my colleagues might cringe to hear it, non-architects—those who lacked either the formal schooling or the license to legally use the title “architect”—have had a huge impact on American architecture over the past century.  If they weren’t architects in the legal sense, they more than lived up to the title’s original meaning of “master builder”.

Why not start at the top? Frank Lloyd Wright’s only formal training consisted of a year of engineering classes at the University of Wisconsin. Thoroughly bored, he dropped out in 1888 and headed for Chicago to find a job.  He quickly found one, first apprenticing with the Chicago architect  J. Lyman Silsbee, and later and more famously with his “lieber Meister”, Louis Sullivan. 

Addison Mizner rose to become one of the "must-have"
society architects of Palm Beach, despite his lack of
formal credentials. Among his most enchanting works
is this Palm Beach shopping court, now named for him.
In 1893, after a falling out with Sullivan over taking outside work, Wright left the firm and opened his own office, where was able to use the title “architect” only because his practice predated the Illinois licensure requirements by four years. Wright nurtured a lifelong disdain for traditional architectural training, which eventually led him to found the Taliesin Fellowship, a unique school in which apprentice architects learned largely by doing.

Addison Mizner, whose work
was seldom taken seriously
by the architectural profession
despite his great success.
But Wright is only the best-known example of brilliant architects with unconventional or even nonexistent educations. In another vein entirely is Addison Mizner, the California-born, Guatemala-raised, Florida-polished raconteur who improbably rose to become the top society architect of Palm Beach during the Roaring Twenties. Mizner despised school, and accordingly his only architectural training was a three-year apprenticeship with the San Francisco architect Willis Polk. The happy result was a personal style that drew more from his childhood knowledge of Spanish Colonial Guatemala than from the copybooks so beloved by his contemporaries.  

One of Cliff May's early Spanish Revival homes in
San Diego's Talmadge Park, designed in 1932, when
May was just 23 years old.
(Photo: Sande Lollis, San Diego Union-Tribune)
Nevertheless, Mizner’s romantic antiquarian villas were considered vulgar setpieces by his academically-trained colleagues. It probably didn’t help that he also ran a business manufacturing mock-antique furniture and building materials, which he used liberally in his own  work. Mizner’s career was spectacular but brief; he died in 1933. Today, his surviving Palm Beach work ranks among the finest Spanish Revival architecture in the nation.

On the opposite coast, Cliff May, the San Diego architect widely considered the father of the California Rancher, started his career building Monterey-style furniture. When he began designing Spanish Colonial-style houses for speculative builders in the early 1930s, academic architects dismissed him as a purveyor of kitsch. Yet over time, May’s rambling, site-sensitive designs metamorphosed into the rustic and low-slung homes that Americans came to love so well. All told, May built his Ranchers in forty U.S. states, and their spiritual heirs went on to become the dominant style of the postwar era. Genuine May-designed Ranchers, not to mention his earlier Spanish Revival designs, are now celebrated and studied by architectural connoisseurs.  
May's later designs hewed to a more Mid-Century vibe,
such as this Long Beach "pool house" of 1953
Cliff May: Despite his skill,
"real" architects didn't want him
in the club.

Despite these formidable accomplishments, May received only late and grudging acceptance from his licensed colleagues—or as he rather poignantly put it,  “It took real architects a long time to let me into the club.”   

Next time, we’ll look at a few more outsiders who changed the course of architecture, and see what they all had in common.

Tuesday, August 11, 2020


The IPhone screen, with its idiot-proof icons,
builds on the long evolution of Apple's
graphic user interface.
For close to a decade now, every time I’ve had to use yet another badly-designed appliance, or had to sit idling at yet another ineptly-timed traffic light, or had to decipher yet another garbled set of instructions, I’ve thought of one man: Steven Jobs. And I wish he was still with us, or barring that, that at least there could’ve been a hundred more like him.

There’s no doubt that, with Jobs’s passing, the world lost one of the most important visionaries of the last hundred years. But for me, the loss has less to do with his putting a computer for the rest of us on a million desktops, nor with his uncanny knack for creating things that people didn’t even know they needed. Granted, these accomplishments are vastly important to Jobs’s legacy. But to my mind, his ultimate triumph was his singular skill at persuading a largely indifferent public that excellent design really matters. He wanted us all to be as passionate about beauty and simplicity as he himself was. And to the extent that Apple’s famously intuitive and user-friendly products are now more popular than ever, he seems finally to have succeeded.
A young Jobs poses with the original Macintosh,
circa 1984.

The fact is that the average American consumer has been amazingly tolerant of third-rate product design. Consequently—and understandably—any company that knows it can make perfectly good money selling clumsy, overcomplicated, or unintuitive products has no incentive whatever to improve them. And so most don’t. 

Apple's logo circa the 1980s.
In Jobs, however, we had the unique case of a businessman on a near-religious crusade to educate his own market, relentlessly challenging us to demand more than the run-of-the-mill crap we’re typically offered. 

It’s interesting to note that the Apple cofounder, despite being a pioneer in one of the most technically complex fields yet known to man, was not an engineer but rather a laid-back college dropout with a mystical streak. To add yet another layer of paradox to this singular mind, he was notoriously—some would say tyrannically—demanding of the people who worked for him. But if this is what it took to engender the phenomenally beautiful and beautifully functional objects Apple has created out over the years, then it was all worth it.

The iPhone 11: Would Jobs have approved? Hmm....
As you’ve probably guessed, I write on a Macintosh, and have done since I bought the very first model through an Apple engineer pal back in 1984. So yes, kids—I’ve been a true believer since long before the iPod, iPad, or iPhone even existed. In fact, I was a believer back when Steve Jobs still had a full head of hair. And for many of those years, I tried in vain to convince doubters why there was nothing like using a Mac—in short, why good design really mattered. Thankfully, with the wild success of those assorted i-Things, Jobs was finally able to make that case beyond any doubt. 

Whether Apple has been able to maintain its "insanely great" design in the near ten-year absence of Steve Jobs is debatable, as one look at the plug-ugly iPhone 11 makes clear. It's almost certain that, with all due acknowledgment of its technical brilliance,  Jobs would not have tolerated the inelegance of its design. 

Jobs had already revolutionized the fields of computing, film, music, and telephonics. I wish he’d been given the time for even more far-flung conquests. The world could have used a hundred more like him, but alas, there was only one.