HomeHow Emerging Technologies Allow Anyone to Create His Own CultureEducationAtlas University
No items found.
How Emerging Technologies Allow Anyone to Create His Own Culture

How Emerging Technologies Allow Anyone to Create His Own Culture

10 Mins
|
March 24, 2011

Through television, newspapers, radio, and advertising, the mass culture of the twentieth century created easily understandable points of reference for virtually everyone. Often, these were low and crude and coarse. But everyone knew who Ralph Cramden was. Who Batman was. Who Vince Lombardi was. You might not have known who Gene Roddenberry was, but you knew that NBC had a show starring a guy with pointed ears.

Today, however, we’re looking at that shared culture in the rearview mirror, and with mixed emotions. In fact, we’re witnessing the death throes of mass culture. It’s being replaced, not by the elder President Bush’s “thousand points of light,” but by a thousand fractured micro-cultures, each of which knows only a little bit about what’s going on in the next micro-culture thriving on the website next door.

As James Lileks of Lileks.com and the Minneapolis Star-Tribune’s Buzz.mn told me a couple of years ago: “Take a basically divided populace—the old red and blue paradigm—and then shove that through a prism which splinters it into millions of different individual demographics, each of which have their own music channel, their own website, their own Blogosphere, their own porn preferences delivered daily by email solicitations. I mean, it’s hard to say whether or not there will eventually be a common culture for which we can have sport, other than making fun of the fact that we really lack a common culture.”

This trend has both good and bad aspects. But before we turn our attention to that—and what it may bode for our future—it might be useful first to review how we got here.

THE RISE OF MASS CULTURE. . .

In many respects, the mass culture of the twentieth century was a temporary anomaly, an offshoot of the tools that drove the economy of the era.

Prior to the period of industrialization, most goods and services were produced locally by skilled craftsmen and artisans. The time and cost of transportation created local economies. News was disseminated by pamphleteers who printed small-scale, limited-distribution newspapers for a wide variety of social and political groups.

We’re witnessing the death throes of mass culture.

But provincialism wasn’t destined to last. The nineteenth century saw the spread of the machine, which basically functioned in a handful of modes: off, on, fast, or slow. The railroads of the era encouraged a single flow of people and goods in and out of transit hubs, which helped fuel the growth of cities. Concurrently, while the Progressive political movement emphasized a type of soft socialism, it also had some positive benefits. For one thing, it attempted to reform newspapers that admittedly had been biased to adopt a cooler, more literary, and less partisan stances.

By the start of the twentieth century, the technology of the time, dominated by the railroad and the assembly line, led to an emphasis on big and simple. The machine-powered assembly lines in Detroit made it expensive and difficult to customize individual cars—hence the phrase, “Detroit is retooling for next year.” (The darker side of all of this, of course, was the simultaneous rise of mechanized mass warfare.)

Mass production also would have a surprising impact on the culture. Hollywood’s studios adopted the assembly-line model as an economical way to produce a steady stream of movies. Similarly, it was much cheaper for printing presses to run off a million copies of the same book, magazine, or newspaper than to continue smaller runs of custom-tailored niche publications. Like the building of the transcontinental railroad network in the prior century, stringing together radio and later television networks to handle a mere handful of national channels was an enormous accomplishment.

Mass production would have a surprising impact on the culture.

The confluence of these factors resulted in the creation of a mass culture—a remarkably homogenous experience of values shared by people coast to coast. Nor was culture pitched to the lowest common denominator. By the 1950s, there was a surprisingly high-quality bent to mass entertainment, albeit with a middlebrow slant. In 1956, MGM’s Lust for Life employed rugged action stars Kirk Douglas and Anthony Quinn as Vincent Van Gogh and Paul Gauguin, respectively. David Lean’s epic films of the time—Bridge on the River Kwai, Lawrence of Arabia,and Dr. Zhivago—were not only hugely successful, they were astonishingly literate, especially when compared with today’s mindless blockbusters. Even a television game show such as What’s My Line could boast on its panel writers, the occasional poet, and Random House publisher Bennett Cerf, who spearheaded the publication of Ayn Rand ’s Atlas Shrugged . Talk-show hosts of the time, such as Steve Allen and Jack Parr, invited thoughtful guests and musicians of the caliber of John Coltrane and the Miles Davis Quintet. Ed Sullivan, the ringmaster of that period’s pop culture, would showcase the occasional opera star. And conductor Leonard Bernstein starred in his Young People's Concerts series—live TV designed to extol the virtues of classical music to young viewers, featuring guest appearances from other legendary musicians, including Aaron Copland and opera stars Christa Ludwig and Walter Berry. (Editor’s note: See the review of Bernstein’s series elsewhere in this issue.)

. . . AND ITS FALL

However, by the 1960s, the cultural winds were changing direction. In Future Shock, Alvin Toffler noted that Ford was promoting its Mustang as the car “you design yourself,” offering innumerable variations and options designed to fit wide ranges of wallets and needs. Arguably, the Johnson administration’s creation of PBS gave the major TV networks an excuse to reduce the sophistication of their programming. Meanwhile, the owners of America’s cable television systems, initially launched in the 1950s to bring TV to areas of poor broadcast reception, were already thinking of ways to offer narrower niche channels to viewers. They did this to fill up their pipeline and to spotlight programming that the Big Three networks increasingly ignored as they competed for the widest audience possible.

Also, in 1969 the Internet was launched. Originally conceived by the Defense Department as a decentralized computing network designed to remain operational even if computers in particular cities were destroyed in a nuclear attack, the Internet (then called “Darpanet”) quickly became a method of linking computers at U.S. universities in order to exchange information and experiment.

By then, the elements were in place to generate what Toffler called a “demassified” economy and media—one that would create personalized services and information—rather than the one-size-fits-all world of the first half of the century. And sure enough, by the late 1960s, middlebrow culture was very much in decline.

The counterculture would quickly become the dominant overculture.

In Hollywood, the Hays Office, which reviewed the scripts and finished product of all Hollywood movies to ensure a uniform code of decency, was dissolved and replaced by a ratings system that, in theory, allowed audiences to decide what level of mayhem and vulgarity they preferred. In practice, the overall cultural level of Hollywood’s product dropped like a stone. Out were such general-audience movies as North by Northwest and Dr. Strangelove; in were counterculture films such as Easy Rider and M*A*S*H (which, as funny as its satire may have been, was the first Hollywood movie to drop the F-bomb on audiences).

The counterculture would quickly become the dominant overculture in the next years. By the early 1980s, cable television viewership was taking off, thanks to early channels such as HBO, CNN, and MTV. During this period, cable was also a temporary reprieve for the now rapidly shrinking middlebrow culture, through niche channels such as Bravo and A&E. The latter’s early programming was surprisingly sophisticated, including Woody Allen’s best films, little-seen Robert Altman movies such as Three Women, and shows on architect Mies van der Rohe. But in order to save funds, by the mid-1990s even A&E was relying on ancient network reruns to flesh out its programming schedule.

Television was about to become what the computer industry calls “a legacy medium.”

MAKE YOUR OWN CULTURE

Futurists and science-fiction writers had long predicted a global computer network. In 1993, the first commercial Internet browser, Netscape’s Mosaic, opened the ’Net to the public. Around the same time, Tim Berners-Lee invented a language that allowed for graphical images, hyperlinks, and other computer data to run as a user-friendly “Web” on top of the existing Internet backbone.

Unique among media outlets, the Web now allows you, me, or anyone to create and shape our own culture. Of course, that culture will be a reflection of what you bring to it. Unlike the overculture of the 1950s, there isn’t an all-knowing network programmer to aim thoughtful, high-quality material at you. If you want to spend your leisure time staring at material that would make Michael Vick squeamish, go right ahead—nobody’s going to know about it but you. (At least until you run for Congress.)

Political Correctness has shot middlebrow mass culture near-dead in its tracks.

Since 2001, the easiest and most obvious method has been via the Weblog, or “blog” (see my article in the September issue of TNI). But that particular Internet innovation (which took off about thirty seconds after “experts” declared the post–dot.com Internet a wasteland) has begotten numerous others. Podcasting, which earned its name from Apple’s ubiquitous iPod player, allows anyone to become his own recording studio, recording anything from a three-minute pop song to an hour-long interview, and developing social networks that provide an easier, more mini-community-oriented form of expression than weblogs do. Recording software, such as Cakewalk’s Sonar program for the Windows PC, puts into the hands of virtually anyone technology that George Martin would have killed for when he was recording the Beatles. Software synthesizers, such as Propellerhead’s Reason, provide sounds that Wendy Carlos couldn’t have imagined in the early 1970s.

While Apple has ensured that the form of the podcast is near-universal, what is poured into the structure varies widely (as with all content on the Web), from intelligent author interviews and topical debate to the lowest common denominator.

EVERY MAN HIS OWN TV STATION

Apple’s iTunes helped create a central clearing house for audio podcasts in late 2005. YouTube did much the same for Internet video. Video on the ’Net used to be a decidedly hit-or-miss affair, which would often take a considerable amount of time to download before playing. With YouTube, anyone with an (increasingly ubiquitous) broadband connection can click on a video and watch it immediately stream.

As exciting as the best Internet video is, the real fun is “rolling your own.” YouTube has turned singers in backwater towns into hot commodities being chased by recording contracts, and has elevated camcorder-armed students to the status of cult heroes. While millions have uploaded their own clips to YouTube and other video aggregate sites, there are two video-oriented websites worth exploring for their efforts in pushing this technology much further.

Michelle Malkin’s Hot Air (www.hotair.com) is built around her five-minute video segments shot by producer Bryan Preston. Utilizing a few thousand dollars’ worth of video and computer technology, the best of Malkin’s clips (frequently shot in front of a green screen with a slick digital backgrounds added afterwards) could be inserted seamlessly into a nightly news broadcast on network TV with no loss of quality.

Television was about to become what the computer industry calls “a legacy medium.”

England’s 18 Doughty Street (www.18doughtystreet.com) is designed to be the Tory answer to the liberal BBC and demonstrates a different approach to Internet TV, with its commitment to hours of live live, long-form programming every day. This content—shot on multiple soundstages in a converted five-story home in the Bloomsbury region of London—is viewable on Windows Media Player and archived for later watching. The overall look and content quality are reminiscent of America’s C-SPAN channel, except that the programming is created by a small staff for Internet consumption.

Setting aside their politics, from a strictly technological standpoint these sites are models for what can be done with video on the Internet. They could be adopted or adapted by anyone wishing to launch his own Internet TV channel, in virtually any genre. They are filling the gap between large-scale corporate media and idiosyncratic personal media, creating small-scale but potentially viable businesses. All of which is very much a return to the pamphleteer model that, for a time, was rendered obsolete during the temporary age of mass media.

WHERE FROM HERE?

Efforts to produce television programming on the ’Net are still in their infancy. And yet, in the not-very-distant future, they could also help to bring traditional television full circle.

Currently, telephone companies such as AT&T are rolling out fiber-optic networks, allowing them to compete with cable TV companies as a source of residential television. They’re transmitting television signals via the computer-language protocols of the Internet, hence the term IPTV. At the moment, the phone companies are selling IPTV largely on the basis of their being cheaper than cable or satellite TV. But the architecture of IPTV is designed to allow much more interactivity and personalization, especially when compared with passive, one-way broadcast television. IPTV already allows consumers to view video-on-demand, ranging from first-run movies to quirky old television reruns.

The IPTV platform also will allow narrowcasting of television shows to a much greater degree than even the 500-plus channels that satellite TV currently offers. The same technology that powers Web-based video producers such as Hot Air and 18 Doughty Street will be used eventually to create proprietary, magazine-style programming for those viewing television on IPTV networks, and probably sooner than you think. (Those not on IPTV probably will get most of the same programming via the Internet.)

All of which is a good thing. Political Correctness has shot middlebrow mass culture near-dead in its tracks. When, as the Wall Street Journal reported in September, an Amherst, Massachusetts high school teacher can cancel a school production of West Side Story because his students found it to be too “racist,” what’s left of mass culture is clearly reaching a low point—and no doubt, it will only continue to decline.

Fortunately, the tools are there for those smart enough—and ambitious enough—to create a culture of their own.

THE TRADE-OFF

For all the advantages of personalized culture, however, there is an undeniable downside in the disintegration of a common culture.

“In one respect, I like this,” James Lileks says. “I like the fact that there are so many cultural opportunities out there, that the monoculture no longer charges the whole show. But the death of the monoculture means that there is a less of a sense of common identity, and how that plays out is something that we are going to learn in the next ten to fifteen years.”

Every four years, an increasingly fractured nation battles it out hammer and tong at the ballot box. The anger and rage that fuels many voters is partly a result of a political system that no longer well serves an increasingly fractured culture.

More broadly, this loss of social cohesion has contributed significantly to what is known as “the culture war”—a result not only of ideology but also of the democratization of technology.

One area where this cultural war really plays out is at the movies, which fight a Red Queen’s Race between endlessly rising zillion-dollar budgets and a shrinking audience that’s increasingly turned off, both by plots that figuratively stink and by theaters that all-too-often literally do.

In order to pre-sell a movie with a production budget of one hundred million dollars—plus the costs of advertising and distribution— Hollywood tries to hedge its bets. Hence all of the movies devoted to themes that date back to the glory days of our shared culture—including far too many ironic reworkings of hit TV shows from the 1960s and ’70s, and the interminable Star Wars and Star Trek franchises, with their roots in the same period.

The Web now allows you, me, or anyone to create and shape our own culture.

So it begins to make sense why ultra-liberal Hollywood, a bastion of Political Correctness, would adapt novels that were written by dead white European Christian males. After famously making hash of best-selling novels such as Bonfire of the Vanities and of historical events such as Pearl Harbor, Hollywood appears to have gotten the message that PC tampering with bestsellers kills their box office. By contrast, when it remains true to the source material, audiences flock to adaptations of such beloved classics as J.R.R. Tolkien’s Lord of the Rings, the “Harry Potter” series, and C.S. Lewis’s Narnia books. Even religious epics such as The Passion of the Christ and the surprisingly patriotic 300 are being made—and making money. All of these draw upon the once-shared values of the past.

Technology can’t remedy social divisions based on conflicting values; it will only reflect, even augment those divisions. Healing those wounds is not the task of inventors, but of philosophers.

In the meantime, however, like it or not, we’ve become a nation of niche markets, narrowcasting, and competing micro-cultures. We’d better get used to it. And, as individuals, we’d be wise to seize all the many opportunities that emerging technologies provide us, in order to create personal cultures that are rewarding, fulfilling, and liberating.

العلوم والتكنولوجيا