2011 Issues of The Futurist

November-December 2011, Vol. 45, No. 6

  • Lost and Found in Japan
  • Updating the Global Scorecard: The 2011 State of the Future
  • Outlook 2012
  • Reconnecting to Nature in the Age of Technology
  • Investigating the Future: Lessons from the “Scene of the Crime”
  • The Search for Global Solutions: Moving from Vision to Action

Lost and Found in Japan

By Patrick Tucker

While the world turned its attention to the frightening prospects of a nuclear catastrophe in post-tsunami Japan, another crisis was being dealt with, quietly, humbly, and with pragmatic determination.

Ogatsu_1.JPG

Click the photo to launch slideshow

The date is April 8, 2011. I am on a bus to go into the Japanese city of Ishinomaki, a place that consisted of 162,882 souls before the March 11 tsunami struck. On the day of my journey, 2,283 of the city’s citizens are feared dead, 2,643 are missing, and some 18,000 are in shelters. Because Japan is, perhaps, the most technologically advanced nation on earth, the successes and failures of its attempts to cope with the aftermath of this disaster will doubtless be instructive to planners and governments around the world. I am here to learn whatever I can.

I’ve also come to see a miracle.

In the weeks following the Tohoku earthquake, in the midst of the Kan administration’s various failed efforts to contain the situation at the Fukushima Daiichi nuclear power plant, something remarkable took place. More than 1,500 people showed up at the Tokyo offices of Peace Boat, a small nonprofit that quickly became one of the first organizations to actively solicit volunteers. These volunteers came to go north, through Fukushima prefecture, into the tsunami-affected areas.

The mission turned out to be surprisingly dangerous. Two nights earlier, a 7.0 aftershock hit the area, causing a power disruption at the Miyagi nuclear power plant as well as an overflow of radioactive material. A tsunami warning was issued and then called down. While the situation was contained within a few hours, it served as a vivid reminder that the safety situation in Ishinomaki is still precarious. The buildings that remain standing are severely compromised.

Yet, the volunteer rolls are only growing. The first Peace Boat dispatch consisted of 50 individuals; the next was 100. The group was now preparing to bring up 250 the following week and as many as 500 in the week after that.

“We have lots of university students,” says Satoshi Nakazawa, a relief worker at Peace Boat who has also volunteered to be my interpreter during my brief stay in the north. “Lots” is an understatement. As I look at the crowd, it seems that about 90% of the volunteers who have shown up are people in their 20s or younger, and most are either students or unemployed.

The Mayor of Ishinomaki

Junior_0.JPGTAKASHI "JUNIOR" YAMAMOTO. CREDIT: PATRICK TUCKER

Upon arriving at Peace Boat’s camp, I make arrangements to meet Takashi Yamamoto, project leader for this operation. He was among the first relief workers to put his boots on the ground in downtown Ishinomaki at a time when even the army (referred to in Japan as the Self-Defense Force) was limiting its activities in the area to mostly helicopter flybys. I meet Yamamoto at the makeshift headquarters the group is sharing with the other relief organizations here. In very un-Japanese fashion, he arrives 30 minutes late, reaches out, and gives a big, two-handed shake. “Call me Junior,” he says. He bids us sit on the floor so he can tell us what he’s been doing the past month.

On March 17, after a journey over stricken roads and a difficult night camping in the cold, he and the other members of the Peace Boat advance team woke and walked downtown. The devastation was Carthaginian.

“I couldn’t believe this was Japan,” he says. He likens the scene to the Tokyo firebombings: glass, smoke, ruin, a smell of dead fish, a world on its side with its contents bleeding out.

Junior happened to have a contact on the Ishinomaki Social Welfare Committee (SWC). These committees are the primary authority on what happens in any given city. Without local SWC approval there could be no Peace Boat relief operation in the area. The Ishinomaki SWC was functioning at one-third of capacity at the time, meaning two-thirds of the city council’s guiding leadership were missing and presumed dead.

The committee was reluctant at first to allow volunteers into the city. Who would coordinate them? What if they got hurt? What if they were criminals? Junior consulted with an architect who calculated that 150 volunteers, working eight hours a day, seven days a week, would have all the mud cleared out of Ishinomaki in approximately 4,000 days.

“Take every volunteer you can get,” he told them.

Junior has been in disaster situations before; he was with one of the first relief teams to show up after the Kobe quake in 1995. He was the project leader for Peace Boat’s response in Sri Lanka to the 2004 Indian Ocean tsunami. But he’s never undertaken anything like this.

The volunteer camp is a tent city outside of Ishinomaki University, which, Junior acknowledges, will not suffice as a durable solution. He wants to build a permanent housing facility for the kids who keep showing up. “You can’t have your people sleeping here in tents in November,” he says. He’s also trying to get money into the hands of the downtown area residents. He wears his new, unofficial role of “mayor of Ishinomaki” well. The life he led before March 11 is becoming a distant memory.

group_0.JPGISHINOMAKI, APRIL 10, 2011. CREDIT: PATRICK TUCKER

The Peace Boat volunteers are divided into 30 teams of five members each, and each team sticks to one mission. For some, this means a full week dealing with people in the areas hardest-hit by the tsunami—people who easily meet the clinical definition of the term traumatized. “When talking to victims, give no information that is not certain. You will start rumors,” the volunteers are told. “This will be very hard work. Be sure to keep your energy level up.”

For the others, it’s seven days of hefting boxes in a warehouse. All the jobs are vital, says Peace Boat, but for the kids who have come here searching for something—some formative experience related to the most significant event in Japan’s history since World War II—the warehouse assignment must be a bit of a disappointment.

The Earthquake Generation

The young Peace Boat volunteers who felt the immediate need to help their fellow Japanese offer an unexpected view of the country’s social reality—and its future.

ms_0.JPGPEACEBOAT VOLUNTEER MAIKO SUGANO. CREDIT: PATRICK TUCKER

Maiko Sugano, age 27, Googled volunteer opportunities and contacted several organizations. Peace Boat was the only one to write back. “They seem to take everyone. No experience necessary,” says Sugano. She’s unemployed right now, which, in contemporary Japan, carries a certain degree of shame. She’s clearly bright. Her English is flawless. Her 10-year goal is a simple one: She wants to feel more capable. She was worried about the radiation from Fukushima, but not enough to let it stop her. She wants nothing but to hold on to this experience, to absorb it into her. “What happened here will be forgotten so easily. People will stop donating. Next month, who knows, something else might happen. If I see it with my eyes, I will take it seriously at least. I will remember it.”

Sugano and many of the young and underemployed volunteers might be referred to as a “lost generation.” Originally an expression that referred to men and women who came of age during World War I in the United States, the term first came into usage in Japan after the bursting of the real-estate bubble in the 1990s, and the moniker “lost generation” has latched itself to various successive graduating classes ever since.

For 20 years now, the story has been the same: The biggest and most stable companies—the ones still offering a clear path to reliable middle-class income—only recruit fresh out of university and only pick the top students. The young people who aren’t snapped up, who willingly diverge from the white-collar career course or don’t seem to match the corporate ideal because they are socially awkward, different, or just of the wrong gender, often spend decades bouncing from start-up to start-up, from one small company job to the next.

“Those hired as contract workers usually have no hope of full employee status in the Japanese corporate world,” says Michael Dziesinski, a sociology fellow at the University of Tokyo. “The employment issue for Japanese youth is a broken postwar school-to-work system for young adults, and as a result, some less-resilient youth fall through the cracks,” says Dziesinski. The result: Nonstandard employment—referring to part-time, freelance, or just dead-end work—has doubled since the 1980s and today comprises one-third of the Japanese labor force.

After World War II, Japan forged a reputation for social cohesiveness, egalitarianism, and strong middle-class job growth. As Japan’s ties to the United States grew stronger through the 1990s, the Japanese economy has come more and more to resemble that of the United States in its most unenviable aspects. Japan’s income inequality is higher than that of many other wealthy countries, such as Norway, Sweden, and even India. The 2008 recession only exacerbated this trend, as many thousands of temporary and contract workers lost employment, bringing the poverty rate up to 15%. A few years ago, this disparity inspired the coinage of the term kakusa sakai, which might be interpreted to mean “disparate society,” or “society without evenness.” Another new expression to describe economic stratification is kachigumi soshite makegumi: society of winners and losers.

“The attainable Japanese dream began to disappear 30 years ago, in the eighties. We don’t know where the next Japanese dream lies,” says Tokyo University demographics expert Yuji Genda.

KS_0.JPGPEACE BOAT VOLUNTEER ISSEY TAMAKU. CREDIT: PATRICK TUCKER

Peace Boat volunteer Issey Tamaku, age 20, is a politics student at Keio University. He lost his aunt and uncle to the tsunami. When he learned that his school had canceled classes because of the earthquake, he, too, Googled volunteer opportunities and found Peace Boat. He went to high school in South Korea and credits this for his perfect English. He says that, compared to Korea, Japan “doesn’t get out enough. We’re too content to stay here. We need better English instruction. These kids are learning English but they can’t speak it.” Still, he’s optimistic about the future of Japan. “I have to be,” he says.

Kenji Yasuda, a student from Yokahama, age 22, is wonderfully frank about his motivation. He was captivated by the scenes on his television and now he wants to know how existence here compares to his comfortable life back home. He says he needed to contribute something and so he will be shoveling mud for the week. “People in Tokyo are getting back to ordinary life,” he says. “Already, pachinko parlors are full. They’re losing memory.”

Koike Shinya, age 20, works as a house painter. He doesn’t know what he wants to do in life except, one day, go to Boston. He’s volunteering now because he wanted to play a role in the most significant event to take place in Japan in the last 50 years. “We are a country of very nice people, but some of that is only on the surface. When a crisis like this happens, you can see people for what they really are,” he says.

Another 20-year-old, Takumi Thomas, is a university student in politics and media, with aspirations toward being an announcer. He was motivated by a mixture of curiosity and its separate, murkier, altruistic cousin, “a desire to help.” Like almost every volunteer here, he began searching online for volunteer opportunities immediately after the disaster. Peace Boat was the first to write back and accept the offer.

Tsubabasa Shinoda, age 20, is from Kanagawa Yokohama. He’s a law student and works an unpaid internship in an advertising agency. Like many of the volunteers in the tent city, he says he got on the bus because he was “afraid of being indifferent.” It seems he’s struggling to do the right thing, groping for the proper response to an event far larger than anything he’s experienced in his lifetime, an event to which he feels intimately bound.

Besides Peace Boat, there are several other nongovernmental organizations operating in the area. A group called AP Bank sent up 100 volunteers for the weekend. The Red Cross was running a hospital. But Peace Boat appeared to be winning the contest to send as many volunteers as possible, which enabled them to cover the gaps left open by other, well-funded relief groups. Herein lies the first lesson of the tsunami: Expect a flood of volunteers and respond rapidly to marshal their energy.

The natural human response to a terrible news event like the Tohoku tragedy is complex. Groups like the Red Cross work to convert that reaction into a financial contribution as quickly as possible through televised appeals and banner ads.

Peace Boat put out a solicitation within weeks of the disaster, when the interest level was still high. It campaigned through its own network, through Facebook, mixi, and even the Tokyo blogger community. The message went viral because it connected with what the broader public actually wanted to do in response to the scene playing out on their televisions: shovel, repair, comfort, change the situation in a visible and tangible way—in a word, act. Clicking a banner ad does not have the same effect and never will.

On April 9, no other private organization in the affected area was taking as big a risk, either financially or in terms of safety, as was Peace Boat. Even the Japanese army began the relief process by carefully assessing the situation and writing a manual before distributing food and supplies. Peace Boat did the reverse: It started sending volunteers and then writing their safety manual based on the feedback they received.

Peace Boat was also spending far more than it was taking in. In normal years, it’s an educational tour outfit, ferrying kids around the world for high-priced educational excursions on chartered boats (Peace Boats). After the 1995 Kobe earthquake and the 2004 Indian Ocean tsunami, the organization raised money and collected supplies, but it has never attempted an operation of this size or scope. Financially, the organization may not survive this, its grandest moment.

This character of impulsive selflessness reflects the attitudes of the young volunteers who have signed up for this excursion. I found it repeated in the survivors.

Resilience by Necessity

Hagas_0.JPGYOSHIE HAGA (LEFT) AND MITSUKO HAGA. CREDIT: PATRICK TUCKER

Yoshie Haga, age 66, and her daughter Mitsuko, age 40, ran a beauty parlor on the corner of what was one of the busier streets in downtown Ishinomaki before the quake. They had two houses in a family compound. One was insured. One house was not. They are in a good mood when I meet them and are eager to tell me their earthquake story.

The tsunami warning sounded and they attempted to drive to higher ground. They hit traffic and their car was swept up in the wave. They broke out and swam to a nearby rooftop, then went from building to building, all while Haga the elder carried her small dog in the front of her blouse. Finally they found a roof that seemed out the flood’s way and stayed the night there. In the morning, they hiked through knee-deep water to the local evacuation center.

They’re animated as they recite this tale. The part about the dog seems embellished, but I’m disinclined to press them on this. Of course it’s natural and fitting that they should want to make this story shine a bit after what they went through.

They say that their clients have been asking them when they would reopen their beauty shop. They are hoping to get the electricity back on by the end of the April, and if they can do that, they aren’t going to charge for haircuts for the first couple of weeks.

This willingness to plan ahead for a brighter tomorrow is encouraging, but rare. For many Japanese, the future has become yet another touchy subject. In a poll conducted by Japan’s largest labor organization before the earthquake, 93% of respondents said they were worried about what lay ahead for the nation and for themselves. Even after March 11 pushed the country back into recession, people like Yoshie and Mitsuko Haga defy this fatalism.

I ask them how they’re able to remain so optimistic in spite of everything they’ve lost. “Women are stronger in these situations,” they tell me.

Since the quake, the Hagas have become devoted stewards of the community. They spend their days moving among their neighbors’ houses, checking up on the elderly. One of the roles of the Peace Boat volunteers is to find people stuck or squatting in uninhabitable houses, which on April 10 number 30,000 people, according to reports. But community members like the Hagas are critical to the effort, because they are much better at finding their neighbors than cadres of strange volunteers would be.

camera_0.JPGCREDIT: PATRICK TUCKER

A few minutes later, I am standing in a shell of a building in downtown Ishinomaki. A single security camera dangles from the ceiling on a loose wire. The south wall of the place is gone, knocked out during the flood by a runaway Toyota station wagon, which now sits outside in the mud.

This is the residence and former convenience store of Sho Nitta, age 74. When the tsunami hit, he and his wife barricaded themselves upstairs and watched helplessly as people tried to break free from their cars. They saw a woman struggling nearby in the current, so they thrust out a pole, caught her, and pulled her inside their house. The Nittas don’t know her first name, but her family name was Takahashi. They haven’t seen her since that night.

They continue to live upstairs in a gutted apartment. Like almost 85% of Japanese people, they have no earthquake insurance and aren’t covered for the damages they’ve suffered. Sho says he wants to rebuild, but I can’t imagine him or his wife pulling the lumber and drywall they will need to fix their home and store. His wife wants to move in with their son in the south. The aftershocks rattle her.

Now, Sho Nitta helps organize neighborhood association meetings every day at 8 a.m. About 50 people show up regularly to receive relief items and to strategize. He, too, wants to get the electricity back in his place, but he needs his neighbor’s permission to run a new line through a shared wall. This neighbor was a music teacher and left at the first opportunity. Now, he’s in Sendai. All that is left of him is his broken piano keyboard covered in mud, which sits outside in a trash heap.

(Now former) Prime Minister Naoto Kan is touring the city of Ishinomaki today, his first visit since the earthquake. I ask Sho Nitta what he would ask Japan’s prime minister if given the chance. Nitta says his concern is the long-term future. He doesn’t believe that Ishinomaki will ever recover economically. “The shops will try to rebuild,” he says, “but the customers won’t come.” He has food and water, for now, but what happens in a year or two? Will the government be able to support him if he and his wife choose to stay? How will they rebuild?

Crocus_0.JPGCREDIT: PATRICK TUCKER

Five Peace Boat volunteers spend the day pulling mud out of the Nittas’ backyard. After several hours of hard work, they are able to leave the couple with a few square feet to erect scaffolding to repair their back wall. Nitta’s wife says she probably won’t replant what was in the garden, but she’s grateful. Extremely grateful. An orange crocus has sprung up beneath the Toyota that came through her wall. She picks the flower and holds it up so all the volunteers can see. We all make too much of it.

We have to.

The Reinvention of Community

The Nittas have been lucky, you might say. They haven’t lost anyone and aren’t technically homeless. They also exemplify the challenges Japan will face as the country tries to put this place back together. The nation’s population is the second oldest in the world. In the tsunami-affected prefectures of Iwate, Fukushima, and Miyagi, an average of one in four people is over the age of 65.

This fact becomes very apparent at Ishinomaki’s relief centers. Residents are allocated to rooms according to neighborhood, not name. In the initial days after the disaster, American television reporters made a point to mention how “orderly” the refugees were keeping the relief quarters. Many journalists were quick to credit the inherent goodness of the Japanese people, as though the inhabitants of this island nation possess a rare dignity gene absent from the common DNA. While flattering, these explanations also traffic in cultural stereotypes of the Japanese as rigid and obsessed with discipline—caricatures that have not always served the Japanese well.

The simple decision to house evacuees alongside their most immediate neighbors—recreating little villages block by block—likely contributed to the safe and calm atmosphere in the relief centers. Members of a community are the most likely to know who lives where, who might be suffering from diabetes or Parkinson’s, and how to reach them.

Almost all of the tsunami survivors I encountered felt personally responsible for reconstruction. The job of fixing damaged structures will fall upon the local community and the social welfare councils. They will appeal to the government for financial support, but all the important decisions will be made at the local level. This, in part, explains why so many residents chose to stay in damaged housing despite the lack of water, heat, or electricity. When the community is broken up and people are shipped to emergency housing situations miles away, reconstruction is impeded for everyone.

This fact seems obvious. Yet, authorities rarely consider community cohesion a priority when determining how to house disaster victims, as evinced by the U.S. government’s relocation of New Orleans residents, first to FEMA trailers and then across the country, in the aftermath of Hurricane Katrina in 2005.

I journey to Minato Elementary School, one of Ishinomaki’s relief centers. Exactly one month before my arrival, the tsunami’s wave—here reaching 16 feet high and thick with flotsam—trampled through the school’s first floor. When I climb the stairs I see that several cars still litter the temple cemetery behind the school, an indication of how high, forceful, and dangerous was the wave that crashed through here.

The refugees housed in the school’s upper stories have been separated into rooms on the basis of neighborhood. They have daily meetings, also at 8 a.m., to distribute food items and discuss the whereabouts of friends and neighbors.

A board displays requests for information about people who have not been found, and application forms for government housing assistance sit beneath an open window. These are necessary to score a spot on the waiting list for a government-subsidized hotel room or a temporary house, of which the Ishinomaki authorities have plans to build 150. Some 8,000 families have applied for temporary housing, a number expected to reach 10,000.

Sachie_0.JPGSACHIE TOMINAGA. CREDIT: PATRICK TUCKER

Sachie Tominaga is one such applicant. She was at a friend’s place when the tsunami warning sounded. She sprinted home, found her mother and her son, got them into a nearby cab, and rushed them to the elementary school. Tominaga’s son is now sitting against a wall, staring at his feet. He appears to be about 20. He is becoming visibly disturbed by our presence. His breathing is accelerating, and he is clenching his fists. Tominaga describes him as easily agitated. After she dropped him off at Minato on the day of the earthquake, she took the cab back home, turned off the gas, grabbed a few possessions, got back in the cab, and headed up the hill to the elementary school. A moment later, she and the driver found themselves stuck in traffic.

In the 30 minutes between the initial quake and the tsunami, tens of thousands of people in low-lying areas struck out to find higher ground. The traffic jam that resulted from too many people trying to take too few roads at once was enormous. Tominaga saw the choice in front of her clearly; she could stay in the cab and hope the jam cleared or make an attempt to leave on foot. She chose the latter. The cab driver, a man who arguably saved her life and the life of her son and mother, chose the former. She hasn’t seen or heard from him since.

I want to ask her about what her life has been like and what she expects next, but these sorts of questions aren’t likely to yield anything more candid than “Muzukashii desu”: It is difficult. The people of Minato do not indulge in complaint or expressions of unhappiness in front of me or the other reporter who is with me today. This is for our benefit. We are guests here, and there is a right and a wrong way to extend hospitality. And then there is the matter of pride. Sadness, like nakedness, is not for the eyes of the world. I ask her instead what life she would like to be living 10 years from now.

“Just a normal life,” she says. “Nothing elaborate.”

It is not the scope of Sachie Tominaga’s hardship that compels sympathy, for the world is populated by the poor and the homeless. Rather, it is the abruptness of her loss. In her quiet, respectful humility, she is a living testament to the fact that the destitute do not usually earn their misery through lack of discipline and poor exercise of choice.

Tomorrow, classes at Minato Elementary are scheduled to resume. Four of this room’s new residents have arrived. A group of boys, ages 7 to 10 or so, stand by the door beside their parents. They are shyly staring at a bank of cubby holes.

Tominaga and her neighbors will have to leave this room to make way for incoming students. She’s not sure where she’ll be sent, and she still has to put her things in order. “I have to go,” she says. She bows low and apologizes. We bow low in return and thank her. She leaves to comfort her son, pack away her few possessions, and prepare herself for another cab ride to a place that is not home.

Beyond Survival

Events like the March 11 earthquake and tsunami in Japan illustrate just how little control we have over the future, despite our actions. Contrary to common hubris, you cannot plan for the unthinkable. You can only pay attention, listen, and learn in order to build stronger, react smarter, survive better when the unforeseeable occurs. The tsunami is already helping researchers, inventors, and designers to do just that.

Whenever the next tsunami hits a populated nation, it will again bring with it death, destruction, and despair. But each of these can be lessened through the intelligent application of technologies already in existence and readily deployable.

Think back to the Hagas on the afternoon of the earthquake. The tsunami warning has just sounded. Like thousands of others in Ishinomaki, they head out by car only to meet traffic, the inevitable result of too many people seeking to use the same outlet at once. They’re swept up by a wave and barely survive. According to anecdotal accounts, fatalities on March 11 were particularly heavy among people stuck in motor vehicles.

Gordon Jones, CEO of Guardian Watch, knows that, while a warning bell does give enough information to spur action, it doesn’t provide enough data to make a real decision. He’s developed a mobile app that allows anyone with a smartphone or video streaming device to get a visual read on a disaster playing out in their area in real time.

The app makes use of the fact that people rely on social networks even—and perhaps especially—during disasters, when the speed of Twitter makes mainstream news look glacial in comparison. There are already more than 200 million cell phones with either photo or movie capability. It’s a function we use for leisure, shooting video of our pets or our friends’ stupid skateboard tricks. But, in a disaster, combined with the right social network and pointed in the right direction, this enormous global web of cameras takes on considerable value. Such an app would have allowed the Hagas to pinpoint the location of the wave behind them and the traffic in front of them before they got into their car.

Combine that small breakthrough with a recent finding from the University of Illinois: Researcher Jonathan Makela used the March 11 tsunami to show that huge wave events create color patterns, detectable at high altitude using special lenses. These patterns can forecast the direction and scope of the tsunami wave. The finding could give emergency workers in tsunami-vulnerable areas an extra hour to prepare.

Perhaps the most important lesson of the March 11 disaster is that we need to change the way we respond to disaster victims immediately following destructive events. Too often, the initial response of those in government charged with managing the suddenly displaced population is to relocate them many miles away.

The short-term need to take citizens out of harm’s way undermines the long-term goal of restoring their lives and communities. The thousands of displaced Ishinomaki residents needed to be physically close to their neighborhoods, and to one another, in order to rebuild.

Now meet David Lopez, a Baltimore architect who’s pushing a new approach to emergency housing. His focus: shelter solutions that allow communities to stay together, as close to their original dwellings as possible, after disasters. It’s a mission he pursued in Haiti following the 2010 earthquake.

proto_0.JPGMARYLAND INSTITUTE COLLEGE OF ART STUDENTS DISPLAY A NEXT-GEN EMERGENCY HOUSING DESIGN IN MAY. CREDIT: PATRICK TUCKER

Lopez teaches a class on emergency housing at the Maryland Institute College of Art. Last May, part of the course work for his students was to design a transitional housing response to an earthquake. The winning project transformed various bits of debris from fallen structures into a cluster of houses where the old ones once stood, thus solving simultaneously (though not entirely) the twin problems of handling the debris and quickly acquiring cheap materials. At a cost of less than $3,100 per house, the winning scheme would cost less than what the Japanese government was spending to build emergency housing units offsite.

This small improvement over the current status quo could make a dramatic difference in the lives of the people of Ishinomaki. Given the choice between abandoning their neighborhood and staying—perhaps uncomfortably—in a broken house with no water or heat, most of the men and women I came across chose the latter. If there is anything to be learned from the events that played out in Japan after the tsunami, it is that our public response to disaster must accommodate and encourage this vital urge to keep community physically intact.

Guardians of the Now

I become viscerally aware of this need for connectedness on the day I journey with other Peace Boat volunteers to Ogatsu on the outskirts of Ishinomaki.

Ogatsu was once a town: a collection of homes, offices, and stores laid out on a navigable grid; a place where people rode bicycles to the market, children walked to school while playing handheld video games; where old women swept the dust from their front steps. These are the typical characteristics of a Japanese community, but they do not describe this place. Not anymore.

Ogatsu2_0.JPGOGATSU, JAPAN. CREDIT: PATRICK TUCKER

Ogatsu, as I encounter it, has become a white Shinto wedding dress webbed across tree branches. It is a house with its interior—couch, chair, wallpaper—exposed like a diorama. Ogatsu is splinters and metal and cotton and silk chaotically meeting and diverging in a manner that is almost beautiful but that cannot serve a single human need. The town of Ogatsu is field upon field strewn with bits and pieces of its inhabitants’ former lives.

The town of Ogatsu is no more.

On March 11, the tsunami here was at its mightiest, at more than 100 feet high. It descended on this place and chewed through everything in its path. The volunteers with me are encountering Ogatsu for the first time, and they are silent. The van driver, a tough looking fellow with long hair done up in a ponytail, is trying in vain to hide the fact that he is weeping. We pass an upside-down roof stuck on a sandbar, like an overturned turtle, and a bus parked where city hall once stood.

Among the few structures still standing is the three-story hospital. Every window is broken. It looks like a casualty of economic depression, a factory abandoned 50 years ago, not a first-rate medical facility that was housing patients just a month earlier. The remains of once neighboring houses are piled up against its walls.

We have come to serve miso soup, boiled vegetables, and rice to the handful of Ogatsu employees who have elected to stay here and clean debris. Many have been sleeping in improvised houses or beneath the tin roof of the local recycling center, which we use as our kitchen. The place is not much more than a truck hangar that had been transformed into a living room. Mismatched bits of office and home furniture stand around coffee tables. Everything is damp with mildew and rain.

sk_0.JPGOGATSU SOUP KITCHEN. CREDIT: PATRICK TUCKER

There were 40 town employees who lived in Ogatsu before the earthquake. I am told that two-thirds have vanished and are presumed dead. We prepare soup for 60, not knowing who else is in the area and may show up.

One of the survivors is Hiroshi Yamashita. In the minutes after the earthquake, he went to help evacuate the hospital, but then fled to the roof when the waters rose up through the first, then the second, then the third floors. He stayed there for three days, waiting for the ocean to recede. His only company was the sound of the waves lapping against the sides of the building. Night brought with it a darkness he had never before seen and the certain knowledge that many people in the hospital beneath him had perished. Finally, on the third night, the sound of moving water softened and disappeared. He was able to climb down the next morning, find construction equipment, and set to work cleaning the street.

He lost several friends that day, but his family—two daughters, his wife, and his mother—survived and are staying with relatives. He has been living in a windowless cargo truck so he can better assist in the clean-up and management of relief items.

With some cajoling, Yamashita admits that the government seemed slow in its response to the disaster, particularly in its handling of food. The Self-Defense Force didn’t begin distributing rice and bread in Ishinomaki until the first week of April, nearly three weeks after the tsunami.

Yamashita is reluctant to offer a more critical assessment of the Kan administration’s response to the event, or the government’s focus on the nuclear power plant. In situations like these, he says, the burden of both relief and repair lies first “with the town leadership, then the prefecture government, then the national government.”

It’s this self-imposed role of guardian that has kept him in Ogatsu, attached to a town that isn’t, cleaning away the remnants of what had been. I ask him what he would like to see this place become in 10 years. This is a softball question that I pitch to a lot of people—an open invitation to be optimistic, to recreate Ogatsu from whole cloth. He looks to the tin roof above his head.

“One thing is for certain,” he says. “I will still be here.”❑

About the Author

Patrick Tucker is the deputy editor of THE FUTURIST magazine and director of communications for the World Future Society. He spent five months in Japan researching trends and reporting for THE FUTURIST (“Solar Power from the Moon,” May-June 2011; “My First Meltdown: Lessons from Fukushima,” July-August 2011; and “Thank You Very Much, Mr. Roboto,” September-October 2011). Email ptucker@wfs.org.

Investigating the Future: Lessons from the "Scene of the Crime"

By Charles Brass

Futurists investigate clues and evidence to attempt to answer difficult questions, much like crime-scene investigators. But while CSIs try to determine things that have already happened, futurists look to what may yet happen, and what we can do now to influence it.

Crime-Scene Futurists: Six Rules from CSI

  1. Explicitly describe the boundary marking the edges of the space in which you are interested. There often will be physical, temporal, and/or organizational dimensions of this boundary, and all need to be identified.
  2. Ensure that all the people who normally inhabit this space, or are likely to enter the space during the project, are aware of the project and its aims.
  3. Document the current contents of the space in as much detail as time and resources permit.
  4. Investigate the provenance of the space with as much diligence as you can.
  5. Notice how, and why, the space changes during the project. Look for both the internal and external forces that might explain these changes.
  6. Use appropriate tools from your futurist toolkit to begin to tease out the future for the space.

—Charles Brass

As practitioners of a relatively young profession, futurists are frequently asked to explain what they do. Often, the askers have some skepticism. I personally have lost track of the number of times people have asked to see my crystal ball or my time machine when I have shown them my business card.

Many people seem to be unable to get their heads around the idea that it is possible to learn something useful about events or situations that have not yet happened. Yet, when archaeologists report on what they have learned, no one doubts their professionalism, despite the fact that they were not at the time and place they are observing.

This is why, when I am asked to explain what a futurist does, I use the analogy of an archaeologist or, for younger audiences, a crime-scene investigator. Most practicing futurists are at least as interested in the past as they are in the future, but my use of this analogy goes far beyond simply acknowledging that how we arrived at the present has a powerful impact on what will happen in the future.

Both crime-scene investigators and futurists are interested in learning more about a time and place remote from themselves, and both use increasingly sophisticated sets of tools and techniques to help them expand their knowledge. Before they begin to use any of these tools, however, they follow a series of protocols that are designed to ensure that they do their job rigorously and that others can validate and replicate their work. This article looks at some of the rules that crime-scene investigators (CSIs) follow. These rules have direct parallels in helping to shape not only good crime-scene analysis, but good futures practice, as well.

Determining the Investigation’s Boundaries

The first thing that CSIs do is to define the physical space in which they are interested and then cordon this area off. This is no trivial exercise. The CSIs expect to invest considerable time and energy in examining the interior of that quarantined space, recognizing all the while that drawing too wide a boundary may yield only marginally more knowledge. Similarly, drawing too narrow a boundary will increase the likelihood that important information will be overlooked. In any case, no boundary can possibly capture everything or everybody of interest.

Futurists, too, have to delineate boundaries around the themes in which they and their clients are interested. As good systems thinkers, futurists are acutely aware of the extent to which everything is interconnected, and they are always concerned that important information may lie outside the immediate area of their focus.

They also know (and if they don’t, their clients always remind them) that they don’t have an infinite amount of time within which to explore the future. Futures work is designed to enhance the quality of decisions made in the present, and clients most often want to make decisions quickly. For instance, those responsible for public-school systems must anticipate numbers of incoming kindergarteners some years in advance, but this is difficult in the absence of detailed information about such things as decisions to open or close local factories, or planned changes in zoning regulations.

The CSI has an advantage over the futurist in that the boundary of an official crime scene is marked with very visible tape that everybody understands and most people respect. Even if futurists are meticulous and explicit about defining the boundaries of a particular assignment, the nature of their work and the people they work with mean these boundaries regularly get challenged or ignored. Nonetheless, most futurists find it very helpful in their consulting work to take time early in the process to discuss, and hopefully agree on, the boundaries within which any particular assignment will take place.

Of course, good CSIs know that a new discovery might at any time cause an expansion of the taped-off area. Similarly, futures work is made easier if the futurist and the client can explicitly acknowledge that some proposed new action is taking the assignment beyond the previously agreed boundaries. In the school system example, chronic flooding in the region may also impact families’ relocation decisions, so the futurist’s boundaries might need to expand to include environmental factors.

There is more to the tape around a crime scene, however, than just simply defining where the CSI will focus attention. The tape reminds others that the space inside is a special place and needs to be treated carefully.

This is another way in which the CSI has an advantage over the futurist. CSIs can pretty well ensure that no one will enter their area of interest unless they have been invited, and even then they will follow the CSI’s rules of conduct. In effect, the CSIs attempt to freeze the crime scene until they complete their investigation.

Futurists’ areas of interest can rarely be as conveniently frozen while the analysis takes place. Nonetheless, if people who do continue to move around inside the demarked area are aware that, for the moment, this is a special space, they are more likely to think more carefully about the actions they take. Perhaps the members of the school board might need to be reminded to factor their yet-to-be completed future scanning into their current budget cycle.

For futurists, marking out the territory of interest in a particular investigation includes identifying the people who habitually occupy that territory. Letting all these people know that an investigation is taking place can often reduce the accidental damage done by those who aren’t aware of the significance of the space.

Of course, not everyone’s motives are pure and wholesome. Both CSIs and futurists need to be aware that some people will deliberately try to mislead or taint the crime scene or the future space.

Analyzing Evidence Objectively

Having drawn a boundary around their area of interest, CSIs then get down to work. They know that their primary role is to carefully notice and document as much as possible. In addition to their five human senses, they bring their experience and a variety of technological tools to help them in this work.

They are acutely aware that their mere presence on the scene changes things, and that their human prejudices and biases color what they notice and how they report on what they notice. They are aware, too, that some of their work is unpleasant, and that it is a natural human reaction to try and cover up some of this unpleasantness.

Futurists, too, are most often outsiders that other people bring in to a situation to help make sense of it. Like any other human beings, too, futurists are prone to bring biases and prejudices to everything they do. Just as the fingerprints of all CSIs and police officers are recorded so they can be eliminated from the investigation, so futurists need to be careful to eliminate as much of their influence on the scene as they can.

Futurists also should know that, whatever specialist expertise they claim to bring, many others on the scene will nonetheless seek to bring their perspectives to the situation. In particular, futurists need to be aware of the natural human tendency to avoid unpleasantness. The best futurists are skilled at presenting the results of their work in such a way that all relevant aspects are given their appropriate weight.

Placing a tape around a crime scene gives the impression that the moment of the crime has been frozen for analysis by the CSI. The skilled investigator, whether CSI or futurist, knows that everything changes, even during an investigation, so the more they know about how things change, the more useful they will be.

In this regard, the training that futurists receive might give them an advantage over the CSIs. Learning to appreciate all the dimensions within which change takes place is an integral part of futurist training, and good futurists are aware that only dead things change in regularly predictable ways.

The CSIs are almost always examining purely physical, geographic space. Futurists, on the other hand, explore landscapes that are shaped and populated by human beings for whom change is an unpredictable inevitability.

CSIs’ specialist expertise is most often accepted by all those involved. They can often rely on the legal system both to support their efforts and to compel the participation of all those in whom they are interested.

Alas, futurists have no such legal mandate. Where the CSI can usually assume that those who commission their work are genuinely interested in their professional analysis—such as identifying a cause of death or indicating a probable perpetrator—futurists often confront unwilling participants or even clients unwilling to listen to what has been learned.

CSIs are provided with an ever-expanding toolkit, much of which is the result of developments in science and technology. In particular, they have access to many tools that enhance or extend human senses and give precise quantitative data.

Futurists, too, have access to an expanding toolkit. Like the CSIs’, much of the futurists’ equipment is designed to supplement individual human senses, often by aggregating information across larger populations. Some of the futurist toolkit is also designed to tap into underutilized areas of the human experience, such as myth, metaphor, and worldview. Often, the futurists seek to sharpen human senses by focusing them in a variety of ways. Modern technology enhances the futurist toolkit by allowing the collection, analysis, and interpretation of quantities of data that would otherwise stretch human capability.

Whatever tools are used, both the CSIs and the futurists need to be aware of the limitations of human ability to understand and interpret the information before them. And they also need to be aware that some people have malicious intent and can either inadvertently or consciously taint the data.

Studying the Past and Studying the Future

CSIs and futurists are both part of our modern world because human beings are relentlessly interested in the world around them. Since none of us can be everywhere at all times, we are collectively prepared to invest in developing the skills of that special subset of people who can help us make sense of a world we did not, or could not, experience: the past and the future.

Good CSIs know that the past is not a space that anyone can completely understand. No matter how many resources we bring to bear on studying it, our comprehension of the past—even of very recent events—will always be imperfect. What CSIs expect to do is to work diligently to reduce this imperfection as much as they can.

Futurists can relate to this: The future is also inherently uncertain. They strive to reduce the uncertainties as much as possible by applying systemic and systematic approaches to understanding the future.

There is a final, crucial difference between CSIs and futurists, however. CSIs primarily exist to help others understand what has happened. Futurists are interested in what may happen and are even more interested in what we would like to happen. Futures work is about both understanding the future and creating it.

In The Clock of the Long Now, futurist Stewart Brand wrote: “Our experience of time is asymmetric. We can see the past, but not influence it. We can influence the future, but not see it.” He may have been wrong on both counts. Many people behave as though they could influence the past, and we all strive to see the future. What both CSIs and futurists remind us is that doing all these things will be improved if it is done systematically and rigorously.

About the Author

Charles Brass is chair of Australia’s premier futures organization, the Futures Foundation, which incorporates the professional association for futurists in Australia. E-mail cab@fowf.com.au; Web site www.futuresfoundation.org.au.

Updating the Global Scorecard: The 2011 State of the Future

By Jerome C. Glenn

The world could be better off in ten years than it is today, but only if decision makers can work together to meet global challenges, according to The Millennium Project.

The global population in general is richer, healthier, better educated, more peaceful, and better connected than ever before, yet half the world is potentially unstable. Food prices are rising, water tables are falling, corruption and organized crime are increasing, debt and economic insecurity is growing, climate change is accelerating, and the gap between the rich and poor continues to widen dangerously.

There are potentials for many serious nightmares, but also a range of solutions for each. If current trends in population growth, resource depletion, climate change, terrorism, organized crime, and disease continue and converge over the next 50–100 years, it is easy to imagine catastrophic results and an unstable world. But, if current trends in self-organization via future Internets, transnational cooperation, materials science, alternative energy, cognitive science, interreligious dialogues, synthetic biology, and nanotechnology continue and converge over the next 50–100 years, it is easy to imagine a world that works for all.

The coming biological revolution may change civilization more profoundly than did the industrial or information revolutions. The world has not come to grips with the implications of writing genetic code to create new life-forms. Yet, within the next two decades, the concept of being dependent on synthetic life-forms for medicine, food, water, and energy could be quite normal.

After 15 years of The Millennium Project’s global futures research, it is increasingly clear that the world has the resources to address its challenges. What is not clear is whether we will make good decisions fast enough and on a large enough scale to really address these challenges. Hence, we are in a race between implementing ever-increasing ways to improve the human condition and the seemingly ever-increasing complexity and scale of global problems.

So, how is the world doing in this race? What’s the score so far? In order to calculate that, an international Delphi panel selected more than a hundred indicators of progress or regress. Indicators were then chosen that had at least 20 years of reliable historical data and later, where possible, were matched with variables used in the International Futures model. The resulting 28 variables were integrated into the State of the Future Index with a 10-year projection. A review of the trends of the 28 variables used in The Millennium Project’s State of the Future Index provides a record of humanity’s performance in addressing the most important challenges.

The Current Outlook for 2020: The Millennium Project’s Scorecard

Here is a summary of where The Millennium Project’s participants see improvements, where they see backsliding, and where the trends may be ambiguous.

Where We Are Winning

  • The percentage of people with access to clean water.
  • The adult literacy rate.
  • The percentage of people enrolled in secondary school.
  • Poverty measured as percentages of the population in low- and mid-income countries living on $1.25 a day (purchasing power parity).
  • Overall global population growth.
  • GDP per capita.
  • Physicians and health-care workers per 1,000 people.
  • Internet users.
  • Infant mortality rates.
  • Life expectancy at birth.
  • Overall percentage of women in parliamentary governments.
  • GDP per unit of energy use.
  • Number of major armed conflicts with more than 1,000 deaths per year.
  • Undernourishment.
  • HIV prevalence among 15- to 49-year-olds.
  • Number of countries that have or are strongly suspected to have plans for nuclear weapons.
  • Total debt service in low- and mid-income countries.
  • Research and development expenditures as a percentage of national budgets.

Where We Are Losing

  • Carbon-dioxide emissions.
  • Global surface temperature anomalies.
  • Percentage of people voting in elections.
  • Levels of corruption in the 15 largest countries.
  • Number of people killed or injured in terrorist attacks.
  • Number of refugees per 100,000 total population.

Areas of Uncertainty

It is not clear at the moment where these trends are heading:

  • Unemployment rate.
  • Transition to clean energy and renewables.
  • Percentage of global population living in democratic countries.
  • Percentage of land area covered by forest.

In comparison with recent years, the global forecast for the next decade looks better than ever. However, the future may not improve as much in the next 10 years as it has over the past 20 years. In many of the areas where improvements are being made (such as reductions in HIV, malnutrition, and developing country debt), they are not being made fast enough. There are also areas of uncertainty that represent serious problems: unemployment, fossil fuel consumption, political freedom, and forest cover. Some problems could have quite serious impacts, such as corruption, climate change, organized crime, and terrorism. Nevertheless, this selection of data indicates that the world 10 years from now, on balance, will be better than today.

Factors to Consider in Assessing the Future

Climate Change and Earth’s Resources: Each decade since 1970 has been warmer than the preceding one, and 2010 tied 2005 as the warmest year on record. Atmospheric CO2 is at 394.35 parts per million as of May 2011, the highest in at least 2 million years.

According to the UN Food and Agriculture Organization’s report Livestock’s Long Shadow, the meat industry contributes 18% of human-related greenhouse gases (measured in CO2 equivalent), which is higher than the transportation industry. A large reinsurance company found that 90% of 950 natural disasters in 2010 were weather-related and fit climate change models; these disasters killed 295,000 people and cost approximately $130 billion.

To save the ecosystem, nothing less than cutting CO2 by 80% by 2020, keeping population to no more than 8 billion by 2050, restoring natural ecosystems, and eradicating poverty will be required, argues Earth Policy Institute President Lester Brown in his book Plan B 4.0: Mobilizing to Save Civilization (Norton, 2009)

Humanity’s material extraction increased eightfold during the twentieth century. We currently consume 30% more renewable natural resources than these systems regenerate. In just 39 years, humanity may add an additional 2.3 billion people to world population. There were 1 billion humans in 1804, 2 billion in 1927, 6 billion in 1999, and 7 billion today.

Investment in alternative energy is rapidly accelerating to meet the projected 40%–50% increase in demand by 2035. China has become the largest investor in “low-carbon energy,” with a 2010 budget of $51 billion. Yet, without major technological breakthroughs and large-scale behavioral changes, the majority of the world’s energy in 2050 will still come from fossil fuels. Therefore, large-scale carbon capture and reuse has to become a top priority to reduce climate change.

Meanwhile, automakers around the world are in a race to make lower-cost plug-in hybrid and all-electric cars. Engineering companies are exploring how to take CO2 emissions from coal power plants to make carbonates for cement and grow algae for biofuels and fish food. China is exploring telework programs to reduce long commuting, energy, costs, and traffic congestion.

Falling water tables worldwide and increasing depletion of sustainably managed water have led to the concept of “peak water,” similar to peak oil. Since 1990, an additional 1.3 billion people gained access to improved drinking water and 500 million got better sanitation. Yet 884 million people still lack access to clean water today (down from 900 million in 2009) and 2.6 billion people still lack access to safe sanitation. Half of all hospital patients in the developing world suffer from water-related diseases.

Food prices are at their highest point in history and are likely to continue increasing over the long term if there are no major innovations in production and changes in consumption. New approaches like saltwater agriculture and pure meat produced from stem cells or tissue replications could help alleviate this.

Environmental security is increasingly dominating national and international agendas and shifting defense and geopolitical paradigms, because policy leaders increasingly understand that conflict and environmental degradation exacerbate each other. The traditional nation-centered security focus is expanding to a more global one due to geopolitical shifts, the effects of climate change, environmental and energy security, and growing global interdependencies. The Millennium Project defines environmental security as the viability of an environment to support life. This concept embraces the goals of preventing or repairing military damage to the environment, preventing or responding to environmentally caused conflicts, and protecting the environment due to its inherent moral value.

Proceeding along the “business-as-usual” path is a threat to environmental security. People and organizations who got away with wreaking environmental damage in the past are less likely to escape exposure and punishment in the future.

Social Change: Nearly 30% of the population in Muslim-majority countries is between 15 and 29 years old. Many who were tired of older hierarchies and high unemployment, felt left behind, and wanted to join the modern world brought change across North Africa and the Middle East in 2011. This demographic pattern is expected to continue for another generation, with potential for improvement and innovation as well as continued social unrest and migration.

The social media that helped facilitate the Arab Spring Awakening is in no small part driving a historic transition from a world comprising many pockets of civilizations barely aware of each other’s existence to a digitally interconnected world.

Technology and the Economy: More data went through the Internet in 2010 than in all the previous years combined, and Amazon.com sold more electronic than paper books for the first time that year as well. Humanity, the built environment, and ubiquitous computing are forming an augmented continuum of consciousness and technology that reflects the full range of human behavior, from individual philanthropy to organized crime. New forms of civilization will emerge from this convergence of minds, information, and technology worldwide.

Computing power continues to accelerate. China currently holds the record for the fastest computer with Tianhe-1. Mira, the 10-metaflop supercomputer that IBM claims will be operational in 2012, would be four times faster. Just as the autonomic nervous system runs most biological decision making, computer systems are also increasingly making more (and more significant and complex) day-to-day decisions.

Ethical decision making is struggling to keep up with the rate of technological change. Despite the extraordinary achievements of science and technology, future risks from continued acceleration and globalization needs to be better forecasted and assessed. At the same time, new technologies also make it easier for more people to do more good at a faster pace than ever before. Ordinary citizens initiate groups on the Internet, organizing actions worldwide around specific ethical issues. News media, blogs, mobile phone cameras, ethics commissions, and NGOs are increasingly exposing unethical decisions and corrupt practices, creating an embryonic global conscience.

Poverty and Wealth: Poverty is on a downward trend globally. The world economy grew 4.9% in 2010 while the population grew 1.2%, yielding world GDP per capita growth of 3.7%. Nearly half a billion people rose out of extreme poverty ($1.25 a day) between 2005 and 2010, but 900 million (13% of the global population) remain in such dire conditions. The number of countries classified as low-income has fallen from 66 to 40, but the gap between rich and poor—both within and among countries—continues to widen. Brazil, Russia, India, and China produced 108 of the 214 new billionaires in 2011, according to Forbes.

China surpassed Japan to become the world’s second-largest economy in 2010. There are more Internet users in China (485 million) than the entire population of the United States (307 million). India is expected to pass China as most populous country in the world by 2030. Together, China and India account for nearly 40% of humanity and are increasingly becoming the driving force for world economic growth.

Health, Medicine, and Well-Being: Many populations are aging, due to falling fertility rates and increasing longevity. The ability to meet financial requirements for the elderly will diminish as the support ratio (workers-to-retirees) will continue to shrink. Policy makers will need to rethink the concept of retirement, and social structures will have to change to avoid intergenerational conflicts.

Another byproduct of longer lives is that there could be as many as 150 million people with age-related dementia by 2050. Advances in brain research and applications to improve brain functioning and maintenance could lead to healthy long life (as opposed to an infirmed long life).

World health is improving, the incidence of diseases is falling, and people are living longer, yet many past challenges remain and many future threats are becoming more serious. During 2011, there were six potential epidemics. The most dangerous is probably the NDM-1 enzyme that can make a variety of bacteria resistant to most drugs. On the plus side, new HIV infections declined 19% over the past decade; the median cost of antiretroviral medicine per person in low-income countries has dropped to $137 per year; and 45% of the estimated 9.7 million people in need of antiretroviral therapy received it by the end of 2010. Yet two new HIV infections occur for every person starting treatment.

Infant mortality is on the decline, as more than 30% fewer children under age 5 died in 2010 than in 1990. Total mortality from infectious disease fell from 25% in 1998 to less than 16% in 2010. On the other hand, health-care costs are increasing, and the shortage of health workers is growing, making telemedicine and self-diagnosis via biochip sensors and online expert systems increasingly necessary.

Conflict and Crime: There is, of course, a darker side to technological development. Advances in synthetic biology, DNA research, and future desktop molecular and pharmaceutical manufacturing could one day give individuals the ability to make and deploy biological weapons of mass destruction. To counter this, we will need more-sophisticated sensors to detect molecular changes in public spaces, along with advances in human development (ranging from improved education to more widespread mental health care) and social engagement to reduce the number of people who might be inclined to use these technologies for mass murder.

Another emerging problem is information warfare and cyberwar. Governments and military contractors are engaged in an intellectual arms race to defend themselves from cyberattacks from other governments and their surrogates. Because society’s vital systems now depend on the Internet, cyberweapons to bring it down can be thought of as weapons of mass destruction. Information warfare’s manipulation of the media can lead to increasing mistrust.

Meanwhile, traditional military wars have decreased over the past two decades, cross-cultural dialogues are flourishing, and intra-state conflicts are increasingly being settled by international interventions. As of this writing, there are 10 major armed conflicts with at least 1,000 deaths per year, down from 14 in 2010: Afghanistan, Iraq, Somalia, Yemen, northwestern Pakistan, Naxalites in India, Mexican cartels, Sudan, Libya, and one classified as international extremism.

The United States and Russia continue to reduce their nuclear stockpiles, but China, India, and Pakistan are increasing them. According to the Federation of American Scientists, by February 2011 there were 22,000 nuclear warheads in the world, 2,000 of which are ready for use by the United States and Russia. And while the number and area of nuclear-free zones are increasing, the number of unstable states is increasing (from 28 to 37 between 2006 and 2011).

Although the world is waking up to the enormity of the threat of transnational organized crime, the problem continues to grow, and leaders have yet to adopt a global strategy to address this threat. World illicit trade is estimated at $1.6 trillion for 2011 (up $500 billion from 2010), with counterfeiting and intellectual property piracy accounting for $300 billion to $1 trillion, the global drug trade at $404 billion, trade in environmental goods at $63 billion, human trafficking and prostitution at $220 billion, smuggling at $94 billion, weapons trade at $12 billion, and cybercrime costing billions annually in lost revenue. These figures do not include extortion or organized crime’s part of the $1 trillion in bribes that the World Bank estimates are paid annually, or its part of the estimated $1.5–6.5 trillion in laundered money. Hence, the total illicit income could be $2–3 trillion—about twice as big as all the military budgets in the world.

Addressing Humanity’s Challenges

The global challenges facing humanity are transnational in nature, demanding transinstitutional solutions. No government, international organization, or other form of institution acting alone can solve these problems. The world may have to move from governance by a mosaic of sometimes conflicting national government policies to governance by coordinated and mutually supporting global policies that are implemented at national and local levels.

The global financial crisis and the efforts to resolve it have clearly demonstrated the need for global systems of analysis, policy formation, and policy implementation. Nation-state decision making worked well during slower and less interdependent times. However, the future is expected to be far more interdependent than today, with even less leeway between problem recognition and solution. Hence, it will require improved global governance.

Economic growth and technological innovation have led to better health and living conditions than ever before for more than half the people in the world, but unless our financial, economic, environmental, and social behaviors are improved along with our industrial technologies, the long-term future is in jeopardy.

Governments should create systems of resilience and collective intelligence and should use national “State of the Future Indexes” for their budget and policy processes. Potential decision makers should have a keen grasp of foresight methods. They should be hardheaded idealists who can look into the worst and best of humanity to create and implement strategies of success.

The world can be a far better place—but only if individuals, groups, nations, and institutions make the right decisions. We need a multifaceted, compellingly positive view of the future toward which humanity can work.

About the Author

Jerome C. Glenn is the executive director of The Millennium Project and the primary author of the organization’s annual State of the Future reports over the past 15 years.

This article draws from the most recent report, which may be ordered from The Millennium Project at www.millennium-project.org. Readers are also invited to share their own conclusions about these trends, as well as read and comment on the short online summaries of the 15 Global Challenges.

Reconnecting to Nature in the Age of Technology

By Richard Louv

A best-selling author argues that our relationship with our natural environment is in jeopardy, imperiling our future well-being. But the growing trend of social networking may in fact inspire new tools to help us restore nature to our lives.

From The Nature Principle, © 2011 by Richard Louv. Reprinted by permission of Algonquin Books of Chapel Hill. All rights reserved.

Every day, our relationship with nature, or the lack of it, influences our lives. This has always been true. But in the twenty-first century, our survival—or thrival—will require a transformative framework for that relationship, a reunion of humans with the rest of nature. In 2005, in Last Child in the Woods, I introduced the term nature-deficit disorder, not as a medical diagnosis, but as a way to describe the growing gap between children and nature. After the book’s publication, I heard many adults speak with heartfelt emotion, even anger, about this separation, but also about their own sense of loss.

In my most recent book, The Nature Principle, I describe a future shaped by an amalgam of converging theories and trends as well as a reconciliation with old truths. This amalgam, the Nature Principle, holds that a reconnection to the natural world is fundamental to human health, well-being, spirit, and survival.

Primarily a statement of philosophy, the Nature Principle is supported by a growing body of theoretical, anecdotal, and empirical research that describes the restorative power of nature—its impact on our senses and intelligence; on the physical, psychological, and spiritual health; and on the bonds of family, friendship, and the multispecies community. Illuminated by ideas and stories from good people I have met, the book asks: What would our lives be like if our days and nights were as immersed in nature as they are in technology? How can each of us help create that life-enhancing world, not only in a hypothetical future, but right now, for our families and for ourselves?

Our sense of urgency grows. In 2008, for the first time in history, more than half the world’s population lived in towns and cities. The traditional ways that humans have experienced nature are vanishing, along with biodiversity.

At the same time, our culture’s faith in technological immersion seems to have no limits, and we drift ever deeper into a sea of circuitry. We consume breathtaking media accounts of the creation of synthetic life, combining bacteria with human DNA; of microscopic machines designed to enter our bodies to fight biological invaders or to move deadly clouds across the battlefields of war; of computer-augmented reality; of futuristic houses in which we are surrounded by simulated reality transmitted from every wall. Inventors and futurists like Ray Kurzweil describe a coming “transhuman” or “posthuman” era in which people are optimally enhanced by technology. NASA’s Steven Dick describes a “postbiological universe” where “the majority of intelligent life has evolved beyond flesh and blood intelligence.”

I am not arguing against these concepts or their proponents—at least not the ones who are devoted to the ethical use of technology to expand human capacities. But I do suggest that we’re getting ahead of ourselves. We have yet to fully realize, or even adequately study, the enhancement of human capacities through the power of nature. In a report praising higher-tech classrooms, one educator quotes Abraham Lincoln: “The dogmas of the quiet past are inadequate to the stormy present. The occasion is piled high with difficulties, and we must rise with the occasion. As our case is new, so we must think anew and act anew.” That we should; but in the twenty-first century, ironically, an outsized faith in technology—a turning away from nature—may well be the outdated dogma of our time.

In contrast, the Nature Principle suggests that, in an age of rapid environmental, economic, and social transformation, the future will belong to the nature-smart—those individuals, families, businesses, and political leaders who develop a deeper understanding of nature, and who balance the virtual with the real.

In fact, because of the environmental challenges we face today, we may be—we had better be—entering the most creative period in human history. This is a time defined by a goal to extend the past century of environmentalism, and to go beyond sustainability to the renaturing of everyday life.

The Connection between Nature and Health

In 2007, naturalist Robby Astrove and I were driving through West Palm Beach, Florida, on our way to an event promoting the preservation of the Everglades. He told me: “As a kid, I was always glued to the car window, taking notice. I still do this and must sit in a window seat when flying. Looking back, it’s no wonder I’m a naturalist, having trained my senses to detail, images, sounds, and feelings.”

In fifth grade, a school field trip to the Everglades led to his career choice. After college, he surveyed hundreds of miles of the Everglades, to learn about the great river of grass, the threats to it, and its recovery. In 1979, when he was 15, Astrove was diagnosed with HIV and hepatitis C, which he contracted from three life-saving blood transfusions for a staph infection that had spread from a blister on his thumb. Following the blood test that revealed HIV, he was called into the doctor’s office. He found his parents in tears. “The doc sat me down and shared the news. My first words were, ‘What are we going to do now?’”

During the ensuing years, he found himself drawn, more and more, to the river of grass. “It’s hard to explain, but acknowledging the cycles, patterns, and interconnectedness of the world has provided healing to me,” he said. “Sometimes, I awake in the middle of the night and find myself putting on boots, grabbing a raincoat and collection containers. I don’t question actions like that. I’m excited to hike in the dark not knowing what I’ll find. It might not be until I hear the call of a barred owl that I realize why I came. Or seeing a familiar tree that I’ve studied a million times during the day that reveals something new at night. I go because I trust my instincts, have patience, and allow for things to happen. Well, there’s luck, too. But the same trust and instinct is required to manage a disease. When I haven’t gotten enough nature time, my body tells me. I listen.”

Astrove, who is studying international public health at Emory University, finds HIV biologically fascinating. “It’s able to reproduce rapidly and can mutate, always creating the demand for new medicines. In a weird way, HIV is elegant, beautiful. I understand what this monster is capable of, so I establish limits. Not staying out too late, eating healthy, not ever smoking.” Avoiding these behaviors as a teenager was difficult for him, but respect for the virus trumped peer pressure. “Nature is always making adaptations, so why can’t I do the same? I listen. When I hear ‘rest,’ I rest. When I see macroinvertebrates in a stream indicating clean water, that reminds me to pay attention to indicators for my own health. Stumbling upon a rare plant reminds me of the uniqueness of my situation. No two people are the same in their response to a virus.”

In his role as an educator, Astrove teaches his students that wetlands serve as “nature’s liver.” He relates to systems personally. “The wetlands purify water and trap pollutants.” He explains that the rain forests and other natural places are the source of many of our medicines, that spending time in that world reduces stress. “We feel good from the endorphin release it stimulates, and it inspires us. Inspiration is another giver of health. I go to the woods knowing I will receive healing. And the benefits come in the form of physical, psychological, and spiritual gains. It’s a natural high sometimes when I get the feeling of light, energy, and awe.” He looked out the truck window at the passing landscape as he drove. “Now that I’ve been taking meds for some time, sensitive blood tests can’t find the virus; I test ‘undetectable.’”

Does research give weight to Astrove’s experience? Possibly. A study of 260 people in 24 sites across Japan found that, among people who gazed on forest scenery for twenty minutes, the average concentration of salivary cortisol, a stress hormone, was 13.4% lower than that of people in urban settings.

“Humans lived in nature for 5 million years. We were made to fit a natural environment…. When we are exposed to nature, our bodies go back to how they should be,” explained Yoshifumi Miyazaki, who conducted the study. Miyazaki is director of the Center for Environment Health and Field Sciences at Chiba University; he is Japan’s leading scholar on “forest medicine,” an accepted health-care concept in Japan, where it is sometimes called “forest bathing.”

In other research, Li Qing, a senior assistant professor of forest medicine at Nippon Medical School in Tokyo, found green exercise—physical movement in a natural setting—can increase the activity of natural killer (NK) cells. This effect can be maintained for as long as 30 days.

“When NK activity increases, immune strength is enhanced, which boosts resistance against stress,” according to Li, who attributes the increase in NK activity partly to inhaling air conditioning phytoncides, antimicrobial essential wood oils given off by plants. Studies of this sort deserve closer scrutiny. For example, in the study of natural killer cells, there was no control group, so it is hard to say if the change was due to time off work, exercise, nature contact, or some combination of influences.

Nonetheless, for Astrove, wilderness has helped create a context for healing. It may have strengthened his immune system and offered protective properties that he, and the rest of us, do not yet fully understand.

The Third Ring

Remember those cardboard kaleidoscopes we had when we were kids—how, when you twisted the cylinders, the pieces of colored plastic would snap into a vivid pattern? Sometimes the future comes into focus just like that. For me, one such moment occurred at a conference held in New Hampshire in 2007. On that day, more than a thousand people from across the state traveled to chart the course of the statewide effort to connect families with nature.

As hours of productive meetings came to an end, a father stood up, complimented the attendees’ creativity, and then cut to the chase. “We’ve been talking a lot about programs today,” he said. “Yes, we need to support the programs that connect people to nature, and yes, we need more programs. But the truth is,” he added, “we’ve always had programs to get people outside and kids still aren’t going outside in their own neighborhoods.” Neither, for that matter, are that many adults. He described his own experience. “A creek runs through my neighborhood, and I would love it if my girls could go down and play along that creek,” he said. “But here’s the deal. My neighbors’ yards back up to the creek, and I have yet to go to my neighbors and ask them to give permission to my kids to play along the creek. So here’s my question. What will it take for me to go to my neighbors and ask them for that permission?”

The New Hampshire dad was raising a fundamental question for people of all ages.

What will it take?

The goal is deep, self-replicating cultural change, a leap forward in what a society considers normal and expected. But how do we get there from here? Let me offer my Three Ring theory. The First Ring comprises traditionally funded, direct-service programs (nonprofits, community organizing groups, conservation organizations, schools, park services, nature centers, and so on) that do the heavy institutional lifting of connecting people to nature.

The Second Ring is made up of individual docents and other volunteers, the traditional glue that holds together so much of society. These two Rings are vital, but each has limitations. A direct-service program can only extend as far as its funding will allow. Volunteers are constrained by the resources available for recruitment, training, management, and fund-raising. Many good programs are competing for the same dollars from the same funding sources, a process with its own price. Particularly during difficult economic times, the leaders of direct-service programs often come to view other groups doing similar work as competitors. Good ideas become proprietary; vision is reduced. This response is understandable.

The best programs and volunteer organizations transcend these limitations, but doing so is always a struggle.

Now for the Third Ring: a potentially vast orbit of networked associations, individuals, and families. This Ring is based on peer-to-peer contagion, people helping people create change in their own lives and in their own communities, without waiting for funding. This may sound like traditional volunteerism, but it’s more than that. In the Third Ring, individuals, families, associations, and communities use the sophisticated tools of social networking, both personal and technological, to connect to nature and one another.

Family nature clubs offer one on-the-ground example. Using blog pages, social networking sites, and the old-fashioned instrument called the telephone (or smartphone), families are reaching out to other families to create virtual clubs that arrange multifamily hikes and other nature activities. An array of free organizing and activity tools is now available on the Internet for these clubs. They’re not waiting for funding or permission; they’re doing it themselves, doing it now.

The California-based organization Hooked on Nature networks people who form “nature circles” to explore their own bioregions. In the San Francisco Bay Area, Exploring a Sense of Place organizes groups of adults who meet on weekends to go on hikes with botanists, biologists, geologists, and other experts on their regions’ natural world. Similarly, the Sierra Club has networked hikers for years.

New Third Ring networks could connect people who are rewilding their homes, yards, gardens, and neighborhoods; neighbors creating their own small, do-it-yourself “button” parks; businesspeople and professionals, including developers, hoping to apply biophilic principles. These networks, unlimited in their ability to grow, could transform future policies of more traditional professional societies. For example, today’s influential Green Building Certification Institute’s LEED certification for buildings is almost exclusively focused on energy efficiency and low-environmental-impact design. It’s overdue for an update that would go beyond energy conservation to include the benefits of more natural environments to human health and well-being. For the proponents of that change, going the conventional route to achieve such a policy change could take years. But an expanding network of individual professionals could accelerate that change—and as you read this, that may have happened already.

Similarly, networks of health care and wellness professionals already committed to the nature prescription could change elements of their professions without waiting for top-down pronouncements. Through peer-to-peer networks, they could change minds, hearts, and eventually official protocol, and they could, through this process, build a funding base for direct-service programs.

When I mentioned this Third Ring notion to the director of the Maricopa County (Arizona) Parks and Recreation Department, the largest urban park district in the United States, he grew excited—not only about family nature clubs but about the broader context of the Third Ring.

“I have programs right now in my park system for families, but they’re under-enrolled. This could be a way to change that,” he said. Moreover, he faces new budget challenges. By encouraging families to create self-sustaining, self-organizing nature networks, he would be expanding the number of people who use his parks. Just as important, the growth of a Third Ring could translate into future political support for park funding.

Similarly, as large land-trust organizations and governments help neighborhoods create their own nearby-nature trusts, overhead would be small, but their reach would grow. So would the public’s understanding of the importance of the land-trust concept. College students, those who hope to pursue careers connecting people to nature, could be similarly networked.

The Third Ring could be especially effective in changing the closed system of public education. At this writing, efforts are afoot to gather “natural teachers” into a national network. These educators, in primary and secondary schools, colleges and universities, are not necessarily environmental education teachers. They’re the teachers who intuitively or experientially understand the role that nature experience can play in education. They’re the art teachers, English teachers, science teachers, and many others who insist on taking their students outside to learn—to write poetry or paint or learn science under the trees. I meet these teachers all over the country. Every school had one or two. And they feel alone.

What if thousands of these natural teachers were networked and, through this network, gained power and identity? Once connected, these educators could push for change within their own schools, colleges, and communities. Connected and honored, natural teachers could inspire other teachers; they could become a galvanizing force within their own schools. In the process, they would contribute to their own psychological, physical, and spiritual health.

Third Ring networks can reach well beyond the immediate members. In Austin, Texas, a grade-school principal told me that he would love to include more nature experience in his school. “But you can’t imagine the pressure I’m under now with the testing,” he said. “We can’t do everything.” When I described the family nature club phenomenon, the principal was enthusiastic. I asked if he could provide toolkits—packed with educational material, guides to local parks, and so forth—and encouragement to children and parents to start their own nature clubs. “I could do that,” he said, and he meant it. He immediately began to think of how the educational elements of these clubs might augment his curriculum.

Earlier that day, in a meeting of leaders from central Texas, a PTA president spoke movingly. “Listen, I’m really tired of going into a roomful of parents and telling them not to give their kids candy, because of obesity,” she said. “Recently, I’ve started talking to them about getting their children, and themselves, outside in nature more often. You can’t believe the different feeling in the room. In the room where I’m preaching about candy, the mood is rather unpleasant, but when I’m in a room with parents and we’re talking about getting outside, then the mood is happy, even serene. Parents immediately relax when we talk about that.” During our meeting, she began to make plans for her PTA to start encouraging family nature clubs.

Social networking, online and in person, has transformed the political world. Online tools are used to raise funds, to organize face-to-face house parties, and to turn out voters. A nature-focused Third Ring using those same tools, and ones not yet imagined, could create a growing constituency for needed policy changes and business practices. It could, in fact, help create a renatured culture.

What if family nature clubs really caught on, like book clubs did in recent years? What if there were 10,000 family nature clubs in the United States, created by families for families, in the next few years? What if the same process in other spheres of influence moved nature to the center of human experience? In such a culture, that father in New Hampshire would be more likely to knock on his neighbor’s door. Or, better yet, one of his neighbors will show up at his door, asking his family to join a new network of neighbors devoted to enjoying nature in their own neighborhood. Their first expedition: to explore the creek that runs through it.

Your Right to Nature

To be clear, I don’t believe that permanent cultural change will take root without major institutional and legislative commitments to protect, restore, and create natural habitat on a global basis.

Generous future historians may someday write that our generation finally met the environmental challenges of our time—not only climate change, but also the change of climate in the human heart, our society’s nature-deficit disorder—and that, because of these challenges, we purposefully entered one of the most creative periods in human history; that we did more than survive or sustain, that we laid the foundation for a new civilization; and that nature came to our workplaces, our neighborhoods, our homes, and our families.

Such a transformation, both cultural and political, will come only with a new consideration of human rights. Recently I began asking friends this question: Do we have a right to a walk in the woods? Several people responded with puzzled ambivalence. Look at what our species is doing to the planet, they said. Based on that evidence alone, isn’t the relationship between human beings and nature inherently oppositional? That point of view is understandable, given the destructiveness of human beings to nature. But consider the echo from folks who reside at another point on the political/cultural spectrum, where nature is seen as an object under human dominion or as a distraction on the way to Paradise. In practice, these two views of nature are radically different. Yet there is also a striking similarity: nature remains the “other”; humans are in it, but not of it.

My mention of the basic concept of rights made some of the people I talked to uncomfortable. One friend said: “In a world in which millions of children are brutalized every day, can we spare time to forward a child’s right to experience nature?” Good question. Others pointed out that we live in an era of litigation inflation and rights deflation; too many people believe that they have a “right” to a parking spot, a “right” to cable TV, even a “right” to live in a neighborhood that bans children. As a consequence, the idea of rights is deflated. Do we really need to add more rights to our catalogs of entitlements?

The answer to these questions is Yes, if we can agree that the right at issue is fundamental to our humanity.

About the Author

Richard Louv is a journalist and the author of eight books about the connections among family, nature, and community. His previous work includes Last Child in the Woods: Saving Our Children from Nature-Deficit Disorder (Algonquin Books, 2005).

This article was excerpted from his most recent book, The Nature Principle: Human Restoration and the End of Nature-Deficit Disorder (2011, Algonquin Books of Chapel Hill).

Outlook 2012

INTRODUCTION

The U.S. space shuttle program may have ended in 2011, but space travel, exploration, and commercialization will continue well into the future, thanks to private initiatives. Growing environmental threats such as the emergence of new “dust bowls” to rival those of the 1930s will spawn the drive to make this planet more livable; look for advances in fuel cells to enable us to live deep under the sea, for instance. These are a few of the forecasts found in THE FUTURIST magazine in the past year, offering glimpses of possibilities and suggestions for solutions.

The forecasts collected in the World Future Society’s annual Outlook reports are not intended to predict the future, but rather to provoke thought on how we may begin to shape our own tomorrows today.

The opinions and ideas expressed are those of their authors or sources cited and do not necessarily represent the views of the World Future Society. For more information, please refer to the original articles cited. Back issues of THE FUTURIST may be purchased at www.wfs.org/backissues.

Continue the dialogue! Your feedback is welcome.

—THE EDITORS

BUSINESS AND ECONOMICS

New metrics will supplement GDP and other economic measures to provide better indicators of quality of life. According to a study by Ethical Markets Media and GlobeScan, many people believe that such economic indicators are limited gauges of a nation’s total economic activity, much less the overall standard of living. Critics advocate for a new metric that accounts for environmental and public-health factors, social welfare, infrastructure, and other quality of life factors. The United Nations’ Human Development Index is perhaps the best-known and most widely cited alternative. —World Trends & Forecasts, May-June 2011, pp. 11-12

The U.S. rich–poor gap is another disaster waiting to happen—probably around 2020. If the economic situation looks bad now, just wait until the end of the decade. Present-day concentration of wealth in the hands of too few Americans, and the related problem of out-of-control consumer debt, will lead to economic stagnation and political upheaval with impacts felt across the world. —Robert B. Reich, author of Aftershock, reviewed by Patrick Tucker, Mar-Apr 2011, p. 52

China’s economy will stop growing and start shrinking later this century. So forecasts economist Daniel Altman, who notes that China is an economic powerhouse now, but structural weaknesses threaten to cause major problems in the long term. Meanwhile, prosperity will resume in the United States and a few other nations that are now lagging. —Books in Brief [review of Outrageous Fortunes by Daniel Altman], Jan-Feb 2011, p. 48

Environmental sustainability will receive growing attention from economists. According to Ethical Markets Media’s Green Transition Scoreboard, which tracks global private investments in sustainable businesses, the “green economy” continues to grow each year. The Scoreboard projects that there could soon be a cumulative $1 trillion annual investment in green businesses. —World Trends & Forecasts, May-June 2011, p. 12

The United States could transition to a cashless society. Cash is on the way out in the United States, even if policy makers do not actively work to facilitate this transition. However, leaving everything to chance may result in trillions of wasted dollars. Possible measures that could help “nudge” cash out of circulation include imposing a federal tax surcharge on ATM withdrawals and transforming cash into an electronic currency. —David R. Warwick, “The Case Against Cash,” July-Aug 2011, p. 47

Commercial space tourism will grow significantly during the coming decade. The Futron/Zogby firm estimates that, by 2021, there will be 13,000 suborbital passengers annually, resulting in $650 million in revenue. Many companies are currently working to make commercial space flight a viable industry, Melchor Antuñano, director of the FAA Civil Aerospace Medical Institute, told attendees of WorldFuture 2010. —Richard Yonck, “Challenges and Opportunities in Space Medicine,” Nov-Dec 2010, p. 50

The “fast fashion” fad may fade. Two competing values drive trends in fashion: the desire for clothes that are fashion-forward and inexpensive, and the desire for clothes that are higher quality and don’t quickly go out of style. The future may favor “slow fashion” as consumers look beyond price tags for merchandise that is well made, long lasting, and free of sweatshop labor. —World Trends & Forecasts, Sep-Oct 2011, p. 12

COMPUTERS AND AUTOMATION

Computers will manage our money for us. Electronically enhanced market management could ward off a lot of would-be recessions and market crashes. Economists might use increasingly sophisticated computer simulation models to identify fault lines and predict trouble before it starts. Even better, computers could perform automated trading for human investors, and in so doing mitigate market risk and unnecessary trades. —Rutger van Santen, Djan Khoe, and Bram Vermeer, authors of 2030, reviewed by Rick Docksai, Mar-Apr 2011, p. 56

The Internet will automatically search itself so you don’t have to. The information you provide Google when you search for something is teaching the search engine more about you and your interests. One day, Google will become so savvy about you that you won’t have to search at all: Your smartphone will pick up information from your environment, anticipate what you’ll want to know, and deliver it automatically. At least, that is the hope of Google developers. Privacy advocates such as Eli Pariser, author of The Filter Bubble, warn of abuses by companies that could profit from such private information. —Eli Pariser, “The Troubling Future of Internet Search,” World Trends & Forecasts, Sep-Oct 2011, p. 6

A computer program that can measure callers’ stress levels over the phone could help crisis centers respond more effectively during emergencies. Rapid talking, variations in pitch, and changes in breathing rates are among the vocal cues that enable the program to gauge urgency and alert responders who may already be overwhelmed with calls during a major crisis. The system may also prove beneficial in military situations. —Tomorrow in Brief, July-Aug 2011, p. 2

Robots may learn human emotions by interacting with people. Researcher Lola Cañamero of the University of Herfordshire, England, says that the more interaction with (and feedback from) a human caregiver that a robot has, the stronger the bond becomes and the more emotional expressions it learns. —Tomorrow in Brief, Nov-Dec 2010, p. 2

Soccer-trained robots will gain enough intelligence and mobility to conduct rescue missions. Engineers are trying to tune robots’ intelligence and motor skills to the point where a team of humanoid droids could play a whole soccer game as a team effectively enough to beat even the best human contenders. The endeavor isn’t just fun and games. It holds practical applications, too: Robots this nimble will be optimally suited for urban search-and-rescue operations and for working as household helpers. —World Trends & Forecasts, Jan-Feb 2011, p. 9

Artificially intelligent entities will evolve faster and farther than humans. While natural human evolution has slowed, technological evolution is accelerating. Humans may increasingly adapt themselves with technological enhancements in order to keep up the pace. —Steven M. Shaker, “The Coming Robot Evolution Race,” Sep-Oct 2011, p. 20

Humans will eventually “lose” the race with robots. Even with every technological enhancement available to them, future human beings will not be able to keep up with the evolutionary pace of robotic humanoids with artificial intelligence. The reason: Robots will be unimpeded by insurmountable biological limitations. The best we can do is to learn from and make friends with our robotic competitors. —Steven M. Shaker, “The Coming Robot Evolution Race,” Sep-Oct 2011, p. 23

ENERGY

A diverse portfolio of energy technologies will replace our reliance on fossil fuels. Scientists are exploring not just wind and solar energies, but also such esoteric technologies as artificial photosynthesis, traveling wave reactors, and mini black holes. —David J. LePoire, “Exploring New Energy Alternatives,” Sep-Oct 2011, pp. 34-38

Lunar-based solar power production may be the best way to meet future energy demands. Solar power can be more dependably and inexpensively gathered on the Moon than on Earth. This clean energy source is capable of delivering the 20 trillion watts of power a year that the Earth’s 10 billion people will require by mid-century. A lunar solar power system such as the LUNA RING (an alternative energy plan from the Japanese company Shimizu) would be the largest public infrastructure project in human history, but it would pay for itself after only 15 years. —David R. Criswell, “Why We Need the Moon for Solar Power on Earth,” May-June 2011, p. 37; Patrick Tucker, “Solar Power from the Moon,” May-June 2011, p. 34

Ammonia may be worth its weight in oil. Hydrogen is too light to be a practical fuel source in its own right, but it works great if combined with nitrogen to form ammonia. If we build enough renewable-energy generation and distribution infrastructure, ammonia might become the world’s first fuel of choice for household and transportation use. —Carl E. Schoder, “A Convenient Truth About Clean Energy,” Jan-Feb 2011, p. 25

Dig very deep, and you will find enough geothermal energy to power the world. Geothermal energy plants today generate fairly limited energy, but that may be because they only channel heat from around 200 meters underground. The earth is much hotter further down, according to several Norwegian companies and ExxonMobil, who are all planning drilling installations that will tap 5,500 meters to 10,000 meters or more underground. Norwegian-based SINTEF says that just a fraction of the heat energy encased at those depths would suffice to power the entire world. —World Trends & Forecasts, Jan-Feb 2011, p. 8

ENVIRONMENT AND RESOURCES

Urbanization will increase global warming. As the National Center for Atmospheric Research projects, the influx of rural populations into cities, particularly in developing countries, could further raise greenhouse-gas emissions by another 25% by mid-century, irrespective of how high total population climbs. On the other hand, aging populations leaving the workforce in industrialized countries may help to reduce emissions and, hence, slow down climate change. —World Trends & Forecasts, Mar-Apr 2011, p. 12

Robotic earthworms will gobble up our garbage. Much of what we throw away still has value. Metals, petroleum, and other components could get additional use if we extracted them, and robotic earthworms could do that for us. Human “earthworm drivers” will direct them to mine landfills, extract anything of value, and digest the remaining heaps into quality topsoil. —Thomas Frey, “More Jobs for Tomorrow,” Jan-Feb 2011, p. 36

The dust bowls of the twenty-first century will dwarf those seen in the twentieth. Two giant dust bowls are currently forming—one in Asia and one in Africa. These clear indicators of soil erosion and desertification are caused, in varying degrees, by overgrazing, overplowing, and deforestation. Desertification currently affects 25% of the planet’s land area, threatening the livelihoods of more than 1 billion people in approximately 100 countries. —Lester R. Brown, “Eroding Futures: Why Healthy Soil Matters to Civilization” and “Dust Bowl Redux,” July-Aug 2011, pp. 23-30

We will use our water more wisely—or else. Water shortages are a problem now and will get worse in years ahead unless we learn to make more efficient use of existing water supplies. Among other things, we should grow more drought-resistant crops, improve our irrigation methods, and expand neighborhood and household use of water-purification and desalination systems. —Rutger van Santen, Djan Khoe, and Bram Vermeer, authors of 2030, reviewed by Rick Docksai, Mar-Apr 2011, p. 55

The United Nations estimates that 2.8 billion people will live in water-stressed environments by 2025. According to the Japanese government, safe water reclamation and recycling will be a $1 trillion market by 2025. They consider it a key export area for the future. —World Trends & Forecasts, July-Aug 2011, p. 12

Nanotech-driven water purification filters could provide fresh potable water to those in water-stressed areas. Japanese manufacturer Nitto Denko’s desalination filter desalinates and purifies water more effectively than any other water filter in existence, but at the moment, the process is too energy-intensive and cost-prohibitive for most developing countries. It uses a reverse-osmosis nano-membrane system. A less energy-intensive process being developed at Stanford University involves a silver nanowire filtration system. —World Trends & Forecasts, July-Aug 2011, pp. 11-12

HABITATS

Advances in fuel cells will enable deep-sea habitation. These fuel cells, which will produce electricity directly, with no toxic fumes, are currently being developed for automobiles. They will eventually allow for the exploration and colonization of the undersea world via extended submarine journeys. This could lead to human colonization of the continental shelves and the shallow oceans as well as the development of extensive deep-sea business sectors. —James H. Irvine and Sandra Schwarzbach, “The Top 20 (Plus 5) Technologies for the World Ahead,” May-June 2011, p. 18

Livable, economically viable manufacturing sites could be built on the Moon. It is feasible to create them within a decade. These sites, or colonies, could process materials on the Moon to create new products. For example, satellites could be fabricated and lowered to desired Earth orbits. This process would cost much less than building satellites on Earth and then rocketing them back up into space. Such sites could turn a profit within 20–30 years and offer huge long-term economic returns. —Joseph N. Pelton, “Finding Eden on the Moon,” May-June 2011, pp. 40-42

Future buildings may be more responsive to weather fluctuations. “Protocell cladding” that utilizes bioluminescent bacteria or other materials would be applied on building facades to collect water and sunlight, helping to cool the interiors and produce biofuels. The protocells are made from oil droplets in water, which allow soluble chemicals to be exchanged between the drops and their surroundings. —Tomorrow in Brief, May-June 2011, p. 2

Cities will use geographic information systems to collect real-time data from citizens to improve services. One such program already in use in the United Kingdom is Voice Your View. The program allows pedestrians to record their opinions about their surroundings into a database via their mobile phones or strategically situated kiosks. The data is then shared with both city planners and the public via Web sites and at the public spaces themselves. —World Trends & Forecasts, Nov-Dec 2010, p. 9

HEALTH AND MEDICINE

More than half of all baby boomers will live healthy lives beyond 100. So forecasts antiaging physician Ron Klatz. Research suggests that it may be possible to prevent the shortening of telomeres or possibly rejuvenate them. (A telomere is a region of the chromosome that protects it from deterioration.) If successful, this technique might increase life spans. —Verne Wheelwright, “Strategies for Living a Very Long Life,” Nov-Dec 2010, p. 13

Robotic surgical machines will build new organ tissue right in hospital wards. Several research centers are developing computerized instruments that will build living tissue layer by layer and implant it directly into human patients. The process, called bioprinting, could use the patient’s own cells as a catalyst and thereby not only help alleviate demands for new organ donations, but also negate the resistance of many patients’ bodies to transplanted organs. —Vladimir Mironov, “The Future of Medicine: Are Custom-Printed Organs on the Horizon?” Jan-Feb 2011, p. 21

More people than ever will need medical treatment for hearing loss. Society is noisier than ever, and ears everywhere are at risk of damage, warns author and journalist George Prochnik. In his latest book, The Pursuit of Silence, he notes that the ubiquity of background noise—traffic, portable music players, sound systems blaring music in restaurants and shopping malls—is contributing not only to damaged hearing, but also to memory loss, reading skills deficiencies, anxiety, insomnia, increased blood pressure, and cardiovascular disorders. Prochnik encourages listeners to adhere to the 60-60 rule: Turn the music down to 60% of the full volume or less, and listen for no more than 60 minutes a day. —World Trends & Forecasts, Nov-Dec 2010, p. 7

A future “Internet of bodies” will enable doctors to monitor patients remotely. As sensors and transmitters shrink in size and are embedded in our bodies, public health officials will be able to collect information and predict problems, so frail elderly and disabled individuals will be able to live more independently. —Tomorrow in Brief, Sep-Oct 2011, p. 2

It’s a boom market for medical tourism. As health-care costs continue to rise in the developed nations, many of their citizens are seeking cheaper care in developing countries’ hospitals. By 2017, 23 million Americans could be spending a combined total of $79 billion annually for care overseas. Developed nations’ health-care leaders worry, however: The trend could cost them heavily in revenue and make it harder for them to recruit new doctors. —Prema Nakra, “Could Medical Tourism Aid Health-Care Delivery?” Mar-Apr 2011, p. 23

Emotion sensors in our surroundings may help reduce our stress. Built-in stress-sensing electronics and electromagnets in things we handle daily, such as pens and steering wheels, would provide a counterforce to fidgety movements and help nervous people to calm down. —Tomorrow in Brief, May-June 2011, p. 2

Nanotechnology and biomimicry offer hope for restoring sight. Flower-shaped electrodes topped with photodiodes to collect light may one day be implanted in blind patients’ eyes to restore their sight. The “nanoflowers” mimic the geometry of neurons, making them a better medium than traditional computer chips for carrying photodiodes and transmitting the collected light signals to the brain. —World Trends & Forecasts, Sep-Oct 2011, p. 18

Epilepsy sufferers could obtain relief via a computer. People with epilepsy will wear compact monitors that will continuously read their brain waves to spot signs of oncoming seizures. When it detects a seizure, the monitor will interface with the patient’s brain to avert it. —Rutger van Santen, Djan Khoe, and Bram Vermeer, authors of 2030, reviewed by Rick Docksai, Mar-Apr 2011, p. 55

Music therapy may play a key part in low-cost interventions. Studies show that music may change people’s cellular environment, boosting immunity and suppressing the expression of genes that are associated with heart disease and other conditions. —World Trends & Forecasts, Sep-Oct 2011, p. 13

INFORMATION SOCIETY

The next generation of dating sites will enable people to go on virtual “dates” in cyberspace. Likewise, breakups will happen more often by electronic communications than by in-person discussions. —Arnold Brown, “Relationships, Community, and Identity in the New Virtual Society,” Mar-Apr 2011, p. 30

The end of identity as we know it: It will be easier than ever to create a new identity or identities for ourselves. All we will have to do is create new avatars in virtual reality. Those avatars will act on our behalf in real life to conduct such high-level tasks as performing intensive research, posting blog entries and Facebook updates, and managing businesses. The lines between ourselves and our virtual other selves will blur, to the point where most of us will, in essence, have multiple personalities. —Arnold Brown, “Relationships, Community, and Identity in the New Virtual Society,” Mar-Apr 2011, p. 34

Learning will become more social and game-based, and online social gaming may soon replace textbooks in schools. The idea that students learn more when they are engaged—as they are when playing games—is helping educators embrace new technologies in the classroom. In addition to encouraging collaborations, games also allow students to learn from their mistakes through trial and error. —World Trends & Forecasts, Sep-Oct 2011, p. 16

Future libraries will be valued more for services than for book collections. Libraries will become more participatory, and librarians will serve as information facilitators. As learning and knowledge creation become more collaborative and dynamic, library spaces will be used more for community services and less as a place to store books. Readers will share recommendations and feedback, enhancing the knowledge contained in texts. —Books in Brief [review of The Atlas of New Librarianship by R. David Lankes], Sep-Oct 2011, p. 52

Transitioning to a mostly cashless society could reduce crime. Specifically, it would go a long way toward eliminating illegal underground economies and reducing criminal activity. Based on 2009 FBI statistics, eliminating cash robberies would save the United States around $144 billion per year. In addition, identity theft and wire fraud would likely decline, since fraudulently wired funds are most often redeemed in cash in order to break audit trails. —David R. Warwick, “The Case Against Cash,” July-Aug 2011, pp. 46-47

LIFESTYLES AND VALUES

We will increasingly treat free time as a general social asset. This free time, or “cognitive surplus” of creativity, insight, and knowledge, could be harnessed for large, communally created projects, thanks to the spread of information technology. We’ve gone from a world with two models of media—public broadcasts by professionals and private conversations between pairs of people—to a world where public and private media blend together and where voluntary public participation has moved from nonexistent to fundamental. —Clay Shirky, “Tapping the Cognitive Surplus,” Nov-Dec 2010, p. 21

Accelerating change may accelerate resistance to change. The uncertainties and discomfort that accompany rapid changes (such as in new technologies and social structures) often provoke individuals to retreat into rigid belief systems and even aggressive, dysfunctional behavior. People may become more apathetic about the future at a time when they need to be more aware and engaged, warn the authors of The Techno-Human Condition. —Braden R. Allenby and Daniel Sarewitz, “The Accelerating Techno-Human Future,” Sep-Oct 2011, p. 32

New data on the neuroscience of human attraction and bonding will change the way people partner and fall in love. The feeling of romantic love is associated with the brain’s dopamine system for wanting. One company has begun to bottle a perfume that contains oxytocin, the natural brain chemical that, when sniffed, triggers feelings of trust and attachment. —Helen Fisher, “The New Monogamy, Forward to the Past,” Nov-Dec 2010, p. 28

Human relationships won’t die, but they will change shape. As more people conduct more social interaction in virtual space, their relations to each other in physical space will change profoundly. “Nuclear” families will morph into other arrangements. Communities could see more construction of single-person housing units due to more homeowners having virtual partners instead of live, in-person partners. Virtual marriages might become normal, and the spouses will claim real benefits and legal ties. —Arnold Brown, “Relationships, Community, and Identity in the New Virtual Society,” Mar-Apr 2011, p. 31

Look for a rise in “lessmeatarianism” as the public grows increasingly aware of the beef industry’s impacts on the climate. Less meat and dairy in our diets could reduce agricultural greenhouse-gas emissions by as much as 80% by 2055, according to the Potsdam Institute for Climate Impact Research. —World Trends & Forecasts, Nov-Dec 2010, p. 9

The future is full of bicycles. As the world keeps urbanizing, people’s health will increasingly suffer from environmental pollution and from sedentary lifestyles that do not allow for enough physical activity. Meanwhile, resource depletion will accelerate. Local transportation systems that encourage biking and walking could be a powerful antidote to these harmful trends, however. There are encouraging signs of more bike use already, including the creation of bike trails, rising popularity of bike tours, and more doctors encouraging elderly patients to bike more often. —Kenneth Harris, “Bike to the Future,” Mar-Apr 2011, pp. 25-28

Gaming will help improve our ability to make decisions. Researchers observe that overconfidence can lead to poor decision making. Now, a Web-based game called World of Uncertainty gauges how confident people are when making decisions, so they can become better aware of their own biases, according to David Newman of Queen’s University Belfast, one of the game creators. —World Trends & Forecasts, July-Aug 2011, pp. 10-11

Future human societies may be divided between augmented and nonaugmented breeds. Those who can afford technological enhancements, including changes to their DNA, may become so significantly altered that they will no longer be able to breed with nonenhanced humans. —Steven M. Shaker, “The Coming Robot Evolution Race,” Sep-Oct 2011, p. 22

SCIENCE AND TECHNOLOGY

Machine vision will become available in the next 5 to 15 years and grow more sophisticated over time. Its range will ultimately exceed that of the human eye. This technology will greatly enhance robotic systems’ capabilities. —James H. Irvine and Sandra Schwarzbach, “The Top 20 (Plus 5) Technologies for the World Ahead,” May-June 2011, pp. 17-18

By 2020, the world’s digital output may reach 35 zettabytes (more than a trillion billion bytes). That’s enough DVDs to reach halfway to Mars. In the near future, high-speed wireless technologies will enable us to access information from almost any location at speeds approaching those of wired networks. Simultaneously, embedded networked processors and smart dust—sensor networks made up of billions, even trillions, of nodes—will be everywhere, providing real-time data streams about everything, all the time. —Richard Yonck, “Treading in the Sea of Data,” July-Aug 2011, p. 33

We’ll ward away mosquitoes safely by adopting the smell of their predators. A multidisciplinary team of researchers at the University of Haifa in Israel have identified key compounds released by mosquitoes’ predators. Synthesizing these natural chemicals and releasing them in breeding areas could offer an inexpensive, nontoxic alternative to pesticides. —Tomorrow in Brief, Nov-Dec 2010, p. 2

We will design more devices to gradually degrade back into the parts stream. In his book Shaping Things, Bruce Sterling proposed that, with the right regulatory framework and technology, it might be possible to start readdressing design decisions so that products like cell phones can decompose back into components that can be reused in next-generation devices. —Cory Doctorow, quoted in “Cory Doctorow Meets the Public,” Nov-Dec 2010, p. 22

Large digital touch-screen displays will take microscopy to the next level. Forty-six-inch or larger multitouch screens will make the act of looking at a sample through a microscope similar to the experience of using Google Maps. —Tomorrow in Brief, July-August 2011, p. 2

A space elevator could lift people (and materials) from Earth’s surface into orbit. Such an elevator would prove especially useful if a lunar or space colony is built. Once in orbit, gravitational pull is 560 times less. People could exit the elevator and fly to the Moon, Mars, or other destinations via very-low-thrust, high-efficiency propulsion systems. —Joseph N. Pelton, “Finding Eden on the Moon,” May-June 2011, pp. 40-42

No water? No power? No problem. Cheap electricity and clean water may soon be possible for remote villages, military operations, and other places without access to these vital resources. A device using a new aluminum alloy developed by Purdue University researchers can split salty or polluted water into hydrogen and oxygen. The hydrogen feeds a portable fuel cell to supply electricity, and the steam byproduct is recaptured as pure water. —Tomorrow in Brief, Sep-Oct 2011, p. 2

WORK AND CAREERS

Journalism may soon be taken over by nonjournalists. Professionals from just about any field—law, neurology, astrophysics, investing, etc.—could be valued news writers if they complete some cross-training in journalism. As traditional news reporting jobs disappear, these cross-training professionals will fill in the gaps and produce news and commentary on their respective fields of work. Readers will flock to them because the writers not only know how to write, but also know their subjects inside and out. —Cynthia G. Wagner, “Emerging Careers and How to Create Them,” Jan-Feb 2011, p. 32

People could become professional data collectors. Terabyters—people who produce a terabyte or more of digital data a day—would be paid generous sums to don high-tech data-collection gear and explore neighborhoods, shopping districts, and city centers. Their sensors would record and process all visual and sensory data about their surroundings, for which companies like Google and Microsoft may pay lucrative sums to develop data streams for marketing purposes. —Thomas Frey, “The Coming of the Terabyters: Lifelogging for a Living,” Jan-Feb 2011, p. 35

With more work done by freelancers, organizations will need full-time professionals to supervise them. Employers large and small will trim overhead to the bare minimum by keeping small cores of staff for only the most essential operations. Meanwhile, most of the nonessential work will be outsourced to freelance help. As projects come up, organizations will contact professional “talent aggregators,” who keep databases of registered work seekers whom they can call up whenever needed. —Jim Ware, “Careers for a More Personal Corporation,” Jan-Feb 2011, p. 37

WORLD AFFAIRS

Networks will increasingly become the key to positive political change. The ability to elect a lawmaker or lobby for a cause is built around our capacity to network with one another online, according to science-fiction author Cory Doctorow. This is why the issue of Internet access, and how it is controlled or restricted, is the most important free speech issue of our time. —Cory Doctorow, quoted in “Cory Doctorow Meets the Public,” Nov-Dec 2010, p. 24

Climate change threatens to displace as many as 70 million Bangladeshis. Much of Bangladesh is at or near sea level, so if the Intergovernmental Panel on Climate Change’s forecast of a seven-meter sea-level rise this century comes true, possibly 17% of the country could be submerged. That would render 60 million to 70 million Bangladeshis homeless and destroy the livelihoods of countless more. Bangladesh is investing heavily in flood and storm preparations now, but India’s diversion of major river ways between the two countries could still spell major trouble. —World Trends & Forecasts, Jan-Feb 2011, p. 9

The Arctic regions will be hotspots for industrial and demographic growth. Iceland, Canada, Russia, and other far-north locales could see more population growth and commercial activity than even Brazil or China. A number of factors are behind this: Surging populations of job seekers in the developing world; falling populations in the northern countries; growing global demand for oil and other resources; and melting of Arctic permafrost, which would likely hasten human immigration into, and commerce throughout, the region. —Laurence C. Smith, author of The World in 2050, reviewed by Rick Docksai, Jan-Feb 2011, p. 47

Watch out, St. Louis! The most-endangered place in America is not on the Gulf Coast or California. St. Louis, Missouri, faces a wide variety of potential disasters, according to Forecasting International’s Owen Davies. As the town resides on the New Madrid fault and the Mississippi River, both earthquakes and floods loom large in St. Louis’s future. Other threats include massive environmental pollution and the highest crime rate in the United States. —Futurists and Their Ideas, Sep-Oct 2011, p. 44

Look for surprising strategic alliances across the globe. Germany and Russia will forge stronger economic ties, while Turkey and the Arab states eye Iran more closely as a competitor. Europe’s internal economic struggles will contribute to the continent’s fading as a global power, while Brazil will exert formidable economic and military influence in Africa. —Books in Brief [review of The Next Decade by George Friedman], Sep-Oct 2011, p. 54

The Search for Global Solutions: Moving from Vision to Action

By Cynthia G. Wagner

Photos by Aaron M. Cohen

What does it take to get an idea launched or a problem solved? At the World Future Society’s 2011 conference, the answer was inspiration, collaboration, and the energy of forward-thinking people.

The Living City Challenge: Buildings That Make a Positive Impact

Green building techniques must continue to improve, evolving beyond meeting LEED certification standards, said Cascadia Green Building Council CEO Jason McLennan. (Currently, LEED Platinum represents the highest standard of environmental certification.) After all, even LEED Platinum-certified buildings have a negative impact on the environment, however greatly reduced. Buildings simply built to code represent the baseline—they are “the worst allowable by law,” he asserted, before asking, “What does ‘good’ look like? How do we move to a place that’s truly regenerative and restorative?”

McLennan, also a board member for the International Living Building Institute, then described the institute’s Living Building Challenge. In general, to meet the challenge, the project should not damage the natural environment—in fact, it should have a positive impact on the environment. For example, “living buildings” should generate a surplus of clean energy. He emphasized that energy efficiency does not mean sacrificing comfort, and he reported that there are three living building projects currently under construction in Vancouver.

McLennan then described a novel sewage-treatment plant that has met the challenge. It is actually intended as a mixed-use facility: Yoga classes are held there, where teachers “encourage people to breathe deeper.”

The “living building” represents the next phase of sustainable buildings, said McLennan’s co-presenter, architect Cindy Frewen-Wuellner. Their hope is that this transformation will happen in the next 30 to 40 years. Both seemed optimistic that the era of suburban sprawl is coming to an end.

—Aaron M. Cohen

Brain Mapping, Intelligence Augmentation, And Virtual Reality

“We will never in our lifetimes completely map the human brain,” said Edie Weiner at the start of her much buzzed-about session with Arnold Brown. (Weiner and Brown are president and chairman, respectively, of Weiner, Edrich, Brown, Inc.)

Weiner elaborated: This has less to do with science’s inability to understand the brain’s complex physiology and more to do with the difference between the brain and the concept of a mind or soul—in other words, the distinction between the physical, neurological, and chemical interplay and “what escapes that” (providing you believe that there is something immutable underlying those processes). However, research will enable us to understand the human brain a great deal more and thus gain greater insight into the human condition. For example, researchers are learning a lot about how the process of memory works and finding ways to reshape and enhance it.

Brain research and brain mapping will likely lead to improvements in the education system and the creation of new learning environments, Weiner told the audience. She believes that virtual reality and interactive gaming will become more commonplace in the years to come. In addition, overcrowded classrooms will be replaced by one-on-one mentorships conducted mostly online. “We will need guides, not teachers,” she said.

Weiner and Brown also discussed the “human–machine interface.” Brown questioned “whether the human brain is capable of dealing with a world that is becoming more complex by the day.” Intelligence augmentation (technologically enhancing the human mind) will become increasingly necessary if humans are to keep up with artificial intelligence. The looming question is: How can we augment or create intelligence if we can’t fully understand it?—Aaron M. Cohen

Health Maintenance for Extended Life Spans

portrait of Aubrey de Grey

Metabolism causes damage on an ongoing basis, and this damage eventually causes pathology, Aubrey de Grey told attendees. It is “a side effect of being alive in the first place.”

But gerontologists aim to intervene in this complex process, focusing on lifelong maintenance. De Grey argued that repairing damage early enough to prevent the pathology that causes aging could help humans achieve a big extension of healthy life spans.

Arctic Wild Cards

We believe we are now seeing the least extent of sea ice in history, according to Lawson Brigham, and this phenomenon could yield wild cards. For example, with greater opportunity for oil and gas development, Greenland may declare independence from Denmark.

Another wild card could be critical safety issues as tourism increases in tiny villages that have no infrastructure to service cruise ships.

How does an idea transform into a goal, and how does a plan inspire people to implement it? What does it take to give a movement its momentum? These were the underlying questions of the 750 futurists who met in Vancouver this past July to consider how to take that great leap of faith required for “Moving from Vision to Action.”

The future absolutely requires courage, said leadership expert Lance Secretan, author of The Spark, the Flame, and the Torch (The Secretan Center, 2010). Just as skiing down a steep slope for the first time requires faith in one’s abilities, effecting change and inspiring others to do so requires courage, whose rewards are fulfillment and accomplishment.

“It’s a myth that we can’t make change quickly,” said Secretan, “but it takes courage to let go of what’s holding us back.”

Is there danger in rushing down the unfamiliar slope of change? Of course there is. Studying the future helps us see where we’re heading. As business consultant Owen Greaves pointed out, many of our cool new technologies, like smartphones, brought risks we didn’t necessarily anticipate, such as geolocation tracking chips that could potentially reveal our whereabouts to others.

Because their impacts may be enormous, the assessment of emerging technologies is one of the key tasks of futurists—and a new mission of the U.S. Government Accountability Office (GAO). Picking up where the former Office of Technology Assessment left off, GAO has now created a permanent Center for Science, Technology, and Engineering (CSTE), reported chief scientist Timothy M. Persons.

The center has just completed a technology assessment of climate engineering, which includes such proposed projects as brightening clouds at sea, pumping liquid CO2 into rocks or aerosols into the stratosphere, and afforestation of deserts. Persons pointed out that the center’s assessments include conversations with the public to get the potential responses of those affected by such technologies. “You don’t want to leave this to the experts alone, because you would lose public trust,” he said.

Communicating the Future

An important aspect of CSTE’s work is to improve communication with the public, including the congressional leaders who, though not scientists themselves, must make decisions about these scientific and technological developments. Thus, the design of interactive animations became an important aspect of the technology assessment and communication process, Persons reported.

Why interactive animations instead of, for example, a printed report? Literacy expert Lawrence Baines of the University of Oklahoma–Norman explained that there is “evolutionary pressure to condense information and communication.” Images, he noted, are briefer than text, which is too complex for the small devices that people are increasingly using as their principal mode of communication.

Baines also observed that television viewing is increasing, along with consumption of media on cell phones. Thanks to multitasking, young people can pack 10 hours of media consumption into just seven and a half hours a day. This consumption is also interactive and social, whereas reading a book requires solitude—an activity that may seem antisocial to today’s youth.

Another approach to communicating technological developments to a new audience is exemplified in the work of Booz│Allen│Hamilton. “My job is to find stuff, and tell everyone about it,” said senior associate William P. Barnett Jr. But, he admitted, not everybody wants to know about it.

The challenge is to help innovators, who may only speak in the language of physics, to describe to potential investors the problems that their work may help solve. Barnett said one way to do this is to create “future environments,” or simulations of the environments that the clients will be working in, showing how the innovations will be able to fill future gaps. Such a simulation is a “great place to dream,” and these future environments are intended to “make complicated information and ideas more visual and easier to understand,” he said.

Changes and Impacts

As Baines pointed out in his session on the future of language, demographic and technological shifts are impacting each other in sometimes unsettling ways, as when a 7-year-old can’t seem to put down her multifunctional cell phone. Baines warned that the move away from the complexity of communication found in books could impair critical thought among younger generations.

On the other hand, young people are growing up in an ever-changing social and technological environment, and they are using new technologies and tools to rebuild the world as they go. The panel on Cultural Shifts among Global Youths gave a mind-boggling overview of these changes:

The “aging” and “younging” of global populations are altering the workplace, careers, and even the traditional life path from school to work to retirement, said Erica Orange, vice president of Weiner, Edrich, Brown, Inc.

For young people, multitasking with multimedia has fundamentally changed their brains and the way they process information. They bore easily, and this boredom will increasingly make shorter-term jobs, contract work, and temping more entrenched, Orange said.

For older workers, the need to acquire new skills to remain competitive may inspire them to try new careers from scratch; internships will no longer be just for the young, according to Orange.

Globally, rapid economic development means that the concept of the “Third World” is becoming obsolete, according to Jared Weiner, a vice president of Weiner, Edrich, Brown, Inc., and a member of the World Future Society’s board of directors.

“We argue that BRIC [Brazil, Russia, India, China] is outdated. Think about Turkey and Singapore,” he advised. The challenge for all economies—and companies—is finding where the talented youth of tomorrow are and where they will want to go.

Global youth’s changing relationship with the virtual world is also driving trends toward using real names online, putting attention on reputations, and regulating more online activities, said Lisa Donchak, an enterprise sales associate for Google Enterprise. “We’re toward the end of the Wild West age of anonymity,” she observed. “Maybe the opportunity to be anonymous was a growth stage. More sites [such as Google+] are asking for your real name.”

Along with this authenticity comes “the right to be forgotten,” to erase your data footprint, Donchak noted. The European Union has been leading the way, with a “do not track” policy on cookies (data files placed on your computer by the Web sites you visit).

How Action Builds on Resilience

Dwight D. Eisenhower once said, “In preparing for battle I have always found that plans are useless, but planning is indispensable.” Donald Byrne, president and CEO of Metrix411, drew on Eisenhower’s wisdom to help illustrate a fundamental principle of futuring: the need to take action.

This point is crucially important in emergency situations, such as when deadly tornadoes struck many parts of the United States earlier this year. Byrne credited the resiliency of the community of Greensburg, Kansas, for its reaction to the 2007 tornado that destroyed the city. “In most communities, there is no organization for what really needs to be done; everybody wants to send water or ready-to-eat food,” he said. But the Lions Club did one thing that was immediately needed: It paid for funerals.

The community’s resiliency, its ability to respond, is one reason people stayed in Greensburg to rebuild rather than move on, Byrne argued. This power of community resiliency was seen again in Japan after the earthquake and tsunami of March 11, 2011.

Thanks to a cadre of young volunteers flocking to Peace Boat, a relief organization, help quickly came to the communities ravaged by the tsunami. “Kids were distributing food before even the army or Red Cross could get there,” reported Patrick Tucker, deputy editor of THE FUTURIST.

Tucker spent five months in Japan and was in Kyoto when the earthquake hit; as did most other foreigners, he left the country in the following week, but returned when he learned of Peace Boat’s relief work. “If you can keep the community together, you can rebuild faster,” he said, noting that neighbors are essential for keeping track of each other’s whereabouts. [See Tucker’s full report, “Lost and Found in Japan,” in this issue.]

“Actions” in Action

One of the best aspects of World Future Society conferences is the opportunity for futurists to share their work, providing case studies of effective actions as well as models for applying futuring principles.

Two of the world’s leading futurist training grounds again sent teams of students to the conference to present their work. Describing the Singularity University experience were teaching fellow José Luis Cordeiro and alumni Sasha Grujicic, Matthew Kern, Vjai Anma, and Alison Lewis, who described such projects ranging from sustainable clean water to automobile sharing.

Representing work done at the University of Houston, and introduced by Studies of the Future graduate program chair Peter Bishop, were Sara Robinson, who analyzed the Future of the Progressive Movement in the United States; Heather Schlegel, on the Future of Transactions and Alternate Currencies; and Emily Empel, on the Future of the Sex Industry.

One especially inspiring approach to stimulating action is “the power of the prize,” said Thomas Frey, executive director of the DaVinci Institute. Most prizes award past accomplishments, but increasingly prizes are offered as a way to stimulate innovative solutions.

“What if we could solve the world’s biggest problems through prize challenges?” Frey announced the DaVinci Institute’s Eight Grand Challenges program, in which countries would enter teams to compete for medals, as in the Olympics. The pursuit of these grand challenges would result in enormous benefits to humanity, Frey said. [Editor’s note: More on the DaVinci Institute’s grand challenges will appear in the January-February 2012 issue of THE FUTURIST.]

DIY advocate Dale Dougherty, editor of Make magazine and organizer of the Maker Faire events, led a lively session showcasing the spirit of hands-on innovation. Maker Faires and the “Maker” movement began a dozen years ago as a way to inspire those who feel compelled to manipulate things with their own hands, who want to understand how things work—and make things work themselves.

But unlike the image of the lone “tinkerer” working in the solitude of his or her own basement, the Maker movement is about “social tinkering. … It’s physical, connecting to the digital,” Dougherty explained. “It’s about personal expression, creating, and interacting.”

Because makers tap their childlike curiosity to play with technologies, recombining them to create new innovations, the Maker movement could provide a model for education. “Give children the gift of time and space to play,” Dougherty advised. “Immersion in an activity is valuable. Why isn’t school like this? … My goal is that students would become producers of a personalized education that they invent for themselves rather than a standardized education that they consume—to consider themselves as producers, not consumers.”

When people are having fun, they are engaged, Dougherty concluded. And this engagement may be the very key to moving from vision to action.

About the Author

Cynthia G. Wagner is editor of THE FUTURIST and of the 2011 conference volume, Moving from Vision to Action, which is available from www.wfs.org/wfsbooks. Email cwagner@wfs.org.

For links to download the WorldFuture 2011 conference program (PDF) or to order audio recordings or the conference volume, please visit www.wfs.org/content/worldfuture-2011.

How the Recession Has Changed the Middle Class

By Patrick Tucker

The 2008 recession was hard on everyone, but it did not distribute its woes evenly.

Pinched: How the Great Recession Narrowed Our Futures and What We Can Do About It by Don Peck. Crown. 2011. 224 pages. $22.

In Pinched, journalist Don Peck paints a portrait of the middle class as jilted lover, nursing feelings of despair and betrayal. After doing everything right, the question this poor sop finds himself asking, over and over, like a funerary wail, is not “Why aren’t I good enough,” but the far more terrifying “Why aren’t I good enough anymore?” There is no easy rejoinder. The American Dream has simply moved on and taken a new name. Our hero is left with only the awareness that his best days have passed him by.

The 2008 recession permanently altered the lives of millions of Americans, neighborhoods, and even entire regions of the United States. Peck shows that many middle-class, middle-skill jobs that existed prior to 2008 will never return, opportunities that had seemed perennial just a few years ago have permanently vanished. Labor experts such as John Challenger, writing in this magazine, have encouraged job seekers in low-growth areas to strike out for more-fertile ground. In fact, much of the advice given to the nation’s unemployed and underemployed has amounted to: Be adaptable, seek training, and move. These admonishments, while sound, are also callous. People forced by market conditions to make dramatic life adjustments are rarely thankful for the opportunity to do so.

In many respects, this current state of woe represents a culmination of trends that have been building for some time. Throughout the last 10 years, however, policy makers and financiers were able to postpone their full impact. The rapid appreciation in the housing market between 2002 and 2008 created an illusory sense of prosperity in the absence of real salary growth, which has budged little from the 1970s. Since the largest asset owned by most Americans is their primary residence, many people experienced an enormous, and artificial, expansion in net worth over the last decade. The losses resulting from the housing collapse will linger for a long time, affecting consumption and investment habits for years.

“Many Americans, even those who didn’t lose their jobs, lost a decade’s sense of progress. Long deferred, a decade’s disappointment has been concentrated in the past three years,” notes Peck.

Stagnant wages and vanishing jobs, compounded by the intractable housing crisis, have metastasized into to a very literal paralysis. Nearly one in four Americans owes more on his or her house than that house is worth. Peck points out that, in Arizona and Florida, the number is one in two, and in Nevada, two in three. Many individuals who are underwater on their home loans simply can’t move to a better economic environment, even if they wanted to. They’re caught between the proverbial rock and hard place, the mountainous amount of debt they owe and the cold truth of their home’s actual value.

All of this has fundamentally changed the demographic makeup of America’s white-picket-fence suburbs, which now house more poor people than do the nation’s urban centers. It’s an ironic reversal. In the 1950s, suburban developments were sold as a means to escape city squalor, which was understood as a thinly veiled allusion to non-Caucasian neighbors. Half a century later, actual squalor in these neighborhoods pits frustrated homeowners against equally desperate renters.

“This isn’t the neighborhood that I moved into,” one frayed suburbanite complained to Peck. “It’s never going to recover to what it was.”

Contrast this predicament with the plight, or more accurately flight, of the nation’s moneyed elite. While the American poor are stuck in place, the country’s rich are increasingly transient, pursuing the opportunities of an interconnected world and less concerned than ever with the future of the republic. A growing number of America’s rich are entrepreneurs, as opposed to inheritors of wealth. Their business aspirations are global in scope; they hire labor in Thailand to market products to consumers in China, or vice versa. Not surprisingly, the American elite have more in common with their fellow entrepreneurs from Asia or Europe than they do with their compatriots back home.

But, Peck cautions, don’t assume that today’s wealthy are leading lives of leisure. They’re more likely to be attached to a BlackBerry than a polo mallet. Because they work so hard, many are resistant to the notion that fortune may have played the determining role in their success. They may well be more philanthropic than their predecessors like the Rockefellers or Carnegies, but they’re also more aware of the depths of human need in places like Ghana, Bangladesh, and Papua New Guinea (locations where the Gates Foundation has significant investments). The struggles of the shrinking American middle class continue to look paltry in comparison to the circumstances of the majority of the world’s inhabitants.

“If the transformation of the world economy lifts four people in China and India out of poverty and into the middle class, and meanwhile means one American drops out of the middle class, that’s not such a bad trade,” Peck quotes one CEO as saying.

Is the American middle class salvageable? Peck offers up a set of balanced recommendations toward that end. First, he argues for a return to the tax rates of a few decades ago, where the wealthy contributed much closer to 50% of their income to the government coffers, as opposed to the 35% they pay today. Peck dismisses the argument that increasing the tax burden on the rich would hurt the current recovery. Trickle-down economics is patently unviable in an environment where the wealthy are few and do a greater portion of their investing and consuming abroad.

Lawmakers may have overreached in their regulatory response to the 2008 market collapse, says Peck, so lessening regulations might help spur business. He also advocates a reconsideration of the nation’s current entitlement commitments, which, while popular among baby boomers, are unsustainable. Above all, only real government investment in research and development will put the country back on the road to prosperity, he argues.

Peck currently serves as a features editor for The Atlantic, and people who have followed that magazine’s coverage of the recession over the past two years and seen his cover feature story will find some aspects of this book familiar. But Pinched provides much original insight and should be considered a natural heir to Reisman, Glazer, and Denny’s The Lonely Crowd, and Thorstein Veblen’s Theory of the Leisure Class. Pinched is an excellent chronicle of the Great Recession’s hidden and long-term effects on the American psyche. In its wide scope and clear focus, it may go on to be the seminal book on this period in the country’s history.

About the Reviewer

Patrick Tucker is the deputy editor of THE FUTURIST magazine and director of communications of the World Future Society.

Tomorrow in Brief

Metal Theft on the Rise?

As the value of metals increases, so does the likelihood of theft. But it isn’t just the local thugs ripping gold chains off our necks that we’ll have to worry about.

Metal theft may become one of the biggest criminal activities of the twenty-first century, warns University of Indianapolis criminologist Kevin Whiteacre. Targets may include construction sites, vehicle parts, plumbing and electrical equipment, and public infrastructure, where thieves see value not just in the manufactured goods themselves but also in their component metals.

“This has redefined theft to me,” says Whiteacre. “You’re no longer stealing a specific item for its value as an item. You’re stealing it for its constituent parts.” Whiteacre has created a Web site, Metaltheft.net, as a repository of news and research on the phenomenon.

Source: University of Indianapolis, www.uindy.edu.

Virtual Lab Rats

The use of laboratory animals has long helped researchers study complex systems, such as the interplay of genetics and environmental factors in disease formation. But these animals need to be fed and housed.

Now, researchers may use computer models with integrated data sets to simulate animal physiology. A project to create a “virtual physiological rat” is under way at the Medical College of Wisconsin in Milwaukee. The project will allow computational biologist Daniel Beard and his team to predict the interaction of a variety of factors within an entire physiological system.

While it won’t eliminate the need for laboratory animals entirely, the project aims to make more efficient use of animal research, to improve understanding of disease, and to advance the goal of creating a virtual physiological human.

Source: National Institute of General Medical Sciences, National Institutes of Health, www.nigms.nih.gov.

Solar Ivy for Walls

The ivy-covered walls adorning university buildings may soon be powering those buildings as well.

Solar Ivy, developed by Sustainably Minded Interactive Technology in New York, is made of small photovoltaic panels that can be created in different shapes and colors to suit the architecture.

Pioneering the application of Solar Ivy is the University of Utah, which used funds raised by students to install the panels in late 2011. The goal is to generate enough electricity for the ivy-covered building to offset the amount of power it buys from the utility company.

Sources: University of Utah, www.utah.edu.

Solar Ivy, www.solarivy.com.

Robotic Caregivers

Need a lift up from bed to chair? The task is awkward and difficult for most humans, and sometimes results in caregivers wrenching their backs. Not so for robots.

As the population of older people needing nursing care begins to soar in Japan and other graying societies, robots are being developed to provide more of the necessary physical support. This may be as many as 40 lifts a day for individual patients.

Japan’s latest RIBA II (Robot for Interactive Body Assistance), developed by researchers at RIKEN and Tokai Rubber Industries, has improved functionality, more power, and greater sensitivity. Sheets of sensors lining the robot’s arms and chest allow it to detect a patient’s weight accurately, and thus provide gentler and safer lifts.

Source: RIKEN, www.riken.jp.

Aquariums as Farms

Future homeowners, college campuses, and other nontraditional “farmers” may soon be growing their own fish and vegetables while recycling waste.

An experimental food production system is being tested by SUNY ecological engineering graduate student Michael Amadori. The system is a variation on aquaponics (combining traditional aquaculture and hydroponic farming) that incorporates the use of post-consumer food waste.

Instead of being composted (or thrown out), the wasted food is fed to the fish. Then, the fish waste is used for growing vegetables. The goal is to reduce the amount of food waste and lower the cost of raising fish.

Source: State University of New York College of Environmental Science and Forestry, www.esf.edu.

Future Active

Custom Teaser: 
  • Observing the Next 30 Years of Climate Change
  • Commercializing Research from the Public Sector

Observing the Next 30 Years of Climate Change

Construction is scheduled to begin in late 2011 on the first long-term continental-scale ecological monitoring system in the United States. The National Ecological Observatory Network (NEON), consisting of 62 sites across the country, aims to help researchers better understand and forecast climate change’s effects and patterns over the next 30 years.

In addition to monitoring environmental change, NEON will also gauge land use and the effects of invasive species on different regions. This will be done via gathering and analyzing data on the soil, water, and atmosphere. Many of the towers and platforms used to gather data will be mobile and transportable, and satellites will help collect information. The National Science Foundation (NSF) is funding the project, which is estimated to cost $434 million.

“NEON’s early observations will provide the continental baseline we need to understand and forecast the likely environmental changes we could see over the coming decades,” says NEON chief science officer Dave Schimel. This long-term climate-research project is intended to help current—and future—researchers spot emerging ecological trends. Such research could enable more successful planning and response, as well as better-informed policy making.

While the NSF emphasizes that NEON’s networked infrastructure will employ existing state-of-the-art technologies, no new technology is being developed specifically for the project.

NEON is scheduled to begin operating as soon as 2012 and to be fully functioning by 2017. Its data will be made available online, in something approximating real time, to anyone interested.

Sources: National Science Foundation, www.nsf.gov.
NEON Inc., www.neoninc.org.

Commercializing Research from the Public Sector

Researchers in the public sector who develop innovative technology are often not as effective when it comes to commercializing it. Yet, this step is important, and transferring technological innovation from the public to the private sector can provide additional benefit to a society by boosting its economy.

A study conducted by the Institute for Defense Analyses Science and Technology Policy Institute and sponsored by the U.S. Department of Commerce examines the obstacles holding back commercialization and searches for more effective strategies to move innovative products and processes from the government lab to the marketplace. The report, titled “Technology Transfer and Commercialization Landscape of the Federal Laboratories,” relies primarily on interviews conducted with individuals involved in technology transfer at federal research labs and agencies.

The interviews revealed nine “mutually influential factors” that “affect the speed and dissemination of technologies from the laboratories,” according to the report. These include government regulations that can delay the process or otherwise “make it difficult for federal laboratories and industry to interact,” too much federal or congressional oversight that “can have the unintended consequence of encouraging a risk-averse culture towards technology transfer,” and lab directors who do not strongly prioritize the commercialization process.

There may also be a lack of knowledge and skills to carry out that step of the process. Without proper incentives in place, it is unlikely that lab directors and others in similar positions will be motivated to develop such abilities. Recommendations for improving incentives include creating awards for excellence in that area and increasing royalty amounts.

The report further points out that efforts are often not as organized or coordinated as they need to be. Clearly defined missions, goals, and strategies for commercialization are necessary in order to improve the process.

Source: National Institute of Standards and Technology, www.nist.gov.

As Tweeted: WordBuzz: Jobsolescence

By Cynthia G. Wagner

As the U.S. space shuttle program ended, Twitterers pondered the future for a special class of professionals.

We recently sent out a call on Twitter for “WordBuzz” suggestions and received a number of interesting neologisms. We selected jobsolescence, one of several idea-forward terms submitted by foresight analyst Richard Yonck (@ryonck) of Seattle, Washington.

Joining us in the conversation was Caroline Halton (@GloHalton), a communications strategist and trainer based in Johannesburg, South Africa.

@ryonck: @WorldFutureSoc “Jobsolescence”: The state of being for functions, positions or fields that have disappeared. #wordbuzz

@GloHalton: any detail on specifically which functions, positions and fields have actually succumbed to ‘Jobsolescence’?

@ryonck: I’d say any job massively undermined by new tech, e.g., elevator operator. May be a few left but not many

@WorldFutureSoc: Much tech-driven “jobsolescence” involves work we don’t want to do, but not everyone is creative enough to find better

@ryonck: True, but there are many jobs people don’t want to do, but have to. Times of transition can cause hardship

@GloHalton: And ‘jobsolescence’ as regards astronauts in the post shuttle era? Taking jobs as tour operators with Virgin?

@ryonck: On the contrary, if private space industry grows, astrojobs may be preparing for lift-off. ;-)

@WorldFutureSoc: I’d love to see more suggestions: Jobs for astronauts in post-shuttle era (cc @NASA, @neiltyson) #jobsolescence

@GloHalton: Houston shedding flight controllers, pad technicians, shuttle parts workers so ‘jobless’ astronauts in good company.

@WorldFutureSoc: Ideas needed! 101 uses for a used - er, unemployed - astronaut. #jobsolescence

We would like to credit Yonck for coining the term, but upon further research (i.e., Googling), we discovered a prior claim on jobsolescence.

Proving that it’s as much fun to do research as it is to just make things up, we watched a video episode about “Jobsolescence” by Double Espresso, independent filmmakers and word-players who credit producer Norma Vega for the concept.

In this short comedy video, two guys, Emilio and Vincenzo, assert that poverty translates into stupidity. They cite the (fictitious) book Masters and Slaves of the New World Economy by Guillermo Pinkerton, which outlines a hidden order and the phenomenon of jobsolescence—the “obsoletion of employment.”

The guys discuss the success of that Facebook kid (i.e., Mark Zuckerberg), whose success story offers lessons for surviving jobsolescence in this crisis of “the Repression.” He became rich by creating a service for others, and you can do that, too, “by creating a job that has long eluded you, so you too can become a punk genius billionaire.”

Source: Espressode 6, “Jobsolescence,” by Double Espresso. Written by Michael Arturo, produced by Norma Vega, starring Manuel Bermúdez and Michael Arturo. View online at www.clicker.com/web/double-espresso-web-series/.

Follow the World Future Society on Twitter at http://twitter.com/WorldFutureSoc. Also see THE FUTURIST magazine’s official Twitter page, @TheYear2030.

Virtual Games Bring Currency to Real Life

Young entrepreneur Brian Wong sees mobile games invading real life.

Mobile games playable on smartphones, tablet PCs, and other Internet-connected devices are projected to surpass $11 billion in annual revenue by 2014, up from $8 billion in 2011, according to a report by Juniper Research. Twenty-year-old software guru Brian Wong says that the mobile game space will advance faster than many are predicting.

“There are still a few billion people on the planet that have not touched a mobile device or game. Imagine what happens when they come online,” Wong said at WorldFuture 2011, the annual conference of the World Future Society. Wong was on hand to discuss his new company, Kiip, which gives players real rewards for their mobile video-game achievements. When players win a level or reach a particularly high score, they can access real-world rewards, everything from coupons for cappuccinos to discounts on clothes and even cruises.

“We’re trying to leverage the mass amount of people who are engaging [in these games] to tie with marketing and advertising and make the game emotionally relevant,” said Wong.

One potential catalyst for runaway growth in mobile-game revenue is the advent of a mobile-payment system, which would allow people to make purchases directly from their phones while immersed in video-game play.

“Right now we’re exclusively paying through plastic, bank accounts, cash, and that’s about it,” Wong said. “But soon we’ll get the ability to use our phones to make payments.… Once that happens, [users] can pay without feeling like [they’re] paying.”

Wong also sees an enormous rise in the value of the virtual economy (VE), which roughly refers to exchanges of virtual goods, links, and digital labor such as tweeting. An April 2011 report commissioned by the World Bank valued the virtual economy at $3 billion at the end of 2009. Wong predicts the VE could grow to $300 billion in the next 10 years.

“These [virtual] goods can be reproduced into infinity with no physical barrier,” he told THE FUTURIST. “The challenge is to use marketing, scarcity, and exclusivity to make the goods meaningful and valuable.” Of Kiip, he said, “We’re building that.”

Wong is particularly sanguine about the potential of new currency systems built around social network platforms. One example is Facebook credits, a system that allows users to buy virtual goods on all the games across Facebook, such as the popular Mafia Wars, Evony, and Farmville.

“The World Bank report leaves out the companies that have the power to consolidate virtual value, like Facebook,” said Wong. “The dollar and euro are yesterday’s news. What happens with Facebook credits or other credit systems? That’s the fascinating question.”—Patrick Tucker

Sources: Brian Wong (interview at WorldFuture 2011), http://kiip.me.

Knowledge Map of the Virtual Economy by Vili Lehdonvirta and Mirko Ernkvist (World Bank publications, 2011), Information for Development Program, www.infodev.org.

The Smell of Future Video

Scent transmission could add another layer to digital and streaming broadcasts.

Virtual-reality enthusiasts have long argued that a truly immersive, multisensory entertainment experience needs to fully engage the senses: sight, sound, and smell.

Researchers at the University of California, San Diego (UCSD), in collaboration with the Samsung Advanced Institute of Technology in Korea, are working to move such experiences a step in that direction by adding smells to movies, television shows, advertisements, video games, and more. They have developed a new incarnation of an idea that arguably dates back to the earliest days of the silent film era and was showcased at the 1939-40 New York World’s Fair as Smell-O-Vision—though perhaps a more accurate term for the concept would be telesmell.

An artificial scent-delivery process raises all kinds of questions. For instance, would telesmell add another dimension to works being presented or would it be a superficial distraction that runs counter to the creators’ intentions? Would it irreversibly alter the entertainment landscape in a positive or negative way? Would it become the “next big thing” or a short-lived fad? And can it be done in a way that’s inexpensive and safe?

Up until this point, the answer to that last question has been up in the air, so to speak. Previous systems were bulky, slow, and crude. They ran out of odor information quickly or were otherwise unable to reproduce the scent consistently, allowed very little control over the amount and intensity of the scents released, and left lingering scents that intermingled like cheap perfumes in a crowded subway car.

Home entertainment centers are also getting flatter and thinner, and watching videos on portable electronic devices is becoming more commonplace. Therefore, a viable telesmell system would have to be a small, compact, nonmechanical device compatible with a wide range of hardware, from gaming consoles to smartphones, according to the researchers at UCSD and Samsung. Thousands of scents can potentially be generated on command in such a device, and it would be relatively inexpensive, they believe.

The change in the viewing experience would be pronounced. “Instantaneously generated fragrances or odors would match the scene shown on a TV or cell phone,” says Sungho Jin, UCSD Jacobs School of Engineering professor of materials science. He gives the example of characters onscreen eating a pizza. “The viewer smells pizza coming from a TV or cell phone.”

Jin tells THE FUTURIST, “The image on the screen should be synchronized with electrically triggered odor release from the chamber array attached to the TV or cell phone.” He compares this process to the way in which sound was added to movies at the end of the silent movie era.

In the proposed system, which the researchers refer to as the X–Y matrix odor-releasing system, the scents are stored in a container made from a silicone-based polymer called polydimethylsiloxane (PDMS), which has been used to time drug release in patients and has the capacity to act as an on/off switch. A rubberlike substance, PDMS is optically clear, nontoxic, nonflammable, and found in everything from contact lenses to processed foods. It is stable over a wide range of temperatures, and it protects its contents from contamination or from leaking out.

An electrical current heats the liquid odor solution inside the container, creating enough pressure to push open a tiny hole at the top, just long enough to release the built-up gas. It seals back elastically and stores the solution until the next cycle. The polymer container is long-lasting, and when scent solutions start running low, they could be replaced.

The system features 10,000 odors preserved in aqueous solutions, each in tiny PDMS containers. The liquids are heated electrically via thin metal wires laid out on a 100x100 cell matrix, which greatly reduces the system’s bulkiness. Currently, the smells can be sensed only up to 30 centimeters (about 1 foot) away.

The researchers are also looking at other ways to improve their prototype. “Such a system should not be too expensive. The price would depend on how many devices are made and sold,” according to Jin. “It may take several years before such a system becomes commercialized.”

There are other potential applications for this technology, aside from entertainment and advertising. One possible field that might benefit is telemedicine.

“One can imagine vapor-based therapeutic drugs in the future (rather than the current solid or liquid-based drugs), which physicians can remotely release,” Jin says. Patients would breathe in the medicine through their nose, absorbing it into their system.

Jin highlights a possible security application, as well: alarm systems. “For example, if you want to repel a burglar at home or at a … government lab, you could design the alarm system in such a way that … you could trigger a severe skunk smell that the typical person could not withstand.”

This brings to mind potential security threats. For example, terrorists could create panic in a public space by transmitting a scent that replicates mustard gas or another dangerous chemical. However, Jin believes such an incident is unlikely.

“Odors or poison gases cannot be transmitted through phone lines or Internet lines,” he says. “But if a terrorist plants an array of different poison gases [using a similar technological method] in a subway station, the selected poison could be released and stopped in a controlled way by remote electronic signals at selected time intervals. [This] would be an advanced version compared to what the terrorist might be able to do today—in other words, just activate a switch and release one type of gas uncontrollably.” While such a system would be more sophisticated, there are currently easier ways for criminals to create panic in the streets.

When and if telesmell ever becomes commonplace, media consumption will likely be the most greatly impacted arena—and it is probable that new uses for the technology will continue to be discovered, as well.—Aaron M. Cohen

Sources: University of California, San Diego, Jacobs School of Engineering, www.jacobsschool.ucsd.edu.

Sungho Jin, UCSD Jacobs School of Engineering professor of materials science (e-mail interview).

Unwasted Energy

Physicists seek ways to harvest “junk” energy in the environment.

From the vibrations filling the air when jets take off to the waves generated by radio and television transmitters, our environment is full of largely wasted energy. Now, researchers are seeking ways to capture that energy and turn it into useful sources of electricity.

One of the challenges is that communications devices transmit energy at different frequency ranges, so whatever devices are used to harvest this energy needs to hone in on the right band in order to capture the energy. (Currently, scavenging devices can work in ranges from FM radio to radar.) Then the energy needs to be converted from AC to DC, and stored in capacitors and batteries.

At Georgia Tech, a rectifying antenna used to convert ambient microwave energy to DC power was developed by a team led by electrical and computer engineering professor Manos Tentzeris. The gathered power could be used for wireless sensors, RFID tags, and other monitoring tasks.

“There is a large amount of electromagnetic energy all around us, but nobody has been able to tap into it,” says Tentzeris. “We are using an ultra-wideband antenna that lets us exploit a variety of signals in different frequency ranges, giving us greatly increased power-gathering capability.”

Tentzeris’s team is also taking advantage of new 3-D inkjet printing technology to build sensors, antennas, and energy-scavenging devices on paper or flexible polymers.

Another promising source of “junk” energy is the vibrations produced on roads and airport runways.

At the University at Buffalo, physicist Surajit Sen and his colleagues have taken a mathematical approach to studying energy exchange between particles. They discovered that altering the surface area of adjacent particles can change the way energy moves, thus making it possible to control the energy channeled.

“We could have chips that take energy from road vibrations, runway noise from airports—energy that we are not able to make use of very well—and convert it into pulses, packets of electrical energy, that become useful power,” says Sen. “You give me noise, I give you organized bundles.” —Cynthia G. Wagner

Sources: Georgia Tech, http://gtresearchnews.gatech.edu. University at Buffalo, www.buffalo.edu.

Connecting People to Their Governments

Mobile phones may become valuable tools for empowering the disadvantaged.

The world’s less-affluent populations cannot all afford personal computers, but mobile phones are much more within their financial reach. That’s why Nicol Turner-Lee, vice president and director of the Media and Technology Institute at the Joint Center for Political and Economic Studies, looks to mobile phones and other handheld devices for Internet access as a great opportunity for empowering disadvantaged communities and ultimately enhancing democracy.

“Cell phones present a lower barrier to entry to underrepresented groups like low-income minority or elderly who need that constant contact. It’s an easier modus operandi for these communities,” says Turner-Lee.

Low-income adults and young people are increasingly using mobile devices to conduct banking, find jobs, and access medical help, Turner-Lee notes. She wants government to ensure that mobile transactions are secure, and she wants government agencies to make more information about their own operations Web-accessible.

“I think it’s a great opportunity with devices to ensure transparency and ensure the ability of citizens to access information in real time,” she says.

In February 2011, Turner-Lee co-authored a report on the future of “e-governance” with Jon Gant, visiting resident fellow of the Media and Technology Institute at the Joint Center for Political and Economic Studies. The report calls out the proliferation of mobile devices as a prime medium for a government to communicate easily and in real time with any of its citizens, whether they own computers or not.

“With the proliferation of mobile devices, especially cell and smartphones, governments can gain easy and immediate access to consumers, especially those that do not own a computer, and widen their distribution of significant data,” the authors write.

However, the report also expresses concern that not enough low-income and undereducated Americans have gained Internet access. The authors recommend education campaigns to encourage more disadvantaged adults to obtain Internet access, such as by showing how using a computer can make it easier for a citizen to interact with Medicare or to navigate Medicaid.

“The outcomes of open government will be the most relevant when they not only reduce the digital disparities that maintain a degraded quality of life for many Americans, but also offer a road to opportunity for these vulnerable groups,” the report states. “In the end, cities can begin to see healthier, safer, and more viable communities as a result of deeper engagement from all citizens.”

World Bank analysts also see much poverty-relieving potential in cell-phone usage. Laurent Besancon, senior regulatory specialist in the World Bank’s Information and Communication Technologies Division, says that India gained its first 3G (Web-accessible) phone services in 2009, and that, in that short time span, the number of 3G connections in India has surpassed fixed broadband connections.

“When we look at the future and we see the cost of platforms decreasing, we see an enormous take-up with smartphones,” says Besancon, adding that he expects Afghanistan to get its first 3G phone services sometime in 2012.

Afghans today are already enhancing their interactions with public services through conventional, non-Web mobile phones, according to Siddhartha Raja, a World Bank information and communications technologies specialist. For example, farmers in remote areas can call in and find market prices in major markets across the country. In the near future, residents of rural areas where doctors are sparse will be able to search through their phone’s on-screen directories to find doctors living in cities. Raja expects this to assist in cutting maternal mortality rates.

“The mobile phone might be the first regular interaction with the state that they would ever have,” says Raja.

Meanwhile, in India’s Kerala state, government agencies have been making many transactions, such as license and registration renewals, or checking voters’ identification and locating the correct polling station to cast a ballot, available over the phone. Raja anticipates more citizen-to-government transactions taking place via mobile phones as phone services and Internet services continue to expand.

“This is really something that is allowing anyone with a mobile phone—whether they are poor or rich—to get access to those services that were previously difficult to reach,” says Raja. “In a country like India, the question is how to expand and improve the connections to government, whereas in Afghanistan, the question is how to get citizens connected to their government in the first place.”—Rick Docksai

Sources: Nicol Turner-Lee (interview), Media and Technology Institute, www.jointcenter.org/institutes/media-and-technology. See also “Government Transparency: Six Strategies for More Open and Participatory Government” by Jon Gant and Nicol Turner-Lee, Aspen Institute white paper, www.aspeninstitute.org.

Laurent Besancon (interview), World Bank, www.worldbank.org.

Siddhartha Raja (interview), www.worldbank.org.

Fighting AIDS through Genome Editing

A new treatment might genetically adapt us to resist HIV.

The human immunodeficiency virus that causes AIDS keeps evolving in the face of new drugs. But a new “genome editing” treatment might enable humans to evolve to resist HIV. The treatment uses enzymes called zinc-finger nucleases (ZFNs) to remove problematic genes, such as that which makes a person susceptible to HIV.

“It’s about giving patients the tools to suppress the HIV virus, to keep the virus count to a low level where it won’t do them any harm,” says Paula Cannon, a UCLA microbiologist and immunologist who is developing a ZFN therapy.

HIV destroys T cells, the blood cells that combat viruses. According to Cannon, T cells’ weak link is a gene called the CCR5. If a T cell does not have the CCR5, HIV cannot harm it.

Cannon and her colleagues applied ZFN to human bone-marrow cells. Bone marrow is where all blood cells, including T cells, are manufactured. The ZFNs latched onto the cells’ genomes and removed the coding for CCR5.

Then the researchers injected these modified stem cells into baby mice. The stem cells merged into the mice’s bone marrow and started producing blood cells.

When the mice reached adulthood, the researchers infected them with HIV. At first, blood samples from the mice exhibited high viral counts. About 12 weeks later, Cannon and her team drew blood again and could no longer detect any viruses. The mice’s marrow cells were making CCR5-negative T cells that were withstanding the virus.

“I think of this as a therapy that will not necessarily completely remove the HIV from their body, but it gives them an HIV-proof immune system so that HIV won’t cause the harm that it normally does,” says Cannon.

She is now trying her ZFN therapy in the Los Angeles clinic City of Hope on patients who are HIV-positive and have lymphoma. She chose them because they typically undergo chemotherapy for the lymphoma, and prior to chemo, doctors remove some of their bone marrow cells to protect them from the chemicals. They reinsert the cells once chemotherapy is complete. Cannon will apply ZFN to the cells before reinsertion.

“Because the AIDS lymphoma patients already have these cells taken out and put back in them, it seems like a good place to start. We’re piggybacking on this procedure,” she says.

She would not be the first to try ZFN therapy on people. Sangamo Biosciences, a pharmaceutical company, conducted human trials in 2011 on a ZFN therapy that isolates, treats, and reinserts T cells. Philip Gregory, Sangamo’s chief scientific officer, says that most patients exhibited higher numbers of T cells six weeks post-treatment. The modified T-cells were replicating.

“We actually expand the number of T cells from what we take out of the body,” says Gregory. “They survive, whereas the cells that express CCR5 are continually killed by the HIV infection.”

According to Gregory, the increased T cell concentrations are significant because, while antiretroviral drugs suppress the virus, they cannot restore an immune system. ZFN treatments give patients their immune systems back and might even enable them to wean off their antiretroviral medications.

“If you can protect these cells from infection, you can halt the infection—and potentially arm the patient with cells that can suppress the infection without drugs,” says Gregory. “It’s the first step to controlling the virus in the absence of medication.”

Either ZFN method may be more reliable than a hypothetical AIDS vaccine or antibiotic, according to Carl June, a University of Pennsylvania pathologist who is working with Sangamo researchers. June says that HIV mutates repeatedly, so a drug that aims to kill HIV cells will not work for very long. Changes to the patient’s cells, however, could block even mutated HIV pathogens.

“If you can target a patient’s cellular protein rather than a virus, you’re much better off on a long-term factor. It would take a very big change in the virus to overcome it,” says June.

ZFN-based genetic treatments could also work against other genetically inherited diseases, June adds, such as sickle-cell anemia and immunodeficiency diseases. Doctors now treat those conditions with bone-marrow transplants, but patients have to take medications to stop their bodies from attacking the transplanted tissue. ZFN treatment would involve no meds.

“ZFN would be much lower toxicity, since you’re using the patient’s own cells,” June explains.—Rick Docksai

Sources: Paula Cannon, UCLA, www.ucla.edu.

Philip Gregory, Sangamo Biosciences, www.sangamo.com.

Carl June, University of Pennsylvania, www.upenn.edu.

September-October 2011, Vol. 45, No. 5

  • The Coming Robot Evolution Race
  • Thank You Very Much, Mr. Roboto
  • The Accelerating Techno-Human Future
  • Exploring New Energy Alternatives
  • Five Principles of Futuring as Applied History

The Coming Robot Evolution Race

By Steven M. Shaker

Homo sapiens may have “won” the evolutionary race to perfect humankind, but artificial intelligence and robotics will evolve faster and farther. Rather than compete with them, we may do well to make them our allies and co-evolve, suggests a technology trend analyst.

Some people believe that humanity’s evolutionary advance into the future is driven by how our genetic pool responds and adapts to climate change and cultural and societal dynamics. These external factors contributed to how we evolved in the past and became human. Extending that same evolutionary view forward by a few hundreds of millions of years, we arrive at comedic vision of our collective future: We’ll have become creatures with a huge forehead for expanded cranial capacity and a small body due to lack of any manual labor, etc.

Most futurists, however, realize that we now have the means to shape and influence our own evolution and cause substantial change within periods spanning only hundreds and thousands of years. The interplay between our ability to map and manipulate our own DNA, as well as to integrate mechanical mechanisms into our own physiology, is driving this evolutionary adaptation. We will adapt our DNA to more readily accept the enhancements from nanotechnology and other bionic devices, and we’ll engineer these to synch up with our DNA improvisations. As a result, humanity’s evolutionary momentum will spiral quicker and quicker. Fashion, self-image, and social bonding will influence the “look and feel” as much as utility. So hopefully, humans won’t resemble the Borgs of Star Trek, except for those of us making an aesthetic choice to do so.

Writers such as Joel Garreau, author of Radical Evolution (Doubleday, 2005), have suggested that accelerating technology could lead to an evolutionary bifurcation between the haves and have nots. Economic, religious, philosophical, and cultural views may prevent some geographical or demographic groups from participating in actions advancing their self-evolution.

The masses of humanity may not be able to afford such enhancements to themselves or their offspring. Those who can obtain genetic and artificial organ replacements may be able to live longer and healthier, and thus will be more likely to survive and reproduce. It is possible that, over time (that is, in much quicker periods than afforded through natural evolution), genetic differences between humans who augment and alter their genetic code may differ enough from those who do not. The variance may prevent interbreeding. This would lead to the creation of a separate new species.

Now, a new competitor is also emerging on the scene. This one is all artificial, with no flesh or DNA. The arrival and evolution of humanoid robots competing against cyborgs and those humans who have resisted change may be reminiscent of the competition between Homo sapiens, Neanderthals, Homo erectus, and the “hobbit” people of the Indonesian island of Flores.

Competition in Robotic Evolution

Homo sapiens chauvinists like to think we were the fittest for survival and outcompeted the other hominids. We did have some fine competitive traits, but our success has to do with some degree of luck.

There were two points when Homo sapiens almost went extinct. Between 195,000 and 123,000 years ago, Earth was in the middle of a glacial phase and the Homo sapiens population was estimated to have gone from about 10,000 inhabitants down to as few as 600 people. Approximately 70,000 years ago, drought may have shrunk the human population down to just 2,000 folks. However, this was soon followed by the “flight out of Africa,” which led to a rapid expansion both in geography and in numbers for mankind. What a very exciting and competitive ancient world that Homo sapiens resided in! Machine evolution will be both more exciting and far more rapid.

Certainly, machinery endowed with artificial intelligence does not have to be robotic; it may be like HAL in 2001: A Space Odyssey, and reside within a computer’s memory core, or be part of a networked set of computers. Robots do not need to be humanoid like the Asimo robot developed by Honda. They can be wheeled or tracked unmanned vehicles like Stanley, the self-driving car that completed the 2005 DARPA Grand Challenge race. They could have multiple legs like Boston Dynamic’s famous Big Dog robot.

There are far better forms for robots than “human,” depending on what the robot is designed to do. But robots that are designed to perform multiple chores previously done by humans—from throwing out the garbage to walking the dog to repairing a satellite—will likely be humanoid in nature. These humanoids would be our most immediate competitors.

Accelerating Robotic Evolution

Some scientists and science commentators have expressed skepticism that sentience could ever be created in a machine setting. They’re impatient that humanistic AI has not yet been achieved, even though researchers have been aggressively pursuing artificial intelligence for decades.

Others disagree. Hans Moravec, the renowned roboticist at Carnegie Mellon University and author of Mind Children (Harvard, 1990), predicts that robots will surpass human intelligence by 2030, will develop humanlike consciousness, will be aware of the world and social interactions, and will gain the ability to replicate themselves and pace their own evolution. Physicist Michio Kaku, author of Physics of the Future (Doubleday, 2011), predicts that helpful robots performing the role of butlers and maids will be available by the year 2100. He is unsure how intelligent they will be, but they will have the capacity to mimic all sorts of human behavior.

Whether either Moravec or Kaku is off by a decade or two, or even several hundred years, it is really insignificant when compared to the glacial pace of natural evolution. In his 2000 paper “Robots, Re-Evolving Mind,” Moravec compares the evolution of intelligence in the natural world with the progress occurring in the field of information technology.

Natural intelligence evolution starts from wormlike animals with a few hundred neurons occurring more than 570 million years ago. Very primitive fish that appeared 470 million years ago had about 100,000 neurons. One hundred million years later, amphibians with a few million neurons emerged from the swamps. One hundred fifty million years later, the first small mammals appeared and had brain capacities with several hundred million neurons. The bigger co-inhabitants at the time, the dinosaurs, had brains with several billion neurons.

After the extinction of the dinosaurs 65 million years ago, mammalian brains also reached sizes of several billion neurons. The first hominids of about 30 million years ago had brains of 20 billion neurons. You and I, and our contemporary human colleagues, have brains operating with approximately 100 billion neurons.

Compare this to the artificial intelligence evolutionary track beginning with the first electromechanical computers built around 1940, which had a few hundred bits of telephone relay storage. By 1955, computers had acquired 100,000 bits of rotating magnetic memory. Ten years later, computers had millions of bits of magnetic core memory. By 1975, many computer core memories had exceeded 10 million bits, and by 1985, 100 million bits. By 1995, larger computer systems had reached several billion bits. By the year 2000, a few personal computer owners had configured their PCs with tens of billions of bits of RAM.

If one accepts the comparison of computer bits to neurons as described by Moravec, then the computer’s growth in evolution expanded each decade what it took Mother Nature to achieve every hundred million years. Moravec calculates that human engineering of artificial intelligence is occurring at 10 million times the speed of natural evolution.

An approach to AI called embodiment, or embodied embedded cognition, maintains that intelligent behavior occurs out of the interplay among the brain, the body, and the world. Some philosophers, cognitive scientists, and AI researchers believe that the type of thinking done by the human brain is determined by certain aspects of the human body. Ideas, thoughts, concepts, and reasoning are shaped by our perceptual system—our ability to perceive, move, and interact with our world. Roboticists such as Moravec and Rodney Brooks (founder of iRobot Corp. and Heartland Robotics Inc.) maintain that, in order to achieve human-level intelligence, any AI-endowed system would have to deal with humanlike artifacts, and thus a humanoid would be the optimal robot to achieve this.

The new field of evolutionary robotics, like its namesake of evolutionary biology, relies on the Darwinian principle of the reproduction of the fittest. This view posits that autonomous robots will develop and evolve from interaction with the environment. The fittest robots will reproduce by observing their interactions with the environment and incorporating mutations that increase their survivability.

Humans will be unable to match the rapid evolutionary jumps afforded to completely artificial beings, even with advances in cybernetics and genetic engineering. Robotic humanoids will only be limited by the laws of physics and not by those of biology, which even genetic engineering can’t alter. Hopefully, the sort of destructive competition that eliminated the rivals to Homo sapiens in the past—including such competitors as Homo erectus and the Neanderthals—will not be repeated in the next evolutionary stage.

In the best possible future, non-altered humans, humans with cybernetic implants, and robotic humanoids will learn from each other, borrow and share technology, and engage in friendly collaboration, cooperation, and competition to benefit all. In considering which robotic designs to support or, on the national level, to fund, that seems a good ideal to aim for.

About the Author

Steven M. Shaker is an executive in a market research and training firm. He is an authority on technology assessments, forecasting, and competitive intelligence. He is co-author, with Alan Wise, of War Without Men: Robots on the Future Battlefield (Pergamon-Brassey’s, 1988) and, with Mark Gembicki, of The WarRoom Guide to Competitive Intelligence (McGraw-Hill, 1998). E-mail steve.shaker@cox.net.

Thank You Very Much, Mr. Roboto

By Patrick Tucker

Japan’s unique research and development environment for robotics telegraphs how robots and humans will co-evolve.

I’m in a strangely lit subterranean room in Kyoto, Japan, and for the sake of the experiment in which I am participating, I’m pretending to be lost. A large “Mall Map” is mounted on a wall in front of me. I move toward it at a leisurely pace, in the manner of a man trying hard not to draw attention to himself. When I reach the map, I stop. A whirring sound of gears moving in a motor rises up behind me. I hear the thrush of wheels passing quickly over carpet. Something mechanical this way comes.

“Konichiwa,” says a cheery voice that sounds like it’s emerged from a Casio keyboard. I recognize the greeting: “Good afternoon.” I turn and see two enormous black eyes staring up at me. “My name is Robovie,” says the robot in Japanese. “I am here to help. Are you lost?”

“Yes,” I answer.

Robovie swivels on his omni-directional wheelbase, extends his motorized arm to the left corner of the room, and points to a large sheet of paper displaying the word “Shoes.”

“May I recommend the shoe store?” Robovie asks. “It’s right over there.”

“Dōmo arigatō,” I tell the robot. (“Thank you very much.”) It bows and wheels slowly backward. The experiment concludes successfully.

Welcome to Japan, which has been one of the world’s most important centers for robotics research since Ichiro Kato unveiled his WAP-1, a bipedal walking robot, at Waseda University in 1969. Fifteen years later, Kato presented a robot that could read sheet music and play an electric organ.

Robovie may seem like a step back compared with an assemblage of metal and wire that can sit down and coerce Kitaro’s “Silk Road” from a keyboard without missing a note. But Robovie, in fact, is far more human than his most impressive predecessors. In ways that are subtle but nonetheless significant, he represents an important turning point in the field. He’s also a moving, talking poster boy for all that is wonderful about Japanese robotic research. The future of human–machine interaction can be found in Robovie’s dark, watchful eyes.

Japan: Robot Central

MIT-trained robotics engineer Dylan Glas is one of Robovie’s very many chaperones. He’s lived in Japan for eight years now, and this has given him a uniquely international perspective on robotics culture. He also represents a reverse brain drain. He holds multiple degrees from the most prestigious technical school in the United States, but he left his country of birth to pursue better research opportunities abroad. Glas says that the allure of Japan wasn’t financial. He had plenty of offers to design robots in the United States. The problem, as he explains it, was that he didn’t want to build war machines.

A participant in MIT’s Middle East Education Through Technology program, Glas worked teaching JAVA programming to Israeli and Palestinian high-school students in Jerusalem in 2004. The experience was instructive.

“I saw how people who are parts of larger warring groups can form friendships,” says Glas. “So I came straight from trying to make peace to looking at building things that killed people.” He explored other research opportunities online and found a picture of Robovie (an earlier iteration) hugging a little girl. He knew at that moment he was moving to Japan.

The United States and Japan lead the world in robotics research. But the two countries are dramatically apart on what they’re building these bots to do. The United States, which spends more on its military than do the next 45 spenders combined, has devoted most of its robotics research funding, on a national scale, to putting machines in dangerous battlefield situations, deep behind enemy lines, over the mountains of Afghanistan and Pakistan in the place of humans. The goal is not so much to replace the human soldier but to automate the deadliest parts of the job so the soldier becomes more technician, less target. The iRobot Corporation, the most successful private robot manufacturer in the United States, didn’t get its start building Roomba vacuum cleaners but designing military machines like the PackBot.

Japan is looking to fill a very different need. Demographically, it’s the oldest country in the world; nearly 20% of the population is older than 65. In the rural countryside, the proportion is closer to 25%. Japan is also shrinking. The number of children under age 15 reached a record low of 16.9 million in 2010. Many of Japan’s best-known robotics research projects, such as Asimo, indirectly address the rising population of seniors and growing dearth of able-bodied workers.

Meeting Social Challenges

Many Japanese argue that the country could address its demographic challenges through policy, such as allowing more willing immigrants into the country. There’s evidence to suggest that a more relaxed immigration policy would benefit Japan economically. But immigrants here face social and even linguistic barriers to real integration. Japanese is a tough language to learn; rules and usage can vary tremendously from prefecture to prefecture, between superiors and subordinates, between waiters and restaurant goers, and even between men and women. Linguistic and social customs can be very important to older Japanese, even if hip and media-savvy kids in Tokyo don’t think much of these cultural norms.

Formality and routine are particularly important in work settings, as anyone who has lived in Japan can testify. The degree of professionalism, focus, and seriousness that people bring to even menial jobs is impressive. This is not a country where you encounter baristas texting while they’re making lattes. That emphasis on completing tasks in a very specific “right” way contributes to greater acceptance of automation, says Glas.

“At work, there is no deviation from the established best practice,” he notes. “When I go to the supermarket, they always say exactly the same thing and deliver customer service exactly the same way. So I think the idea of robots doing that sort of work is very natural.”

All of these factors—aging and decreasing population, lack of immigrant labor, electronics expertise, available research funding, and cultural openness to automation—make Japan the key destination for humanoid robotic research, the study of how humans and robots interact in casual, civilian settings.

The Intelligent Robotics and Communication Laboratories, where Glas works, puts people and robots together in interesting settings. Their field tests offer a snapshot of a future where humans and machines work and play side by side. One of Glas’s favorite experiments involved a teleoperated robot who served as a tour guide in the town of Nara, Japan’s imperial capital some 900 years ago and home to some of the most important Buddhist temples in the world.

Touring Nara is more fun with the right guide, someone who has spent awhile learning the history and who knows a secret or two about the place (such as where to find the sacred herd of deer that eat directly from your hand). But the average age of a tour guide here is 74. Therein lies the problem. The walk from the train station to the best sites, like the famous giant bronze Buddha, can be challenging for young bodies, let alone someone in her 70s. Glas and his colleagues saw an opportunity to put a remote-controlled robot (as opposed to a fully autonomous one) in a unique setting to serve as the voice, eyes, and ears of a real person.

“Having this robot there helps [the guides] be there from home, so they can still talk and share their enthusiasm for Nara and the history of Nara,” Glas notes. “When I tell people this, a lot of Americans say, with a blasé shrug, ‘interesting.’ Japanese people light up and say, ‘Oh, we really need that!’ The perception of necessity is very different. That’s a cultural difference that guides the way people perceive how robots should be in society.”

My conversation with Robovie reenacts another field test that took place in an Osaka shopping mall in 2008 and 2009. The goal in that situation was not so much to empower people through telerobotics as to instruct robots how to interact with humans. The setting was a strip of stores by the entrance to Universal Studios Japan. Robovie had a 20-meter turf, sandwiched between a collection of clothing and accessory boutiques and a balcony. The first challenge was learning to distinguish between people who were passing through the area in a hurry from those who were just window-shopping or who were lost. The second group might be open to a little conversation; the first group represented a hazard.

The mall test is a classic example of the sort of pattern-recognition task that humans are great at, but robots just don’t do; there are too many open questions. How do you explain human states like “in a hurry,” “window-shopping,” and “lost” in binary code, a language that the robot can understand?

The researchers outfitted the area with six sensors (SICK LMS-200 laser range finders) and, over the course of one week, collected 11,063 samples of people walking, running, and window-shopping. They analyzed the data in terms of each mall goer’s velocity and direction, and isolated four distinct behaviors: fast walkers, idle walkers, wanderers, and people who were stopped or stuck looking at a map. These classifications helped Robovie learn how to recognize different types of people based on their behavior.

Next, Robovie had to say something to the folks he chose to converse with, and the back-and-forth had to seem fluid and natural to the human participant. You would assume that teaching a robot to make chitchat would be a snap after all the time humans have spent over the last decade talking to computerized agents over the phone. But in a real-world setting, the interaction is a lot harder for the machine to handle gracefully. “People think [computerized] speech recognition is so great,” says Glas. “It is, if you have a mic next to your mouth. But if you have a mic that’s far away or on a noisy robot, or there’s music in the background and a group of three people is trying to talk to the robot at once, it’s not feasible.”

Robots have the same hearing problem that afflicts people with King–Kopetzky syndrome, also called auditory processing disorder. Picking up sound isn’t the issue. It’s distinguishing words and meaning from all the other background noises. The problem lies in the brain, which is where most of what we call hearing actually occurs.

To compensate for these deficiencies, the researchers made sure Robovie could call for backup. A human operator would monitor the exchanges from an isolated location, and if Robovie heard a word he didn’t recognize (or got himself lost in some corner of the mall), the operator could chime in and help the robot with the recognition problem. This cut down on the time it took the robot to respond to questions. Human–robotic interaction will likely proceed along these lines—partially human, partially robot—for the foreseeable future, says Glas.

“Even in big automated factories, you need a human. You always will,” Glas avers. “My goal is to increase the automation level, decrease the role of the operator, and work towards more automation. So instead of one person fully operating a telerobot, you have one person monitoring 400 robots; when one runs into a novel operation scenario, he calls the operator.”

The lab’s field tests have yielded a plethora of interesting and counterintuitive findings. For one thing, people trust robots that look like they just came out of the garage, with bolts and hinges exposed, more than they do bots concealed in slick plastic frames. Also, kids and adults interact with robots in very different ways. Adults kept asking Robovie for directions and treated him like a servant, while kids asked the robot’s age and favorite color.

These experiences are why Glas loves his job. They also reveal how the study of humanistic robots involves much more than sensors and hardware. It draws from psychology, anthropology, and a host of other so-called soft sciences. It makes use of intuition and observation in a way that formal robotics research under a military grant doesn’t. This, in part, explains why one of the most important figures in human–robotic interaction research is himself an artist.

The Oil Painter

In the myth of Pygmalion, a sculptor creates a female statue so convincing that the gods make her real. Japanese roboticist Hiroshi Ishiguro has never heard of Pygmalion, but shares the tragic hero’s obsession: creating a work of art so lifelike that—in the imaginations of those who behold her—she becomes real. Ishiguro is known internationally for his very particular robotic creations modeled after real people, including himself, his daughter, and various beautiful Japanese women. He’s also one of the senior fellows at the Intelligent Robotics and Communication Labs.

On a warm November day, I get to meet him at his office at Osaka University. My friend and I are shown into a large space decorated in modern furniture of plastic and glass. At the far end of the room, a man draped entirely in black and leather, and sporting an Elvis-like pompadour that extends from his forehead, is watching two different television monitors and smoking with a feverish intensity. He looks not so much like one of the most important figures in modern robotics as a Los Angeles record producer circa 1985.

Ishiguro began his university studies with a dream of becoming a visual artist. Computer science was a backup. Eventually, he was forced to give up on oil painting as a career. “I couldn’t get the right shade of green,” he says. Ishiguro has put his arts training to good use. It’s his artistic sensibility that informs his unique approach to robotic design. “Oil painting is the same thing [as building robots]; the meaning of oil painting is to re-create the self on canvas.”

Ishiguro believes that some understanding of humanity (and some formal humanities training) is essential to success as an engineer. “We need to start with the question, What is human?” he says, “and then design and build from there. Our brain function is for recognizing humans, but in engineering robots, we don’t study the importance of the human appearance.… We need to establish a new type of engineering. Appearance is key, then movement, then perception and conversation.”

He takes us across the hall to show us his lab. A mannequin-like figure is sitting erect on a metal stool. I ask if I can investigate, and he nods. I step hesitantly forward and poke the android in the cheek. Its eyes open wide, and it turns to stare in my direction. The mouth hangs slightly open in an expression of surprise.

The demonstration is simultaneously amazing and unnerving. Ishiguro admits that his creations have secured a reputation for creepiness. When his daughter met her android doppelgänger, she burst into tears. Like a true artist, Ishiguro says he’s thankful for every honest response he gets.

“People are surprisingly sensitive of the humanlike appearance. The older people accept this type of robot. I exhibited one at the World Expo. Elderly people came up and asked ‘Where is the robot?’ Young people knew right away.”

Ishiguro has recently turned his attention to the theater. Two years ago, he and Japanese playwright Oriza Hirata began the Robot-Human Theater project, an ongoing collaboration placing flesh-and-blood actors next to tele-operated androids and other robots. Last year, on a trip to Tokyo, I caught a performance of Hataraku Watashi (I, Worker), a staid, 30-minute piece exploring themes of mortality, autonomy, and what it means to be human. Hataraku Watashi also serves as a live experiment in human–robotic interaction. Takeo and Momoko, the play’s robotic stars, faced the same challenges as their human co-stars: line delivery, timing, blocking, and conveying emotion and meaning.

Ishiguro and Hirata’s most recent piece, Sayonara, starred actress Bryerly Long and Geminoid F, a female tele-operated android. Following the play’s debut last November, Long told Reuters that she felt “a bit of a distance” between herself and her co-star. “It’s not some kind of human presence,” she said.

Ishiguro expressed confidence that future performances will get better. “We think we need humans for theater, but this is not so,” he told me. “The android can change the theater itself.”

This assertion begs a question that is either philosophical or semantic depending on the answer: Can robots act?

The late Russian literary critic M. M. Bahtkin might say Yes, insisting that the success of any piece of art, including theatrical performance, rests entirely on the reaction it creates in the audience. By this view, Takeo, Momoko, and Geminoid F are already accomplished actors.

The late Lee Strasberg, founder of method acting, would argue otherwise. The robot has no internal memories, no painful or elating life episodes with which to breathe credibility into the performance. “The human being who acts is the human being who lives,” he said. The presence of life, ergo, is a necessary precondition to “acting.” The robot is a prop, or, in the case of a remotely controlled android, just a puppet. It’s an interesting gimmick but not a thespian. Real-life experience is a precursor to genuine acting, and a robot will never be able to experience life in the way that humans do.

Or will it?

The Pattern-Recognition Machine

Turn your attention back to Robovie for a moment. Picture him standing alone in his stretch of mall. A potential conversational partner enters his area of operation. Robovie has to make a decision: Is it safe to approach or isn’t it? The window for that decision is closing.

“You don’t want [the robot] to go super fast in crowded commercial spaces,” says Glas. “But people walk quickly. If you’re walking through that door, and Robovie wants to talk to you, he has to start early. We have to predict people’s behavior in the future, predict not only where they are, but what they’re going to be doing and where they’re going. The system gets a couple of seconds of data.”

Herein is the reason Robovie represents a great leap forward in artificial intelligence. He’ll never play chess as well as IBM’s Deep Blue played against Garry Kasparov. He won’t win on Jeopardy and can’t vacuum better than Roomba. What Robovie does is learn about the people in his environment. He takes in information about his setting and the live actors in that setting and responds on the basis of a perceived pattern, moving toward reward and away from threat. This is incredibly human.

In his 2004 book On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines, neuroscientist and Palm Computing founder Jeff Hawkins argues that the neocortex evolved expressly for the purpose of turning sensory data, in the form of lived experiences, into predictions.

“The brain uses vast amounts of memory to create a model of the world,” Hawkins writes. “Everything you know and have learned is stored in this model. The brain uses this memory-based model to make continuous predictions of future events. It is the ability to make predictions about the future that is the crux of intelligence.”

This ability to anticipate is a function of the neocortex, which can be traced back to humankind’s reptilian ancestors of the Carboniferous Period. Over the course of millions of years, the neocortex increased in complexity and size and emerged as the central component in human cognitive intelligence—not because of what it is, which has changed materially, but because of what it does.

The process of prediction forms the very basis of what makes us human. We see, we gather data from our senses, we predict, therefore we are. By that metric, Robovie’s every interaction, his every encounter, every question he asks, and every response he picks up brings him a little bit closer to humanity.❑


Kizoa slideshow: Thank You Very Much, Mr. Roboto - Slideshow

About the Author

Patrick Tucker is the deputy editor of THE FUTURIST magazine and director of communications for the World Future Society. In 2010-2011, he spent five months in Japan researching and writing about the future. His previous article, “My First Meltdown: Lessons from Fukushima,” appeared in the July-August 2011 FUTURIST. E-mail ptucker@wfs.org.

Exploring New Energy Alternatives

By David J. LePoire

What is most likely to satisfy our energy needs in the future—wind farms and photovoltaic arrays, or something yet to be invented? Options for the world’s energy future may include surprises, thanks to innovative research under way around the world.

Much discussion about going beyond petroleum includes the development of wind farms, solar thermal concentrators, solar cells, and geothermal energy production. But will these satisfy our energy needs in the future? We hope that renewable sources will provide enough energy to supply the world’s future needs, but there are still many uncertainties.

How much will low-intensity sources of energy cost over their life spans, and what will their environmental impacts be? The answers depend on research and on the operational experience gained in deploying these technologies and their associated storage, transmission, and conversion systems.

Another area of uncertainty is the growth in world demand for energy. If everyone in the world used energy as the United States does, the rate of energy production would have to increase by a factor of four. In addition, the energy use per person in the developed world might not be stagnant; it might increase. Could renewable sources keep up with this demand?

The following is an overview of a few conventional renewable energy sources that may be expanded in the near future, as well as some more speculative potential “surprises.” As the time horizon increases, the uncertainties associated with the technologies, economics, and political scenarios increase.

Energy Today

Fossil fuels currently account for 83% of the U.S. energy supply and slightly less (80%) of the world’s energy supply, but energy conservation and efficiency since the oil crises of the 1970s have suppressed growth of energy demand. If energy use had grown as fast as the economy, the United States would be using an estimated 60% more energy than it does now. We’ve improved energy use in buildings, electrical appliances, cars, and industrial processes. These applications are often motivated by cost savings.

The attainment of energy efficiency through conservation or improved technology allows us to extract more applied energy from a comparable amount of fuel. This has led to growth that has been quicker in the economy than in energy use.

Current nuclear power plants extract the remnant energy from supernova explosions stored in the heavy element of uranium. Since these stellar explosions occurred billions of years ago, before the solar system formed, nuclear power is not renewable. However, there is still much more energy stored in the heavy elements than the amount that is currently utilized. Techniques are being explored to expand the possible fuel materials to include other types of uranium and thorium.

Hydroelectric power is renewable but demonstrates some limitations: Though inexpensive, electricity generated from hydropower (for example, along the Tennessee, Colorado, and Columbia rivers) affects large tracts of land and is generally limited to a few select spots where the topography of the land supports a good reservoir location. Growth globally is limited because prime locations have already been developed.

Direct solar-energy technologies such as solar photovoltaic cells are being rapidly developed and deployed, and other technologies are also advancing our ability to efficiently convert wind, waves, ocean currents, and biofuels into usable energy.

Beyond Conventional Renewable Energy Sources

To hedge our energy bets and reduce future uncertainty, researchers are exploring new options for future energy sources, including ways to improve older ideas, such as fusion energy, space-based solar power satellites, Moon bases, and advanced nuclear fission options.

The strategy of maintaining a variety of energy options could be likened to the strategy of reducing risk in an investment portfolio. For example, our current energy technologies have costs, environmental impacts, and maturity levels that are relatively well known. Researchers are now testing newer renewable technologies, with the aim of cutting production expenses, minimizing negative environmental impacts, and enhancing scalability.

The hypothetical space-based, fusion, and advanced fission energy production systems introduce an extra level of uncertainty, because some technical aspects are not solved and because their relative costs depend on the construction of new infrastructure to support them.

Infrared Solar Technology

Nanotechnology offers a tool that could help create designs that convert energy more efficiently. For example, nano-scale antennas could be built to capture infrared light from the Sun—light that we cannot directly see but we do experience as heat. A solar cell that could extract this infrared energy would be able to provide energy both day and night (although not as much at night).

An antenna is more efficient at capturing energy and absorbs at a wider range of angles than conventional cells, and it does not require exotic materials to make. However, the antenna has to be about the same size as the wavelength of the light. For radios, this is about 1 meter. For cell phones, it is a few inches. For infrared light, the wavelength is about one twenty-fifth of the width of a human hair. One antenna would not only be difficult enough to make, but it would also result in very little energy production.

The challenge to easily produce millions of these small antennas was successfully met at Idaho National Laboratory (a U.S. Department of Energy laboratory), along with other laboratories, in work that received the 2007 Nano50 award. The laboratories were able to “print” 250 million metal antennas on plastic about the size of a standard sheet of paper. However, the problem remains to convert the absorbed energy (10 Ghz frequency) into useful electricity (60 Hz frequency).

Nuclear Fission

Nanotech could also improve energy-conversion efficiency of fission technology by allowing particles of uranium atoms to be converted to electricity before they collide and generate heat. This might be achieved by integrating the fuel and electricity extraction zones at the nano scale. When charged particles hit gas in the small pores, they strip the gas of electrons. The separation of charges then generates a voltage difference. This work is being pursued by a former Los Alamos National Laboratory scientist.

In a traveling wave reactor, only a small slice of a cylinder core is undergoing intense nuclear reactions with fast neutrons. The reactor needs an initial ignition with enriched uranium, but then it burns much like a candle. Its advantages include the ability to use unenriched fuel such as natural uranium, waste (depleted) uranium, thorium (much more plentiful than uranium), and spent nuclear fuel (considered a waste product of current nuclear power electricity generation).

This design was originally proposed in the 1950s, but no actual reactor has been built. Currently, TerraPower has developed designs for such a reactor, which were publicized in a 2010 Technology Entertainment and Design (TED) presentation by Bill Gates. These reactors would use the fuel more efficiently by using more of the available uranium and thorium, and would operate at higher temperatures and thereby allow higher thermal efficiency. They would also be contained, such that the fuel would last for 60 years, and generate much less waste as more material was burned.

Nuclear Fusion

Fusion is the process of merging two small atomic nuclei into a larger one. If the resulting nucleus is lighter than iron, the reaction also releases energy. The difficulty lies in getting two electrically charged nuclei close enough for the merger, or fusion, to occur. For energy production, the nuclei need to be pushed together in a controllable, energy-efficient, and economical way.

In nature, there is one system—stars—that controllably and efficiently generates energy. However, it is impossible to replicate the confinement mechanisms that stars use, since it requires the gravitational attraction of the mass of the Sun. The process of confining plasma is necessary for generating controllable, energy-efficient, and economical fusion energy. Although the concept of nuclear fusion for energy generation was identified early after World War II, its implementation has been frustrated because of the various ways the plasma finds to escape confinement. It seems that fusion has been “about 30 years away” for the past 50 years!

One way to confine the plasma is through the inertial forces from an implosion. This is the technique used by the large facility at Lawrence Livermore National Laboratory—the National Ignition Facility—whose construction just ended in 2010 and is scheduled for experiments.

Another technique to magnetically compress hydrogen long enough for fusion to take place is to run a large current through wires. The large current vaporizes the wire into a plasma, while simultaneously creating a large magnetic field to compress the plasma and hydrogen. The Z machine at Sandia National Laboratories has been experimenting with this concept for many years.

Artificial Photosynthesis

A 25-year quest for scalable solar energy solutions has drawn from biomimicry for inspiration. In its search for creating artificial photosynthesis, an MIT team led by Dan Nocera recently identified two natural biological techniques that had previously remained hidden. Nocera noticed that some life-forms use cobalt in photosynthesis. He then developed a long-lived cobalt-based catalyst that uses sunlight to convert water into oxygen and hydrogen gas.

This work supports Nocera’s goal of finding a chemical process that could be distributed (e.g., on houses) and robust (e.g., not decay) in converting sunlight into liquid chemicals (e.g., alcohols) that store the energy for later use in transportation as a gasoline substitute or as electricity with a fuel cell.

The MIT team’s recent discoveries have led to a startup company, Sun Catalytix, that is partially funded by the U.S. government’s Advanced Research Projects Agency-Energy (ARPA-E) program, which funds selected promising energy-related innovations. In the lab, it seems like the catalyst also works in impure water, which could lead to it being used not only to generate and store solar energy but also to purify water.

Nate Lewis at Caltech is also searching for artificial photosynthesis in a different way, by using nanotubes along with a membrane to generate hydrogen from light.

Space-Based Solar Technology

The idea of space-based solar energy extraction has been around for decades. Obstacles include the high price of sending reliable equipment into space and of maintaining it there and the uncertainties associated with transmitting the energy back to Earth.

Two locations are currently being explored: geosynchronous orbit and on the Moon. The latter offers the advantages of using existing materials and providing a more conventional work environment.

A Japanese company, Shimizu, is exploring the use of semi-autonomous robots to do the primary conversion of materials and build the solar energy collection system. The idea is to create a continuous strip of land, perhaps going all around the Moon’s equator, of solar cell collectors built with lunar materials.

The resulting LUNA RING, a complete equatorial ring, would allow continuous energy collection. The Sun shines only half the time on the far side of the Moon, yet the same side of the Moon is always facing the Earth, so just a limited number of transmitters would be needed. [Editor’s note: For more on the LUNA RING concept, see “Solar Power from the Moon” by Patrick Tucker, THE FUTURIST, May-June 2011.]

To make this plan more feasible, space travel and the movement of materials need to be more economical. There have been several attempts to improve the space elevator concept, which was first proposed by Russian scientist Konstantin Tsiolkovsky in 1895. A major obstacle is the strength of the material needed for the spine of the elevator, which must reach more than 24,000 miles from the geosynchronous orbit down to near the Earth level. Recently, NASA and physicist Brad Edwards have been updating the design on the basis of the idea that carbon nanotubes, which have the necessary strength, can be scaled up to provide enough material and consistency for the long cable.

Speculative Physics: Dark Energy, Muons, and Mini Black Holes

Still further in the future, and associated with far greater uncertainty, are speculations about using new potential physics discoveries. Although a surprise might arise from this area, the probability of any one technique being successful is small, and it would take a large amount of effort to develop it into an integrated energy production system.

History has shown that surprises can revolutionize energy generation. In the mid-twentieth century, nuclear fission power was able to go from the lab to the power station in about 40 years. There are still many natural mysteries that might point the way to new energy technologies.

Among these mysteries are dark matter and dark energy, which account for about 95% of the energy in the universe. Accelerators such as the CERN Large Hadron Collider might discover new particles, as predicted by a variety of competing theories. Or they might produce mini black holes, whose physics would be interesting to explore. Physicists have begun speculating about potential theories and about how new forms of matter and energy might be exploited to generate useful energy.

For example, heavy, negatively charged particles can catalyze fusion. This is seen when muons enter water. Muons are a heavy relative of the electrons that are produced by natural cosmic rays or accelerators, which have been known since the 1950s. Hydrogen nuclei are attracted to the heavy negatively charged muons and form atoms with the nuclei that are orbiting the muon. This is a form of containment of the hydrogen nuclei. Eventually, the nuclei fuse, releasing energy. Therefore, no large temperature or containment facility is necessary. The muons are then released to catalyze more reactions.

However, the muons are unstable and eventually decay. Currently, the energy necessary to produce the muons is more than the energy generated by the limited number of fusions they catalyze. If a new, more stable, negatively charged particle is found, the economical catalysis of fusion might be developed.

Another possibility is that mini black holes might one day be produced and controlled to extract energy from the material fed into them. As the material entered, some of the energy would radiate out. Very small theoretical black holes would be too unstable and radiate before control was established, but there might be a “sweet spot” of black-hole size that would radiate at a beneficial rate. Mini black holes have been proposed as an energy source for a spaceship in the far future.

Finally, there are aspects of quantum physics that are still very puzzling. Researchers are exploring the connection between quantum physics and gravity, as well as the fundamental aspects of quantum physics behavior, such as the way in which spin influences collective behavior. Another possibility is finding a way to extract energy from vacuum energy (zero point energy).

Diversity for the Energy Portfolio

These examples of potential new energy sources highlight an essential ingredient in the future of energy: the diversity of the organizations involved in developing it.

Some projects are government-based, such as those sponsored by DOE. Others are collaborations between a government and an industry, such as the Japanese Artemis group. Some projects are sponsored by individual philanthropists and investors, such as Bill Gates and Vinod Khosla. And some, such as the ITER fusion reactor, require international collaboration. The space elevator, for example, would probably require similar international agreements and cooperation.

Besides direct research funding, other ways to foster innovation include contests in which many different types of organization can participate. Successful contests include the X Prize for space travel and the Defense Advanced Research Projects Agency (DARPA) Grand Challenge for autonomous vehicle navigation.

Energy is a major determinant in economic development, not only with regard to heating, transportation, and entertainment, but also with regard to staples such as food, shelter, and health. The energy fuel types have periodically changed over the last 200 years, and our current dependence on fossil fuels may soon be at an end.

We have been applying energy-efficiency methods to curb energy demand, and we have been developing renewable energy sources, such as solar and wind power, to increase supply. However, these energy sources might not be able to meet all future energy needs because of their economics or environmental impacts.

Searching for more potential future sources of energy to prepare for the challenges ahead requires research. New tools that employ nanotechnology, supercomputers, and space technology enable such exploration. A balanced portfolio of energy options and organizational support can reduce uncertainty and minimize the potential for surprises.

About the Author

David J. LePoire is an environmental analyst at Argonne National Laboratory. E-mail dlepoire@anl.gov. This work was supported by the U.S. Department of Energy under Contract No. DE-AC02-06CH171357.

This article draws from his essay, “Beyond (Conventional) Renewable Energy?” in the World Future Society’s 2011 conference volume, Moving from Vision to Action, which may be ordered from www.wfs.org/wfsbooks.

The Accelerating Techno-Human Future

By Braden R. Allenby and Daniel Sarewitz

Technology and humanity are co-evolving in ways that past generations had never imagined possible, according to the authors of The Techno-Human Condition. This is not necessarily a good thing, they warn. With unprecedented levels of innovation come new societal tensions and cultural clashes. People everywhere are challenged to adapt to accelerating change.

We’re all used to thinking of cognition as something that happens within individuals. Now we are seeing augmented cognition built into weapons systems in Iraq, automated cognition built into our automobiles, and human memory collected in Google. We’re diffusing cognition across integrated human–technology networks. What you have is not just humans networking with other humans, but humans in integrated human–technology networks functioning differently than they were before. Cognition may be going away from the individual and more toward a techno-human network function. This represents a profound change in our cognitive patterns. In the past, we had to carry more in our brains.

That is not to say that we’re getting more intelligent as a culture. We have different kinds of intelligence. Acting in the face of uncertainty and disagreement seems to be as difficult as ever. Part of the problem is, on the one hand, excessive optimism that we can solve big challenges like climate change or terrorism with more analysis and more information, and, on the other hand, the belief that we’re completely screwed. The technological optimists are as much a problem as the Luddites, because they have a one-sided view that we can solve all our problems if we just apply enough technology and reason. This kind of black-and-white dichotomy is not helpful given the actual complexity of socio-technological evolution.

Too Much Information Running Through My (Distributed) Brain

Everyone is awash in information. There is this weird cognitive dissonance: You can participate in everything and are totally connected—comment on blogs, visit chat forums, etc.—but also there seem to be forces beyond anybody’s control, and an increasing inability to filter out what’s important and reliable.

We have the capacity to generate and process much more information, but there’s a lot more information that doesn’t get dealt with. It just sits out there: dark information, like the dark matter that dominates the universe. It’s pretty clear that there is a broadening and a networking of cognitive capacity. For example, we might use our e-mail as outsource memory for names because we can’t remember them all. But it’s countered by the fact that we’re overwhelmed and can’t remember all our e-mails.

Moreover, we’re seeing technological change across the entire frontier of technology. If you look at past clusters—steam, railroad, automobiles—those clusters tended to have one dominant technology. Other technologies also changed in response, but you could always identify the dominant technology of the era.

Now, you really can’t pinpoint a single dominant technology. Change is occurring across all frontiers of technology: biotech, nanotech, information, cognitive science, and so on. That means adaptation is that much harder, because there are no stable domains, no safe harbors. Moreover, the rate of change is more rapid than ever before and is still accelerating. The pure rate of technological evolution is unprecedented.

For those two reasons—the fact that change is across the entire technological frontier, and the fact that the rate of change is unprecedented and accelerating—our situation is different, and the issues we face are different. And just when we need more institutional agility, we are becoming more rigid.

Because they have to innovate, the private sector will generally adapt to change more rapidly. They’re not happy about it, because it’s difficult. But it’s also an opportunity, and they have no choice. At the level of the firm, confusion and ossification are punished fairly severely in the market, so there’s some selective pressure for adaptability. If you don’t innovate as rapidly as possible, you’re going to lose market share to competitors. You have to innovate or you’ll fall off the rapidly changing edge of the technology. Even sectors that are slower to change have to innovate or risk being replaced. In construction, for instance, people are talking about machines that enable you to print out a house.

The same phenomenon is true for the military, only the stakes are even higher—not just economic survival, but survival, period. This is why so many new technological frontiers start out with military applications.

Individuals are having a hard time understanding, let alone managing, the rate of change, and it’s fair to say that society as a whole is having a hard time. Many people meet the uncertainty and discomfort that characterize rapid and accelerating change by retreating into relatively rigid belief systems or, less aggressively, apathy. In general, therefore, the radical change has the contraindicated effect of driving dysfunctional behavior. This is a very dangerous trend, because what’s needed is more awareness and engagement, not less.

At a social level, part of a predictable response to the accelerating, unpredictable, and uncomfortable change in many domains is rejection of flexibility. So in many ways, we’re moving away from developing appropriate coping mechanisms. That appears to be the case in politics in many countries. We’re getting more and more entrenched politics on either side, instead of the kinds of discussions we need that can let us understand and adjust to continual socio-technological change.

The more that people feel knocked back on their heels by this change, the more they seem to be retreating into the worldviews that make them less able to understand and respond. Meanwhile, the private sector—which is the source of most of the innovation—gets better at accelerating the change, often by leveraging off of military efforts. It would be easy to offer pop-sociological interpretations of this phenomenon, but part of the problem is the expectation of control that comes from the Enlightenment commitment to rational action. That control impulse is now wired into our culture, if not our genes.

More Innovation and Inequality

It is reasonable to expect that social tensions have risen, and will continue to rise, as the rate of change accelerates. A certain level of conflict seems to correlate (if not contribute to) cultural, economic, and technological development. Life expectancy in many developed countries hovers around 80 years; in developing countries in Africa, it can be in the low 40s.

Factors contributing to this gap include existing human-enhancement technologies, such as vaccines (which engineer the immune system to contribute to a much longer life than would occur naturally). Combine this with the ability of a developed society to provide for basic needs so as to facilitate knowledge workers—as opposed to requiring hard and continuing labor simply to exist—and we see huge competitive advantages for developed economies.

This gap is augmented by the fact that technology as a domain is self-perpetuating, especially as technological innovation integrates across categories (robotics and neuroscience, for instance). This momentum will favor societies with an integrated capability across the technological frontier as a whole.

Now, what if states aggressively pursue particular technologies? For instance, what if China goes from its “one child” policy to an “enhanced child” policy, whereby all schoolkids are required by the state to take some sort of cognitive enhancement? The outcomes of such a policy are difficult to anticipate, but may be quite disruptive. The outcome of the “one child” policy (combined with technologies that can promote gender selection) has included a gender imbalance that is potentially destabilizing to Chinese society. An “enhanced child” policy would surely have similarly unpredicted consequences.

Plenty of room exists for social tensions, though not necessarily due to old triggers such as immigration. The innovation hothouses of Silicon Valley and the Route 128 corridor wouldn’t exist except for Indian, Chinese, and other immigrants. Russian immigrants to the United Kingdom easily assimilate to their new society, and, indeed, are characteristic of a globalizing elite that migrates fairly effectively across national borders. And certainly much of the backlash against modernity in the United States is not from immigrants, but from domestic groups that find social and technological evolution challenging and distasteful.

The real question is whether we will overshoot productive conflict and get into destructive conflict, as some argue that American politics may have done. However, the older conflicts, such as natives versus migrants, may become obsolete with the globalization of economic, institutional, and technological systems.

Unless policies are enacted to reduce inequities, human-enhancement technology will probably magnify existing gaps between the capabilities and assets of different groups. We may have been seeing an early version of this in the United States over the past 40 years, where a commitment to an elite-driven education system, and the absence of a social or political commitment to redistribution policies, leaves us pretty much unable to offset the consequences of Schumpeterian “creative destruction.” The result has been a de-skilling of the workforce and the elimination of the conventional manufacturing sector, among other problems.

It is hard to understate how stressful life is becoming for many in developed countries as a result of technological evolution. For a digital immigrant, for example, trying to stay networked and multitasking can be very stressful, while, for a digital native, that level of information flow may be not only comfortable, but necessary to feel psychologically connected to others. The pressure to be more productive is indeed pervasive, though perhaps advanced information technologies themselves may not be the source of stress: It still takes decisions by people, and their institutions, to damage other people this way. Yet, individuals may have little choice but to conform to evolving technological demands if they want to remain enfranchised in society.

Technology and Generational Divides

We should not forget that stress is part of the human condition: Staying alive as a poor farmer in India, or surviving drought in Africa or Asia, is also stressful. Moreover, for many people across cultures, change can be very stressful, regardless of productivity demands, which adds a complicating factor to any analysis. In the longer run, human psychology may be increasingly subject to deliberate engineering, raising the interesting question of whether stress—in appropriate amounts depending on personality and genetics—is a necessary component of being human as we know it, and can be designed to more preferable levels. But we reemphasize that any such efforts are likely to lead to unforeseen consequences that may overwhelm the original intent of the actions.

Many middle-aged adult professionals may feel that the connectedness of the wired world adds enormously to their stress levels, but it is doubtful that their kids would look at it that way. This is not meant to dismiss the concern; adjusting successfully to the information flow—which to you may be unmanageable but to your kid is normal—will undoubtedly bring other problems. We might never be able to finely tune such attributes as stress levels for an entire population, and this is probably just as well: Psychological diversity is clearly an asset for the species. Yet, that may not keep us from trying.

It is reasonable to expect that a radical increase in longevity will exacerbate rather than mitigate generational conflict. After all, think about how mature human-resource workers differ in their view of informal social networking photographs and materials compared with the young people who post it: Older folks tend to judge it by their experience, when much potentially questionable behavior simply never got recorded. Today, all your friends have cell phone cameras, and it may amuse them to post embarrassing pictures of you. It isn’t the behavior that has changed; it’s what technology allows to be classified as personal versus public.

Such technologically mediated misunderstandings tend to get worse as the pace of technological evolution accelerates. Moreover, it is also probable that accelerating technological evolution will result in increased social, institutional, and financial upheaval. Another consequence of life extension will be an older workforce that may resist giving way to younger job seekers.

In our book, we include a brief quote from Gulliver’s Travels on the issue of intergenerational conflict. Swift describes an island of people who live forever, and he foresees them as an alienated, isolated generation of sufferers. This of course is a satirical device, but his questioning, nearly 300 years ago, the notion that longevity is automatically a boon to either the person or the society, is still apt. Totally cut off from the younger generation, these aged beings are bitter and alone.

As we pursue radical life extension through technological means, Swift’s challenge may become ours. Why assume that the cumulative effect of lots of people living longer is going to be a boon to society? We’re definitely going to pursue it, but the objective ought to be more years of healthy life, not just more years. Unfortunately, when we see more years of life, we also see more years of unhealthy life. Anyone who is middle aged is dealing with unhealthy parents and the slow decline that we all slowly go through. The flipside of living a lot longer is that dying takes a lot longer, too. Dealing with the continual challenges of demographic change is thus another aspect of the techno-human condition.

One must always remember how unpredictable the future is. It might well be that older, less-innovative brains perform important functions in integrated techno-human cognitive networks, and thus act as a source of balance and stability without impeding cognition. And there is a lot of work going on now with augmented-cognition technology (for example, with automobiles), which may well lend itself to developing cognitive tools that can compensate for aging adults. For example, we may see AI avatars with the experience and caution of adults and the playfulness and experimentalism of youth.

To the extent that human judgment will never be fully replaced by artificial intelligence, one might hope that a population of wise elders could indeed be a resource for society. In the case of science, there are some fields where the biggest contributions are made by kids, such as math and physics, and others where experience really does count, such as geology and engineering.

So we’ll see problems and benefits as technological development accelerates and as cognition becomes increasingly networked. We are very likely to see a set of problems that we haven’t figured out how to deal with, and that are historically unique. Technology affects how culture evolves, and culture affects how technology evolves. They’re not separate categories. You can’t understand one without understanding the other.

About the Authors

Braden R. Allenby is a professor of civil and environmental engineering at Arizona State University. E-mail braden.allenby@asu.edu.

Daniel Sarewitz is a professor of science and society at Arizona State University. Email daniel.sarewitz@asu.edu.

They are co-authors of The Techno-Human Condition (MIT Press, 2011).

This article was based on interviews with the authors by Rick Docksai.

Five Principles of Futuring as Applied History

By Stephen M. Millett

A historian and futurist offers a theoretical framework for developing more credible and useful forecasts. The goal is to help individuals and organizations improve long-term foresight and decision making.

When I was working on my doctorate degree in history, people would quip: “Why study history? There’s no future in it!”

On the contrary, there may be a great deal of history in the future. Throughout my four-decade career as a historian engaged in futuring, I have used the past to explore the future. Like the study of history, futuring is heavily based on facts, evidence, solid research, and sound logic—more science, less science fiction.

Futuring is an example of what I call “applied history,” or the use of historical knowledge and methods to solve problems in the present. It addresses the question “What happened and why?” in order to help answer the question “How might things be in the future and what are the potential implications?” Futuring, at least in a management context, combines applied history with other methods adapted from science, mathematics, and systems analysis to frame well-considered expectations for the future. This process will help us to make decisions in the present that will have positive long-term consequences. In the language of business, futuring is an aspect of due diligence and risk management.

History provides indications of the future. Identifying historical trends helps us see patterns and long-term consistencies in cultural behavior. History may not repeat itself, but certain behaviors within cultures do. We can spot patterns in persistent traditions, customs, laws, memes, and mores. Debating whether a historical event is unique or a manifestation of a long path of behavior is like arguing whether light is a particle or a wave—the answer depends entirely upon your perspective.

The past provides precedents for future behavior. When you understand how things happened in the past, you gain much foresight into the things that might happen in the future—not as literal reenactments, but rather as analogous repetitions of long-term behavior that vary in their details according to historical conditions.

Let me hasten to qualify my view of history by saying that I see no immutable forces in the flow of history, no invisible hands of predestination, fate, or economic determinism. Time may be like an arrow, in the words of Sir Arthur Eddington, but I very seriously doubt that it has a prescribed target. I am also skeptical of the concept of political or economic cycles recurring with regular periodicity. If there were any determinism or predictability whatsoever in human behavior, it lies in our evolutionary biology and cultures. Luck, randomness, and the idiosyncrasies of free will play important roles in determining the future as well.

While the study of history has been rich in philosophy, it has lacked theories such as those found in the natural and social sciences. Most historians have not pursued such theories, because they see each period of history as being unique and as having little or no practical applications for problem solving today. Futuring as applied history, however, needs basic principles upon which to build forecasts that can be used for long-term decision making.

A Framework for Understanding the Future

The study of the future is very sparse in both philosophy and theory. Theories (which may also be seen as mental or analytical models) provide a framework for forecasts and give them a credibility that increases managers’ willingness to take calculated risks. In addition, they can help us utilize our knowledge of demonstrated trends, interactions, and causes to better anticipate the future. The theories do not have to be rigid, but they do need to provide an explicit framework that can be modified, expanded, and even rejected by experience.

To that end, I have been working on a set of theoretical principles for futuring from the point of view of an applied historian. I offer them now as working guidelines until others can offer better.

Futuring Principle 1>> The future will be some unknown combination of continuity and change.

After an event occurs, you can always find some evidence of the path that led up to it. Sometimes when viewed in hindsight, the path looks so linear that it is tempting to conclude that the outcome was inevitable all along. In reality, it is the historical research that is deterministic, not the events themselves.

No historical event has ever occurred without antecedents and long chains of cause-and-effect relationships. Nor was there ever a time when decision makers did not have choices, including the simple option to do nothing. Yet, in the present moment, one can never be certain which chains of events will play out. While there are continuities in the past and the present, there are also changes, many of which cannot be anticipated. Sometimes these changes are extreme, resulting from high-impact, low-probability events known as “wild cards.”

Thus, the future always has been and most likely always will be an unknown combination of both trend continuities and discontinuities. Figuring out the precise combination is extremely difficult. Therefore, we must study the trends but not blindly project them into the future—we have to consider historical trends, present conditions, and imagined changes, both great and small, over time. You might say that trend analysis is “necessary but not sufficient” for futuring; the same goes for imagined changes, too.

Futuring Principle 2>> Although the future cannot be predicted with precision, it can be anticipated with varying degrees of uncertainty, depending upon conditions. Forecasts and plans are expectations for the future, and they are always conditional.

As twentieth-century physicist Niels Bohr famously said, it is very hard to make predictions, especially about the future. Yet, we can and do form expectations about the future ranging from ridiculous to prescient. David Hume, Werner Heisenberg, and Karl Popper cautioned us to be wary of drawing inductive inferences about the unknown from the known. This caution applies as much to futuring as it does to science.

All events occur in the context of historical conditions; likewise, all events in the future will almost certainly occur within a set of conditions. Therefore, all forecasts are conditional.

We may not be able to anticipate specific events in the future, but we can form well-considered expectations of future outcomes by looking at specific conditions and scenarios. For example, “When will the United States experience again an annual GDP growth rate of 7% or higher?” is a much more elusive question to address than “Under what likely conditions would it be reasonable to expect the United States’ annual GDP growth rate to be 7% or higher in the future?”

Futuring Principle 3>> Futuring and visioning offer different perspectives of the future—and these perspectives must complement one another.

This principle draws a distinction between futuring and visioning. Futuring looks at what is most plausibly, even likely, to unfold, given trends, evolving conditions, and potentially disruptive changes. It emphasizes conditions that are partially if not largely out of your own control.

Visioning, on the other hand, involves formulating aspirational views of the future based on what you want to see happen—in other words, how you would like events to play out. Of course, just because you want a certain future to happen does not guarantee that it will.

Strategic planning is a manifestation of visioning. If an organization does not engage in forecasting with all the rigor of historical criticism and good science, strategic planning can be just so much wishful thinking. I find that wishful thinking is alive and well in many corporations and institutions. Both futuring and visioning are necessary and they go hand-in-hand—just be careful to correctly identify which you are doing and why.

Futuring Principle 4>>

All forecasts and plans should be well-considered expectations for the future, grounded in rigorous analysis.

Futuring methods fall into three broad, fundamental categories: trend analysis, expert judgment, and scenarios (also known as multi-optional or alternative futures). Historical research methods and criticism play well in all three categories.

As a futurist, I have no data from the future to work with. I cannot know in the present whether a forecast of mine will turn out to be “right,” or “accurate,” or even “prescient,” but I know what I can and cannot convincingly defend as being well-considered expectations for the future.

In this regard, the soundness of philosophical premises and theories, along with familiarity with best research practices, will add much to your foresight credibility and to the usefulness of your futuring activities.

Futuring Principle 5>>

There is no such thing as an immutable forecast or a plan for a future that is set in stone.

Forecasts and plans must be continuously monitored, evaluated, and revised according to new data and changing conditions in order to improve real-time frameworks for making long-term decisions and strategies.

A forecast is a well-considered expectation for the future; it is an informed speculation or a working hypothesis, and as such is always a work-in-progress. Forecasts, like historical research, can never be completed. There is always more to be said on the subject as time passes. We must continuously use new and better information to evaluate and modify our expectations for the future.

* * * *

Futurists, like historians, must examine events in a large and complex context. My challenge to futurists, forecasters, strategic planners, and decision makers is to apply a historian’s rigor to their futuring endeavors. Think through a foundational philosophy of the future and theories concerning why some futuring methods are more trustworthy than others.

When generating forecasts, rely upon well-tested theories and best practices to justify your methods and conclusions. Use the five futuring principles offered above to guide your formulation of forecasts as well-considered expectations for the future.

About the Author

Stephen M. Millett is the president of Futuring Associates LLC, a research and consulting company, and teaches in the MBA program at Franklin University. He received his doctorate in history from The Ohio State University. His career at the Battelle Memorial Institute spanned 27 years. He is a past contributor to THE FUTURIST and World Future Review and he was a keynote presenter at WorldFuture 2003 in San Francisco. He may be reached at smillett@futuringassociates.com.

A more thorough discussion of these principles and supporting case histories appear in his forthcoming book, Managing the Future: A Guide to Forecasting and Strategic Planning in the 21st Century, to be published by Triarchy Press, www.triarchypress.com/managing_the_future.

Thomas Bayes and Bruno de Finetti: On Forming Well-Considered Expectations of the Future

The theories of subjective probabilities advocated by eighteenth-century English mathematician and theologian Thomas Bayes and by twentieth-century Italian statistician Bruno de Finetti are very applicable today when we assign likelihood to any future conditions or outcomes.

Bayes (circa 1702-1761) used prior knowledge as a starting point for calculating the probabilities of events. “Prior knowledge” may mean a hunch or an educated guess in lieu of non-existing facts. To illustrate this concept, Bayes’s one paper on the topic described how an initial estimate of the positions of balls on a billiard table may lead to more accurate calculations of where they are likely to land next. With increasing information, one may see patterns that both explain unknown causes and anticipate the future.

Bayes’s approach has led to the information theory stating that expectations for the future must always be modified by new information. Yet, some critics contended that probability should be reliant on data-based statistics rather than subjective judgment.

About two centuries later, de Finetti (1906-1985) provided a proof that all probabilities, particularly those concerning the future, are subjective. He concluded that it is better to admit your subjective judgment than to hide it in apparent objectivity. One way to do this is to assign a future event an a priori probability: While it may or may not be prescient, it can reveal how likely you think a future event may be according to your own biases—information that can give you a sense of how objective your forecasting actually is.

Stephen M. Millett

Tomorrow in Brief

Virtual Therapy for Phobias

Simulating an environment or situation that evokes fear is one way that psychologists help treat patients with severe phobias. Now, therapists can deploy a range of virtual world simulations to help their patients.

In a virtual café or pub, for instance, individuals with social phobias can learn to deal with fears associated with being in public, such as being looked at or talked about, according to Delft University of Technology researcher Willem-Paul Brinkman.

While their patients are engaged in the simulation, therapists will be able to observe and record physical reactions such as heartbeat and perspiration, then encourage patients to test alternative behaviors in the simulation.

Source: Delft University of Technology, http://www.tudelft.nl.

Mobile Water and Power

Places without access to clean water and convenient power may soon have a solution to both problems.

Developed by Purdue University researchers, a new alloy of aluminum, gallium, indium, and tin can split salty or polluted water into hydrogen and oxygen. The hydrogen feeds a fuel cell housed in a relatively lightweight (under 100 pounds) portable unit to produce electricity, with steam as a byproduct. The steam purifies the water.

The technology may be used not only for poor, remote villages, but also for military operations, according to Jerry Woodall, distinguished professor of electrical and computer engineering.

Source: Purdue University, www.purdue.edu.

Space Junk Detector

A new European space surveillance system is being developed in the hope of keeping outer space tidy—and space traffic flowing smoothly.

Futurists have long warned that increased human activity in space would have one inescapable byproduct: increased orbiting junk. Space junk haulers, reclaimers, and recyclers were even listed among THE FUTURIST’s “70 Jobs for 2030” (January-February 2011).

Now, Fraunhofer Institute researchers are working with the European Space Agency to develop radar systems with sufficiently high resolution to track the estimated 20,000 orbiting pieces of debris that threaten to damage or destroy any satellite or vehicle they may encounter.

Source: Fraunhofer-Gesellschaft, www.fraunhofer.de.

The Internet of Bodies

As sensors and transmission technologies continue to shrink in size, they will enable us to monitor and manage our own bodies—and connect with others.

As with the so-called Internet of things, an Internet of bodies may soon be built, thanks to work under way in research labs such as the Department of Informatics at the University of Oslo.

Such a “bodnet” could allow frail elderly individuals to live independently at home, as well as improve public health monitoring and prediction systems, as data can be collected from widely distributed populations.

Source: University of Oslo, www.uio.no.

WordBuzz: Protopia

A proposed destination for a desirable future. Protopia, as defined by Wired senior maverick Kevin Kelly, would be a future that is better than today but would not attempt to be a utopia in the sense of a problem-free world.

Technology futurist Heather Schlegel would like to take the concept a bit further. Protopia, she argues, should represent a positive portrayal of the future. Protopians would actively tackle big problems and develop new tools, mind-sets, and paradigms for doing so.

Sources: Kevin Kelly’s blog The Technium, www.kk.org/thetechnium.

Heather Schlegel’s blog Heathervescent, www.heather vescent.com.

Future Active

Custom Teaser: 
  • Pros and Cons of the African Brain Drain
  • Envisioning the Museum of Tomorrow
  • Futuring Goes to Town

Pros and Cons of the African Brain Drain

Africans are investing in higher education, but the lack of job opportunities for graduates is helping to drive a brain drain, argued University of California, Davis, economist Philip Martin in a recent online discussion hosted by the Population Reference Bureau. Martin is the author of the new PRB report, “Remittances and the Recession’s Effects on International Migration.”

In his PRB Discuss Online appearance on May 26, Martin pointed out that a lack of opportunities for university graduates with advanced degrees in their home countries gives them little choice but to seek employment elsewhere.

“Many African countries spend relatively more on higher education than on K-12 schooling, which leads to ‘too many’ university graduates who cannot find jobs, prompting them to emigrate,” he said. Martin projects that international migration of both educated and non-educated African workers will continue to increase.

The remittances that these workers send back to family members and loved ones provide a bit of a boost to their home countries’ overall economies, Martin observes in his report. These remittances can help create jobs and fund startup costs for small businesses in the migrant workers’ home countries. However, “sending workers abroad and receiving remittances cannot alone generate development,” Martin warns.

Although these monetary gifts may not counterbalance the loss of skilled (as well as so-called “unskilled”) workers, they make a significant impact. During the online discussion, Martin cited World Bank statistics: In 2010, remittances sent by workers from developing countries back home totaled around $325 billion. Projections for 2011 are even higher, and, according to the World Bank, that figure should increase by $50 billion in 2012. This is triple the amount of international aid money received.

There are other benefits and drawbacks to the brain drain, as well. “Migration can set in motion virtuous circles, as when sending Indian IT workers abroad leads to new industries and jobs in India, or set in motion vicious circles, as when the exit of professionals from Africa leads to less health care and too few managers to operate factories,” Martin explained during the Q&A session.

Martin recommends that policy makers in countries to which workers are migrating create legislation that protects them rather than trying to limit migration or restrict migrants’ rights.

Source: Population Reference Bureau, www.prb.org.

Envisioning the Museum of Tomorrow

A daylong workshop on futures thinking, forecasting methods, and strategic planning was offered to museum professionals attending the American Association of Museums’ annual meeting and expo in Houston in May 2011.

“Forecasting the Future of Museums: A How-to Workshop” was organized by the association’s Center for the Future of Museums (CFM). The workshop tied in nicely with the overall theme of the conference, “The Museum of Tomorrow.” Both the forecasting workshop and the general conference focused on ways that museums can evolve and adapt to the various shifts—political, economic, environmental, technological, and cultural—now taking place.

The workshop was led by CFM founding director Elizabeth Merritt; Peter Bishop, director of the Future Studies program at the University of Houston; and Garry Golden, lead futurist at the management consultancy futurethink. The workshop covered both the principles of foresight and museum futures specifically.

“We reviewed the basics of futures studies in the morning, explaining how trends and events can disrupt our path to the ‘expected’ future,” Merritt explains. The leaders also conducted an exercise: “Participants created cards for a forecasting deck in the course of these exercises, which we then used in the afternoon as they learned how to create scenarios to explore potential futures.”

Most of the afternoon was devoted to creating and exploring scenarios. Several wild cards were considered, including the possibility that museums could lose federal tax-exempt status and the occurrence of an event such as a pandemic or terrorist act that “might radically restrict travel or people’s willingness to congregate in public places,” Merritt says.

The workshop closed by looking at ways that museum directors can incorporate forecasting methods such as trend analysis, visioning, and scenario building into their strategic planning. According to the CFM, strategic long-term planning is essential for museum professionals, but short-term planning is currently more prevalent.

Those who couldn’t be at the conference in person had the opportunity to “attend” a virtual component taking place simultaneously. During the CFM’s online presentation, “Practical Futurism: Harnessing the Power of Forecasting for Your Institutional Planning,” several museum directors addressed the need to identify what Merritt describes as “the trends that challenge their local communities … and their museums’ own sustainability”—and to respond to them accordingly.

“The two other activities CFM specifically orchestrated were an ‘Ask a Futurist’ booth, staffed by faculty and students from the University of Houston, and an installation on the future of natural history museums by artist Tracy Hicks,” says Merritt. The art installation, titled Helix: Scaffolding #21211, also explored natural history museums’ projected influence on the Earth’s ecology.

Such events provide clear indication that museums—sometimes considered mere repositories of history—are orienting toward the future as well.

Sources: American Association of Museums, www.aam-us.org.

Center for the Future of Museums, www.futureofmuseums.org.

Futuring Goes to Town

From smart growth to traffic control measures, citizens of the Township of Delta in Michigan recently had the opportunity to voice their preferences on issues affecting their future.

In May 2011, the township’s community development department held a futuring session to gather information on issues surrounding the township’s growth and development. The meeting was part of an effort to review and update the township’s parks and recreation plan, non-motorized transportation plan, and comprehensive land use and infrastructure plan. Around 70 participants offered input to help community developers set objectives and goals for the future.

According to planning director Mark Graham, “participants were asked numerous questions pertaining to the future of the township in relation to urban sprawl, public transit, environmental protection, placemaking, recreational amenities, and the provision of public services.”

Those in attendance voted anonymously, via hand-held electronic devices, on 21 multiple-choice questions such as, “Which one of the following environmental issues do you feel will present the biggest challenge to the quality of life for township residents in the future?”

Afterwards, the results of the poll were tallied. Citizens participating in the exercise clearly saw “loss of open space” as a detriment to Delta Township’s future, followed by “high fuel prices [that] make suburban commuting less desirable.”

“The survey results from the futuring session will be one of the data sources used in compiling goals and policies for the updating of the township’s comprehensive plan,” Graham says.

An online version of the survey augmented the futuring session’s results and enabled those who could not attend to have a voice. The next step will be to schedule a public hearing to gain crowd feedback on a proposed draft of future plans.

Source: Delta Township Community Development Department, www.deltami.gov.

Future Scope

Custom Teaser: 
  • Accelerated Carbon Emission Rates
  • Broadening the Definition of Arts Participation
  • TV Is Going Off the Air
  • Agencies Are Unprepared for Climate Change

Accelerated Carbon Emission Rates

Carbon is now being released into the atmosphere nearly 10 times as fast as a similar period of climate change nearly 56 million years ago, according to a team of scientists led by Lee R. Kump of Penn State.

The researchers examined rock cores from the Paleocene-Eocene Thermal Maximum (PETM) event that were collected in Spitsbergen, Norway. The samples contained a large amount of sediment, enabling the researchers to infer the amount of greenhouse gases that produced the carbon content and the temperature that would have resulted.

“We looked at the PETM because it is thought to be the best ancient analog for future climate change caused by fossil fuel burning,” Kump explains. The researchers believe that the Earth experienced a warming of 9°F to 16°F during PETM, accompanied by an acidification event in the oceans.

During the PETM, ecosystems had 20,000 years to adapt to carbon release, but at current rates of emission, “it is possible that this is faster than ecosystems can adapt,” warns Kump.

Source: Penn State University, www.psu.edu.

Broadening the Definition of Arts Participation

Attendance at classical music concerts and art galleries has declined in the United States, but this is not a complete picture of people’s interest or participation in arts activities. The National Endowment for the Arts is broadening its traditional benchmarks for measuring arts participation.

Now, the NEA is looking at a wider variety of artistic genres and including people’s arts participation via electronic media, as well as involvement in personal arts creation. By this measure, some 75% of Americans are active arts participants.

Studying these trends will help promoters, managers, and curators become more engaged with their prospective audiences, such as through innovative arts education programs.

Source: National Endowment for the Arts, www.arts.endow.gov.

U.S. Hispanic Population Is Booming

The Hispanic population in the United States is growing at four times the rate of total U.S. population. The numbers increased by 43% (15.2 million) between 2000 and 2010. The nation as a whole increased by 9.7% (27.3 million) during that time, according to the U.S. Census Bureau.

The fastest growth occurred among the largest subgroup: Hispanics of Mexican origin, who represented 63% of total U.S. Hispanic population—up from 58% in 2000.

Source: U.S. Census Bureau, www.census.gov.

TV Is Going Off the Air

Free, over-the-air TV viewing has been declining steadily since 2005, according to the Consumer Electronics Association.

Sales of “rabbit ears” and rooftop antennas are thus falling, as viewers seem reconciled to paying for television content. Those who do “cut the cable” to pay TV are switching to the Internet rather than free airwaves.

The digital broadcast transition has offered consumers many new viewing options, including Internet streaming services such as Hulu and Netflix. These options also allow for mobile TV viewing when delivered to smartphones and other devices, including video monitors in cars.

Source: Consumer Electronics Association, www.ce.org.

Agencies Are Unprepared for Climate Change

Floods, fires, tornadoes, and other catastrophes associated with climate change may increasingly result in water shortages, epidemic diseases, skyrocketing insurance costs, and other disruptions.

Federal agencies in the United States are ill-prepared to handle such threats, warns a report by Resources for the Future that draws from experts in economics, ecosystems, insurance markets, and risk management.

Whether or not climate change can be mitigated or reversed, agencies need to be flexible and informed, allowing local actors to respond to crises quickly. The report recommends policies that emphasize adaptation innovation, and suggests crafting legislation that creates synergies across multiple policy areas.

Source: “Reforming Institutions and Managing Extremes: U.S. Policy Approaches for Adapting to a Changing Climate,” Resources for the Future, www.rff.org/adaptation.

The Troubling Future of Internet Search

Data customization is giving rise to a private information universe at the expense of a free and fair flow of information, says the former executive director of Moveon.org.

By Eli Pariser

Someday soon, Google hopes to make the search box obsolete. Searching will happen automatically.

“When I walk down the street, I want my smartphone to be doing searches constantly—‘did you know?’ ‘did you know?’ ‘did you know?’ ‘did you know?’ In other words, your phone should figure out what you would like to be searching for before you do,” says Google CEO Eric Schmidt.

This vision is well on the way to being realized. In 2009, Google began customizing its search results for all users. If you tend to use Google from a home or work computer or a smartphone—i.e., an IP address that can be traced back to a single user (you)—the search results you see incorporate data about what the system has learned about you and your preferences. The Google algorithm of 2011 not only answers questions, but it also seeks to divine your intent in asking and give results based, in part, on how it perceives you.

This shift speaks to a broader phenomenon. Increasingly, the Internet is the portal through which we view and gather information about the larger world. Every time we seek out some new bit of information, we leave a digital trail that reveals a lot about us, our interests, our politics, our level of education, our dietary preferences, our movie likes and dislikes, and even our dating interests or history. That data can help companies like Google deliver you search engine results in line with what it knows about you.

Other companies can use this data to design Web advertisements with special appeal. That customization changes the way we experience and search the Web. It alters the answers we receive when we ask questions. I call this the “filter bubble” and argue that it’s more dangerous than most of us realize.

In some cases, letting algorithms make decisions about what we see and what opportunities we’re offered gives us fairer results. A computer can be made blind to race and gender in ways that humans usually can’t. But that’s only if the relevant algorithms are designed with care and acuteness. Otherwise, they’re likely to simply reflect the social mores of the culture they’re processing—a regression to the social norm.

The use of personal data to provide a customized search experience empowers the holders of data, particularly personal data, but not necessarily the seekers of it. Marketers are already exploring the gray area between what can be predicted and what predictions are fair. According to Charlie Stryker, a financial services executive who’s an old hand in the behavioral targeting industry, the U.S. Army has had terrific success using social-graph data to recruit for the military—after all, if six of your Facebook buddies have enlisted, it’s likely that you would consider doing so, too. Drawing inferences based on people like you or people linked to you is pretty good business.

And it’s not just the Army. Banks, too, are beginning to use social data to decide to whom to offer loans. If your friends don’t pay on time, it’s likely that you’ll be a deadbeat, too. “A decision is going to be made on creditworthiness based on the creditworthiness of your friends,” says Stryker.

If it seems unfair for banks to discriminate against you because your high-school buddy is bad at paying his bills or because you like something that a lot of loan defaulters like, well, it is. And it points to a basic problem with induction, the logical method by which algorithms use data to make predictions. When you model the weather and predict there’s a 70% chance of rain, it doesn’t affect the rain clouds. It either rains or it doesn’t. But when you predict that, because my friends are untrustworthy, there’s a 70% chance that I’ll default on my loan, there are consequences if you get me wrong. You’re discriminating.

One of the best critiques of algorithmic prediction comes, remarkably, from the late nineteenth-century Russian novelist Fyodor Dostoevsky, whose Notes from Underground was a passionate critique of the utopian scientific rationalism of the day. Dostoevsky looked at the regimented, ordered human life that science promised and predicted a banal future. “All human actions,” the novel’s unnamed narrator grumbles, “will then, of course, be tabulated according to these laws, mathematically, like tables of logarithms up to 108,000, and entered in an index … in which everything will be so clearly calculated and explained that there will be no more incidents or adventures in the world.”

The world often follows predictable rules and falls into predictable patterns: Tides rise and fall, eclipses approach and pass; even the weather is more and more predictable. But when this way of thinking is applied to human behavior, it can be dangerous, for the simple reason that our best moments are often the most unpredictable ones. An entirely predictable life isn’t worth living. But algorithmic induction can lead to a kind of information determinism, in which our past clickstreams entirely decide our future. If we don’t erase our Web histories, in other words, we may be doomed to repeat them.

Exploding the Bubble

Eric Schmidt’s idea, a search engine that knows what we’re going to ask before we do, sounds great at first. We want the act of searching to get better and more efficient. But we don’t want to be taken advantage of, to be pigeon-holed, stereotyped, or discriminated against based on the way a computer program views us at any particular moment. The question becomes, how do you strike the right balance?

In 1973, the Department of Health, Education, and Welfare under Nixon recommended that regulation center on what it called Fair Information Practices:

  • You should know who has your personal data, what data they have, and how it’s used.
  • You should be able to prevent information collected about you for one purpose from being used for others.
  • You should be able to correct inaccurate information about you.
  • Your data should be secure.

Nearly forty years later, the principles are still basically right, and we’re still waiting for them to be enforced. We can’t wait much longer: In a society with an increasing number of knowledge workers, our personal data and “personal brand” are worth more than they ever have been. A bigger step would be putting in place an agency to oversee the use of personal information. The European Union and most other industrial nations have this kind of oversight, but the United States has lingered behind, scattering responsibilities for protecting personal information among the Federal Trade Commission, the Commerce Department, and other agencies. As we enter the second decade of the twenty-first century, it’s past time to take this concern seriously.

None of this is easy: Private data is a moving target, and the process of balancing consumers’ and citizens’ interests against those of these companies will take a lot of fine-tuning. At worst, new laws could be more onerous than the practices they seek to prevent. But that’s an argument for doing this right and doing it soon, before the companies who profit from private information have even greater incentives to try to block it from passing.

Eli Pariser is the board president and former executive director of the 5 million member organization MoveOn.org. This essay is excerpted from his book, The Filter Bubble: What the Internet Is Hiding From You. Reprinted by arrangement of The Penguin Press, a member of Penguin Group (USA), Inc. Copyright © 2011 by Eli Pariser.

Finding Connection And Meaning in Africa

A doctor discovers meaningfulness in a simpler, survival-oriented culture.

By Tom Murphy

As a radiologist physician, I went to Moshi, Tanzania, in June of 2007 to spend three and a half weeks teaching and working with radiology doctors in training at Kilimanjaro Christian Medical Center (KCMC) Hospital at the base of Mt. Kilimanjaro. As I said in my e-mails home, “Every minute was an adventure and every day a safari.”

The medical milieu was one in which we dealt with basic human existence. We encountered a spectrum of problems, from extreme and untreatable infectious disease to the new plague of “Western disease” (diabetes, early obesity, heart attack, and stroke), and the growing presence of cancer. There were also wild cards, such as the curse of inexpensive but toxic Chinese drugs, as well as infants with congenital and rheumatic heart disease that were on waiting lists for surgical repair in India. Disease crossed all ages, from babies to teens to a 26-year-old male with terminal parasitic Echinococcus filling his lungs to old men with testicular tumors the size of a grapefruit.

The hospital was open-air, Christian, 500 beds, and a major training center for nursing, anesthesia, radiology, dermatology, and other specialties. It was also a research center for Duke University (AIDS, dermatology, and medical students).

But I went there for another purpose. I had been working with the Millennium Project, a futures think tank in Washington, D.C., for which I had been studying global issues for seven years. Africa, and particularly sub-Saharan Africa, has been at the forefront of many issues—AIDS, poverty, corruption, and so on. I had heard so much about Africa that I was more intrigued by what I could learn than what I could teach.

What do the African people have to teach the rest of us about the future? Besides thousands of years of history and culture, there is the magical attraction of Africa, which is a palpable sense of connection—connection to the past, connection to the earth, and connection to each other. It is simply people expressing themselves honestly while living in a world where meeting the basic needs for food, shelter, clothing, and human kindness fill up the day.

The human kindness is broad. It encompasses the solidarity of survival of everyone and the spirit of the individual. These are a proud people in their demeanor, their voice, their language, and their respect for each other. They are self-confident enough for the young to say Schiamoo (“I place myself at your feet”) to the elders and mean it, and to welcome all into their homes to get to know people and their personalities. When I asked Korakola, one of the radiology residents, to review a talk I was going to give to the staff, she said, “Say whatever you wish and we will decide what to take from it.”

The hospital itself offered me another perspective: Health is not taken for granted here. There is a sense that anyone here could become seriously ill, that death is nearby. And I saw an unspoken acceptance of this reality in the pediatric wards on the faces of women with their ill children. Their thankfulness for any help is profound.

This presence of illness and death is a part of the culture. The AIDS prevalence rate in Tanzania is 8%, and, as in much of sub-Saharan Africa, waterborne infectious disease remains the major cause of death. The cultural side of this is a gratefulness for life. Many say they do not think of the future. Is this hopelessness, acceptance, or contentment?

Along with connection to each other is connection to the earth. There are billboards that say, “In Tanzania, there is no such thing as waste.” Small croplands of maize, beans, and bananas are interspersed with the prairie landscape, and grass cuttings are bicycled to the farmers and bartered for milk. Homes have few appliances, clothes are washed in available cold water sources (our maid used the shower). Water is treasured, and safe water is mostly created. Food production is local, unpackaged, unprocessed, bartered, shared, and completely consumed. Clothing is simple, beautiful, locally stitched and repaired. And all of this is a point of pride for them as they joked with me about the “fat” people of the West.

So what does this have to do with the future? This is a people tempered by thousands of years of culture, still living in contact with their own human needs and their earth. They nurture grateful, kind, affectionate, children with fully developed personalities, and live in community—simply but not simplistically. They live in a time frame that allows for human interaction without rancor. There is simple joy here without the false anxieties of excess.

As I contemplate the future, I wonder if this is not what we are all looking for. Is there an endpoint to future studies? Is there a goal? Could this be it?

As I have written this, I have avoided a few topics: violence (which I did not see here, but it is hard to ignore the slaughters of nearby Darfur), tribalism, corruption, lack of infrastructure, polluted and infected water (and its medical consequences), and the overall environmental degradation that plagues the southern African countries. Anyone would know that these must be dealt with.

And there are other puzzlements. Would these gentle people, if given the opportunity, forsake this beauty of culture for the wealth and “ease” of Western countries? I know that they crave education for their children, and perhaps that is a long-term answer. It is up to the Tanzanians to choose how they will develop.

There is also a global issue here. If the world chooses to develop an ecologically and humanly healthy global footprint, are those of us in wealthy countries willing to do it consciously? In other words, would we choose to not consume, even though we have the money, in order to have a resource-secure Earth?

Perhaps we can solve it all with technology, but there was something grounded about the sense of living that I found in Africa. There, the real, everyday human needs of food, shelter, and community were profound: Everything had meaning.

Tom Murphy, MD, is a physician in Des Moines, Iowa, and has been an associate of the Millennium Project for 10 years, focusing on global health and epidemics (challenge number 8 of the 15 Challenges of the Future). E-mail temmmmm@aol.com.

The Sounds of Wellness

Music may have charms to suppress the savage gene.

One ancient therapy has been gaining increased currency among health practitioners in multiple fields of medicine: music. Doctors and nurses increasingly credit music with demonstrable healing powers and anticipate that it can play a major role in treating or preventing many health conditions.

“Sound was really overlooked as a healing modality for a long time. But more recently because of the amount of studies—and because it’s a low-cost intervention—we’re seeing it being used more in medical centers,” says Brenda Stockdale, director of mind–body medicine at RC Cancer Centers.

Stockdale’s cancer center incorporates music into a six-week program for patients who are recuperating from—or trying to prevent—heart disease, autoimmune conditions, cancer, diabetes, and other illnesses. The program dedicates one full week to therapies involving sound, with other weeks focusing on nutrition, physical therapy, and other traditional health areas. In Stockdale’s experience, patients who incorporate sound-based therapies and music into their health regimens attain the best results.

“Using a technology of sound can round out a wellness program,” says Stockdale.

Her facility also plans on replacing televised news in the waiting room with spa-like music. This will soothe patients who might exhibit what Stockdale calls “white coat syndrome”—i.e., being nervous about visiting a doctor.

“That is a great way for medical facilities to start using sound from the moment a person comes in, creating a healthier atmosphere,” says Stockdale.

Music might even influence our genes, Stockdale adds: Her colleague Barry Bittman, medical director of Meadville Medical Center’s Mind-Body Wellness Center, leads sessions in which patients play spontaneous tunes and rhythms on musical instruments. Findings suggest that, following many sessions, the genes each patient carries for heart disease and other conditions are less likely to become active.

We cannot actually change our genes, but outside stresses and conditions may determine whether certain genes will be expressed, Stockdale explains. Music is a healthy influence, and when patients add it to their environs, they raise the odds that genes for health problems will not activate.

“We’re changing the cellular environment. We’re helping healthy genetic expression,” she says.

Stockdale cautions that this study is ongoing, and it is too early to draw hard conclusions from it. But if the preliminary results prove valid, then physicians might eventually design targeted music regimens to actively shape gene expression.

“We will have enough information of the genetic potential, and we will have enough information with all the markers available, to start using music intentionally,” she says.

In the meantime, music therapy is a growing treatment field. Its practitioners use music and sound sequences to help patients manage or relieve chronic pain, immune system disorders, brain damage, mental and emotional disorders, and some developmental disabilities.

Elena Mannes, documentary producer and author, recounts many applications of music therapy in her 2011 book, The Power of Music. For example, she notes that patients in England who had received anesthetics recovered more quickly and had fewer complications if they listened to classical music. Canadian patients who listened to soothing music at regular intervals needed half as much anesthetics as other patients. And at the Beth Abraham Hospital in New York City, patients whose speech was impaired due to a stroke regained some of their speaking abilities after undergoing music therapy.

“To see tears come to the eyes of a neuroscientist as music enables a stroke patient to speak is to witness a moment filled with promise. Science is opening doors to medical applications of music that were unimaginable a decade or so ago,” Mannes writes, adding that “scientists predict a future in which music will routinely be used as a prescription.”

Stockdale leads her patients through sessions using music recordings such as “auto-genics,” which feature acoustics tailored to help the listener’s brain wave rhythm slow down into a more relaxed state. When the brain registers ambient music, Stockdale explains, it secretes chemicals linked to many desirable health effects, including boosting immunity and slowing the heart rate.

“Years ago, we had a mechanistic view of the body. Now we know that the mind and body communicate seamlessly. It’s a constant conversation between mind and body,” Stockdale says. “It is mind and matter affecting each other.”—Rick Docksai

Sources: Brenda Stockdale (interview), RC Cancer Centers, www.brendastockdale.com.

Elena Mannes (interview), Mannes Productions, www.mannesproductions.com. See The Power of Music: Pioneering Discoveries in the New Science of Song by Elena Mannes (Walker, 2011).

Fast Fashion: Tale of Two Markets

Should retailers put the brakes on quick-response manufacturing?

Fashion trends are perpetually changing. A number of large clothing retailers have successfully capitalized on this via a streamlined system involving rapid design, production, distribution, and marketing—what’s known as the “quick-response method.” This method often involves hiring low-paid factory workers in developing countries to manufacture the apparel, keeping prices low for consumers in developed countries.

The quick-response method is an integral part of the “fast fashion” industry, which became prevalent around five years ago and continues to grow. Fast fashion is centered around relatively inexpensive, cheaply made designer knockoffs that go in and out of style faster than the traditional cycle of four fashion seasons.

Fast fashion has grabbed a large share of the apparel market in the United States, according to a study by Iowa State University assistant professor Elena Karpova and graduate student Juyoung (Jill) Lee. This may indicate what could be deemed a long-term fashion trend: Consumers in the United States are gravitating toward lower-priced attire over high quality, longer-lasting clothing. But the opposite holds true in the Japanese market.

In their study, Karpova and Lee compared U.S. and Japanese government data on the issue over a 10-year period.

“I think because U.S. consumers have been price conscious, they generated the whole trend in the industry called ‘fast fashion,’” says Karpova. “American consumers want styles to change quickly, and they want to see new merchandise in their favorite stores almost every week—and at affordable prices.”

Karpova and Lee report that, in general, American consumers frequently replace the inexpensive, lower-quality clothes they purchase. The U.S. apparel industry has attempted to compete with more successful overseas chains by manufacturing their own fast fashions, yet this has backfired somewhat: Americans believe imported clothing to be of higher quality than that produced domestically. To put it another way, American consumers believe that cheap clothing from large European retailers is somehow superior to cheap clothing from large American retailers. (Clothing exports from the United States are not faring much better abroad, either.)

On the other hand, Japanese consumers are willing to pay more for domestically made products with higher price tags, which has resulted in fewer purchases of less-expensive imports. The Japanese apparel industry’s emphasis on more expensive, higher quality goods distinguishes them from foreign competitors in a positive way.

Marketing efforts help drive these trends, Karpova and Lee note. Clothing stores in Japan target older consumers, who are likely to be more interested in long-lasting quality than keeping up with the latest styles, while American advertising targets younger consumers interested in just the opposite.

While it’s likely that both trends will continue, there also exists the possibility that they could reverse course. Recently, Swedish “fast fashion” chain H&M debuted with a big splash in Tokyo’s trendy, style-conscious Harajuku and Shibuya districts and is quickly making inroads in Japan. Meanwhile, Parisian designers at high fashion house Hermès have begun emphasizing what is being dubbed “slow fashion.”

Regardless, fast fashion is not as socially or ecologically responsible as that which is well-made, long-lasting, free of sweatshop labor, and capable of being appreciated for longer than a few weeks, Karpova and Lee conclude.—Aaron M. Cohen

Sources: Iowa State University, www.news.iastate.edu.

“The U.S. and Japanese Apparel Demand Conditions: Implications for Industry Competitiveness,” Journal of Fashion Marketing and Management (volume 15, issue 1), Emerald Group Publishing, www.emeraldinsight.com.

The Gamification of Education

Why online social games may be poised to replace textbooks in schools.

The world has entered a bright new technology-driven era, yet the education system remains rooted in a gray industrial past. At least, this is the argument that a growing number of education professionals are making.

One idea for reform that is steadily gaining popularity involves moving learning almost entirely online and declaring textbooks more or less obsolete. Some suggest taking Web-based learning one step further: Online social gaming may become the educational tool of choice.

While traditional education proponents may be quick to dismiss computer games as inconsequential, others argue that a strong precedent for independently motivated online game-based learning has already been established. Examples include PBS KIDS’s interactive whiteboard games, which teach basic subjects to very young children, and the Learning Company’s hugely popular historical learning game, The Oregon Trail.

Advocates for gaming in education also point to professional training situations where games are increasingly replacing lectures and presentations. Further afield, Jane McGonigal, the director of game research and development at the Institute for the Future, has designed award-winning games to help ignite real-world solutions to pressing social and environmental challenges, such as global food security and a shift to renewable energy.

In their book, A New Culture of Learning (CreateSpace, 2011), Douglas Thomas and John Seely Brown argue that curiosity, imagination, and a sense of play—three aspects integral to learning—are largely missing from the traditional textbook-and-test based education system. What’s more, the authors point out, these are all present in massively multiplayer online role-playing games (MMORPGs) like World of Warcraft.

In Thomas and Brown’s view, such games “are almost perfect illustrations of a new learning environment.” In the social gaming world, “learning happens on a continuous basis because the participants are internally motivated to find, share, and filter new information on a near-constant basis,” they write. Unlike midterms and final exams, games associate learning with fun and allow for trial and error (basically, the freedom to make mistakes). They can also encourage exploration, collaboration, and the exchange of ideas while removing unwanted pressures that can interfere with students’ abilities.

Thomas and Brown further point out that players must do a great deal of reading and research (typically on blogs, wikis, and forums) in order to complete quests in MMORPGs. In other words, well-designed games can also motivate kids to read, the authors believe.

Already, one well-funded experimental New York City public charter school, Quest to Learn (Q2L), has practically eliminated textbook-based learning and largely replaced it with game-based learning. (A sister school, ChicagoQuest, is scheduled to open in September 2011.) Q2L describes itself as “a school that uses what researchers and educators know about how children learn and the principles of game design to create highly immersive, game-like learning experiences in the classroom.” There, basic classes such as math, science, languages, and social studies take place in virtual game worlds. There are bad guys and monsters to defeat along the way.

The school also utilizes game design as a teaching tool, with the goal of creating a solid game-based pedagogical model. “Games work as rule-based learning systems, creating worlds in which players actively participate, use strategic thinking to make choices, solve complex problems, seek content knowledge, receive constant feedback, and consider the point of view of others,” according to Q2L.

That being said, some subjects, such as math and science, are more easily “gamified” than others, such as discussion- and essay-based subjects in the humanities (it would be difficult to parse out the subtleties of, say, To Kill a Mockingbird, by teaming up against bad guys in a MMORPG).

Other advantages and disadvantages need to be weighed. One potentially large drawback is that addiction to game play is engineered into the games themselves, according to Scot Osterweil, research director of MIT’s Education Arcade, which develops (and advocates for) educational games. Parents may want their children to study calculus every night, but they may become concerned if that practice were to become habit-forming, Osterweil noted during a panel on gaming and education at South by Southwest Interactive (SXSW) in March 2011.

Game addiction becomes a much more complex issue when studying and learning is involved, observes Alan Gershenfeld, who serves on the advisory board of nonprofit organization Games for Change. At another 2011 SXSW Interactive panel, Gershenfeld noted one potential solution for game addiction that is being considered by designers: The characters in the game might be programmed to get tired and ask the kids to take a break.

Then there is what Gershenfeld termed the “chocolate and broccoli question”: How do you convince children to play games that are educational and thus less appealing and hip than games like World of Warcraft? It’s not easy, but it’s doable, he said.

Gamified learning is in the early experimental stage. The jury is still out on whether game mechanics may be more effective than linear presentations of educational content with intermittent quizzes. The only thing that can be said with almost certainty is that the number of such experiments is poised to increase.—Aaron M. Cohen

Sources: Quest to Learn, www.Q2L.org.

A New Culture of Learning by Douglas Thomas and John Seely Brown (CreateSpace, 2011).

Biomimicry to Fight Blindness

Doctors design neuron-compatible implants to restore lost eyesight.

The human eye and a digital camera are structurally very alike, according to University of Oregon physicist Richard Taylor. For that reason, he hopes to adapt computer chips into components that surgeons might use to restore blind patients’ eyesight.

“The ultimate thrill for me will be to go to a blind person and say, ‘We’re developing a chip that one day will help you see again,’” says Taylor. “For me, that is very different from my previous research, where I’ve been looking at electronics that go into computers.”

Taylor’s idea is to use nano-sized, flower-shaped electrodes topped with photodiodes, which collect incoming light. The electrode will relay signals from the photodiode to the eye’s own nerves (neurons), transferring the signals along a pathway to the brain for processing into visual images.

The “nanoflowers” would undergo construction in a specialized factory. Once they are built, a surgeon could place them in a patient’s eye using a scalpel and other basic surgical tools that are already found in operating rooms everywhere.

A healthy human eye has photodetectors and an optic nerve, explains Rick Montgomery, a doctoral student working with Taylor on the project. When light hits the eye’s surface, the photodetectors respond by sending signals to the optic nerve, which relays them to the brain. The brain then creates sight. Blindness results when neurons are damaged, preventing the signals from reaching the brain. The nanoflower components will bridge the disconnections, however, by receiving light from their own photodiodes and sending signals to still-functioning neurons.

Photodiodes are now a common fixture in solar panels. In Taylor’s electrode model, they generate working vision instead of electrical energy. “It’s like putting a panel of solar cells in the eye and using that energy generated by that cell to let the brain know what it’s seeing,” says Montgomery.

Medical labs have previously tried to jumpstart vision in impaired eyes by means of photodiodes, according to Montgomery. But computer chips and neurons don’t fit well together, since neurons are slender and branchlike and computer chips are square. This means that, no matter how finely crafted a computer chip may be, many of its outgoing signals will be misdirected and never reach the neurons; consequently, the person will not see as much.

Taylor and Montgomery get around this problem by shaping their nanoflowers in, literally, flower shapes because they imitate the neurons’ geometry. The nanoflowers’ ends reach out far enough toward neurons to send them more signals.

“We’re mimicking biology,” says Montgomery. “We’re trying to use what evolution has come up with, with the complex geometry that the neuron has, and we’re mimicking that in our electrode.”

Montgomery began working this summer with Simon Brown at the University of Canterbury, in New Zealand, on experiments with various metals to grow the nanoflowers on implantable chips. The two researchers are refining the production techniques and to determine which metals would be most compatible with patients’ bodies. The technology could be ready for testing on people in the next 10 years, Montgomery believes. Taylor and Brown will probably start a company that grows and sells the nanoflowers in conjunction with other nanotech companies.

Brown told THE FUTURIST that nanoflowers could achieve even more ambitious goals than curing blindness: If a nanoflower can interact with human neurons to generate eyesight, it might also work with neurons to restore mobility in a person suffering paralysis, vastly improve the functionality of prosthetic limbs, or undo effects of Alzheimer’s and Parkinson’s diseases.—Rick Docksai

Sources: Richard Taylor (interview), University of Oregon, www.uoregon.edu.

Rick Montgomery (interview), University of Oregon, www.uoregon.edu.

Simon Brown (interview), University of Canterbury, New Zealand, www.canterbury.ac.nz.

Futurists and Their Ideas: Marvin J. Cetron on Terrorism and Other Dangers

By Edward Cornish

To protect the United States against terrorists and other aggressors, Defense Department agencies often call on Marvin J. Cetron and his private consulting firm, Forecasting International.

Marvin J. Cetron, founder of Forecasting International Ltd., was born in Brooklyn, New York, in 1930. His father, an accountant, moved the family frequently during Marvin’s early years, but eventually settled in Lebanon, Pennsylvania.

Cetron attended Pennsylvania State University, where he majored in industrial engineering. After graduating, he got a job with the U.S. Navy Department. It was the start of a 20-year-career with the Navy.

His first assignment was to the Naval Applied Science Laboratory in the old Brooklyn Navy Yard, where he specialized in planning and resource allocation—tasks that required a great deal of forecasting.

In addition to his day-to-day work, the Navy sent Cetron to Columbia University to earn a master’s degree in production management. “On my own time!” Cetron notes. “It took three years.”

In 1953, the Navy transferred him to Washington, D.C., where he testified before Congress on the need to raise the pay scale for government engineers. “We could not hire them because companies like Sperry Gyroscope were paying 50% more.” While in Washington, Cetron spent two hours briefing then-senators John Kennedy and Lyndon B. Johnson.

“But the proudest thing I ever did for the Navy Yard was setting up a program at New York’s Pratt Institute in which students could study for six months and then work for the Navy’s Applied Science Laboratory for six months. We hired 50 bright students who had not been able to go on to college. They got their degrees in five years and then worked for the Navy for at least three years. Every one of them graduated.”

In the 1960s, the Navy transferred Cetron first to the Marine Engineering Laboratory at Annapolis and then to the Navy’s Advanced Concepts Group in Washington. The Bureau of Ships had 19 laboratories at the time, and Cetron was in charge of forecasting for all of them.

“A lot of my work was resource allocation,” he says. “We would compare Navy missions with what was going on in science and applied research. Then we would allocate dollars according to the importance of the mission.”

During that time, Cetron planned and carried out one of the largest studies of American science and technology ever conducted. It was called QUEST, for Quantitative Utility Estimates for Science and Technology, and it attempted to anticipate new technologies and how they could be applied to naval and marine missions. The Marine Corps’s Harrier vertical take-off fighter jet and ground-effect landing craft both emerged from this work.

In this period, Cetron also toured NATO countries to explain what the U.S. Navy was doing in forecasting, in an attempt to get other governments to establish their own forecasting programs.

Meanwhile, he spent six years of his rare spare time earning a doctorate in research and development management at American University in Washington. He recalls that his most difficult challenge in those years came from then–Secretary of Defense Robert S. McNamara.

“McNamara was determined to cut government waste by combining duplicate functions,” Cetron reports. “The Army and Marines had to use the same tanks. The Navy and Air Force would use the same airplanes. And we had to combine the service laboratories whose functions overlapped. I was responsible for all the basic and applied research labs. Fortunately, my master’s and doctoral theses had been in the Program Evaluation and Review Technique, or PERT. I had first used it in what later became the Polaris program. For McNamara’s plan, it was just what we needed.”

After 20 years of government service, Cetron retired from the Navy and founded his own firm, Forecasting International Ltd., in Arlington, Virginia. The firm prospered immediately and has remained active ever since. Over the years, Forecasting International has carried out forecasts of the computer industry for Apple and IBM; the hospitality industry for Marriott and Best Western hotels; energy technologies for Siemens; and policy planning for the Indonesian Ministry of Economics, the Kenyan Ministry of Finance, and the Brazilian Ministry of Planning.

In 1977, two years before the fall of the Shah in Iran, Cetron advised his clients to pull their investments out of the country.

“The gap in income and wealth between the richest and poorest tenths of Iranian society was enormous, and that is always a warning of instability,” Cetron explains. “Then the Shah very hastily doubled the salaries of his imperial guard and top officers. He was obviously afraid. Once that happened, we knew the end was near.”

Forecasting Terrorist Activities

In 1994, the U.S. Defense Department selected Cetron and his colleagues to plan and manage its Fourth Annual Defense Worldwide Combating Terrorism Conference.

“The first three meetings of the conference were limited to specialists in the terrorism field,” Cetron reports. “But I had the idea of inviting some general forecasters as well. They might not know much about terrorism, but they understood how to look into the future.”

Cetron’s innovation worked out “spectacularly well,” he says. The meeting report, called Terror 2000, “anticipated virtually the entire course of global terrorism in the years ahead.

“The use of coordinated attacks on distant targets and the probability of a second, much more successful attack on the World Trade Center all appeared in that report,” Cetron notes. “We even predicted the use of hijacked aircraft to attack the White House or Pentagon, but this last forecast was later removed from the report at the request of the State Department, which feared giving terrorists ideas they might not have on their own.

“All of these insights came from the futurists,” Cetron continues. “The subject specialists rejected most of them, but we were sure enough about our forecasts to include them over their objections. I’m sorry we turned out to be right, but it was hard not to feel some satisfaction. When I first suggested consulting futurists, back in the early 1950s, the admiral in charge of the Bureau of Material told me that if I got involved with some of those ‘nuts’ I would lose my security clearance!”

Teamwork at Forecasting International

Cetron attributes much of the success of Forecasting International to his long-time colleague, Owen Davies.

“We met in 1985 when an editor suggested that Owen help me write a book. I had written a number of textbooks by then, but it was not until Owen and I began working together that my writing for the general public began to take off.

“It turned out that Owen is a very capable forecaster in his own right. Over the last 20 years, we have probably carried out 200 studies together, and he has participated fully in all of them. In many ways, he has become the air beneath my wings.

“In the year 2000, Forecasting International undertook a study of a large Asian nation for a government agency aligned with the intelligence community. Part-way through our work, it became clear that trends alone would not be enough for this forecast. We needed a set of scenarios to guide our analysis. Owen prepared them in an afternoon, and they shaped the remainder of our research. When the project was over and he wrote up the result, that effort formed nearly one-fourth of our report.”

Then early in 2004, Davies sent Cetron a brief note about the nature of Islam and the origins of extremist antipathy toward the West. Cetron encouraged him to expand his thoughts, and these eventually were supplemented by a survey of futurists, terrorism specialists, military officers, and industry executives whose companies were likely to be affected. The resulting report became required reading at all three of the major military graduate schools.

More recently, Davies has speculated that, if the United States loses access to Middle Eastern oil, the nation might rapidly develop its shale oil resources and thus become a world leader in oil production. “In the long run,” Davies suggested, “America would grow much richer.”

After the tsunami struck Japan in March 2011, Forecasting International received a request for a quick study of natural disasters that might devastate American cities. Davies’s research revealed that Honolulu has twice been the target of tsunamis vastly greater than the 30-foot wave that struck Japan. One of these waves was 255 feet high, and the other, more than 1,000 feet.

Davies also found that a subduction zone similar to the one responsible for Japan’s tsunami stands at the end of a narrow sea channel leading to Anchorage, Alaska, and that a wave resulting from an earthquake there would likely destroy parts of Oakland and San Francisco, California.

But the most endangered U.S. city, Davies reported, may be St. Louis, Missouri, which faces earthquakes from the New Madrid fault, flooding by the Mississippi River, tornadoes, and—less-natural disasters—massive environmental pollution, and the highest crime rate in the United States.

Cetron now worries a little less about Islamic terrorism but more about home-grown terrorism in the United States, as alienated Americans attack the people and institutions they are angry at. The need to protect America from its own citizens may lead to further intrusive invasions of people’s privacy and security measures suggestive of the world described by novelist George Orwell in his novel 1984.

About the Author

Edward Cornish is the founding editor of THE FUTURIST. E-mail ecornish@wfs.org.

As Blogged: Insights on the Futuring Profession

Futurist bloggers reflect on what it means—and what it takes—to be a futurist.

What does it mean to be a professional futurist? Or a student of futures studies? What are the most necessary skills, the most important attributes, the most integral responsibilities? Here are a few excerpts from bloggers weighing in on the subject on WFS.org.

When They Say You Cannot Know the Future, They Are Planning It For You

Posted by Eric Garland, Wednesday, June 6, 2011

In 15 years of work in the field of foresight, I have learned two things:

1. You can always know more about where your future is heading.

2. When somebody says it’s impossible to know the future, it is usually because they are planning yours for you, and theirs for them.

Notice at no time do I mention predicting the future. … This is a question of knowing who you are, where you are, and where you are going—as an individual, a group, a nation, a species. The people who say we cannot know more about our future through a simple understanding of large, powerful trends are not only wrong, they are doing harm.

… If somebody tells you that you cannot do this, that you should not do this, that it is impossible—ask yourself why they want you to keep thinking the way you are. I bet it is not so that you can be more innovative, flexible, or successful. Perhaps it is because they like the way things are just fine.

… Our world absolutely cries out for the ability to see over horizons, to anticipate the next shock and the golden opportunity.

Top 10 Attributes of FS [Futures Studies] Students

Posted by Alireza Hejazi, Thursday, April 28, 2011

… Become a skillful questioner. As a student of FS you need to ask good questions. Asking good questions at good times is an art, and one of the missions that FS students should accomplish is mastering this art. Different questions can be raised in different areas and times, but a good question is one that is targeted at a definite goal at the most appropriate time.

… Rationalize your expectations. … Our expectations should be based on logical and reasonable foundations. We are not going to solve all of the world’s problems in just one night.

… Learn to teach FS to others. … Lifelong education is needed for all of us, but our college years are [finite]. After some years of education you’ll graduate and perhaps find an opportunity to teach FS to others in both academic and informal ways.

… Develop your personal strategic plan. Not only as a futurist, but also as a [normal] person, you need to develop your personal strategic plan. You may be always asked to forecast for others, but firstly you should learn to [forecast] for yourself. After or during your college years, you should learn how to apply FS tools and techniques in your personal strategic planning. Designing such a plan gives you necessary direction and leads you through your life and your futuring endeavor.

An old saying, “If you are a physician, heal yourself first,” reminds us of the necessity of personal futuring before forecasting for other [people’s] lives and work.

Small Business Futures

Posted by Verne Wheelwright, Sunday, March 20, 2011

I strongly believe that everyone can best learn about futures tools and methods by starting with Personal Futures. This is not just because I am so invested in Personal Futures (which I acknowledge). It’s about learning systems. How do we learn?

We learn best and quickest from what we can experience, and Personal Futures is based on each individual’s life experience. This allows individuals to learn a totally new method or tool and relate that method or tool to personal experience. The result is instant learning, because the experience is already built in. This approach also appears to be effective in large organizations for leadership training in long-term thinking.

About the Authors

Eric Garland is the founder and managing partner of Competitive Futures Inc. and author of How to Predict the Future and WIN!!! (Competitive Futures, 2011).

Alireza Hejazi is founder and developer of the FuturesDiscovery Web site.

Verne Wheelwright is author of It’s YOUR Future … Make It a Good One! (Personal Futures Network, 2010).

Turbulence-Proofing Your Scenarios

By Rick Docksai

Investing in an effective scenario-planning exercise and using the experience wisely can have a big payoff for organizations.

Scenario Planning in Organizations: How to Create, Use, and Assess Scenarios by Thomas J. Chermack. Berrett-Koehler. 2011. 272 pages. Paperback. $34.95.

Plenty of scenario-planning books tell readers how to build scenarios, but some pieces are missing. Few books offer advice for implementing scenarios or for determining if one’s organization is achieving optimal results from them, according to Colorado State University scenario-planning professor Thomas Chermack.

“Pick up any of the popular scenario planning books and check the index for assessment, evaluation, or results. I predict that you will not find these entries,” Chermack writes in Scenario Planning in Organizations.

Trying to pick up where he thinks other books have left off, Chermack introduces “performance-based scenario planning.” In his view, the work does not end with building scenarios. He presents tools for first developing scenarios, then carrying them out and measuring their comparative worth.

Chermack speaks through live narrative, using a real-life tech firm—he assigns it the alias “Technology Corporation”—which adopted scenario planning in order to better formulate mission strategies, more efficiently manage team projects, and execute needed internal reforms. A team of the corporation’s staff met for six rounds of planning throughout eight weeks.

The team compiled copious amounts of data about their organization’s business model, the industry environment, and the critical forces at play. They identified and ranked by strategic importance the factors that were certain and those that were uncertain. As Chermack explains, “When truly uncertain forces have been isolated, energy can be spent trying to understand those forces and how they might play out across a range of possible futures.”

Then they developed sets of scenarios that explored the external environment and how their organization would likely respond to changes within it. In follow-up sessions, they speculated how their scenarios might change if certain elements in the environment changed. Chermack calls this latter phase “wind tunneling,” in reference to the wind tunnels that aerodynamics researchers use to test new airplane models.

“Turbulence is an environmental characteristic that puts stress on the object in question, be it an airplane or an organization,” he writes.

Chermack lays out specifically how the team gathered data and analyzed it, and how they used Web sites, podcasts, and other digital media to communicate scenarios to each other and to the rest of their organization. He adds further suggestions for how any leader can effectively manage scenario projects and avoid many potential pitfalls.

Technology Corporation’s endeavor ends with the team members agreeing to expand from exclusive production of intellectual property to production of new, useful technology products. Toward that end, they formulate plans for more contracts with R&D partners, selling new technologies, and increasing cross-functional collaboration. Chermack observes that, as they worked, they noted approvingly that communication and understanding among their organization’s staff had improved markedly.

“Many expressed surprise that such a simple exercise could have such profound results,” he writes.

Finally, the team members assessed the results post-implementation. They completed short surveys about their satisfaction or dissatisfaction with the exercise and its degree of usefulness, and—more importantly, according to Chermack—what they learned: They detailed what they knew now that they did not know before, and how they and others would function differently following the scenario project. Chermack specifies several ideal survey formats and the kinds of questions that they should include.

“Learning is a prerequisite to change,” Chermack writes. “People cannot change their behaviors, have strategic insights, or create a novel way of seeing a situation if they have not learned.”

Chermack’s assessments also include performance questionnaires on the improved productivity, new ideas, and cost savings that participants expect will result from the scenario projects. Technology Corporation’s participants reported that an investment of $100,000 in the exercise generated ideas that would net them $250,000 in new revenue: a benefit of $150,000.

“While costs of scenario projects can seem high at first, consider the implications of saving from one major catastrophe or one major strategic insight,” writes Chermack.

Not all organization leaders are convinced that scenario activities hold merit. Chermack’s Scenario Planning in Organizations addresses their doubts head-on. The author acknowledges where prior scenario literature may have left unanswered questions, and he combines background theory and real-world strategizing to fill in the gaps. His book will be a useful addition to the libraries of organization leaders everywhere.

About the Reviewer

Rick Docksai is an assistant editor of THE FUTURIST. E-mail rdocksai@wfs.org.

The Uncertain Future of the English Language

By Edward Cornish

Parlez vous “Globish”? If English is your only language, you’re probably doing okay now. But you might not be prepared for the future, suggest the authors of Globish and The Last Lingua Franca.

Globish: How the English Language Became the World’s Language by Robert McCrum. W.W. Norton. 2010. 331 pages. $26.95.

The Last Lingua Franca: English Until the Return of Babel by Nicholas Ostler. Walker & Company. 2011. 330 pages. $28.

When a Spaniard talks with a Chinese person, what language do they speak?

Chances are good that it isn’t either Spanish or Chinese. Instead, it’s English, the language they are most likely to have in common.

The rise of English as the leading language for international communications makes a fascinating story, and Robert McCrum, associate editor of the London Observer, tells it well in Globish: How the English Language Became the World’s Language.

McCrum begins with the humble origins of English among the Angles, or Anglii, a people living in what is now Denmark and northern Germany during the days of the Roman Empire.

During the Dark Ages, many Anglii migrated to England, where their German dialect gradually evolved into the English language of today. Along the way, English picked up words from French, Latin, and other languages.

Living on an island, the English people became intrepid seafarers, who carried their language around the world. Today, every continent has a substantial group of English speakers, and English has gained increasing importance as a lingua franca, a language used among people who do not share the same mother tongue. Other languages, such as Greek, Latin, and French, have served this purpose in the past, but English is now the most popular choice.

The need for a lingua franca has intensified in recent years with the growth of travel and international sports, as well as the globalization of the economy. To succeed in today’s world, individuals and governments alike recognize the value of knowing how to speak and read English.

To make things easier for people whose native language is not English, Jean-Paul Nerrière, a French-speaking former IBM executive, has developed a simplified version of English that he calls Globish.

“Globish,” reports McCrum, “starts from a utilitarian vocabulary of some 1,500 words, is designed for use by non-native speakers, and is currently popularized in two handbooks: Découvrez le Globish and Parlez Globish.

Nerrière believes that Globish will not only improve global communications, but will also limit the spread of English. Many French people are horrified when English words like hot dog and jumbo jet infiltrate their beloved French language.

Globish is not the first attempt to simplify the English language. Back in 1930, the English linguist Charles K. Ogden invented what he called Basic English, which got much publicity after World War II. Basic English had an 860-word list for the beginner’s vocabulary.

Interest in Basic English later faded, but recently it influenced the creation of the Voice of America’s “Special English” for news broadcasting and “Simplified English,” designed for technical manuals.

Globish, Basic English, and other simplifications of English can help non-English speakers to acquire a working knowledge of the language, but most people will need to go beyond a stripped-down vocabulary if they want to get the full benefit of the world’s vast English-language resources. So regular users of English-language resources will want easy access to a good dictionary.

Meanwhile, totally artificial languages continue to have advocates. Esperanto, a language developed by Polish scholar L. L. Zamenhof in the nineteenth century, has a vocabulary based on a variety of European languages, so it is more “neutral” than a language based solely on English. However, the world’s intellectual resources are largely in English; relatively little is written in Esperanto or any other invented language.

Surveying the Lingua Francas

In contrast to McCrum, Nicholas Ostler, chairman of the Foundation for Endangered Languages, takes a less triumphalist view of the English language in his recent book, The Last Lingua Franca: English Until the Return of Babel.

Ostler describes the rise and fall of lingua francas through the centuries. Greek, Persian, Latin, French, and many other languages have had their day in the sun but later declined as other languages came into favor.

So it will likely be with English, Ostler suggests in his concluding chapter, “Under an English Sun, the Shadows Lengthen.” However, Ostler admits that “the current status of English is unprecedented.”

He adds that, simultaneously, English “has a preeminent global role in science, commerce, politics, finance, tourism, sport, and even screen entertainment and popular music. With no challenger comparable to it, it seems almost untouchable. Even in China, the only country with a language that has more native speakers, every school child now studies English. And India, set to overtake China in population by 2050, is already trading on an expertise in English inherited from the British Empire and studiously preserved and fostered ever since.”

So, Ostler concludes, “two polar opposites define the extremes of what is possible. International English might grow to become Worldspeak, as a single fully global lingua-franca might be called, available as a universal auxiliary (or indeed primary) language to every educated adult. Or it might retreat as other powers advance, losing its global users and status until it is confined to the lands where it is still spoken as a mother tongue. A third, intermediate, option would see English retained as a world language, but developing on a separate standard from that used by native speakers.”

Ostler offers some intriguing explanations for the rise and fall of languages. When the Romans ruled much of the world, their language became popular with people who wanted to get ahead. When Roman power declined, Latin might have been expected to decline with it. But Latin found new strength as the official language of the Roman Catholic Church, and most books were written in Latin until Johannes Gutenberg developed movable type and books began to be printed in quantity.

McCrum explains in Globish that, before Gutenberg, books were costly, handmade, and rare. But Gutenberg’s development of movable type allowed books to be published quickly and cheaply, so people of modest means could buy them, and they did, but they preferred books published in their own languages—French, German, Italian, etc.—rather than Latin, which most people had difficulty with.

During the Enlightenment, Latin got another boost when it became, for a time, the language of science and scholarship. Physicist Isaac Newton had his Principia published in Latin in 1687, and many other scientists published in Latin well into the nineteenth century. But then the tide had turned decisively against Latin, because most readers preferred to read texts in the vernacular (their mother tongues). For a time, German became popular as the favored language for scientific publishing, but the popularity of German in science declined sharply after the Nazis took control in Germany.

Both McCrum and Ostler do well in outlining the history and current situation of English and its rivals, but they fail to tackle the policy issue: Would it be desirable for the world to have a single language, and, if so, should it be English?

From an economic standpoint, a single language might seem highly desirable: Business transactions would be easier, and considerable money could be saved by not having to hire translators. On the other hand, the initial cost of training millions of non-English speakers to be fluent in English would be enormous, and there would later be the problem of finding new employment for thousands of teachers who have long made a living teaching French, Spanish, German, Chinese, and other languages.

About the Reviewer

Edward Cornish is the founding editor of THE FUTURIST.

Books in Brief

Edited by Rick Docksai

Living Libraries

The Atlas of New Librarianship by R. David Lankes. MIT Press. 2011. 408 pages. Illustrated. $55.

A library in this century will be valuable not so much for its book collections as for its community space, argues library information sciences professor R. David Lankes in The Atlas of New Librarianship. He describes a new ethos of “participatory” librarianship taking hold in the profession: Librarians as dynamic facilitators of conversation and knowledge creation in their communities.

Lankes cites one survey in which a majority of teenagers said they wanted their local librarians to run blogs that would review and recommend books, with space for readers to comment. This would enable them not only to see book recommendations, but also to know who was recommending them.

Although librarians aren’t blogging en masse just yet, some are hosting faculty blogs and servers through which users can explore academics’ articles. Also, many are quickly adopting social-networking sites, such as Flickr and Facebook. Lankes further describes how library catalog systems are becoming more user-friendly; they may reach the point where, as with iPhones, users can tailor them for personal use by adding or removing custom apps.

Some libraries construct live social space, such as a café or a music performance center that has a stage with pianos on which musicians can practice. Lankes also discusses how libraries can encourage aspiring local entrepreneurs and cultivate civic awareness among their neighborhoods’ elementary- and secondary-school students.

Lankes wrote The Atlas of New Librarianship with librarians and scholars in mind, but the text covers such a vast array of pertinent subjects that almost any reader—parent, community leader, business professional, student, job seeker, etc.—may find a few topics of personal interest.

Neighborhood-Based Futuring

Collective Visioning: How Groups Can Work Together for a Just and Sustainable Future by Linda Stout. Berrett-Koehler. 2011. 198 pages. Paperback. $17.95.

You don’t have to be a prolific speaker, brilliant writer, or gifted organizational leader to bring about change in your community, says nonprofit director Linda Stout in Collective Visioning. What you need, in her view, is a collective vision around which you can rally people to work together to achieve.

Stout’s principle is “collective visioning,” and it means focusing on an ideal of what you want your community to be, rather than on the particular problem that you want to solve. She shares stories of organizations, faith groups, and circles of neighbors and friends who successfully applied collective visioning. For example, residents of a low-income community in Louisiana prevailed on the state’s legislature to close down a juvenile prison that had been abusing its inmates, and then convert the property into a community college.

In another case, after Hurricane Katrina struck in 2005, a group of students in a dilapidated school in New Orleans took charge to plan repairs of its classrooms and buildings. They also implemented brand-new garden plots, outdoor meeting spaces, and energy-efficient architectural designs.

Stout guides readers on how they, too, can carry out collective visioning in their own communities. She explains how one would bring together a diverse group of people and get them to interact in an atmosphere of equality and acceptance; then, through session exercises and activities such as storytelling, he or she would inspire them, break down barriers of mistrust, and make sure that everyone is sufficiently heard.

Collective Visioning is a powerful depiction of the positive impacts a motivated group of people can have on their community. Community activists and all who want to improve their neighborhoods’ quality of life may find in it both inspiring examples and useful tips.

Arctic Ice in the Hot Seat

The Fate of Greenland: Lessons from Abrupt Climate Change by Philip Conkling, Richard Alley, Wallace Broecker, and George Denton. Photographs by Gary Comer. MIT Press. 2011. 216 pages. Illustrated. $29.95.

As Greenland’s climate goes, so may go the climate of the rest of the world, according to conservationist Philip Conkling, glaciologist Richard Alley, oceanographer Wallace Broecker, and geologist George Denton. In a firsthand account richly illustrated with dozens of photographs of Greenland’s landscapes and glaciers, they explain how researchers’ findings about the land mass’s geological past and present raise grave concerns about its future—and ours.

Researchers agree that Greenland experienced several major climate shifts in its past, and each one precipitated weather changes and sea-level rise across the globe. Greenland seems to be on the verge of yet another major shift due to warming trends that melt gradually larger and larger quantities of its ice sheet. The world cannot afford not to pay attention.

Uncertainty lingers over exactly how much warming will take place. Some amount is inevitable, however, and it will surely be higher if humans persist with business as usual, the authors warn. As small amounts of ice continue to disappear from the ice sheet’s edges, the center will lower and warm up. Eventually, warming will imperil all of the remaining ice. The full process would take place over the next few centuries, but coastal cities everywhere could be in jeopardy from flash floods within the next few decades. Meanwhile, the changing climate would inflict desertification and storm patterns that wreck economies and food supplies on every continent.

The Fate of Greenland beautifully presents the challenges of forecasting climate change and the care that researchers must put into getting it right. It also compellingly explains the serious harms that humanity stands to suffer if it mistakes forecasters’ uncertainty for an excuse to take no action on greenhouse gas emissions. Scientists and non-scientists from all walks of life will find this an eloquent and timely read.

Forward-Thinking Classrooms

The New Digital Shoreline: How Web 2.0 and Millennials Are Revolutionizing Higher Education by Roger McHaney. Stylus. 2011. 247 pages. Paperback. $29.95.

Web 2.0 is second nature to millennial-generation students, but it baffles many educators, notes management information systems professor Roger McHaney. He has good news for the grownups: If they learn to understand Web 2.0 and incorporate it into their classroom practices, they will stay relevant and their students will stay engaged.

McHaney profiles many virtual learning software programs, educational Web sites, and mobile apps, and how teachers can use each. He also identifies larger market trends, such as printed textbooks’ replacement by wikibooks and e-books.

He further describes how digital media influence the millennials’ learning patterns—e.g., they are more inclined to collaboration with peers, creativity, and processing multiple streams of information. Over time, he speculates, schools will adapt by basing more course material on projects from previous classes of students and by expanding provisions of video editing software, recording facilities, and Internet interfaces. The most effective teachers, according to McHaney, will act less like instructors and more like facilitators, guiding the students as they take charge of their own learning experiences.

Also, mobile Web services will become components of classroom instruction. Students will consult search engines during class discussions and ask professors questions by texting them, while the professors podcast their own lectures for reuse by classrooms everywhere.

As McHaney makes clear, teachers have much to learn. But they have much to contribute, as well. Students need teachers’ help to separate valuable information from useless information, and to use digital technologies properly while avoiding the pitfalls of laziness, sloppy scholarship, and compliant thinking.

The New Digital Shoreline is a fascinating overview of where education is heading. Parents, teachers, and everyone else involved in learning would be well-advised to read this book.

New U.S. Leadership for a New World

The Next Decade: Where We’ve Been … and Where We’re Going by George Friedman. Doubleday. 2011. 243 pages. $27.95.

This century will challenge U.S. leaders to exercise wider foreign-policy vision than ever before, according to George Friedman, founder and CEO of geopolitical intelligence firm STRATFOR, in The Next Decade.

The United States has traditionally considered certain countries more strategically key than others, but in this century practically every country on earth will matter, Friedman argues. Leaders will need to develop a balanced global strategy that is not singularly focused on combating terrorism, but on myriad issues taking place on all corners of the globe.

Friedman sees major shakeups ahead in U.S. foreign policy. For example, the United States will distance itself from Israel and strive to accommodate Iran; it will also attach far more importance to several countries now regarded as only somewhat important, such as Poland and Singapore.

Across the globe, alliances will shift, Friedman predicts. Germany will build closer economic ties with Russia, while Turkey and the Arab states increasingly eye Iran as a competitor and adversary. Europe will struggle with internal economic rivalries and fade as a global power center. Brazil might become a formidable economic and military influence in Africa.

As Friedman assesses each global region, he details how it will affect U.S. national interests and how leaders should respond. In general, he advises pragmatic policy focused on cultivating balances of power within each region, rather than building democracy or preserving historic alliances.

Friedman displays fresh thinking on many of the oldest, most complex diplomatic problems facing the United States and its allies. Foreign-policy enthusiasts may not all agree with every argument he presents in The Next Decade, but they will surely admire its depth of research and clarity of voice.

July-August 2011, Vol. 45, No. 4

  • My First Meltdown: Lessons from Fukushima
  • Technology's Role in Revolution: Internet Freedom and Political Oppression
  • Eroding Futures: Why Healthy Soil Matters to Civilization
  • Treading in the Sea of Data
  • Augmented, Anonymous, Accountable: The Emerging Digital Lifestyle
  • Our Naked Data
  • The Case Against Cash

About this Issue (July-August 2011)

Dealing with Data

Fifteen years ago, Future Survey editor Michael Marien organized a debate at the World Future Society’s annual meeting to address the issue of whether the ongoing Information Technology Revolution would turn out to be all that good for us. In other words, is more information coming at us a good thing or a bad thing?

I’ve always felt that more information is better than less information, but now I’m not so sure. As Marien said, “having much more information is bad for our heads. … It produces infoglut, which may well be the greatest under-studied problem of our time.” (“Information Technology Revolution: Boon or Bane?” THE FUTURIST, January-February 1997, page 11.)

Among the consequences of infoglut that Marien forecasted were the devaluation of information and increased stress and workloads. All of these problems have largely come to be. So now the question becomes, what do we do about it?

In this issue of THE FUTURIST, business trend analyst Erica Orange offers a “data abacus” to help organizations to assess and leverage the increasingly digital lifestyles of consumers and citizens (“Augmented, Anonymous, Accountable: The Emerging Digital Lifestyle”). One aspect of that digital lifestyle involves our money, so David R. Warwick, an advocate for the cashless society, shows what governments need to do to advance digital transactions that improve society’s safety and security (“The Case Against Cash”).

As for dealing with the data itself, one major issue is keeping it secure. International IT security advisor William H. Saito advocates higher standards at the design level, with better self-regulation of the industry (“Our Naked Data”).

Also in a special Web-exclusive, Eli Pariser, the former executive director of MoveOn.org warns that more-personalized Internet searching may have hidden side effects. Pariser will be exploring this topic in greater depth for our September-October issue so stay tuned(“Escaping the Filter Bubble“).

And since the “data deluge” is likely to continue accelerating, analyst Richard Yonck offers insights on a variety of technologies, from nanotech to genetics to search engines, that will keep us from drowning in data (“Treading in the Sea of Data”).

* * *

While I admit to often feeling overloaded by the information coming at me at the World Future Society’s conferences, I always look forward to them because of the variety of perspectives that can be found nowhere else. The opportunity to sift through the data and to tap into the great energy store of new ideas is one not to be missed!

For information about WorldFuture 2011: Moving from Vision to Action, to be held July 8-10 in Vancouver, visit www.wfs.org/content/worldfuture-2011.

Cynthia G. Wagner
Editor

Tomorrow in Brief (July-August 2011)

Detecting Callers’ Stress Levels

A computer program that recognizes different levels of urgency in callers’ voices could help crisis centers respond more quickly to the most serious emergencies.

“Stress and negative emotions, in general, have a strong influence on voice characteristics,” according to researchers at Delft University of Technology and the Netherlands Defense Academy.

Rapid talking, variations in pitch (rising or falling intonation, for example), and changes in breathing rates are among the vocal cues that allow the program to gauge urgency and alert responders who may already be overwhelmed with calls during a major crisis. The system may also prove beneficial in military situations.

Source: International Journal of Intelligent Defence Support Systems (2011, Vol. 4, No. 2), Inderscience, www.inderscience.com.

WordBuzz: Followee

A followee is someone you follow (or “fan” or “friend”) in an online social network such as Twitter. It is just one of many neologisms submitted by wordsmiths to Merriam-Webster’s Open Dictionary.

New words and clever coinages showcase the rapid and fluid movements of the English language, but they are not necessarily accepted as words in the official Merriam-Webster dictionary (which, incidentally, still spells e-mail with a hyphen, unlike just about everyone else).

More new words at Open Dictionary, http://nws.merriam-webster.com/opendictionary/.

3-D Movies Go Mobile

Film lovers on the go may soon have the best of all worlds: 3-D movies delivered to their cell phones.

Combining new mobile radio standards with advanced video coding, researchers at Fraunhofer Institute for Telecommunications in Berlin have developed a special data compression technique that permits the transmission of two video streams required for a 3-D effect.

The 3-D option promises to enhance the consumer experience as more people access YouTube and other favorite Internet sites via their smartphones. The technology also will increase the options for businesses, medical responders, and other communicators who need high-definition mobile imaging.

Source: Fraunhofer-Gesellschaft, www.fraunhofer.de/en/.

Supersizing Microscopes

Bringing microscopic samples up to where researchers can see and “touch” them is the aim of a new touchscreen developed by researchers in Finland.

The innovation merges Web-based virtual microscopy with supersized (minimum of 46 inches) multitouch displays. The samples are digitized with microscopy scanners and stored on shared image servers. Researchers accessing the image can zoom in and move around it, much like using Google Maps.

The developers believe that the multitouch microscope will enhance interactive teaching as well as virtual research.

Sources: University of Helsinki, www.helsinki.fi. MultiTouch Ltd., www.multitouch.fi.

From Album to App

The creation of albums —collections of songs built around a theme or cohesive musical sound—could have become a lost art in the age of single-tune downloads. But thanks to application development for tablets like the iPad, artists and their record labels now have a new way to offer their fans an affordable augmented musical package.

In his blog for Forrester Research, consumer products analyst Mark Mulligan reports that music giant EMI has released an app version of the band Swedish House Mafia’s Until One album. The app includes the trio’s book and photos, along with lyrics, videos, interviews, discussion forums, games, social-networking feeds, and dynamically updated news content.

“This is a great innovation in music product,” says Mulligan, “but EMI needs to understand that that is exactly what this is: the start of the next generation of music products, not ‘just an app.’”

Source: “Finally a 21st Century Music Product, And It’s From EMI!” blog post by Mark Mulligan, March 24, 2011, Forrester Research Inc.

Futurists and Their Ideas: Visionary Forecasting with Graham T. T. Molitor

By Edward Cornish

One futurist ventures where few other forecasters dare to go, offering forecasts for the next thousand years and explaining how he does it.

Graham T. T. Molitor, former vice president and legal counsel for the World Future Society, bases his forecasts on the assumption that successive waves of economic change are the primary forces shaping human life.

“The dominant economic activities around which humanity is centered constitute the single most influential force shaping humanity, its institutions and civilizations,” Molitor asserts.

So if you can anticipate a major economic shift, you can anticipate other changes in human life.

“Past eras have been molded by shifts from agriculture to manufacturing, then to services, and now (among the so-called modern nations) to information era enterprises,” Molitor says. “Now emerging to sustain strong economic growth through the next millennium are at least five economic activities, which are poised to dominate advanced nations. These five enterprises include leisure, hospitality, recreation, and entertainment; life sciences; meta-materials; a new atomic age; and a new space age.”

Each of these economic activities can become the leading wellspring of jobs and the gross domestic product in country after country. However, previously dominant activities will not disappear, but will decline in relative importance. Major changes do not occur instantaneously, but rather develop in stages over time, so knowing these stages allows a forecaster to anticipate many things to come.

Molitor’s thinking about change draws on his long experience dealing with public policy issues as a lawyer, political scientist, and consultant for U.S. defense agencies, the White House, and politicians running for office, including presidential candidates Richard Nixon and Nelson Rockefeller.

Governmental policies, Molitor discovered, typically follow a series of steps, starting with people’s ideas about what should be done. Also, a significant change may occur in one nation or region long before it occurs in others.

This observation led him to the idea of forerunner jurisdictions—nations or areas that tend to be on the leading edge of change—as well as laggard jurisdictions, which typically cling to old ways. The jurisdiction may be a city, a province, or even an entire nation.

Molitor began thinking about forerunner nations while touring Europe. He noted that Sweden led the way in adopting social policies, so he made a special study of that nation. Why, he wondered, was Sweden moving ahead rapidly toward visionary goals of providing a better life for its people?

Molitor recognized that political leaders in most countries rarely have the resolve, tenure of office, and popular support to pursue distant future goals. So typically, political leadership succumbs to the pressures and problems of the moment. Commitments to immediate situations leave little room for addressing long-range visionary goals.

Sweden, along with a few other countries, has become adept at utilizing high level blue-ribbon gatherings of experts to come up with serious visionary propositions and targets.

“Sweden’s Royal Commission reports are exemplary,” Molitor asserts. “These undertakings are structured to reflect the viewpoints of experts representing almost every important sector of the nation that is materially affected. Opportunities for additional views also are structured into the rigorous review process. It comes as no surprise that most of the new and novel concepts put forward in these Royal Commission reports are swiftly passed into law.”

Britain’s Royal Commission reports are also exemplary, according to Molitor. In fact, he says the blue-ribbon Royal Commission’s reports are probably the most carefully thought out and technically proficient of any nation and are always right on the mark intellectually.

“Unfortunately,” he adds, “while Britain’s scholarship is superb and frequently unsurpassed, there tends to be a reluctance to implement a new concept swiftly. As a result, a quick response is often lacking, and hopes may be dashed.”

The Acceleration of Change

Though change takes place faster in some nations than others, it is now faster everywhere than it was in times gone by.

The Stone Age dominated human activities for millions of years. Then came—in increasingly rapid succession—the Copper Age, the Bronze Age, the Iron Age, and the Steel Age.

Broadly characterizing contemporary times, it might be fair to identify an ongoing succession of materials technologies, such as Plastics (1900s) and Silicon (1950). Beginning in the current millennium, Molitor foresees at least five potential eras based on a succession of materials technologies: A Bio-materials Age (starting around 2100), followed by a Meta-materials Age (around 2200-2300), Atomic Matter (around 3000), and Anti-matter (beyond 3000).

These expectations for the future offer an opportunity for governments or private industry to plan their future and develop forward-looking strategies. Government policies and other factors can significantly delay, accelerate, or prevent the realization of these anticipated developments. What actually happens will depend heavily on the public policy decisions made by various governments.

Molitor has made a specialty of analyzing how government policy is shaped and has identified 22 stages in the formulation of a policy. At the beginning of the process is the generation and discussion of ideas about what should happen. The ideas lead to innovations, events, and issues (the emergence of topics about which there is a clear difference of opinion). Near the end of the process come governmental regulations, judicial reviews, and, finally, revisions of state or national constitutions.

A Life’s Work of Futuring

The capstone of Molitor’s career has been the editing (in collaboration with George Thomas Kurian) of the two-volume, 1,115-page Encyclopedia of the Future (published by Simon & Shuster Macmillan, New York, 1996).

Molitor has now closed his consultancy, Public Policy Forecasting in Potomac, Maryland, and has retired, with his wife Anne, to western Pennsylvania. During the course of his career in forecasting, Molitor amassed a vast library covering a wide variety of subjects. In 2007, he donated some 13,000 of these books to Regent University Library in Virginia Beach, Virginia.

About the Author

Edward Cornish is the founding editor of THE FUTURIST and author of Futuring: The Exploration of the Future (WFS, 2004).

Graham T. T. Molitor may be reached by e-mail at gmolitor@comcast.net.

A Chemical Mission to Mars

Methane lures astrobiologists to look below the Martian surface.

For millennia, humans have suspected that life exists on other planets. Science writer Marc Kaufman thinks that it does—and that we might have to look no further than Mars.

Life Among the Stars?

A cosmologist cautions against optimism in our search for extraterrestrial beings.

FUTURIST staff editor Rick Docksai asks noted scientist Paul Davies for his views on our prospects for finding life elsewhere in the universe. Davies is an Arizona State University cosmologist and the author of The Eerie Silence (Houghton Mifflin Harcourt, 2010).

Rick Docksai: We know that there are billions of stars in our galaxy, and that quite a few of them have planets. We’ve seen a few of those planets. But—just referring to our galaxy—what are the odds of at least one of those planets not only being habitable, but actually being inhabited?

Paul Davies: The odds are not just unknown, but unknowable in our present state of ignorance. The reason is simple: We have no idea what is the probability that life will emerge on an Earthlike planet if you have one. Traditionally, that number was assumed to be very close to zero. Today, it is fashionable to assume it is near one. There is no justification for that fashion. If we knew the physical mechanism that turns non-life into life, we could have a go at estimating the odds. We have no idea what that mechanism was, so guessing how likely it is to happen is completely pointless.

Docksai: If another planet has given rise to microbial life, how likely or unlikely is it that microbial life would give rise over time to intelligent life, as happened on Earth?

Davies: That is a tractable question, because at least we know what the physical mechanism is that evolves simple cells into intelligent beings—variation and natural selection. Darwinism could be used in principle to estimate the odds for intelligence to emerge, but sadly in practice we have no idea how to do that calculation. There seem to be many very unlikely accidents of evolution that preceded the evolution of intelligence on Earth, so it’s easy to believe that the probability is very small. But it may not be.

Docksai: How much information can we currently obtain about planets in other star systems and said planets’ potential to sustain life?

Davies: This is getting better all the time. Kepler [NASA’s program searching for habitable planets] is compiling an impressive inventory of extrasolar planets. Unfortunately, Earthlike planets [i.e., habitable, roughly] are still hard to spot with current technology. But I am convinced there are lots of Earthlike planets out there.

Docksai: Our tools for studying other stars and other planets continue to evolve and enable us to learn more. Looking ahead, what new information do you hope we might learn about the planets in other star systems in the next 10–15 years? How might our capacities to study them expand?

Davies: The holy grail is to measure the composition of the atmospheres of Earthlike planets. We know how to do this, for example with fancy space-based spectrometers combined with systems to blot out the glare of the parent star. But it will be expensive. I don’t see NASA doing it in the next 15 years in the current financial climate.

Docksai: Given the tough economic climate worldwide, space agencies have seen their already-sparse revenues shrink further. How optimistic are you that astronomy will progress in years ahead despite thin funding streams?

Davies: I am cautiously optimistic about finding traces of life on Mars, as both NASA and ESA have good Mars exploration programs. If we find extant life, for my money it would just be Earthlike life that got there from here in impact ejecta (or vice versa).

For more information, see The Eerie Silence: Renewing Our Search for Alien Intelligence by Paul Davies (Houghton Mifflin Harcourt, 2010. 242 pages. $27). Davies’s Web site is http://cosmos.asu.edu.

“There is a very good chance that [a near-future Mars mission] will say definitively, ‘Here are organic compounds,’” says Kaufman, a Washington Post science reporter and the author of First Contact (Simon & Schuster, 2011).

Astronomers believe that, billions of years ago, Mars was warm, had liquid water, and could have supported life. Mars cooled over time, and its surface water disappeared, Kaufman explains. But there could be much more ice below the surface, and maybe some remaining microbes.

“Potentially there was life on Mars three or four billion years ago, when it was wetter and warmer. And as conditions changed, it might have migrated underground,” he says.

Kaufman cites Michael Mumma, a NASA astronomer who in 2009 discovered frequent methane emissions from several surface sites across Mars. This could indicate life, says Kaufman, because on Earth, more than 90% of methane comes from living organisms; the rest is produced by volcanoes and other geological phenomena.

Mumma told THE FUTURIST that, in the two years since his methane discovery, he and his research team gathered data on Mars’s atmosphere from 24 Earth-based observatories. NASA will publish the findings in late summer 2011. Mumma is hopeful that the information will help scientists judge whether Mars’s methane implies Martian life—or just Martian geological activity.

“We’re following up on the methane production to identify if the release repeats annually, but also to measure gases that might be key tests of biology versus volcanism,” says Mumma.

More insights may emerge from ExoMars, a pair of robotic missions that NASA and the European Space Agency (ESA) will co-launch to examine methane and other trace gases on Mars. According to Mumma, ExoMars may find important chemical clues. For example, if it is accompanied by sulfur dioxide, the methane is probably geological in origin; if hydrogen sulfide accompanies it, then the methane almost certainly came from something living.

“We are hopeful that, if we see a methane hotspot, we will look closely for these other gases to see what’s really happening,” says Mumma.

While finding life would be momentous, according to ESA ExoMars Project scientist Jorge Vago, so would finding signs of geological movement. Geological activity would suggest a warm inner core. This means that Mars might have enough heat in zones below its surface to sustain microbial life.

“In either case, it will mean the planet is not dead—either from a geological point of view or from a biological point of view,” says Vago.

The first ExoMars mission will launch in 2016 and, upon arriving in Martian aerospace in mid-2017, will deploy a rover that will land on the surface and spend four days analyzing the soil and air. The spacecraft above will continue orbiting and analyzing atmospheric gases for the next two years.

A follow-up ExoMars mission in 2018 will carry a rover with a drilling arm that will tunnel two meters into the soil to obtain dirt and rock samples. Vago says that this drilling will be the deepest that humans have ever before pried into the planet.

“This is a 3-D rover,” says Vago. “For the first time, we will be able to look into Mars’s third dimension, that of depth.”

The rover will deposit the samples into a cache. Sometime after 2024, a third robotic mission may retrieve the samples and fly them back to Earth for astronomers to study in person.

“This mission in 2018, you can think of it as the first element of Mars sample return,” says Vago. “The long-term aim is Mars sample return.”

Viewing Mars’s inner core will probably not be possible any time in the near future, Mumma concludes, but he considers studying the gas emissions up close to be the next-best thing. They may reveal much about what is taking place under the Martian surface.

“Detecting these effluent gases would be one of the most direct clues. Other than that, we’re stuck with looking at surface land forms,” he says.—Rick Docksai

Sources: Marc Kaufman (interview), The Washington Post; author of First Contact: Scientific Breakthroughs in the Hunt for Life Beyond Earth (Simon & Schuster, 2011. 213 pages. $26).

Michael Mumma (interview), NASA, Goddard Space Flight Center, www.nasa.gov/goddard.

Jorge Vago (interview), European Space Agency, www.esa.int.

Four Scenarios for Co-Working

Sharing workspace offers potential benefits in an uncertain economy.

Telecommuters, freelancers, and others without a regular office to anchor their workday may suffer from loneliness or require a more professional environment than a local coffeehouse from which to conduct business. One solution, co-working, may offer some options to improve working lives and productivity.

With co-working, independent and freelance workers voluntarily share an office space if not necessarily a common employer. Emphasizing cooperation over competition, the process enables remote (i.e., otherwise placeless) workers to create a community, a support system, and a strong professional network among themselves. Co-workers report having found opportunities to collaborate, share skills, and subcontract among each other, and perhaps not surprisingly, many find they are able to be more productive in such an environment.

Regardless of the type of work that is performed, the co-working spaces themselves can be run on either a nonprofit or a for-profit basis; they typically charge a monthly membership fee (generally inexpensive), and the level of membership can vary depending on how much time one plans to spend at the office.

There were more than 700 such spaces around the world as of March 2011, according to online co-working magazine Deskmag.com. While that number may seem small, it is significant: It represents around twice as many facilities as there were just twelve months prior. The movement is clearly growing fast, but its direction is not entirely certain at this point.

To help gain a clearer picture of co-working’s possibilities, a scenario analysis workshop was conducted by Thomas Chermack, the director of Colorado State University’s Scenario Planning Institute, and Angel Kwiatkowski, the founder of Cohere, a co-working space in Fort Collins, Colorado. Projecting several years ahead, the group at Cohere developed a set of near-term scenarios.

After going through hundreds of sticky notes, the group managed to pin down what participants agreed were the two most important key variables determining co-working’s future. The first was an internal game changer (will a given co-working group hold together?). The second was an external one (is the economy stable or unstable?). The group then created four scenarios based on these two variables:

1. Stable economy/stable community. In the best-case scenario, co-working has gone mainstream and its appeal has expanded. More and more companies recognize it as a viable way to increase efficiency, productivity, and employee satisfaction and well-being. As a result, employees are offered this option upon being hired. Most co-working spaces are staffed 24 hours a day to accommodate everyone’s schedule.

2. Stable economy/unstable community. As in the above situation, everyone is generally doing well career-wise. However, most of the advantages of co-working—including side benefits such as educational classes, guest speakers, social mixers, and other activities—have disappeared. According to this scenario’s authors, “new members often arrived to an empty or near empty space and received no orientation or details about their membership.” In the absence of any genuine leadership or investment in the community, those hired to run these facilities don’t really know what they’re doing or why they’re doing it (aside from the paycheck).

With no real sense of community and little to keep people there, members come and go. As trust and camaraderie evaporate, those who remain erect cubicle walls. There is very little in the way of communication (much less collaboration) in the space.

3. Unstable economy/stable community. Despite the ongoing recession, “fierce loyalty and tight networking bonds” among long-term members enable co-working spaces to flourish.

Meanwhile, corporate co-working franchises begin popping up. Cheaper to join, these offer “more lavish amenities” but lack the same sense of community as the smaller, less-profit-oriented spaces. As a result, they tend to attract a different, less tightly knit crowd, and the turnover rate is higher. Ultimately, the smaller model proves more sustainable, while the larger franchises struggle.

4. Unstable economy/unstable community. Upon arrival, co-workers (if you can even call them that) walk through a turnstile, slide their credit cards through a plexiglass partition, and then choose an empty stall in which to work in isolation. Everything is pay-per-minute. There is little to no interaction between people. There is also little trust and security (you’d be wise to take your valuables with you if you leave the room).

While this scenario may be more satirical than realistic, the point is clear: Without an emphasis on community building, co-working as it exists today will either take a turn for the worse or disappear altogether.

In the end, Kwiatkowski believes co-working will most likely evolve somewhat along the lines described in the third scenario. On the one hand, there will be co-working spaces “run by really passionate people who love what they’re doing,” she says. They may not believe in growth or have any interest in scaling their activity. However, they will be fully engaged and immersed in the communities they are creating.

“On the other side of the continuum,” Kwiatkowski notes, “you have the ‘chain restaurant’ [model] of co-working: people franchising and opening multiple spaces and hiring community managers.” With the support of corporations, partnerships, and sponsorships, these franchises may or may not be interconnected, and may ultimately be more accountable to their investors than their members.

“That’s the division I’m already starting to see,” she says. “We’re polarizing on opposite ends.”

So while many “officeless” workers may increasingly be tempted to join the ranks of co-working in the future, they may want to look for those office spaces that emphasize community over facilities. When it comes to co-working’s future, smaller may be better.—Aaron M. Cohen

Source: Angel Kwiatkoski (interview), Cohere LLC, www.coherecommunity.com.

Gaming for Better Decision Making

Overconfidence can lead to poor decisions, as gamblers should know.

A new Web-based game developed by researchers in the U.K. endeavors to help users quantify their level of confidence to improve decision making.

World of Uncertainty uses mathematics, statistics, critical thinking, knowledge management, and educational psychology and consists of a 10-question quiz to help people make better decisions. After answering a question on religion, politics, general knowledge, and so on, the player is asked to indicate how confident she is in her answers.

This latter metric determines the number of points awarded based on confidence level, in a manner similar to a gambling game. If the player is supremely confident in one of her answers, she can gamble all of her points on being right to double her money but receive virtually no points in the event that she were wrong. If she is unconfident and indicates as much, betting nothing, she would earn roughly the same amount of points regardless of whether she was right or wrong, as some points are awarded just for answering the question.

At the end of the quiz, the player receives a score for the number of questions she answered correctly and—much more importantly—how her knowledge on the subject related to her confidence level. “The more quizzes you will try, the more accurate you will get in estimating and expressing your confidence,” reads the message at the end of the ordeal.

Later iterations of the game may involve graphics or more action-packed game play.

“Whether the choices facing us are simple or complex, a greater awareness of uncertainty and of our own biases can improve the quality of our decision making. We believe there’s real potential for people to acquire that awareness through computer games,” says David Newman of Queen’s University Belfast, one of the Web site creators.

In a paper that the team submitted as part of its funding request, the researchers outline their goals for the project and describe why video games are ideal tests for human reactions to uncertainty. In the game environment, “the play is not limited to following a pre-written story.… A player may explore, gather evidence, estimate risks, make decisions, and see the consequences of these decisions.” Faced with immediate effects of over- or underconfidence, such as a high or low score, players gain the ability to grasp their propensity to commit errors of false self-assurance.

“Our vision is of a society transformed from one in which most people prefer simple stories, and avoid discussing uncertainty, to one where a large proportion of the population has the skills of exploring uncertain evidence and can estimate uncertainty,” the researchers write.

Patrick Tucker

Source: The Engineering and Physical Sciences Research Council, www.epsrc.ac.uk.

Purification at the Nano Scale

A Japanese water-filtration system could help quench the world’s growing thirst.

Recent events in Japan have sparked concerns about freshwater availability in many parts of the country. Fortunately for Japan, the nation is also the world’s leader in water filtration.

Japanese manufacturer Nitto Denko is currently marketing what it claims is the world’s most efficient desalination filter, the SWC6 MAX, a reverse-osmosis nanomembrane system released in 2010. According to the company, the filter can remove “salt and other minerals as well as bacteria and viruses from seawater, and lower the 3.5% of salt in seawater to 0.0075%”—lower than the salt content of freshwater. The SWC6 MAX was invented by Hisao Hachisuka and is currently in use in a water treatment facility in Australia.

At present, SWC6 MAX water is rather expensive. The cost of filtering an acre foot is more than $650, because of the amount of energy required to push water through the filter. That price tag is beyond the means for the world’s poorest inhabitants but within reach for the Japanese. The company has not said that it will be using the technology in the areas affected by the March 2011 tsunami or radiation. However, numerous other technologies exist for effective wastewater filtration, which could be used in Japan, including ozone injection and nanofiltration.

One of the more interesting water purification technologies to emerge recently is electro-filtration through silver nanowire fiber. The silver nanowire mesh, connected to a 20-volt power source, zaps bacteria and pathogens, making the water drinkable. This method, pioneered by Stanford University professor Yi Cui, has been shown to be more effective and less energy-intensive than other filtration methods that require large amounts of energy to push water through filters.

At present, silver nanowire filtration is also cost-prohibitive for the world’s poorest regions, due to the high cost of constructing silver nanowires. But in 2010, Taiwanese chemist Yi-Hsiuan Yu patented a process for mass production of silver nanowires. If this method is effective, it could greatly reduce the cost of production for these nanowires, making Yi Cui’s filter more practical for the world’s poor. Korean firm Toptec has patented the world’s first nanofiber mass production system.

The Global Water Recycling and Reuse System Association of Japan has a large, government-funded mandate to “develop [a] comprehensive water recycle system and expand the system, making the most use of Japanese technologies and know-how.” The Japanese government sees water filtration and green infrastructure as a key export area for the future.

The United Nations estimates that 2.8 billion people will live in a water-stressed environment by 2025. The world’s poorest people need access to cutting-edge desalination technologies, coupled with advanced filtration, to increase the availability of freshwater and to remove toxins from wastewater. Wastewater recycling on the community level is essential for water stability, many experts contend. According to the Japanese government, there will be a $1 trillion market for safe water reclamation and recycling by 2025, so the potential private client list is considerable.

Patrick Tucker

Sources: Nitto Denko Corporation, www.nitto.com. Stanford University, http://stanford.edu. Personal interviews.

Future Active (July-August 2011)

Edited by Aaron M. Cohen

Mapping New Zealand’s Long-Term Future

Over four days in March 2011, conference goers at StrategyNZ: Mapping our Future envisioned the most preferable long-term future for New Zealand and searched for innovative ways to meet future challenges. The conference, sponsored by the Sustainable Future Institute, was held in Wellington.

StrategyNZ participants discussed the past, present, and future of the country, looking ahead almost half a century to the year 2058. With an eye on policy, attendees explored a variety of issues, including health, education, technology, the environment, and the economy.

The event kicked off with a two-day futures-studies course conducted by Peter Bishop, associate professor of strategic foresight and coordinator of the graduate program in futures studies at the University of Houston. Bishop gave overviews of futuring and forecasting techniques as well as strategic planning methods.

Afterwards, during a workshop held over the next two days, participants broke into groups of 10 to create “strategy maps”—the strategic foresight tool from which the conference gets its name.

Strategy maps are diagrams that graphically depict a set of goals and strategies. These visual tools can help people see more clearly the ways in which objectives, resources, and various other facets of a given strategy interrelate with one another, providing a clearer sense of cause and effect and the best way to move forward with a plan of action.

At the conference’s conclusion, workshop groups had an opportunity to present their strategy maps to several members of New Zealand’s Parliament.

Coordinators plan to present the results of StrategyNZ at the World Future Society’s annual meeting, WorldFuture 2011: Moving from Vision to Action, in Vancouver, Canada. Co-presenter Wendy McGuinness, chief executive of the Sustainable Future Institute, chairs both the New Zealand chapter of the World Future Society and the Millennium Project’s New Zealand node.

Audio files and PowerPoint presentations from most of the speakers are available for free download on the Sustainable Futures Institute’s Web site, as are the various workshop groups’ outputs.

Attendees were also involved in the preparation of an e-book: StrategyNZ: Mapping our Future Reflections. This followed the publication of two reports (also available online): ­StrategyNZ: Mapping our Future Workbook and StrategyNZ: Mapping our Future Strategy Maps. Wendy ­McGuinness hopes to publish a further book next year, which currently has the working title Exploring New Zealand’s Long-Term Future.

Sources: Strategy NZ, http://strategynzsite.info. Sustainable Futures Institute, www.sustainablefuture.info.

Delphi 2.0: Wild Cards and Weak Signals

The European Commission–funded iKnow (Interconnecting Knowledge) project is reaching out to those in the larger futuring and foresight community for help evaluating key wild cards and weak signals in its extensive, ever-expanding database. The project calls its online evaluation an “international Delphi 2.0 survey.”

In classical Delphi polling, groups of experts are individually and anonymously surveyed in a series of rounds. They are presented with a summary of responses in each subsequent round and work to narrow those down, such as by assigning the responses a rating. The process may continue until the experts reach consensus on a given issue, if that is the end goal.

In a Real-Time Delphi survey, the entire process takes place online and is opened up to anyone interested—it’s a method of crowdsourcing, in a way. The series of rounds is eliminated (but anonymity is preserved), and responses are available for viewing as soon as they are submitted.

The iKnow Project’s database is intended to aid the practice of studying, understanding, and anticipating the wild cards and weak signals that are “potentially ‘shaping and shaking’ the future of science, technology, and innovation.”

Wild cards are widely understood as low-probability, high-impact events; iKnow divides them into three categories: intentional, unintentional, and nature-related “surprises.”

Weak signals, on the other hand, are trickier to define. The iKnow project classifies them as “unclear observables warning us about the probability of future events (including wild cards).” Examples the organization gives include current policies, past wild cards, and emerging issues, which reveals just how wide-ranging the term can be.

The categories range from information and communication technologies to social sciences and the humanities.

Source: iKnow, www.iknowfutures.eu.

A Playful Utopia

This past March, South By Southwest Interactive (SXSW) attendees in Austin, Texas, had the chance to attend Plutopia 2011: The Future of Play. Held at the Mexican American Cultural Center, the event showcased configurable, experiential, and interactive works—art installations, projections, demonstrations, games, live performances, and more. The works on display exemplified some of the ways that emerging technologies are being incorporated into the arts.

San Francisco–based game manufacturer Sifteo showed off its interactive gaming cubes, originally developed at the MIT Media Lab. (Co-founder David Merrill was a featured speaker at Plutopia 2011.) This electronic gaming system features small blocks with color LCD screens, built-in wireless communication, and motion sensors that enable them to respond to players as well as other Sifteo Cubes. According to Merrill, the company “aims to empower people to interact with information and media in physical, natural ways that approximate interactions with physical objects in our everyday lives.”

In the courtyard of the cultural center, French artists Grégory Lasserre and Anaïs met den Ancxt, who collaborate as Scenocosme, displayed a hanging garden of interactive musical plants. These digitally enhanced “hybrid” plants, which they call Akousmaflore, respond to motion and touch by making different sounds.

Nearby, the Edge of Imagination Station invited partygoers to create digital stop-motion animation sequences using various toys, props, and chalk drawings. Across the courtyard, the University of Texas Department of Computer Science showed off its robot soccer team. (Robot soccer was reported about in the January-February 2011 issue of THE FUTURIST.)

Other highlights included the improvisational group Text of Light (featuring Lee Ranaldo of seminal art-punk band Sonic Youth and artist/composer Christian Marclay, among others), who spontaneously composed moody soundscapes to accompany projections of experimental filmmaker Stan Brakhage’s abstract films. Futurist, design critic, and science-fiction author Bruce Sterling was also on hand to give the opening speech, as he has done at Plutopia’s previous SXSW parties.

Founded in 2007, Plutopia is a future-focused entertainment company that creates what it calls “sense events.” The company, whose name is a “mash-up” of pluralist utopias, looks ahead to a technologically enhanced, interconnected future that is worth getting excited about.

Source: Plutopia, www.plutopia.org.

As Tweeted: Science Fiction Futures: Aliens and Robots Need Not Apply

On a recent afternoon on Twitter, we pondered some of the stereotypes about how the future is depicted in science fiction books and movies.

Our Twitter followers responded to our quest to find robot- and alien-free scifi with a very diverse reading and movie-renting list of suggestions. And they reminded us that good science fiction also doesn’t have to be about the future, though of course that is our preference. ;)

@WorldFutureSoc: Can anyone name a good #scifi story (book, play, movie) that does not involve aliens or robots?

@ebonstorm: The Foundation Series—No aliens or robots are harmed in the making of these three books by Isaac Asimov. #scifi

@Geofutures: It has robots.

@heathervescent: I don’t think these have either: The Stars My Destination. Trouble on Triton.

@Geofutures: 11 of my top 20 #futurist #movies involve neither (Blade Runner more clones than robots) http://bit.ly/cr7Km1 #scifi

@WorldFutureSoc: @Geofutures Thx for link to www.Futuristmovies.com! Just bookmarked it.

@heathervescent: I believe you can substitute the Foundation series as God Emperor Leto’s unwritten 10000 yr rule. #scifi #mashup

@heathervescent: If that last tweet was too cryptic it was a Dune + Foundation Series mashup. :)

@WorldFutureSoc: Yeah, it kinda was! ;)

@heathervescent: Its a challenge to fit two major scifi series’ mashed up in a 140char tweet. :)

@jasonporath: Primer. The Fountain.

@jlindenger: I love “The Fountain”

@MattCCrampton: Easy, any book by William Gibson. Start with Neuromancer.

@JasonSwanson: OHHH good suggestion!

@jlindenger: Sharon Shinn’s Samaria series which starts a bit more fantasy and shifts to scifi as things are explained.

@jlindenger: Octavia Butler’s Parable of the Talents/Parable of the Sower are probably among the best in that category.

@jlindenger: I think she takes issue with labeling it SF, but... @margaretatwood ‘s “The Handmaid’s Tale”

@johnadamyates: What about giant bunny rabbits?- Donnie Darko

@heathervescent: Here’s the FTW submission: Clockwork Orange. (via Andrew on FB) Extra Credit: Solaris.

@jotaace: Those who mentioned Solaris: It’s a story about a BIG alien.

@Geofutures: And an unusually realistic one http://bit.ly/guBpEJ

@ahnie: No idea if they’re “good” but CSI (TV)-based should qualify as #scifi w/o aliens/robots

@WorldFutureSoc: @ahnie My mind-set is toward the future, so I sometimes forget that not all #scifi is set in the future.

@ahnie: I *think* Firefly/Serenity was w/o aliens except (d)evolved humans. Can’t remember if any robots. Clearly time to re-watch.

@jlindenger: bazinga!

@ahnie: had to check Urban Dictionary for “bazinga” Is it definition 2, 5, or 6? #TragicallyUnhip

@heathervescent: Just search for Sheldon and Big Bang Theory.

@ahnie: BBT & Sheldon are UD def #1. Sometimes being TV-less is el-sucko. #ThemeSongIsNowMyEarworn

@WorldFutureSoc: The #scifi chat took on a life of its own! #BBT is science, fiction, robotless except for Sheldon’s “virtual presence”

@OscarMopperkont: I believe 1984 by Orwell isn’t mentioned yet #scifi

@jlindenger: there’s a ton. Enjoy!

@WorldFutureSoc: Great list! Thx so much! (#scifi books/movies w/o robots, aliens)

@WorldFutureSoc: smacks forehead for forgetting great movies like Jurassic Park and WarGames.

Follow the World Future Society’s Twitter page and search all of our tweeps at http://twitter.com/WorldFutureSoc.

The Futurist List (SciFi free of aliens and robots)

Literature

Air by Geoff Ryman (St. Martin’s Griffin, 2004). @justinpickard

The Chrysalids by John Wyndham (Michael Joseph, 1955). @OscarMopperkont

A Clockwork Orange by Anthony Burgess (William Heinemann, 1962; also a film). @heathervescent

Dune by Frank Herbert (Chilton Books, 1965; also a film). @JITHyperion

The End of Eternity by Isaac Asimov (Doubleday, 1955). @jimmath

Fahrenheit 451 by Ray Bradbury (Ballantine Books, 1953; also a film). @Whaatson

Flashforward by Robert J. Sawyer (Tor Books, 1999). @Geofutures

Glasshouse (Orbit [U.K.], Ace [U.S.], 2006) and Halting State (Orbit [U.K.], Ace [U.S.], 2007) by Charles Stross. @jlindenger

The Handmaid’s Tale by Margaret Atwood (McClelland and Stewart, 1985; also a film). @jlindenger

The Moon Is a Harsh Mistress (1966) and I Will Fear No Evil (1970) by Robert A. Heinlein (G. P. Putnam’s Sons). @Ouroboros4ever, @jlindenger

Neuromancer by William Gibson (Ace Books, 1984). @MattCCrampton

Nineteen Eighty-Four by George Orwell (Secker and Warburg, 1949). @OscarMopperkont

Parable of the Sower (Four Walls Eight Windows, 1993) and Parable of the Talents (Seven Stories Press, 1998) by Octavia E. Butler. @jlindenger

River of Gods by Ian McDonald (Simon & Schuster, 2004). @abranches

Samaria series by Sharon Shinn (e.g., Archangel, Ace Books, 1997). @jlindenger

Seeker by Jack McDevitt (Ace Books, 2005). @heathervescent

Snow Crash (Bantam Books, 1992) and The Diamond Age (Bantam Spectra, 1995) by Neal Stephenson. @jlindenger

The Stars My Destination by Alfred Bester (Sidgwick and Jackson, 1956). @heathervescent

Trouble on Triton by Samuel R. Delaney (Bantam Books, 1976). @heathervescent

Film

Blade Runner (directed by Ridley Scott, 1982; based on the novel Do Androids Dream of Electric Sheep? by Philip K. Dick). @Geofutures

Brazil (1985) and Twelve Monkeys (1995), directed by Terry Gilliam. @ryonck

Children of Men (directed by Alfonso Cuarón, 2006; based on the novel The Children of Men by P. D. James). @jotaace

Code 46 (directed by Michael Winterbottom, 2003). @justinpickard

Donnie Darko (directed by Richard Kelly, 2001). @johnadamyates

Eternal Sunshine of the Spotless Mind (directed by Michel Gondry, 2004). @jasonporath

The Fountain (directed by Darren Aronofsky, 2006). @jasonporath, @jlindenger

Frau im Mond (Woman in the Moon) (directed by Fritz Lang, 1929). @jotaace

Gattaca (directed by Andrew Niccol, 1997). @JasonSwanson, @justinpickard

Jurassic Park (directed by Steven Spielberg, 1993; based on a novel by Michael Crichton) @WorldFutureSoc

Just Imagine (directed by David Butler, 1930). @jotaace

Kosmicheskiy Reys (Cosmic Rays) (directed by Vasili Zhuravlov, 1936). @jotaace

Minority Report (directed by Steven Spielberg, 2002; based on the short story “The Minority Report” by Philip K. Dick). @ryonck, @WorldFutureSoc

Primer (directed by Shane Carruth, 2004). @jasonporath

Serenity (directed by Joss Whedon, 2005). @transhumanistic, @ahnie

Stereo (1969), Crimes of the Future (1970), and eXistenZ (1999), directed by David Cronenberg. @jotaace

Things to Come (directed by William Cameron Menzies, 1936; based on The Shape of Things to Come by H. G. Wells). @jotaace

WarGames (directed by John Badham, 1983) @WorldFutureSoc

Television

Big Bang Theory (created by Chuck Lorre and Bill Prady, CBS, 2007). @jlindenger, @heathervescent, @ahnie, @WorldFutureSoc

CSI (Crime Scene Investigation) (created by Anthony E. Zuiker, CBS, 2000). @ahnie

Firefly (created by Joss Whedon, Fox, 2002). @transhumanistic, @ahnie

Technology’s Role in Revolution: Internet Freedom and Political Oppression

By Evgeny Morozov

Revolutions depend on people, not on social media, and the Internet both promotes democracy and thwarts it, says a foreign-policy scholar. Cyber-utopians be warned: Authoritarian regimes are adapting to the Internet age.

You Say You Want a Twitter Revolution?

Author Evgeny Morozov’s new book looks critically at Internet-based democracy activism.

The Net Delusion: The Dark Side of Internet Freedom by Evgeny Morozov. PublicAffairs. 2011. 408 pages. $27.95.

In 2009, reports that dissidents in Iran were using Twitter prompted many Western commentators to proclaim that social media was fomenting a democratic Iranian revolution—only to be disappointed when the “revolution” fizzled and died. New America Foundation fellow Evgeny Morozov attributes the commentators’ misplaced hopes to cyber-utopianism, a widespread but naïve expectation that the Internet will empower oppressed peoples and advance democracy.

According to Morozov, cyber-utopians failed to anticipate that authoritarian regimes would also benefit from the Internet. In fact, such police states as Belarus and Iran pay bloggers to spread propaganda and frequent social-networking sites to monitor dissidents. Other states, such as Russia, disseminate crass entertainment through video-sharing sites to distract viewers from social and political issues.

Morozov debunks many widely held assumptions about how politically repressive states and their opposition both work. He follows with advice for democratic lawmakers who want to help the dissidents.

Pro-democracy lawmakers must engage with the Internet, he says, but they must observe how it impacts different countries in different ways and shape their policies accordingly: What works in Tunisia might not work in Burma. Also, they must never treat Web-based platforms as substitutes for diligent, committed human activists who mobilize people to action in real life.

The Net Delusion is a sobering assessment on the limits of Internet activism. It has practical advice for policy makers and nonprofit activists across the globe.

Rick Docksai

The only place where the West is still unabashedly eager to promote democracy is in cyberspace. Enthusiastic belief in the liberating power of technology, accompanied by the irresistible urge to enlist Silicon Valley start-ups in the global fight for freedom, is of growing appeal to many policy makers. In fact, many of them are as upbeat about the revolutionary potential of the Internet as their colleagues in the corporate sector were in the 1990s.

We shouldn’t give the Internet too much credit, however, and we should probably give it credit for some of the negative things that are happening. We shouldn’t be biased and just look at the brighter side. We should be more critical in thinking about its impacts.

The idea that the Internet favors the oppressed rather than the oppressor is marred by what I call cyber-utopianism: a naïve belief in the emancipatory nature of online communication that rests on a stubborn refusal to acknowledge its downside.

Cyber-utopians ambitiously set out to build a new and improved United Nations, only to end up with a digital Cirque du Soleil. Failing to anticipate how authoritarian governments would respond to the Internet, cyber-utopians did not predict how useful the Internet would prove for propaganda purposes, how masterfully dictators would use it for surveillance, and how sophisticated modern forms of Internet censorship would become.

Fidel Castro’s Twitter page has been around for a few years. But very few people in Cuba own computers, because the Cuban government restricted the sale of computers to its population, so most of them just don’t have the equipment to tweet. They don’t have Internet cafés. They do have a small blogging culture, a few bloggers who have to be very careful. The government modified the restrictions on computers only a short while ago, so I wouldn’t expect Facebook or Twitter to matter much in Cuba in the next five to ten years.

Take a closer look at the blogospheres in almost any authoritarian regime, and you are likely to discover that they are teeming with nationalism and xenophobia. Things don’t look particularly bright for the kind of flawless democratization that some expect from the Internet’s arrival.

Likewise, bloggers uncovering and publicizing corruption in local governments could be—and are—easily coopted by higher-ranking politicians and made part of the anti-corruption campaign. The overall impact on the strength of the regime in this case is hard to determine; the bloggers may be diminishing the power of local authorities but boosting the power of the federal government. Authoritarian regimes in Central Asia, for example, have been actively promoting a host of e-government initiatives.

Normally a regime that fights its own corruption has more legitimacy with its own people. From that perspective, I wouldn’t go so far as to say that the Internet is making the government more accountable, but I would say that it is making local officials more responsible.

The government may be eliminating corruption in the provinces, making the people happier, but that doesn’t mean that they’re eliminating corruption at the top. So the distribution of corruption might be changing. But I do think government might use the Internet to solicit more citizen input. That won’t undermine the government. It will bolster its legitimacy.

It’s not paradoxical. The fact that the government is soliciting their opinions does not mean that the government is listening to them. It wants to give the people the impression that it is listening to them. In some sense, it creates a semblance of democratic institutions. It’s all about creating a veneer of legitimacy.

The Internet’s Role in Middle Eastern Revolutions

Digital activists in the Middle East can boast quite a few accomplishments, particularly when it comes to documenting police brutality, but I don’t think the Internet will play much of a role in Middle Eastern democratic revolutions compared with other factors. The things to watch for are how the new leaders shape the new constitutions and how they deal with the elements of the previous regimes. All those things are far more important than what happens online. I wouldn’t bet that the Internet will be a great help.

As for the extent to which these new regimes become democracies—it’s a wild guess for anyone, me included. They have a chance, but outcomes will depend upon many factors, including internal policies and external conflicts. I don’t buy into the cultural notion of Arabs not being ready for democracy. Democracy in the Middle East may succeed. But it will depend on how they work with the existing challenges.

The revolts were driven by people who had economic grievances and were politically oppressed. They turned to the Internet to publicize their grievances and their resistance. The fact that new media and blogs were present probably set a different tempo to the revolts. If the Internet were not around, the regime might be tempted to crack down in a much more brutal way. The revolts themselves would be taking a different shape, and they may have happened three to six months later.

It’s hypothetical to say how today’s democratic revolutions would have happened without the Internet, but revolutions throughout history are driven by cultural factors. The events probably would have happened differently and probably would have turned out differently. We have to entertain the possibility that these events could have been much more violent and taken much more time if they hadn’t had the publicity that they had thanks to the Internet.

But ultimately, a regime’s response to a revolt depends on the regime, not on the Internet. Just because people can tweet and blog doesn’t stop the Libyan government from instituting a violent crackdown.

In all, it’s hard to generalize based on the future of the Internet. We don’t have a one-size-fits-all approach to every country. We adapt our policies for each country. That’s how foreign policy works. But with the Internet, we have a tendency to generalize that this must be how it works everywhere, and that isn’t the case.

How Russia Handles the Internet and Activism

While civic activism—raising money for sick children and campaigning to curb police corruption—is highly visible on the Russian Internet, it’s still entertainment and social media that dominate. In this respect, Russia hardly differs from the United States or countries in western Europe. The most popular Internet searches on Russian search engines are not for “What is Democracy?” or “how to protect human rights,” but for “What is love?” and “how to lose weight.”

The Kremlin supports, directly or indirectly, a host of sites about politics, which are usually quick to denounce the opposition and welcome every government initiative, but increasingly branches out into apolitical entertainment. From the government’s perspective, it’s far better to keep young Russians away from politics altogether, having them consume funny videos on Russia’s own version of YouTube, RuTube (owned by Gazprom, the country’s state-owned energy behemoth), or on Russia.ru, where they might be exposed to a rare ideological message as well.

Many Russians are happy to comply, not least because of the high quality of such online distractions. The Russian authorities may be on to something here: The most effective system of Internet control is not the one that has the most sophisticated and draconian censorship, but the one that has no need for censorship whatsoever.

I don’t think there is anything unique about Russia per se. It’s just that the government is smarter than the Egyptian government was about how to use the Internet. The Egyptian government didn’t do anything online. It didn’t engage in propaganda, deploy bloggers, or launch cyberattacks. They missed the train.

I think the difference is that the people who built up the Russian Internet ended up working for the government. The Egyptian government’s approach to the Internet was very shallow, and it had to pay the price, eventually.

Giving everyone a blog will not by itself increase the health of modern-day democracy; in fact, the possible side effects—the disappearance of watchdogs, the end of serendipitous news discovery, the further polarization of society—may not be the price worth paying for the still unclear virtues of the blogging revolution. This does not mean, of course, that a smart set of policies—implemented by the government or private actors—won’t help to address those problems.

Revolutions Require Training and Organization

The people who were instrumental in making the Egyptian revolution happen weren’t new to politics. Almost all of them were part of existing political and social forces. They had had plenty of training and organization by various Western foundations and governments. I don’t think the view of this as being a spontaneous revolution was true. I myself have been to several democracy workshops in Egypt. I wouldn’t necessarily view these people as atomized individuals. They have been trained offline.

But of course, you wouldn’t have heard as much about it. Who’s paying for those workshops? It’s the U.S. government and U.S. foundations. In this sense, Facebook and Twitter are much better covers, because the uprisings they enabled appeared to be spontaneous. It would be very misleading to suggest that all the connections forged by these activists are virtual. Revolution is much more about building human networks.

In 1996, when a group of high-profile digerati took to the pages of Wired magazine and proclaimed that the “public square of the past” was being replaced by the Internet, a technology that “enables average citizens to participate in national discourse, publish a newspaper, distribute an electronic pamphlet to the world … while simultaneously protecting their privacy,” many historians must have giggled.

From the railways, which Karl Marx believed would dissolve India’s caste system, to television, that greatest “liberator” of the masses, there has hardly appeared a technology that wasn’t praised for its ability to raise the level of public debate, introduce more transparency into politics, reduce nationalism, and transport us to the mythical global village.

In virtually all cases, such high hopes were crushed by the brutal forces of politics, culture, and economics. Technologies tend to overpromise and underdeliver, as least on their initial promises.

Which of the forces unleashed by the Web will prevail in a particular social and political context is impossible to tell without first getting a thorough theoretical understanding of that context. Likewise, it is naïve to believe that such a sophisticated and multipurpose technology as the Internet could produce identical outcomes—whether good or bad—in countries as diverse as Belarus, Burma, Kazakhstan, and Tunisia. There is so much diversity across authoritarian regimes.

I wouldn’t have much hope in the Internet in North Korea. First, it’s a country with some of the fewest Internet connections in the world. And second, average North Koreans have been brainwashed to such an extent that you have serious psychological challenges that you can’t overcome just by using blogs and Twitter. It would be much harder than for a country like Belarus, for example, where one-third of the country is online. Mobile phones might play a role in getting more information out. But it’s unlikely that Facebook or Twitter will play much of a role.

Policy makers need to abandon both cyber-utopianism and Internet-centrism, if only for the lack of accomplishment. What would take their place? What would an alternative, more down-to-earth approach to policy making in the digital age—let’s call it cyber-realism—look like?

Cyber-realists would struggle to find space for the Internet in existing pillars. Instead of asking the highly general, abstract, and timeless question of “How do we think the Internet changes closed societies?,” they would ask “How do we think the Internet is affecting our existing policies on country X?” Instead of operating in the realm of the utopian and the ahistorical, impervious to the ways in which developments in domestic and foreign policies intersect, cyber-realists would be constantly searching for highly sensitive points of interaction between the two.

They wouldn’t label all Internet activism as either useful or harmful. Instead, they would evaluate the desirability of promoting such activism in accordance with their existing policy objectives.

Cyber-realists wouldn’t search for technological solutions to problems that are political in nature, and they wouldn’t pretend that such solutions are even possible. Nor would cyber-realists search for a bullet that could destroy authoritarianism—or even the next-to-silver bullet, for the utopian dreams that such a bullet can even exist would have no place in their conception of politics.

Instead, cyber-realists would focus on optimizing their own decision-making and learning processes, hoping that the right mix of bureaucratic checks and balances, combined with the appropriate incentives structure, would identify wicked problems before they are misdiagnosed as tame ones, as well as reveal how a particular solution to an Internet problem might disrupt solutions to other, non-Internet problems.

Most important, cyber-realists would accept that the Internet is poised to produce different policy outcomes in different environments and that a policy maker’s chief objective is not to produce a thorough philosophical account of the Internet’s impacts on society at large, but, rather, to make the Internet an ally in achieving specific policy objectives. For them, the promotion of democracy would be too important an activity to run it out of a Silicon Valley lab.

About the Author

Evgeny Morozov is a contributing editor to Foreign Policy, visiting scholar at Stanford University, a Schwartz fellow at the New America Foundation, and the author of The Net Delusion: The Dark Side of Internet Freedom (Public Affairs, 2011). E-mail evgeny.morozov@gmail.com.

This article draws from his book as well as an interview with staff editor Rick Docksai, which may be read at wfs.org.

Eroding Futures: Why Healthy Soil Matters to Civilization

Lester R. Brown

The earth beneath our feet is the Earth’s infrastructure for the resources that sustain our civilizations—and our futures. A leading agricultural policy expert shows what we must do to save the soil.

The signs that our civilization is in trouble are multiplying. During most of the 6,000 years since civilization began, we lived on the sustainable yield of the Earth’s natural systems. In recent decades, however, humanity has overshot the level that those systems can sustain.

We are liquidating the Earth’s natural assets to fuel our consumption. Half of us live in countries where water tables are falling and wells are going dry. Soil erosion exceeds soil formation on one-third of the world’s cropland, draining the land of its fertility. The world’s ever-growing herds of cattle, sheep, and goats are converting vast stretches of grassland to desert. Forests are shrinking by 13 million acres per year as we clear land for agriculture and cut trees for lumber and paper. Four-fifths of oceanic fisheries are being fished at capacity or overfished and headed for collapse. In system after system, demand is overshooting supply.

For past civilizations, it was sometimes a single environmental trend that was primarily responsible for their decline. Sometimes it was multiple trends. For ancient Sumer, decline could be attributed to rising salt concentrations in the soil as a result of an environmental flaw in the design of their otherwise extraordinary irrigation system. After a point, the salts accumulating in the soil led to a decline in wheat yields. The Sumerians then shifted to barley, a more salt-tolerant crop, but eventually barley yields also began to decline. The collapse of the civilization followed.

Although we live in a highly urbanized, technologically advanced society, we are as dependent on the Earth’s natural support systems as the Sumerians and Mayans were. If we continue with business as usual, civilizational collapse is no longer a matter of whether but when. We now have an economy that is destroying its natural support systems and has put us on a decline and collapse path. We are dangerously close to the edge. Among other actions, we need a worldwide effort to conserve soil, similar to the U.S. response to the Dust Bowl of the 1930s.

On March 20, 2010, a suffocating dust storm enveloped Beijing. The city’s weather bureau took the unusual step of describing the air quality as hazardous, urging people to stay inside or to cover their faces when they were outdoors. Visibility was low, forcing motorists to drive with their lights on in daytime.

Beijing was not the only area affected. This particular dust storm engulfed scores of cities in five provinces, directly affecting more than 250 million people. It was not an isolated incident. Every spring, residents of eastern Chinese cities, including Beijing and Tianjin, hunker down as the dust storms begin. Along with the difficulty in breathing and the stinging eyes, there is a constant struggle to keep dust out of homes and to clear doorways and sidewalks of dust and sand. The farmers and herders whose livelihoods are blowing away are paying an even higher price.

These annual dust storms affect not only China, but neighboring countries as well. The March 20 dust storm arrived in South Korea soon after leaving Beijing. It was described by the Korean Meteorological Administration as the worst dust storm on record. In a similar event in 2002, South Korea was engulfed by so much dust from China that people in Seoul were literally gasping for breath, reported Howard French for The New York Times. Schools were closed, airline flights were canceled, retail sales fell, and clinics were overrun with patients having difficulty breathing. Koreans have come to dread the arrival of what they call “the fifth season”—the dust storms of late winter and early spring.

While people living in China and South Korea are all too familiar with dust storms, the rest of the world typically learns about this fast-growing ecological catastrophe when the massive soil-laden storms leave the region. In April 2010, a National Aeronautics and Space Administration (NASA) satellite tracked a dust storm from China as it journeyed to the east coast of the United States. Originating in the Taklimakan and Gobi deserts, it ultimately covered an area stretching from North Carolina to Pennsylvania. Such huge dust storms carry off millions of tons of topsoil, a resource that will take centuries to replace.

Civilization’s Earthy Foundation

The thin layer of topsoil that covers much of the planet’s land surface and is typically measured in inches is the foundation of civilization. Soil is “the skin of the earth—the frontier between geology and biology,” writes geomorphologist David Montgomery in Dirt: The Erosion of Civilizations. After the Earth was created, soil formed slowly over geological time from the weathering of rocks. This soil supported early plant life on land. As plant life spread, the plants protected the soil from wind and water erosion, permitting it to accumulate and to support even more vegetation. This relationship facilitated an accumulation of topsoil that could support a rich diversity of plant and animal life.

As long as soil erosion on cropland does not exceed new soil formation, all is well. But once it does, it leads to falling soil fertility and eventually to land abandonment. Sadly, soil formed on a geological time scale is being removed on a human time scale.

Soil erosion is “the silent global crisis,” observes journalist Stephen Leahy in Earth Island Journal. “It is akin to tire wear on your car—a gradual, unobserved process that has potentially catastrophic consequences if ignored for too long.”

Losing productive topsoil means losing both organic matter in the soil and vegetation on the land, thus releasing carbon into the atmosphere. The 2,500 billion tons of carbon stored in soils dwarfs the 760 billion tons in the atmosphere, according to soil scientist Rattan Lal of Ohio State University. The bottom line is that land degradation is helping drive climate change.

Soil erosion is not new. It is as old as the Earth itself. What is new is that it has gradually accelerated ever since agriculture began. At some point, probably during the nineteenth century, the loss of topsoil from erosion surpassed the new soil that is formed through natural processes.

Today, roughly a third of the world’s cropland is losing topsoil at an excessive rate, thereby reducing the land’s inherent productivity. An analysis of several studies on soil erosion’s effect on U.S. crop yields concluded that, for each inch of topsoil lost, wheat and corn yields declined by close to 6%.

In August 2010, the United Nations announced that desertification now affects 25% of the Earth’s land area, threatening the livelihoods of more than 1 billion people—the families of farmers and herders in roughly 100 countries.

China may face the biggest challenge of all. After the economic reforms in 1978 that shifted the responsibility for farming from large state-organized production teams to individual farm families, China’s cattle, sheep, and goat populations spiraled upward. The United States, a country with comparable grazing capacity, has 94 million cattle, a slightly larger herd than China’s 92 million. But when it comes to sheep and goats, the United States has a combined population of only 9 million, whereas China has 281 million. Concentrated in China’s western and northern provinces, these animals are stripping the land of its protective vegetation. The wind then does the rest, removing the soil and converting rangeland into desert.

Wang Tao, one of the world’s leading desert scholars, reports that, from 1950 to 1975, an average of 600 square miles of land turned to desert each year. Between 1975 and 1987, this climbed to 810 square miles a year. From then until the century’s end, it jumped to 1,390 square miles of land going to desert annually.

China is now at war. It is not invading armies that are claiming its territory, but expanding deserts. Old deserts are advancing and new ones are forming like guerrilla forces striking unexpectedly, forcing Beijing to fight on several fronts.

While major dust storms make the news when they affect cities, the heavy damage is in the area of origin. These regions are affected by storms of dust and sand combined. An intense 1993 sandstorm in Gansu Province in China’s northwest destroyed 430,000 acres of standing crops, damaged 40,000 trees, killed 67,000 cattle and sheep, blew away 67,000 acres of plastic greenhouses, injured 278 people, and killed 49 individuals. Forty-two passenger and freight trains were either canceled or delayed, or simply parked to wait until the storm passed and the tracks were cleared of sand dunes.

Other Regions in the Dust

While China is battling its expanding deserts, India, with scarcely 2% of the world’s land area, is struggling to support 17% of the world’s people and 18% of its cattle. According to a team of scientists at the Indian Space Research Organization, 24% of India’s land area is slowly turning into desert. It thus comes as no surprise that many of India’s cattle are emaciated and over 40% of its children are chronically hungry and underweight.

Africa, too, is suffering heavily from unsustainable demands on its croplands and grasslands. Soil scientist Rattan Lal made the first estimate of continental yield losses due to soil erosion. He concluded that soil erosion and other forms of land degradation have cost Africa 8 million tons of grain per year, or roughly 8% of its annual harvest. Lal expects the loss to climb to 16 million tons by 2020 if soil erosion continues unabated.

On the northern fringe of the Sahara, countries such as Algeria and Morocco are attempting to halt the desertification that is threatening their fertile croplands. Algeria is losing 100,000 acres of its most fertile lands to desertification each year, according to President Abdelaziz Bouteflika. For a country that has only 7 million acres of grainland, this is not a trivial loss. Among other measures, Algeria is planting its southernmost cropland in perennials, such as fruit orchards, olive orchards, and vineyards—crops that can help keep the soil in place.

Mounting population pressures are evident everywhere on this continent where the growth in livestock numbers closely tracks that in human numbers. In 1950, Africa was home to 227 million people and about 300 million livestock. By 2009, there were 1 billion people and 862 million livestock. With livestock demands now often exceeding grassland carrying capacity by half or more, grassland is turning into desert. In addition to overgrazing, parts of the Sahel are suffering from an extended drought, one that scientists link to climate change.

The incidence of Saharan dust storms—once rare—has increased 10-fold during the last half century, reports Andrew Goudie, professor of geography at Oxford University. Among the African countries most affected by soil loss from wind erosion are Niger, Chad, Mauritania, northern Nigeria, and Burkina Faso. In Mauritania, in Africa’s far west, the number of dust storms jumped from two a year in the early 1960s to 80 a year recently.

And the impacts are global. Dust storms leaving Africa travel westward across the Atlantic, depositing so much dust in the Caribbean that they cloud the water and damage coral reefs.

Nigeria, Africa’s most populous country, reports losing 867,000 acres of rangeland and cropland to desertification each year. While Nigeria’s human population was growing from 37 million in 1950 to 151 million in 2008, a fourfold expansion, its livestock population grew from 6 million to 104 million, a 17-fold jump. With the forage needs of Nigeria’s 16 million cattle and 88 million sheep and goats exceeding the sustainable yield of grasslands, the northern part of the country is slowly turning to desert. If Nigeria’s population keeps growing as projected, the associated land degradation will eventually undermine herding and farming.

In East Africa, Kenya is being squeezed by spreading deserts. Desertification affects up to a fourth of the country’s 39 million people. As elsewhere, the combination of overgrazing, overcutting, and overplowing is eroding soils, costing the country valuable productive land.

In Afghanistan, a UN Environment Programme (UNEP) team reports that in the Sistan region “up to 100 villages have been submerged by windblown dust and sand.” The Registan Desert is migrating westward, encroaching on agricultural areas. In the country’s northwest, sand dunes are moving onto agricultural land in the upper Amu Darya basin, their path cleared by the loss of stabilizing vegetation due to firewood gathering and overgrazing. The UNEP team observed sand dunes as high as a five-story building blocking roads, forcing residents to establish new routes.

An Afghan Ministry of Agriculture and Food report reads like an epitaph on a gravestone: “Soil fertility is declining,... water tables have dramatically fallen, de-vegetation is extensive and soil erosion by water and wind is widespread.” After nearly three decades of armed conflict and the related deprivation and devastation, Afghanistan’s forests are nearly gone. Seven southern provinces are losing cropland to encroaching sand dunes. And like many failing states, even if Afghanistan had appropriate environmental policies, it lacks the law enforcement authority to implement them.

Neighboring Iran illustrates the pressures facing the Middle East. With 8 million cattle and 79 million sheep and goats—the source of wool for its fabled Persian carpet-making industry—Iran’s rangelands are deteriorating from overstocking. In the southeastern province of Sistan-Balochistan, sandstorms have buried 124 villages, forcing their abandonment. Drifting sands have covered grazing areas, starving livestock and depriving villagers of their livelihood.

In Iraq, suffering from nearly a decade of war and recent drought, a new dust bowl appears to be forming. Chronically plagued by overgrazing and overplowing, Iraq is now losing irrigation water to its upstream riparian neighbors—Turkey, Syria, and Iran. The reduced river flow—combined with the drying up of marshlands, the deterioration of irrigation infrastructure, and the shrinking irrigated area—is drying out Iraq. The Fertile Crescent, the cradle of civilization, may be turning into a dust bowl.

Dust storms are occurring with increasing frequency in Iraq. In July 2009, a dust storm raged for several days in what was described as the worst such storm in Iraq’s history. As it traveled eastward into Iran, the authorities in Tehran closed government offices, private offices, schools, and factories. Although this new dust bowl is small compared with those centered in northwest China and central Africa, it is nonetheless an unsettling new development in this region.

Food and Forage

One indicator that helps us assess grassland health is changes in the goat population relative to those of sheep and cattle. As grasslands deteriorate, grass is typically replaced by desert shrubs. In such a degraded environment, cattle and sheep do not fare well, but goats—being particularly hardy ruminants—forage on the shrubs. Between 1970 and 2009, the world cattle population increased by 28% and the sheep population stayed relatively static, but the goat population more than doubled.

In some developing countries, the growth in the goat population is dramatic. While Pakistan’s cattle population doubled between 1961 and 2009, and the sheep population nearly tripled, the goat population grew more than sixfold and is now equal to that of the cattle and sheep populations combined.

As countries lose their topsoil, they eventually lose the capacity to feed themselves. Among those facing this problem are Lesotho, Haiti, Mongolia, and North Korea.

Lesotho—one of Africa’s smallest countries, with only 2 million people—is paying a heavy price for its soil losses. A UN team visited in 2002 to assess its food prospect. Their finding was straightforward: “Agriculture in Lesotho faces a catastrophic future; crop production is declining and could cease altogether over large tracts of country if steps are not taken to reverse soil erosion, degradation, and the decline in soil fertility.”

During the last 10 years, Lesotho’s grain harvest dropped by half as its soil fertility fell. Its collapsing agriculture has left the country heavily dependent on food imports. As Michael Grunwald reported in the Washington Post, nearly half of the children under five in Lesotho are stunted physically. “Many,” he wrote, “are too weak to walk to school.”

In the Western Hemisphere, Haiti—one of the early failing states—was largely self-sufficient in grain 40 years ago. Since then, it has lost nearly all its forests and much of its topsoil, forcing it to import over half of its grain. Lesotho and Haiti are both dependent on UN World Food Programme lifelines.

A similar situation exists in Mongolia, where over the last 20 years nearly three-fourths of the wheatland has been abandoned and wheat yields have started to fall, shrinking the harvest by four-fifths. Mongolia now imports nearly 70% of its wheat.

North Korea, largely deforested and suffering from flood-induced soil erosion and land degradation, has watched its yearly grain harvest fall from a peak of 5 million tons during the 1980s to scarcely 3.5 million tons during the first decade of this century.

Soil erosion is taking a human toll. Whether the degraded land is in Haiti, Lesotho, Mongolia, North Korea, or any of the many other countries losing their soil, the health of the people cannot be separated from the health of the land itself.

Restoring Earth’s Soil Foundation

Restoring the Earth will take an enormous international effort, one far more demanding than the Marshall Plan that helped rebuild war-torn Europe and Japan after World War II. And such an initiative must be undertaken at wartime speed before environmental deterioration translates into economic decline, just as it did for the Sumerians, the Mayans, and many other early civilizations whose archaeological sites we study today.

Protecting the 10 billion acres of remaining forests on Earth and replanting many of those already lost, for example, are both essential for restoring the planet’s health. Since 2000, the Earth’s forest cover has shrunk by a net 13 million acres each year, with annual losses of 32 million acres far exceeding the regrowth of 19 million acres.

Thus, protecting the Earth’s soil warrants a worldwide ban on the clear-cutting of forests in favor of selective harvesting, simply because each successive clear-cut brings heavy soil loss and eventual forest degeneration. Restoring the Earth’s tree and grass cover, as well as practicing conservation agriculture, protects soil from erosion, reduces flooding, and sequesters carbon.

We also need a tree-planting effort to both conserve soil and sequester carbon. To achieve these goals, billions of trees need to be planted on millions of acres of degraded lands that have lost their tree cover and on marginal croplands and pasturelands that are no longer productive.

Planting trees is just one of many activities that will remove meaningful quantities of carbon from the atmosphere. Improved grazing and land management practices that increase the organic matter content in soil also sequester carbon.

Lessons of the Dust Bowl

The 1930s Dust Bowl that threatened to turn the U.S. Great Plains into a vast desert was a traumatic experience that led to revolutionary changes in American agricultural practices, including the planting of tree shelterbelts (rows of trees planted beside fields to slow wind and thus reduce wind erosion) and strip cropping (the planting of wheat on alternate strips with fallowed land each year). Strip cropping permits soil moisture to accumulate on the fallowed strips, while the alternating planted strips reduce wind speed and hence erosion on the idled land.

In 1985, the U.S. Department of Agriculture, with strong support from the environmental community, created the Conservation Reserve Program (CRP) to reduce soil erosion and control overproduction of basic commodities. By 1990, there were some 35 million acres of highly erodible land with permanent vegetative cover under 10-year contracts. Under this program, farmers were paid to plant fragile cropland in grass or trees. The retirement of those 35 million acres under the CRP, together with the use of conservation practices on 37% of all cropland, reduced annual U.S. soil erosion from 3.1 billion tons to 1.9 billion tons between 1982 and 1997. The U.S. approach offers a model for the rest of the world.

Another tool in the soil conservation toolkit is conservation tillage, which includes both no-till and minimum tillage. Instead of the traditional cultural practices of plowing land and discing or harrowing it to prepare the seedbed, and then using a mechanical cultivator to control weeds in row crops, farmers simply drill seeds directly through crop residues into undisturbed soil, controlling weeds with herbicides. The only soil disturbance is the narrow slit in the soil surface where the seeds are inserted, leaving the remainder of the soil covered with crop residue and thus resistant to both water and wind erosion. In addition to reducing erosion, this practice retains water, raises soil carbon content, and greatly reduces energy use for tillage.

In the United States, the no-till area went from 17 million acres in 1990 to 65 million acres in 2007. Now widely used in the production of corn and soybeans, no-till has spread rapidly, covering 63 million acres in Brazil and Argentina and 42 million in Australia. Canada, not far behind, rounds out the five leading no-till countries. Farming practices that reduce soil erosion and raise cropland productivity such as minimum-till, no-till, and mixed crop–livestock farming usually also lead to higher soil carbon content and soil moisture. In Kazakhstan, the 3 million acres in no-till seemed to fare better than land in conventional farming during the great Russian heat wave and drought of 2010.

In sub-Saharan Africa, where the Sahara is moving southward all across the Sahel, countries are concerned about the growing displacement of people as grasslands and croplands turn to desert. As a result, the African Union has launched the Green Wall Sahara Initiative. This plan, originally proposed in 2005 by Olusegun Obasanjo when he was president of Nigeria, calls for planting a 4,300-mile band of trees, nine miles wide, stretching across Africa from Senegal to Djibouti. Senegal, which is losing 124,000 acres of productive land each year and which would anchor the green wall on the western end, has planted 326 miles of the band. A $119-million grant from the Global Environment Facility in June 2010 gave the project a big boost. Senegal’s Environment Minister, Modou Fada Diagne, says, “Instead of waiting for the desert to come to us, we need to attack it.” One key to the success of this initiative is improving management practices, such as rotational grazing.

In the end, the only viable way to eliminate overgrazing on the two-fifths of the Earth’s land surface classified as rangelands is to reduce the size of flocks and herds. Not only do the excessive numbers of cattle, sheep, and goats remove the vegetation, but their hoofs pulverize the protective crust of soil that is formed by rainfall and that naturally checks wind erosion. In some situations, the preferred option is to keep the animals in restricted areas, bringing the forage to them. India, which has successfully adopted this practice to build the world’s largest dairy industry, is a model for other countries.

A Sustainable Plan to Preserve Soil

Conserving the Earth’s topsoil by reducing erosion to the rate of new soil formation or below has two parts. One is to retire the highly erodible land that cannot sustain cultivation—the estimated one-tenth of the world’s cropland that accounts for perhaps half of all excess erosion. For the United States, that has meant retiring nearly 35 million acres. The cost of keeping this land out of production is close to $50 per acre. In total, annual payments to farmers to plant this land in grass or trees under 10-year contracts approaches $2 billion.

In expanding these estimates to cover the world, it is assumed that roughly 10% of the world’s cropland is highly erodible, as in the United States, and should be planted in grass or trees before the topsoil is lost and it becomes barren land. In both the United States and China, which together account for 40% of the world grain harvest, the official goal is to retire one-tenth of all cropland. For the world as a whole, converting 10% of cropland that is highly erodible to grass or trees seems like a reasonable goal. Since this costs roughly $2 billion in the United States, which has one-eighth of the world’s cropland, the total for the world would be $16 billion annually.

The second initiative on topsoil consists of adopting conservation practices on the remaining land that is subject to excessive erosion—that is, erosion that exceeds the natural rate of new soil formation. This initiative includes incentives to encourage farmers to adopt conservation practices such as contour farming, strip cropping, and, increasingly, minimum-till or no-till farming. These expenditures in the United States total roughly $1 billion per year. Assuming that the need for erosion control practices elsewhere is similar to that in the United States, we again multiply the U.S. expenditure by eight to get a total of $8 billion for the world as a whole. The two components together—$16 billion for retiring highly erodible land and $8 billion for adopting conservation practices—give an annual total for the world of $24 billion.

Altogether, then, restoring the economy’s natural support systems—reforesting the Earth, protecting topsoil, restoring rangelands and fisheries, stabilizing water tables, and protecting biological diversity—will require additional expenditures of just $110 billion per year. Many will ask, Can the world afford these investments? But the only appropriate question is, Can the world afford the consequences of not making these investments?

About the Author

Lester R. Brown is president of Earth Policy Institute and author of World on the Edge: How to Prevent Environmental and Economic Collapse (W.W. Norton & Company, 2011), from which this article was adapted with permission. He may be contacted at Earth Policy Institute, 1350 Connecticut Avenue, N.W., Suite 403, Washington, D.C. 20036. Web site www.earth-policy.org; e-mail epi@earth-policy.org.

References and Resources

Data, endnotes, and additional resources can be found on Earth Policy’s Web site, at www.earth-policy.org. Also see:

Dirt: The Erosion of Civilizations by David R. Montgomery (University of California Press, 2007). A geomorphologist argues that we are running out of sufficient soil to feed future populations, making a case for organic inputs and conservation.

The Grapes of Wrath by John Steinbeck (Viking Penguin Inc., 1939) puts environmental damage into a human context.

• Food and Agriculture Organization (www.fao.org) provides information on soil and soil resources, conservation, desertification, land assessment, plant and crop nutrition, and more.

• NASA’s Earth Observatory site (http://earthobservatory.nasa.gov) offers satellite imagery showing the extent and impacts of dust storms, droughts, and more.

• U.S. Department of Agriculture Agricultural Research Service (www.ars.usda.gov) oversees the National Soil Erosion Research Laboratory, among many other programs promoting innovation in resource management.

Dust Bowl Redux

Dust storms provide highly visible evidence of soil erosion and desertification. Once vegetation is removed either by overgrazing or overplowing, the wind begins to blow the small soil particles away. Because the particles are small, they can remain airborne over great distances. Once they are largely gone, leaving only larger particles, sandstorms begin. These are local phenomena, often resulting in dune formation and the abandonment of both farming and grazing. Sandstorms are the final phase in the desertification process.

In some situations, the threat to topsoil comes primarily from overplowing, as in the U.S. Dust Bowl, but in other situations, such as in northern China, the cause is primarily overgrazing. In either case, permanent vegetation is destroyed and soils become vulnerable to both wind and water erosion.

Giant dust bowls are historically new, confined to the last century or so. During the late nineteenth century, millions of Americans pushed westward, homesteading on the Great Plains, plowing vast areas of grassland to produce wheat. Much of this land—highly erodible when plowed—should have remained in grass. Exacerbated by a prolonged drought, this overexpansion culminated in the 1930s Dust Bowl, a traumatic period chronicled in John Steinbeck’s novel The Grapes of Wrath. In a crash program to save its soils, the United States returned large areas of eroded cropland to grass, adopted strip-cropping, and planted thousands of miles of tree shelterbelts.

Three decades later, history repeated itself in the Soviet Union. In an all-out effort to expand grain production in the late 1950s, the Soviets plowed an area of grassland roughly equal to the wheat area of Australia and Canada combined. The result, as Soviet agronomists had predicted, was an ecological disaster—another Dust Bowl.

Kazakhstan, which was at the center of this Soviet Virgin Lands Project, saw its grainland area peak at just over 25 million hectares in the mid-1980s. (One hectare equals 2.47 acres.) It then shrank to less than 11 million hectares in 1999. It is now slowly expanding, and grainland area is back up to 17 million hectares. Even on the remaining land, however, the average wheat yield is scarcely 1 ton per hectare, a far cry from the 7 tons per hectare that farmers get in France, western Europe’s leading wheat producer.

Today, two giant dust bowls are forming. One is in the Asian heartland in northern and western China, western Mongolia, and central Asia. The other is in central Africa in the Sahel—the savannah-like ecosystem that stretches across Africa, separating the Sahara Desert from the tropical rain forests to the south. Both are massive in scale, dwarfing anything the world has seen before. They are caused, in varying degrees, by overgrazing, overplowing, and deforestation.

Lester R. Brown

My First Meltdown: Lessons from Fukushima

By Patrick Tucker

Japan’s nuclear disaster carries a number of important lessons, such as how and when to deploy a worst-case scenario. While working in Kyoto, THE FUTURIST’s senior editor observed Japan’s nightmare and the costs of poor communication during a crisis.

Scene: The date is March 11, 2011. I am in my apartment in Kyoto, Japan, watching my first partial nuclear meltdown 335 miles away in Fukushima. Because the word “melt” suggests a visible and even transition between physical states, I always thought of a meltdown as a fast and fluid event. The experience does not conform to my expectations; it seems to proceed at a lurching pace.

Chief Cabinet Secretary Yukio Edano becomes a regular fixture on our televisions and laptops. Because he, and the Kan administration, are reliant on the plant’s operator, Tokyo Electric Power, for information, he can offer little more than reassurances that the situation is under control. These come in stark contrast to the ever more frightening scenes behind him. We watch as problems spread from one part of the facility to another. Hydrogen explosions literally blow the walls off of several of the reactor buildings.

We turn to Twitter and Facebook. In the hours after the earthquake, 177 million tweets are sent and 572,000 new Twitter accounts are created. We discover that radiation levels have reached 8,217 microsieverts per hour near the front gate of the Fukushima Daiichi nuclear power station and that anyone in this kind of environment would be exposed to more than three years’ worth of naturally occurring radiation within 60 minutes. We also learn that the government is venting steam laced with cesium and iodine and that iodine exposure can result in thyroid cancer.

We begin to compulsively massage at our thyroid glands and text fellow American expats around Japan.

“We’re definitely getting out of Tokyo and coming to Kyoto,” says my friend, father of a two-year-old toddler.

“I’m definitely going home to New York,” says my neighbor. He’s on a flight 12 hours later. We make fun of him for overreacting. Within the day, we, too, begin to contemplate leaving the country. A family of three nuclear refugees, as they’ve come to be called, has taken up temporary residence in my bedroom. Emails from family back home are hysterical in tone. Emails from Japanese friends within the country politely suggest that the situation has been blown out of proportion.

In the days that follow, my wife and I begin to pursue contrasting avenues of research. I gravitate toward articles and sources that confirm the narrative to which I have already subscribed, that the mainstream Western media is dramatizing the situation at the power plant. I am encouraged and impressed by the enterprising young people in Tokyo who have taken it upon themselves to monitor the radiation from their homes and offices and tweet the results—which show that radiation levels are not dangerous.

“In a worst-case scenario, the alpha radiation would be contained to a relatively small area. The main threat is to the food supply and only the food from that prefecture,” I tell my wife.

“The USS Ronald Reagan was picking up radiation on deck; and it was a hundred miles offshore. They moved the entire fleet,” she says, emphasizing “fleet” as though this nuance in the story indicates that a truly remarkable naval maneuver has occurred.

My wife begins to follow a different line of research. She seizes on the movements taking place behind the official statements, the diplomatic breakdown between the U.S. and Japanese governments over the proposed size of the evacuation zone. Her faith in the Kan administration has completely evaporated. But she is not panicked. She is calm, collected, and open-eyed as she weighs various bits of information against the credibility of their respective sources.

In the days that follow, we learn that the French government has advised its citizens to evacuate Tokyo. The U.K. government chief scientist states that the French response is “not based on science.”

The discussion within our tiny apartment turns to the future. The evacuation area is steadily swelling, first to 10 then to 20 kilometers. The government continues to proclaim that radiation levels in Tokyo and beyond are not dangerous. Officials also anticipate that they may have to vent more steam. We know that we are probably safe where we are. But the amount of radiation detectable at the mouth of the plant seems to be rising roughly in tandem with the price of airfare out of Japan. Finally, in a calm and careful manner, like so many others, we purchase tickets to leave the country.

The Fallout from Bad Communication

The week of the March 2011 earthquake saw a massive exodus of foreigners from Japan, a 16% drop in the Nikkei 225 stock market index (which has partially bounced back), and runs on bottled water and toilet paper in Tokyo. Could the government have handled the situation better? Tragedies like a tsunami can’t be prevented. Other disasters, like the Kan administration’s public relations response to the breakdown at the Fukushima Daiichi nuclear power plant, offer lessons for the future.

Upon my return to the United States, I contacted crisis communications expert Peter Sandman. When disasters strike, he says, most people have three questions for the government man or woman at the podium: What happened? What do you expect to happen? and What are you worried might happen?

“By far the biggest crisis communication error of the Japanese government [was] failure to answer the second and third questions satisfactorily: i.e., its failure to forewarn people about tomorrow’s and next week’s probable headlines, and its failure to guide people’s fears about worst-case scenarios,” says Sandman.

In the midst of the unfolding disaster, the Kan administration refused to speculate publicly about how bad the situation could get. The result was the bizarre scene I saw on my television: Yukio Edano, in his bright blue jumpsuit, issuing public reassurances and hastily revising them as the nuclear power plant exploded behind him. In some parts of the country, these press conferences were followed by public-service announcements calmly advising people to stay indoors, wear masks to limit exposure to radiation, and avoid tap water. Competing messages like these led rational people like my wife and me to conclude that the Kan administration and Tokyo Electric Power weren’t giving us the full story.

Sandman says that the Japanese government “failed to predict that there would probably be increasing radiation levels in local milk, vegetables, and seawater; that Tokyo’s drinking water would probably see a radiation spike as well; that plutonium would probably be found in the soil near the damaged plants; that the evidence of core melt would probably keep getting stronger… etc. After each of these events occurred, the government [said] they were predictable and not all that alarming. But it failed to predict them.”

This vicious cycle—public official downplays situation, situation worsens, repeat—is one that Sandman has seen before, when he served on the congressional investigation into the Three Mile Island nuclear incident. In that instance, the operating utility, Metropolitan Edison, quickly worked to paint an optimistic but not inaccurate portrait of what was going on inside the plant. When the picture worsened, the public was left to speculate that Metroplitan Edison was either lying about the risks or unaware of what they were.

Sandman’s first piece of advice to any government or company spokesman tasked with addressing the public during a crisis is, in a word, speculate. Do it gloomily, alarmingly, but above all else, do it loudly. He suspects that in the case of Fukushima, like Three Mile Island, the people in control only communicated what they knew for certain. Because they kept their worst fears private people were left to invent their own worst-case scenarios.

“Talking about what’s likely and what’s possible is necessarily speculative. Some commentators and even some crisis communication professionals have argued that authorities shouldn’t speculate in a crisis. This is incredibly bad advice,” says Sandman.

If my wife and I and the many other people who fled the country had known the government’s worst-case scenario, we likely would not have left. But there’s another lesson to be learned from the Fukushima disaster, evinced by the many who stayed to volunteer in the areas most affected by the tsunami: Trust your people not to panic. They’re probably more steady than you think.

About the Author

Patrick Tucker is senior editor of THE FUTURIST magazine and director of communications for the World Future Society. Contact him at ptucker@wfs.org. A longer version of the interview with Peter Sandman can be found at this link.

Treading in the Sea of Data

By Richard Yonck

Information: Our world is swimming in it. With each passing day, our lives become more dependent on it. Yet, the very magnitude of this torrent of data compromises its benefits to us. New strategies and technologies are now evolving that may save us from drowning—and even help us thrive.

The desire for information is rooted deep within us, evolved into our genes. Essentially an outgrowth of food foraging behavior, information foraging provides similar neurological payoffs. In a now-famous 2009 study on monkeys, Ethan Bromberg-Martin and Okihide Hikosaka demonstrated that dopamine neurons treat information as a reward. In other words, looking for and finding information makes us feel good: The behavior reinforces itself and makes us want to do it again.

At the same time, the growing volume of information available to us makes us increasingly inclined to seek breadth of knowledge rather than depth. We delve into a task or subject just a bit before we’re drawn away to something else. Our attention is continually being pulled toward a different target than the one we’re currently semi-focused on. In the end, it’s a little like being at a smorgasbord buffet: There are so many dishes, we can’t properly savor any single one of them.

All this information and the technologies that accompany it have led to an ongoing dialogue about the pros and cons of our advances. In his most recent book, The Shallows (W.W. Norton, 2010), Nicholas Carr argues that information technology is changing our brains, making us less focused, less capable of deep thought. Others, such as technology writer Clay Shirky, futurist Jamais Cascio, and cognitive scientist Steven Pinker, have acknowledged that, while we are changing in response to all of our progress, this is a pattern that has occurred throughout human history. Time and again, we’ve adjusted our ways of thinking in response to our technological advances. As toolmakers, we’ve used our devices to change the world, and in turn they’ve changed us.

There’s no denying that this relentless inundation of information severely hampers our ability to concentrate. Interruptions and distractions abound, invading our mind, making focused thought far more difficult. A study by Microsoft Research found that, following even a minor interruption, it typically takes us 15 minutes to fully refocus on the subject at hand. The study’s authors reported that they were “surprised by how easily people were distracted and how long it took them to get back to the task.”

The Coming Data Deluge

Data grows exponentially. According to market research and analysis firm IDC, the world’s digital output is doubling every one and a half years. In 2010, they expect the world to create and replicate a record 1.2 zettabytes of data. That’s over a trillion billion bytes, or a stack of DVDs reaching to the Moon and back. By 2020, IDC expects this number to grow to 35 zettabytes, or enough DVDs to reach halfway to Mars. But there are reasons to believe this estimate may fall woefully short.

Right now, data only seems to be everywhere, but in the near future it really will be. High-speed wireless technologies will soon enable us to access information from almost any location at speeds approaching those of wired networks. At the same time, devices that generate that data will increasingly be distributed throughout our environment. Embedded networked processors and smart dust—sensor networks made up of billions, even trillions, of nodes—will be everywhere, providing real-time data streams about everything, all the time.

Lifelogging is another development that could exacerbate our data problem. As cameras, recording devices, and storage media continue to shrink, the ability to record every instant of our lives becomes not only feasible, but possibly even appealing. Used in conjunction with intelligent search methods, lifelogging could provide us with the equivalent of near total recall. Where was I on the night of the thirteenth? What was the name of that associate I met for a few seconds five years ago? And perhaps most importantly, where did I leave those darn keys? These kinds of questions could become trivial using such a system, but the storage and data processing involved would not.

Gordon Bell, formerly of DEC, now works for Microsoft Research, where he is the subject of the MyLifeBits lifelogging project. In his recent book, Total Recall (Dutton, 2009), he writes, “e-memory will become vital to our episodic memory. As you live your life, your personal devices will capture whatever you decide to record. Bio-memories fade, vanish, merge, and mutate with time, but your digital memories are unchanging.” Such technology will bring with it many benefits as well as many unintended consequences, not the least of which will be an explosion of additional digital information.

Then there’s the sheer volume of metadata that will be created by computers. The examination of primary data—whether it’s Web links or cell-phone habits or demographic voting habits—yields a tremendous amount of secondary or derivative information. Analysis of smartphone records can generate information about traffic flow and population movement. Tweets and search-engine queries can contribute data for analysis in epidemiological studies of infectious diseases. As each set of data is recombined and reanalyzed, it generates still more data.

This brings us to the Semantic Web. Conceived by Tim Berners-Lee, the father of the World Wide Web, the Semantic Web aims to take information that is currently only machine readable and make it machine understandable.

The Semantic Web alters the relationship between data and machine. It gives data meaning. Currently, computers treat most information on the Web merely as strings of letters and numbers, so that “the quick brown fox” has about as much meaning as “Sgd pthbj aqnvm enw,” at least at the machine level. But with the Semantic Web, “quick,” “brown,” and “fox” are all formally represented concepts with defined relationships to other concepts. The ontologies that define these concepts establish meaning that can be understood by our computers.

With these improvements, our computers will be able to readily compile information from a range of sources without human oversight and consolidate it into a format that best suits our needs. As information comes to be better structured and defined, all sorts of new ways of working with it will become possible. Existing information will be analyzed and recombined in ways we’ve never even thought of—all at the speed of our fastest computers.

Body area networks (BANs) will also be a source of new information. A set of wearable or implanted sensors that monitor body functions, our BAN would keep us and our health-care providers apprised of our well-being with continuous data streams. As sensor costs plummet, such monitoring holds the potential to drastically reduce health costs by alerting us at the earliest stages of an illness. But while such devices may have considerable benefit, they also threaten to add greatly to the world’s data load.

Under this onslaught of information, how will we function, much less use these resources effectively? We already use filtering technologies, like the ad-zappers used in digital video recorders that enable us to circumvent the commercials that checkerboard the television schedule. Similarly, ads, banners, and other commercial efforts might be filtered out by software that’s able to distinguish it from relevant content. But because the data collected from these efforts can provide useful information to advertisers, they will find ways to disable such filters. In many ways, the battle for our attention will be a technological escalation between media and viewer.

Coping with Data

All this brave new data will result in many changes, as we try adapting our behavior, improving the existing technologies, and developing better interfaces with them.

Adapting ourselves. Changing our own behavior is both the simplest and the most difficult option. On the one hand, we can decide the how and when for ourselves, whether it’s checking e-mail or surfing the Web or watching TV. We can even choose to opt out entirely, cutting off all but the most basic forms of communication. On the other hand, such habits are often very difficult to break for the same reasons they can lead to compulsive behavior. And while there may be certain benefits to going completely “cold turkey,” such a decision could find the user increasingly cut off and at a disadvantage in society.

Despite the possible difficulties involved, setting aside regular time each day to shut down the information flow can yield benefits. Such a hiatus creates time to absorb, digest, and reflect on what’s been learned. Taken even further, incorporating regular meditation into one’s schedule can help to diminish the negative physiological and psychological effects of information overload. It can also contribute to further insight, as can REM sleep. But such methods can only take us so far, especially when the volume of data in our world continues to escalate.

Adapting existing technologies. A number of possible strategies for dealing with information overload can be found within existing technologies. For instance, various software can already be used to direct, consolidate, and filter information, channeling only what is useful and relevant to our attention. We have “dashboards” that aggregate information streams such as RSS feeds, and “radars and filters” to manage what we get.

Advances in natural language processing of unstructured data will give us another means to better access data. A good example of this is IBM’s DeepQA Project, better known as Watson, which captured the public imagination in early 2011 on the popular quiz show, Jeopardy. As this already impressive technology matures, it will find applications in many fields, including health care, business analytics, and as personal assistants.

A very different approach to processing and improving the way we access information can be found in the knowledge engine Wolfram|Alpha. The brainchild of Stephen Wolfram, the eponymously named program computes answers to queries based on structured data. Rather than returning lists of documents as in a Google search, Wolfram|Alpha consolidates the information into relevant answers and visualizations.

According to the project’s mission statement, “Wolfram|Alpha’s long-term goal is to make all systematic knowledge immediately computable and accessible to everyone.” While this may strike some as an extremely lofty objective, no one can accuse the creator of Mathematica and author of A New Kind of Science (Wolfram Media, 2002) of ever thinking small, his work in particle physics notwithstanding. Wolfram has stated that Wolfram|Alpha’s processing of structured data is very different from the way DeepQA works with unstructured data. He’s also suggested that, if there is ever a Watson 2.0, it could benefit from integrating the Wolfram|Alpha API.

Once the Semantic Web and knowledge engines become more widespread, one category of software that should develop rapidly is that of intelligent agents. These machine assistants are programs that will be able to perform routine tasks for us, whether it’s making appointments, locating supplies, handling inquiries, or planning a vacation. Over time, these agents will become increasingly intelligent, capable of learning our individual preferences. Eventually, they’ll become so good that they’ll almost be able to mirror our own thought processes.

At some point, these “virtual selves” might even be able to go into the world (or at least virtual worlds) as autonomous avatars, our representatives in the world at large. As technology advances further, it may become possible to reintegrate these virtual selves, acquiring their experiences with such fidelity that it would seem like we’d been there ourselves. Such tools could go a long way toward helping us deal with a world swimming in information.

New interfaces. The development of new interfaces will change not only how we think about and visualize information, but also how we work with it. New large-scale multitouch screens and gesture interfaces already allow us to work with virtual 3-D models in ways that are far more like manipulating objects in the real world and therefore much more intuitive. As these develop further, Minority Report–like interfaces will give us the means to work with large amounts of complex information quickly and with ease.

Three-dimensional displays are another tool that will allow us to pull much more information from visual displays. Currently, these use special glasses, but high-quality 3-D displays that don’t require glasses will be available later this decade. This will allow for the use of complex spatial relationships in visualizing information.

Augmented reality. Augmented reality applications are already available for our smartphones and are developing rapidly. Nevertheless, they are still very much in their infancy. Augmented reality superimposes digital information and artifacts over maps and real-life images to convey additional information to the user. The combination of features available in today’s smartphones—mobility, camera, display, GPS, compass, accelerometer—make them the medium of choice for these applications.

Already, augmented reality apps can direct you to a nearby bus stop or subway station, recommend a local restaurant, and act as a travel guide. In coming years, more-sophisticated applications will provide virtual devices and dashboards seemingly in mid-air; personalized, contextual datafeeds; and advertising customized to our individual preferences. While this technology will be responsible for still more information finding its way to us, it will also play a major role in compressing and consolidating information that will be almost instantly available for our needs.

Options for Human Augmentation

Transforming our tools will only go so far in helping us keep our heads above the rising sea of data. In order to stay afloat, we may eventually find it necessary to transform ourselves. Such augmentation, generally called cognitive enhancement, will probably follow a number of parallel paths.

Pharmacological enhancements. Caffeine and other stimulants have long been used as “productivity enhancers” to help us focus on tasks. More recently, pharmaceuticals such as Adderall, Modafinil, and Ritalin have grown in popularity, particularly among college students. But there is a lot of anecdotal evidence indicating that, while some abilities such as focus are improved, other functions related to creativity can suffer. Additionally, these drugs can be addictive and increase the potential for psychosis over time. Since this usage is off-label—meaning it isn’t what they were actually developed or prescribed for—it seems likely that improved versions may be possible, hopefully with fewer side effects. Other categories, such as vasodilators—for example, gingko biloba—claim to improve brain function by delivering more blood and oxygen to the brain. Here again are potential avenues for improving brain function.

True smart drugs, or nootropics, hold significant potential to improve learning and retention. Current research aimed at helping Alzheimer’s and dementia patients may eventually lead to drugs that have other uses, such as learning augmentation. Ampakines, for instance, are a new class of compounds that improve attention span and alertness as well as facilitating learning and memory. Ampakines have been studied by the Defense Advanced Research Projects Agency (DARPA) for use by the military.

Genetic and biotechnology enhancements. Many genetic studies are being done to identify therapeutic strategies that promote neuroplasticity—the formation of new neural structures in the brain—and improve learning ability. A study at the European Neuroscience Institute published in 2010 found that memory and learning ability of elderly mice was restored to youthful levels when a cluster of genes was activated through the introduction of a single enzyme.

A number of stem-cell research studies offer hope not only for degenerative mental pathologies but also for restoring our ability to learn rapidly. In a 2009 study, older mice predisposed to develop the plaques associated with Alzheimer’s were treated with neural stem cells. These cells stimulated an enhancement of hippocampal synaptic density, which resulted in better performance on memory tests a month after receiving the cells. (The hippocampus is a region of the brain that plays important roles in long-term memory and spatial navigation. It is one of the first regions to suffer damage from Alzheimer’s.)

Another recent study of mice exposed to the natural soil bacterium Mycobacterium vaccae found that their learning rate and retention greatly improved. It’s been speculated that this was due to their brains’ immune response to the bacterium. As we learn more about the chemical and genetic processes our brains use in acquiring knowledge, it should eventually become possible to enhance them in very targeted ways.

Brain–computer interfaces. While still some way off, technology may one day allow us to offload a small or large portion of our memory and processing to machines. To some, this may seem farfetched, but there is already considerable research taking place in this and related fields. Today, there are already interfaces that give quadriplegics and people with locked-in syndrome the ability to control computers and operate wheelchairs. There are even headsets available that allow users to operate computer games, all through the power of thought. Someday, we will no doubt look back on these as primitive devices, but in the meantime, they offer a glimpse of what may become commonplace.

The information-management potential of advanced brain–computer interfaces (BCIs) would be significant. We might have the ability to generate separate threads that take care of several tasks at once, transforming us into true multitaskers. We could gather information on a subject from a broad range of sources and have it condensed into just the format we needed. We could draw on immense external computer resources to rapidly resolve a problem that might take months for a team of present-day experts. We could learn at the speed of thought—only the speed of thought would be many orders of magnitude faster than it is today.

Futurist Jamais Cascio and others believe we will forgo BCI in favor of one of the other forms of cognitive enhancement, and they may be correct. The problem of being lumbered with last year’s BCI model as these technologies continue to develop could well dissuade many potential augmenters. But this presumes that the BCIs of tomorrow will be as permanently fixed as the computer hardware of yesteryear. Due to just this sort of concern, the neural equivalent of a firmware upgrade may be devised. Also, nanotechnology may offer a means for “rewiring” the interface in a straightforward manner as new advances are made. It’s far too early to say for sure, but the possibilities will (and should) continue to be explored.

Can Data Escalation Promote Intelligence Escalation?

Rapidly increasing amounts of data, improvements in technology, and augmentation of our own mental processes, combined with competitive pressures, are already creating a positive feedback loop. This is producing additional incentives for generating more information, leading to more and better technology to work with it, and giving us further motivation to make ourselves even more capable of accessing and utilizing it. The result of such a cycle will be an escalation of intelligence, both in our technology and ourselves.

Like so many technological trends, this one could potentially accelerate and continue up to the point when limiting factors bring it to a halt. However, because improved intelligence would give us better tools for discovering and creating new ways to manipulate the primary physical laws of the universe, this threshold may be a very distant one.

Some theorists have speculated that our computers will continue to shrink and improve until every particle of matter in a block of material could be utilized for computation. Quantum theorist Seth Lloyd has referred to this as the “ultimate laptop,” and its upper bounds are defined by the fundamental limit on quantum computation and by the maximum information that can be stored in a finite region of space. Such a device would be 1033 times faster than today’s fastest supercomputer. (That’s a billion trillion trillion times faster.) Moore’s law asserts that computer performance doubles every one-and-a-half to two years. If this trend were maintained—and that’s a big “if”—then this upper limit could be reached sometime in a little over two centuries.

What would we do with so much computer processing power, so much data, and presumably so much intelligence? Would we spend our days pondering the remaining mysteries of the universe? Or would we become a world of navel-gazers, tweeting and friending at the speed of thought (or whatever it is we’ll be doing with Web 327.0)? In all likelihood, it will be something in between—something that appears utterly fantastical today and will seem quite mundane tomorrow. We may even still be arguing about how some new technology is going to render us less focused, less capable, or less human than our forebears—just as we always have when confronted with new information technologies.

In the face of all this, only one thing seems certain: Whether we’re swimming in the shallows or diving to the deepest depths, we’ll continue to work hard to stay afloat in an ever-growing sea of information.

About the Author

Richard Yonck is a foresight analyst for Intelligent Future LLC. He writes a futures blog at Intelligent-Future.com and is the founder of FutureNovo.com, a site about emerging technologies. His previous article for THE FUTURIST, “The Age of the Interface,” appeared in the May-June 2010 issue. E-mail: ryonck@ intelligent-future.com.

This article draws from his paper in the World Future Society’s 2011 conference volume, Moving from Vision to Action, which may be preordered from www.wfs.org/wfsbooks.

Augmented, Anonymous, Accountable: The Emerging Digital Lifestyle

By Erica Orange

From social networking to location-based mobile applications, our digital devices are increasingly shaping our lives. A business futurist examines what this will mean for organizations and individuals in the coming decade, highlighting some of the key “beads” on the “data abacus.”

We live in an age in which the digital and the real worlds comingle effortlessly. In a relatively short period of time, a variety of computing and communications devices have seamlessly incorporated themselves into our lives. New applications and tools for these devices compete to grab (and keep) people’s attention. At the surface, what is happening may seem to have a certain frenetic mindlessness to it, but at the heart of it, a critical transformation is occurring that will be important for companies and organizations to understand, and even leverage, in the coming economy.

Digital technologies, especially the “social media” that facilitate connections, operate within a certain framework. Imagine this framework as an abacus, with each bead representing a different opportunity, a different facet or characteristic of the digital sphere. No bead is fixed. Each can slide easily back and forth, reconfiguring the entire landscape and presenting entirely new and untapped areas of exploration.

The easiest way to visualize each bead on this “data abacus” is to list them, with the understanding that any product, service, or brand can use any combination of beads to be better positioned in the digital space in the coming decade. Listed below are some key emerging characteristics of this new era.

Anonymous Crowdsourcing

Honest, unfiltered, and anonymous or pseudonymous feedback may become the new norm. Web sites such as BetterMe and SideTaker allow people to gather anonymous constructive criticism and objective opinions from friends on specific matters or disputes and to give feedback to others as well. Honestly (formerly Unvarnished) aims to create an open forum to rate professionals in the workplace. Honestly’s founders express hope that that their profiles and performance reviews could eventually function in the same capacity as job references, thus reshaping the employee recruitment process, much like Facebook has done in recent years. While many employers have made it a practice to screen social media profiles for information about potential hires, the ability to discern reality from perception on those profiles will be an increasingly valuable skill.

Yet, anonymity is not always a good thing. While these sites tout built-in safeguards, whenever there is a lack of personal accountability there is also a good deal of risk. However unlikely it is that “flame wars” will spring up on such sites as these, anyone with a slight grudge or dislike for a co-worker could potentially damage that person’s career with a few thoughtless keystrokes.

Anti-Authoritarian

The explosive growth of social networking is giving people more of a voice around the world. Activism on Facebook and other sites has successfully rallied citizens against oppressive governments and posed challenges to bureaucrats and government officials. Twitter, Facebook, and LinkedIn have all been officially blocked in China due to protests being organized via the sites. Currently, more than a dozen countries block Internet sites for political, social, and security reasons. Meanwhile, the Obama administration is permitting technology companies to export online services like Instant Messenger and photo sharing to Iran, Cuba, and Sudan as a way to make it more difficult for restrictive governments to clamp down on free speech. Businesses and corporations are increasingly being held to similar levels of accountability.

Accountable

On the other hand, federal agencies in the United States and elsewhere are beginning to make statistical data and other information more widely available to the public, in an effort to boost government transparency, efficiency, and responsiveness. In addition, the amount of information available at the push of a “search” button is making it increasingly easier for citizens to fact-check statements coming through official channels.

Then there is the rise of e-government, or Government 2.0, in which public agencies utilize available technologies such as social media, cloud computing, and even mobile applications both to improve their own efficiency and to enable more open and direct participation in governance. Such crowdsourcing efforts engage more citizens in the governing process. In addition, social networking sites (Twitter in particular) have made it easier for voters and politicians to interact.

This is analogous to changes happening in the business world. Technology is redefining our relationship to time, space, and location. The private sector has seen many examples of this (over the past year especially) and now it is starting to spill over into the public sector as both citizens and government agencies have grown more comfortable with new technology.

It will become equally important for organizations to track the ways in which marketplace democratization is spreading, and to explore how to leverage it. Governments and companies alike will continue to grapple with issues about an increasingly vocal and connected public. Analytical

The amount of digital information increases tenfold every year. As The Economist reported (February 25, 2010), data management and analytics are worth more than $100 billion and are growing at almost 10% a year, roughly twice as fast as the software business as a whole. Roger Bohn of the University of California in San Diego, quoted in The Economist, says that “information created by machines and used by other machines will probably grow faster than anything else.”

Known as “database to database information,” this removes people from the equation altogether. This represents something I call the shift from Big Brother to Big Sister. What is emerging is a complex “system of systems” that has the ability to monitor and control everything from municipal power grids, integrated toll networks on major highways, and water distribution systems to employee communication, behavior, and productivity—all without human inclusion.

It can also benefit futurists. Google Ventures is investing in Recorded Future, a data analytics technology that could possibly be used to predict future events by tracking how frequently an entity or event is referred to in the news and around the Web over a period of time. This foresight technique, known as scanning, can be tedious and time consuming when conducted by individuals or groups. Fully automated scanning would likely lead to more (and more accurate) indications of what is to come.

Attention Grabbing

The ability to communicate—and be communicated to—constantly, cheaply, and effortlessly is creating so much noise in the system that it is a wonder anyone can pay attention to anything for very long. Often, people are simultaneously immersed in the digital and the “real” world, and multitasking online and offline. All of this is leading toward what technology writer and consultant Linda Stone, a former senior executive at Microsoft, has termed “continuous partial attention.”

It has been found that cell phone conversations do not just interfere with driving; driving also interferes with the processing, description, and memory of cell phone conversations and messages and can impede your ability to relay information accurately and remember key pieces of information. Research shows that interruptions can decrease accuracy, judgment, creativity, and effective management, making people far less effective.

Arrival Oriented

Currently, a lot of attention is being paid to location-based applications that harness GPS-enabled mobile technology to let users broadcast their location. Such services as Foursquare, Gowalla, SCVNGR, and Geoloqi have enticed millions to digitally “check in,” and more sites are popping up each year. Facebook has been testing the waters with Facebook Places, signaling that the company recognizes this potentially profound shift in the way people interact online. These mobile applications encourage users to share which business locations they visit with selected friends. Sometimes there is a gaming component involved.

Location-based mobile check-in services do not simply offer free word-of-mouth advertising. They are also designed to give businesses a chance to tailor deals to patrons and forge relationships with them.

For traditional businesses, one of the advantages of these services is the ability to reach customers on-the-go. Shopkick identifies consumers via their smartphones the moment they enter a retail environment, and the mobile app automatically begins to accumulate “rewards” and exclusive product discounts from the stores they choose to visit. Users can also earn virtual currency, called “kickbucks,” which can be cashed in for gift cards at any of Shopkick’s retail partners.

Many small local businesses have had difficulty competing online, but location-based apps could help level the playing field a little. At the moment, Shopkick’s partners are limited to large retail chains, but other services tailor themselves to local businesses as well.

And there is yet another side to all of this data transferring. Eventually, businesses may know exactly who their customers are, where they are, and when they are nearby.

If that wasn’t enough, digital billboards will increasingly be able to track the age and gender of pedestrians who walk by them, providing advertisers with a more accurate reading of the potential audience. Clothing retailer Forever21, for instance, recently unveiled a billboard in Times Square that interacts with the crowd on the street using high-tech surveillance equipment and computer vision technology. The software identifies and maps the people below, enabling the computer to build a composite image of them in real time.

As this shows, businesses are exploring completely new ways to reach consumers. Radio frequency identification technology and global positioning systems are being developed arm-in-arm with the imagination of businesses (not to mention governments), which are creating new ways to use these tiny electronic sensors to monitor and track consumer behavior, as well as their own supply chains and product inventory—all in real time. However, the growth and interconnectedness of embedded systems will undoubtedly raise major challenges and privacy implications, especially as complex systems become increasingly self-adaptive.

Aggregative

Data systems are beginning to take on a life of their own, becoming more autonomous and more fully integrated into daily life. The Economist projects that, by 2017, there could be as many as 7 trillion wirelessly connected devices and objects, which translates to approximately 1,000 per person. Self-contained systems will increasingly pool their resources and capabilities to create new, more complex, and fully independent meta-systems that will offer more functionality, operability, and computing power.

As systems become vaster and more complex, our ability to interface with them will undoubtedly become more difficult. The average American consumes about 34 gigabytes of data and information every day, and the 2010 study “Digital Universe” by IDC projects the amount of information to rise 44-fold in the next decade. This will become harder to manage and control. Also, as we’ve seen through developments in reality mining, mobile phones are able to predict the patterns of our movements and whereabouts. In fact, we can be found as much as 93% of the time, no matter how far we travel.

Affordable

Low-cost technology, available to just about everyone, is radically changing the world’s economic structure. The drastic reduction in global poverty in the last couple of decades is due in no small part to the declining cost and increasing availability of equalizing technologies such as the computer and the mobile phone. As these tools become even cheaper and even more available, there is hope that this trend will accelerate, particularly in the developing world.

Aid and Assistance

We’ve seen for a while now how the Internet and mobile phones can be leveraged as tools for social development in Third World countries. Txteagle, for instance, distributes small jobs via text messaging to people in developing countries in return for small payments, which are transferred to a user’s phone by a mobile money service. These tools will be increasingly utilized by leveraging the power of the mobile phone to help educate and grow emerging economies.

Adding Up the Impacts

There are a number of other “beads” we could add to this list, including adaptable, aesthetically pleasing, aware, authentic, algorithmic, artistic, actionable, and more. The point is that traditional brick-and-mortar establishments are merging with the online world, in-store experiences are merging with gaming platforms, and real-world brands are merging with location-based social media platforms. In the future, businesses will combine the real and the virtual in increasingly innovative and expansive ways.

In this budding age of digital entanglement, businesses will have more in-depth knowledge about their customers and clients—who they are, their comings and goings, and their purchasing patterns. As humans and data systems become more intertwined, we may also see a greater level of intimacy emerge between the business and the consumer (as well as other organizations and their stakeholders, including governments, schools, and even religious institutions).

Also, as more consumers voluntarily surrender more information, there is likely to be an accelerated trend shifting the focus from gathering information to processing it. Now that companies have the data, they aren’t sure how to best use it or sort through it all. Other third-party companies will likely emerge to handle the complex metrics. A host of powerful computational tools will mash up vast quantities of data from many sources, in many ways.

Yet, in certain ways, individuals will be more in control than ever before. Businesses and other organizations will be forced to engage in greater transparency and openness about their practices. They will have to interact with their customers on their customers’ terms, as well.

Small local businesses have traditionally found it hard to compete in the online marketplace against large, established brands. In this new environment, we are likely to see more companies, both large and small, operating effectively in both the traditional brick-and-mortar world and the digital realm—and merging the two together in increasingly innovative and expansive ways. Thanks to smartphone apps and social media, local businesses have the chance to gain more of a foothold, rather than losing out to online retailers.

Yet, journalist and cultural critic Douglas Rushkoff offers a word of warning in Program or Be Programmed (OR Books, 2010), writing: “If the social urge online comes to be understood as something necessarily comingled with commercial exploitation, then this will become the new normative human behavior.” In other words, something that seems almost unfathomable at the moment—for example, voluntarily allowing companies to gather private information on you—may become accepted as simply a part of life.

Meanwhile, all of our systems, networks, structures, electronic devices, and virtual entities are becoming increasingly connected and dependent on each other—in many cases, without human inclusion. This process began several years ago, and is happening more and more, faster and faster.

For organizations, this represents and requires a monumental shift in management, especially as organizational energy input continues to migrate away from human labor. At the end of the day, the number of hours spent managing labor may pale in comparison to those spent managing systems.

In the meantime, the visual of the data abacus can be used as a way to provide fresh insights, spot new business trends, and unlock new sources of economic value as companies work to determine how to gain a competitive advantage from this sea of digital data.

About the Author

Erica Orange is vice president of Weiner, Edrich, Brown, Inc., a leading futurist consulting group in the United States. Her previous article for THE FUTURIST, “From Eco-Friendly to Eco-Intelligent,” was published in the September-October 2010 issue. Her address is 200 East 33rd Street, Suite 9I, New York, New York 10016. E-mail erica@weineredrichbrown.com.

Our Naked Data

By William H. Saito

The ease of communicating on modern networks has meant a rise in data vulnerability. A security specialist outlines the steps that the IT industry should take to protect consumers from data attacks—and itself from reactionary regulators.

Many of us find ourselves with multiple gadgets—in our pockets, our homes, our cars, our offices—and these gadgets are increasingly built to talk to each other, often automatically and invisibly. Camera phones upload straight to the Web and connect through WiFi and Bluetooth to unseen computer networks; the printer next to your desk can suddenly start printing out documents sent from a branch office on the other side of the world, and our cars automatically pull down information from the sky on the latest traffic and weather conditions.

A 2010 survey by Unisys Corporation showed that most Americans are largely unaware of the threat posed by data vulnerability. For instance, while a majority (73%) of Americans said they regularly update the computer virus detection software on their home computers, only a minority (37%) said they updated their cell phone passwords regularly, and nearly the same portion (36%) do not update mobile passwords at all.

Even common documents (licenses, passports, payment cards) that we carry around with us contain RFID chips. All these sensors and transmitters are constantly busy, silently collecting and giving away our personal information to other devices, often without our knowledge. Every time such information is transmitted and received, there is a very real risk that the data may be intercepted by people other than those for whom it was originally intended, and tampered with or abused for criminal, terrorist, or other purposes.

Scientists actually may be more at risk than the average population, especially those in academic circles. For all the theoretical discussion of computer security, those inside the academic environment often do not take real security issues as seriously as do those in the business world. This indifference puts researchers at risk with regard to their data, especially those who are involved in research with potential commercial applications.

Scientists working on politically controversial or emotionally charged projects have also famously found themselves targets for security attacks: In 2010, the e-mail accounts of climate researchers from East Anglia University were hacked by conservative activists, who then attempted to use private messages to discredit the researchers academically and professionally. The researchers were subsequently cleared of any wrongdoing or impropriety, but their exoneration received much less public attention than the initial scandal.

The Global Positioning System And the Risk of Convenience

Numerous types of sensors were designed for our convenience, usually not with security in mind. By the end of 2010, almost 80% of cell phones had a built-in global positioning system (GPS) device, according to iSuppli. That’s up from about 50% in 2009. These devices can be used to send information on the user’s whereabouts to another place. For the most part, we see such technology as a welcome innovation, helping us find the nearest coffee shop when we are in a strange city, for example, or discover which of our friends is close at hand, thanks to social media applications.

We may have the option of allowing such information to be transmitted or of blocking it when we first start to use the application, but there are other ways of tracking phones (and people) without our consent or knowledge. The phone network is not the only system that provides information on our whereabouts; many digital cameras now also include GPS receivers, permitting the automatic geotagging of photos—i.e., instantly identifying the photographer’s real-time location. Most modern cars are equipped with satellite navigation systems, which also transmit location information.

Back Doors, RFIDs, and Hidden Vulnerabilities

Our computer systems at home and at work are obvious security targets, but the existence of “back doors”—methods for bypassing normal authentication—may not be that obvious. Networking over the air (WiFi) or over power lines and the use of Bluetooth gadgets help to reduce clutter and introduce flexibility, but they also introduce risk. “Free” wireless access points are sometimes set up to capture WiFi traffic, and it is now possible to spoof a global system for a mobile communications cellular tower to capture all cellular telephone calls in a specific targeted area. Clearly, politicians and celebrities are not immune to hacking, as seen by the recent revelations that members of the British press were routinely listening in on the voice mails of its citizens, including the royals.

To prevent channels between devices from being compromised, it is possible to encrypt the traffic; however, such encryption can slow down and impede users, and many “secure” products are quite vulnerable since the protocols are not well implemented. Often, the security and encryption on these devices is so troublesome to set up that many users (including corporate IT departments) don’t bother, or set things up incorrectly, falsely assuming they are protected.

Even if you’re not using a wireless network or a Bluetooth keyboard, the electromagnetic emissions from the equipment you use can be monitored remotely, and in extreme cases may actually allow someone to read your screen through walls or from across the street.

You would think that most people by now would know something about the risks of viruses on their computers, yet many people happily download and install unknown applications from dubious sources, oblivious of the fact that their new software could hijack their PC’s camera and microphone and surreptitiously transmit audio and video to parties unknown. In fact, the simple microphones found in all laptops can be used to determine what keys are being typed on those keyboards.

Misusing computer peripherals is sometimes an officially sanctioned activity, as shown in the case of the Pennsylvania school district that distributed student laptops with what the district termed “tracking security” features (but could better be described as Big Brother “spyware”), taking photographs of unsuspecting students in their homes.

While the proliferation of USB devices over the past few years has been a boon for computer users, it has also increased opportunities for data hacking. Small USB keyloggers, similar in appearance to thumb drives or keyboard cable extenders, can remain undetected for months at a time, faithfully recording every password, confidential memo, and private thought before the device is retrieved (or the data automatically uploaded) and the contents analyzed, regardless of how tightly locked down your office’s network is. Even innocent-seeming devices such as USB flash drives and CD-ROMs distributed at trade fairs, etc., can be used to install back doors and “Trojan horses,” sending confidential data such as banking passwords back to base, just as a “free” game downloaded to a mobile phone can open that device up to unlimited abuse.

Nor, in case you’re wondering, is the written word any more secure. Many office printers, copiers, and faxes now incorporate hard disks or other memory devices to capture, store, and transmit the printed and scanned images (we don’t think of them as such, but modern copiers are actually sophisticated computers that can be easily compromised).

These memory devices are designed to be accessible for maintenance purposes: They can be removed and their contents read at leisure. The printouts and copies from many of these devices incorporate microscopic anticounterfeiting information, which can also be used for tracking purposes. And when you leave the building, all the smart cards and RFID chips that you carry around—the corporate entry cards, mass transit cards, passports, credit and debit cards, etc.—can also let people know who and where you are and what you’re up to.

We can regard many of these security and privacy violations as essentially harmless, if irritating. Vending machines in Japanese train stations, for example, can automatically recommend drinks to customers based on their age, sex, and other factors. Annoying text messages may pop up on your cell phone, inviting you to enjoy a discounted latte every time you come within 100 yards of a coffee shop that you’ve visited in the past. But far more frightening are the criminals obtaining or abusing such information. The “safety blanket” supposedly provided by these RFID chips is an illusion, since the chips, together with their content, can be cloned, with all the attendant problems of identity theft.

Three Steps to a More Data-Secure World

Computer scientists both outside and inside the IT industry need to understand the essence of security and how the data that they collect will affect the overall system. The goal is to mitigate the risk of unintentional data leakage, which leads to other security issues. One way to do this is for researchers to find the flaws in the systems they use, but manufacturers seldom welcome these efforts.

A change in attitude needs to take place regarding the responsible disclosure of exploits by independent researchers: The discoveries need to be welcomed and acted upon, rather than seen as challenges to professional competence. Currently, there is a surprising lack of awareness of the risks posed by data breaches; the majority of technology companies are more concerned with (and devote large amounts of R&D to) business continuity than with security.

As sensors become cheaper and more commonplace, the IT industry needs to take a consistent approach with regard to alerting consumers, at a user interface level, about the privacy risks resulting from the use of different sensors and applications, as well as a unified hardware and basic software approach to security. The emphasis on continuity rather than security illustrates that companies and organizations, including research institutions, need to take active security and privacy protection measures within their domains. The industry as a whole should be aware of the risks to their businesses posed by security breaches, and should take the necessary steps to guard against these risks.

At least three steps are vitally necessary to head off what I see as a serious crisis developing—serious for individuals who will suffer as a result of abuses of their privacy and personal information, and also for the many companies and organizations that will suffer from all-too-predictable legislation enacted to protect citizens from the “evils” of technology being perverted by unscrupulous forces.

1. Inform the public of data products’ vulnerabilities. The makers of all devices that are capable of collecting and/or transmitting data should inform the public of any known vulnerabilities associated with their products. Whether or not this should be a legal duty is another matter, as it is probably impossible for a company to come up with an exhaustive list of every way in which its products could be abused.

Industry, too, needs to create standardized guidelines on the use of sensor data that contains personal information. There needs to be a cross-industry “best practices” standard to govern the implementation of these sensors at the device level, which can be explained to end users in a standardized format so that the use of these is consistent.

2. Make security a data design priority. Companies engaged in such designing and manufacturing must proactively incorporate security in their products and the design process. Designers must balance the accelerated demand for new features against a possible regulatory backlash that may occur if security becomes a populist consumer issue.

There are real-world examples of how security is already being taken seriously in areas that may seem surprising. Some copier manufacturers, such as SHARP, offer and promote encryption on the hard drives built into their copiers and printers. Such encryption significantly reduces the value of a stolen or illegally accessed hard drive. Many laptop manufacturers now offer the option to disable USB ports (this is standard operating procedure in many corporate Windows desktop builds), and several cell phone manufacturers promote models without cameras. Unfortunately, these solutions fail to address the root cause of the issue; they are merely “patches” for a few of the holes in what is a veritable Swiss cheese of data insecurity.

3. Set high standards and enforce them. Perhaps most important of all, industry players must collaborate and implement stringent self-regulation to better define the collection and use of data from the different sensors in our lives. Moreover, global business must work closely with government to strengthen the penalties for any interception of information containing personal data not intended for the person or organization reading it.

We stand at a crossroads in terms of dealing with data security, and both paths are, for different reasons, highly unattractive. Prompt, meaningful self-regulation to avoid a coming crisis seems just as impossibly difficult to some as suffering the painful, throw-out-the-babies-with-the-bathwater overreaction of technically unsophisticated, politically motivated government regulators.

I argue that self-regulation is far preferable to government control. I am aware that cross-industry cooperation, not to mention industry–government cooperation, is no easy matter, but the consequences of delaying could be catastrophic. It is essential to avert this crisis so that consumer choice isn’t restricted, manufacturers aren’t shackled, and researchers aren’t thwarted in their development work by a new wave of draconian personal data protection laws.

About the Author

William H. Saito is an entrepreneur, venture capitalist, educator, and advisor on security issues worldwide. Currently, he serves as an advisor on innovation and entrepreneurship for several Japanese ministries and lectures at Keio University (SFC) and at Tokyo University of Agriculture and Technology (TUAT) in Japan. He was recently selected as a Young Global Leader (YGL) for 2011 by the World Economic Forum. Web site http://saitohome.com/.

The Case Against Cash

By David R. Warwick

Alternatives to cash would not only reduce violent crime, but also deter underground economic activity that goes untaxed. An advocate for a cash-free economy asks why the U.S. government isn’t promoting it.

Twenty years, ago Harvey F. Wachsman, a lawyer and medical doctor, wrote an op-ed piece published in the New York Times proposing that cash be abolished and replaced by government bank cards. His purpose was to reduce tax evasion, muggings, and other crimes. Many observers over the years have expressed the same idea. Yet, to this day, rudimentary research on abolishing cash, let alone a cost-benefit study, is nonexistent.

Aside from local news of an occasional mugging or robbery—and the strong suspicion that workers who prefer cash do not report such earnings to the IRS—few people in the United States are aware of the extent and severity of cash crime and how heavily it weighs on the economy. Americans’ reliance on cash is a contributor—albeit an indirect one—to the high U.S. prison rate and the bloody Mexican drug war.

Consider the fact that the nation’s prisons are overflowing with the highest inmate population in the world. With only 5% of the world’s population, the United States holds about one-quarter of its prisoners. At an average outlay of $25,000 per inmate, the United States spends more than $60 billion annually just to house prisoners.

Now consider that low-level drug offenses comprise 80% of the rise in the federal prison population since 1985 (though those numbers have begun to go down in more recent years). Harsh penalties for nonviolent drug offenses are a central component of prison overcrowding. Whether or not the War on Drugs or minimum sentence requirements are an appropriate or fair response to U.S. drug use, one thing is certain: The vast majority of those illegal transactions are cash-based.

Greenbacks are also the currency of choice for Mexican drug cartels, which funnel between $19 billion and $29 billion in profits out of the United States annually, according to the U.S. government. That money goes across the border by the truck full. More than half of it quickly disappears into Mexico’s cash-based economy, where it goes toward off-the-record purchase of huge items, tracts of land, car dealerships, and even hotels, according to L.A. Times writer Tracy Wilkinson.

The good news is that, for law-abiding consumers in the United States, cash is disappearing. A prediction made 15 years ago by economist Milton Friedman is proving correct; that is, that cash “will die a natural death.” But you would not get this impression judging from the amount of currency in circulation: In December 2010, it reached $829 billion. And that sum rises every year. However, about two-thirds of that is located abroad. Most of the rest might be hoarded by a small minority, because a recent survey by the Boston Fed reveals that the average U.S. consumer has only $79 in cash on his or her person and another $157 in cash elsewhere (home, car, office).

The best indicator of where cash is headed is the percentage that it is used at point of sale (POS) in comparison with other types of payment. The Boston Fed survey reveals that Americans now pay in cash at POS only 22.7% of the time. Cash is clearly trending downward. Indeed, the recent surge in debit card usage is not so much a switch from credit cards as it is at the expense of cash and checks.

Can Reducing Cash Reduce Crime?

The post-cash era will see a reduction in those crimes in which cash is the typical payment medium. If it becomes more risky and difficult for thieves to sell stolen goods, then criminals will steal fewer goods in the first place, and crime rates for simple theft and burglary of goods will fall. Even identity theft and wire fraud will decline. Fraudulently acquired goods are typically sold for cash, and fraudulently wired funds are most often redeemed in cash in order to break audit trails. Cash also cloaks the links between thefts and subsequent sales of the stolen property in online auctions and at flea markets.

The greatest single benefit will be the elimination of cash robberies. Around 800 Americans are murdered every year in these crimes. A recent study by Iowa State University places the overall societal cost of a single armed robbery at $335,732 and that of a murder at $1.75 million. Armored car service alone is a $15 billion industry. Based on 2009 FBI statistics, this means that ending cash robberies (even allowing for noncash robberies) would save the United States about $144 billion per year.

Criminals may turn to cash alternatives, but all will prove to be poor substitutes. Credit and debit cards are easily traced. Barter is impractical. Prepaid cards or foreign currency may work in limited scenarios, but on analysis they have other snags. None of these comes close to cash in providing ease of use, secrecy, universal acceptability, and value-storage. Drug crime will continue, no doubt, but today’s freewheeling narcotrafficking empires are destined to disintegrate.

A further benefit of the demise of cash is that it will compel the unbanked—some 17 million Americans—to establish bank relationships. This will not only save them often exorbitant fees for payday loans, pawn shops, refund anticipation loans, and rent-to-own financing, but it will also further the government’s imperative to provide financial access to all citizens.

Privacy protections will evolve with the post-cash society. Subpoenas or consent will still be required to access payment data. Yet, current investigative methods, particularly “following the money,” will yield greater crime-solving results. Most significantly, the much-increased chance of being caught will deter many crimes in the first place. Deterrence promises to be particularly effective against tax evasion: The widespread practice of omitting to report and pay taxes on cash income costs honest taxpayers at least $300 billion per year. Cash is the essence of the underground economy—which the Cato Institute, a conservative think tank, estimates at 13% of the U.S. GDP.

Getting to a Post-Cash Future

Obviously, the Fed cannot simply pluck cash out of circulation without a replacement plan. Notwithstanding the terminal decline of cash, actively abolishing it is a formidable challenge. It interrupts cash’s economic functions as opposed to simply allowing it to succumb to attrition by other payment media, during which time glitches tend to work themselves out. No country has ever abolished its currency.

Government could speed its descent, for example, by imposing a federal tax on cash withdrawals from ATMs and on receiving “cash back.” This might well precipitate the final process. Belgian economist Leo Van Hove suggests “nudging” cash out of circulation by means of such surcharges. The Fed could also transform cash into an electronic currency. This would preserve seigniorage and government control of the national unit of money, to name but two advantages.

Powerful and influential privacy advocates extol cash because of its anonymity; it leaves no trail and does not involve third parties. But for the public, cash-based crime is violent and real, and a more tangible threat than concerns about government “spying” on citizens. Bank cards are deemed safe and easy to use, and losses from such use are minimal and insured. Americans are making their choice clear: Nearly 80% of their point of sale purchases are made in trail-leaving transactions in which they freely give their names.

While government economists likely recognize most privacy concerns as hyperbole and are cognizant of the damage that cash facilitates, particularly in the underground economy, actually acting to abolish cash remains controversial. A cashless future is a policy labeled “radioactive” by some—too politically dangerous to discuss.

Enlightened economists around the globe are breaking away from such constraints. I would urge America’s policy makers to recognize the profound social and economic merit in the proposal, join ranks, roll up their sleeves, and get to work on it.

If they do not, cash will crash anyway. But that may take 20 or more years, and it will cost many thousands of lives, even more serious injuries, and trillions of wasted dollars.

About the Author

David R. Warwick is a real-estate developer, investor, and former attorney. He is author of Ending Cash: The Public Benefits of Federal Electronic Currency (Quorum Books, 1998) and many articles in various publications. E-mail drwarwick@comcast.net.

Connectivity and Its Discontents

A book review by Edward Cornish

Once a cheerleader for new electronic technologies, an MIT professor now worries about their long-term effects.

Alone Together: Why We Expect More from Technology and Less from Ourselves by Sherry Turkle. Basic Books. 2011. 360 pages. $28.95.

Sherry Turkle, a clinical psychologist and professor at the Massachusetts Institute of Technology, has spent more than 15 years studying the relationship of humans with robots and other electronic technologies. Once a techno-cheerleader, she now questions whether our relationship with new electronic technologies will work out well over the long run.

“Technology now reshapes the landscape of our emotional lives,” Turkle writes in her new book, Alone Together, “but is it offering us the lives we want to lead? Many roboticists are enthusiastic about having robots tend to our children and aging parents, for instance. Are these psychologically, socially, and ethically acceptable propositions?… And are we comfortable with virtual environments [e.g., Second Life] that propose themselves not as places for recreation but as new worlds to live in?”

Overwhelmed by demands on our time, says Turkle, we turn to technologies that promise to help us but in fact make us more harried than ever, so we eagerly escape to the Web. “Then, we gradually come to see our life on the Web as life itself!”

Technologies like BlackBerrys and cell phones do connect us to friends all over the world, and robots can serve us as helpers, friends, lovers, and playthings. But such technologies also impose serious costs.

One problem is that these technologies distract us from attending to the people we are actually with. Coming home after a long day, a schoolteacher talks to a robot dog because her husband is too busy on his cell phone to listen to her.

Children longing for a bit of “quality time” with their parents find that mothers are online chatting with faraway friends while fathers text-message during Sunday dinner.

Meanwhile, in schools, students update their Facebook status instead of listening to teachers.

Affectionate Robots, Dialed-Down Friends

Deprived of meaningful contact with human friends, many people now turn to robot pets such as the Furby—a highly sociable, owl-like robot that plays games and seems to learn to speak English as its owner plays with it.

A robot owl or other pet that can speak soothingly to older people at bed time can be a comfort to many people, both old and young. So in a few years, airlines may routinely distribute Furbies, Tamagotchis, and other sociable robots to passengers wanting to get a little rest and relaxation on long overseas flights.

At the same time, we may be losing the real human touch of other people, even as we try to stay more connected: “I don’t use my phone for calls any more,” reports a college student. “I don’t have the time to just go on and on. I like texting, Twitter. Looking at someone’s Facebook wall, I learn what I need to know.”

Young people now expect to be continuously connected to their friends wherever they are and to be always “on.” Meanwhile, people have become increasingly impatient with dealing face to face with other people. Hardly anybody now seems to have time for a relaxed conversation; instead, people compulsively text-message while driving to work, despite the risk.

Alone Together

offers a wealth of information about the numerous uses being made of new technologies, but Turkle does not offer a clear answer to the problems she describes. It seems highly unlikely that these technologies will disappear (unless they are replaced by even more powerful technologies) or that people will refrain from using them in ways that many of us will not always be happy with.

On the other hand, a glance at history reveals that major technological innovations have frequently alarmed the contemporary world when they first appeared but became completely accepted as time passed.

About the Reviewer

Edward Cornish is founding editor of THE FUTURIST and the World Future Society’s futurist-in-residence.

Cautions about Techno-Faith

By Rick Docksai

No technological innovation can substitute for human critical thinking, argue two engineering and science professors.

The Techno-Human Condition by Brady Allenby and Daniel Sarewitz. The MIT Press. 2011. 216 pages. $27.95.

What if two people plugged into an electronic brain–brain interface that transmitted all their thoughts to each other? In spring 2001, according to Brady Allenby and Daniel Sarewitz in The Techno-Human Condition, a panel at the National Science Foundation considered this very scenario. The participants, who included health researchers and IT executives, agreed unanimously that this interface would erase misunderstanding and usher in world peace.

It didn’t occur to them that humans who understand each other might still want to kill each other, or that good diplomats sometimes have to keep some information to themselves. Allenby, an Arizona State University engineer, and Sarewitz, an Arizona State science professor, cite this as one example of well-intentioned humans placing too much faith in technology.

People often fail to anticipate a new technology’s undesirable side effects, the authors argue. The twentieth-century physicists who discovered nuclear energy did not foresee the atom bomb, global arms races, or toxic fallout from defective nuclear power plants.

Blind faith in technology and failure to gauge new technology’s long-term consequences both receive scrutiny in The Techno-Human Condition. Allenby and Sarewitz stress that technology doesn’t exist in a vacuum: It interacts with larger human and natural systems in both helpful and harmful ways.

“Technology is best understood as an earth system—that is, a complex, constantly changing and adapting system in which human, built and natural systems interact,” the authors write.

Central to their discussion is the use of technology to enhance humans’ mental and physical performance. The transhumanist movement anticipates humans merging with machines, with consequently huge accelerations in people’s life spans, intelligence, and overall well-being. Allenby and Sarewitz agree that technology will transform life, but they call for caution to ensure the best results.

For instance, many people would opt for medical treatments that boost their cognitive skills. But cognitive enhancement does not make someone a better person, and a malicious person who undergoes cognitive enhancement might become even worse—by becoming more capable, he or she also becomes more dangerous.

“If a lot of jerks improved their concentration, the cumulative effect on the rest of us might well be unpleasant,” the authors write.

Western societies, in particular, have a track record of overly trusting technology, according to Allenby and Sarewitz. The Enlightenment era of the seventeenth and eighteenth centuries bred successful traditions of scientific inquiry but also left many Westerners assuming that reason and analysis would solve or manage all societal ills.

Allenby and Sarowitz hold that Enlightenment-era thinking has run its course, for our world increasingly defies understanding and management. Humanity will not make much further progress until it learns to embrace complexity and contradiction instead of instinctively trying to solve them.

The authors state that any technology exists on multiple levels. On one level, a human user operates it and benefits from it. On another, the technology interacts with society and generates long-term change. Social media exemplifies this. People first used Facebook, Twitter, and similar applications to interact with other users. Then resultant changes in social interaction, marketing, political activism, reading habits, and human thought patterns emerged.

Allenby and Sarewitz identify five technologies that are poised to rapidly evolve and generate massive societal change: nanotechnology, biotechnology, robotics, information and communications technology, and applied cognitive science. Societies will benefit more from them if, while creating them, they also create frameworks to guide their continuing evolution.

The frameworks will have to be adaptable. Technology will change quickly, so the rules governing its use will need to change along with it.

“The lessons of yesterday’s experience are not easily transferred and applied to today’s problems,” write Allenby and Sarewitz.

The authors place more hope in forums for exploring scenarios of innovations and their consequences. Critical analysis, experiential learning, forecasting, and emergency planning will all be vital to helping people navigate the often-bewildering paths of innovation.

“Intelligence must co-evolve with, and emerge from, experience,” they write.

The Techno-Human Condition is a thoughtful and analytical discussion of how humanity might continue to develop technology while preserving the best of human nature. The authors’ tone is philosophical and academic; it is not light reading. But readers who look forward to an informative debate will be highly satisfied.

About the Reviewer

Rick Docksai is a staff editor for THE FUTURIST and an assistant editor for World Future Review. E-mail rdocksai@wfs.org.

Books in Brief (July-August 2011)

Edited by Rick Docksai

Crisis Modeling

Crashes, Crises, and Calamities: How We Can Use Science to Read the Early-Warning Signs by Len Fisher. Basic. 2011. 233 pages. $23.99.

A stock market crash and a volcanic eruption have at least one thing in common, according to science writer Len Fisher: People can foresee them if they know what to look for. As Fisher explains, most of the worst natural and human disasters strike quickly but with some prior warning signs. In Crashes, Crises, and Calamities, he describes how new models of system processes and system breakdowns can help observers to anticipate hurricanes, tornadoes, economic crashes, mass riots, and many other deadly situations.

Fisher describes the four processes—positive feedbacks, acceleration past the point of no return, domino effect, and chain reactions—that are the primary causes of runaway disasters in the environment, economics, society, and even our personal lives. Each one, in its own way, pushes a system toward a “critical transition” point in which massive change takes place in very short time.

It used to be nearly impossible to spot a critical transition until it had already begun. New computer models in the last decade have vastly expanded our scope, however, and revealed common patterns running throughout many natural and human system failures. It is now possible to foretell when many natural or human-made systems are about to collapse.

Fisher offers examples of research models and how researchers use them. For example, a “fold catastrophe” model helps ecologists understand ecosystem degradation, psychologists make sense of mood swings, and anthropologists chart the rise and fall of human civilizations. Another, the “cusp catastrophe” model, illustrates the thought processes that lead college students to engage in binge drinking. No matter what the subject area, the right models help researchers to make sense of data and, ultimately, help craft sound public policies.

Some models are more accurate than others. Fisher advises that readers apply skepticism to critique the models themselves. He lays out sets of questions that an observer can ask to ascertain that the data, the model, the calculations, and the people conducting the tests are each demonstrably reliable. A model that fails on any of these four points will probably not offer reliable forecasts.

A book that describes scientific research models could easily intimidate average readers. Fisher’s book does no such thing. The author lays out the theories and their uses in very direct, conversational language. Crashes, Crises, and Calamities is well-suited for all readers who want to think creatively about their future and the world’s future.

A Former Oil Advisor Talks Climate Change

Earth: The Operators’ Manual by Richard B. Alley. W.W. Norton. 2011. 479 pages. $27.95.

Richard B. Alley is admittedly not the stereotypical environmental advocate: He worked many years for an oil company, and his political registration is right-of-center. But the Penn State geosciences professor, who is also an Intergovernmental Panel on Climate Change researcher, takes the greenhouse-gas threat seriously. In Earth: The Operators’ Manual, he explains why.

In simple—but data-rich—prose, he illuminates carbon dioxide’s role in warming Earth’s climate from prehistory onward. He lays out the conclusive evidence that dangerous warming is now under way, and that human-generated carbon dioxide is the culprit. Along the way, he rules out every other proposed cause of warming, from sunspots to volcanoes. As a corollary, he highlights the looming crisis in fossil fuels supplies.

He also tackles many objections raised by global-warming skeptics: He explains why the observed worldwide temperature cooling after 1998 does not disprove warming, why the 2009 “Climategate” incident does not discredit the evidence for climate warming, and why some countries had record snowfalls and cold spells in recent years even though the planet is allegedly heating up.

Alley calls for a measured transition away from fossil fuels, with taxes on carbon emissions imposed to incentivize progress. Then he walks readers through a variety of renewable-energy systems. None can replace fossil fuels just yet, he says, but the creativity and initiative that their inventors demonstrate should give us great hope.

Much of Alley’s information will be old hat to environmental researchers and energy insiders, but general audiences may gain many new insights. He shows a firm understanding of both the business world and climate science, laying out the facts in a direct but well-balanced and ultimately hopeful voice of authoritative scholarship. Lingering skeptics—and those wishing to debate said skeptics—will find Earth: The Operators’ Manual a worthwhile read.

The Dollar Isn’t What It Used to Be

Exorbitant Privilege: The Rise and Fall of the Dollar and the Future of the International Monetary System by Barry Eichengreen. Oxford University Press. 2011. 215 pages. $27.95.

The U.S. dollar was the world’s currency of choice throughout most of the twentieth century and the first few years of the twenty-first, but within a few years it became just one international currency among many, says Berkeley economist Barry Eichengreen.

Confidence in the dollar dropped precipitously in the wake of the 2008 financial crisis, he notes. Whereas international banks had bought billions of dollars of U.S. securities every year, U.S. assets were now viewed as toxic.

Eichengreen believes that the recovered, post-recession global economy will hold a reduced role for the dollar. India and its trade partners will use the Indian rupee for more transactions; Brazil and its partners, the Brazilian real; etc.

The twenty-first century world marketplace is distinctly multipolar: The U.S. economy thrives alongside those of Europe, Japan, and the surging BRIC countries. Also, mobile technology makes it easier for buyers and sellers in different countries to conduct transactions with different currencies.

But Eichengreen does not expect the dollar to go away. Not only are markets everywhere accustomed to using it, but some would suffer losses if they disposed of their dollars too quickly. Also, each rival currency has major problems of its own. Eichengreen concludes that the dollar will retain a dominant presence in international commerce for many years to come. However, the dollar’s stock will fall further if the United States does not rein in its annual budget deficits, he warns.

Some authors trumpet American exceptionalism, while others hold that the United States is on a path of irreversible decline. Eichengreen presents a well-reasoned middle path. Economists, public-policy specialists, and others curious about the future of the global marketplace will find Exorbitant Privilege to be an analytical and provocative read.

The Roots of Economic Recession

Fault Lines: How Hidden Fractures Still Threaten the World Economy by Raghuram G. Rajan. Princeton University Press. 2010. 260 pages. $26.95.

Economies may be recovering from the 2008 recession, but another meltdown may be inevitable unless major changes in worldwide financing take place, warns finances professor Raghuram G. Rajan. The “fault lines” that led to the last recession—income inequality, exorbitant debts, and high-risk capital—are still with us and must be fixed, he says.

The current system still rewards banks that issue high-risk loans and encourages financial institutions to buy delinquent debts en masse, according to Rajan. Traders assume that, even if their accounts fail, their governments will cover their losses.

Regulatory authorities are to blame, also. Many do not demand enough information from the financial institutions, and even when they do, they often do not disclose it all to the public.

At the country level, the world’s most prosperous economies, such as Japan and the United States, cannot grow without borrowing from China and other lender nations. Developing economies, meanwhile, depend too heavily on exports.

Rajan encourages light, targeted regulations that reduce risks but hold financial institutions responsible for their own success or failure. He also calls for open information channels that keep investors, consumers, and regulators fully aware of risks.

Social equality is also a must, according to Rajan. When a country’s citizens are deeply unequal in health-care access and opportunities for education, they will be more prone to unemployment and high debts. A country that maximizes opportunities for all its citizens, however, will better withstand economic ups and downs.

Recession-weary readers may consider Fault Lines grim news. Rajan makes no secret that the existing market systems greatly worry him. But he also expresses hope that, if the world community learns the right lessons from the latest recession, a more stable and equitable world economy may be in our future.

A Post-Western Global Economy?

How Asia Can Shape the World: From the Era of Plenty to the Era of Scarcities by Joergen Oernstroem Moeller. Institute of Southeast Asian Studies. 2011. 540 pages. $49.90.

A new global economic system will emerge this century, and Asia will probably be its center of gravity, argues Joergen Oerstrom Moeller, an Institute for Southeast Asian Studies senior fellow, in How Asia Can Shape the World.

Demographic and environmental challenges demand new economic models that use fewer materials and more manpower while balancing the drive to accumulate wealth with the realities of finite resources. The countries of Asia are well positioned to form that model, according to Moeller. Their millennia-old philosophical traditions prize harmony, sustainability, and social responsibility while discouraging materialism—and Asian businesses exemplify these principles by, in general, showing a greater sense of corporate social responsibility than their Western counterparts do.

Moeller also cites the growing size of Asia’s financial sectors and the increasing investments by Asian businesses in Africa, Latin America, and the Middle East. Also, Asia is becoming the primary market for the developing world’s raw-material exports. Over the next 25 years, China—not the United States—could be the most influential power in some developing regions.

Asian leadership of the global economy, however, is not a foregone conclusion. China, Japan, and their regional neighbors urgently need to modernize their educational systems to graduate enough qualified workers. They must also increase their domestic spending rates to compensate for shrinking workforces and aging populations. China faces the additional challenges of bringing more women into the workforce and creating new social-welfare programs to attend to massive future populations of retirees.

How Asia Can Shape the World is a global view of twenty-first-century economics, how they may evolve, and the role that the nations of Asia may play in their evolution. Its title may appeal foremost to scholars of Asia, but economists and policy analysts in any corner of the world may find it relevant.

Debating Geoengineering

How to Cool the Planet: Geoengineering and the Audacious Quest to Fix Earth’s Climate by Jeff Goodell. Houghton Mifflin Harcourt. 2010. 262 pages. $26.

Could purposely deploying clouds of pollutants that blot out sunlight be our last, best hope for averting catastrophic climate change? Proponents of “geoengineering” think so. Jeff Goodell, a journalist who contributes to Rolling Stone and the New York Times, explores current research into how—and whether—geoengineering might work.

In career journalist style, Goodell conveys interviews from a range of climate experts who have studied geoengineering. Their viewpoints vary: Some ardently support geoengineering, while others think it is ludicrous and maybe even dangerous.

Geoengineering would be foolhardy if we attempted it without understanding in full how Earth’s systems work, the critics tell Goodell. They contend that our knowledge is still very limited: Inaccurate weather forecasts and computer models of climate change that actually underestimate the extent of planetary warming effects both attest to this, they argue. They fear that geoengineering might unleash many unintended side effects, such as weather changes and altered water currents.

Plus, while it might stop carbon-dioxide-induced warming, it would do nothing to stop carbon dioxide’s other toxic effects, such as the destruction of coral reefs. Only serious reductions in emissions by humans can achieve that.

Knowledge gaps do not faze the proponents, who remind Goodell that climate change is happening quickly and with intensifying force. They see little time ahead for humanity to take action adequate to mitigate it. Geoengineering will be a winning solution, they argue, since it is faster-acting and less expensive than other alternative fixes.

How to Cool the Planet is a behind-the-scenes look at the debates and speculations surrounding geoengineering. Goodell gathers a range of voices, both pro and con. Environmentally aware readers who like a good debate will enjoy this book.

[Ed. note: See also “Dimming the Sun,” World Trends & Forecasts, May-June 2011.]

Towns That Take Care of Themselves

Walk Out Walk On by Margaret Wheatley and Deborah Frieze. Berrett-Koehler. 2011. 264 pages. Paperback. $24.95.

When the World Bank suspended its shipments of fertilizer and seeds to Zimbabwe, residents of Kufunda, Zimbabwe, feared that they would go hungry: Their farms had always depended on outside support. Then they conferred with their village’s elders and, under the elders’ guidance, created a new, localized agricultural system that could feed them year after year without government subsidies or NGO donations and with enough resilience to withstand droughts, pest infestations, and other hazards.

Kufunda is one of seven success stories told by Margaret Wheatley and Deborah Frieze, both former presidents of the nonprofit Berkana Institute, in Walk Out Walk On. The authors profile seven communities that implemented unique solutions to the problems that afflict cities and towns throughout the world.

The communities’ residents are “walk outs,” according to Wheatley and Frieze, because they exited from economic models, situations, and ideas that had confined them. Then they “walked on” to new concepts, practices, and opportunities by ceasing their reliance on outside help and direction. They drew upon their own initiative and ancestral wisdom to grow their own crops, produce their own energy, and create vibrant public spaces out of previously abandoned buildings and parking lots.

Although the communities’ development models are unique, they are guided by core principles that communities anywhere can emulate, the authors conclude: self-sufficiency, compassion, community wisdom, and listening to all of one’s neighbors, especially the poorest and most disadvantaged. Wheatley and Frieze hope that more communities will join them in walking out of resource-depleting and exploitative practices, and walking into empowerment of all people.

Community activists on every continent long for communities to become more sustainable and more self-reliant. Wheatley and Frieze’s Walk Out Walk On speaks directly to them by showing how some communities are making that transition in the here and now.

May-June 2011, Vol. 45, No. 3

  • The Top 20 (Plus 5) Technologies for the World Ahead
  • Global MegaCrisis: Four Scenarios, Two Perspectives
  • Solar Power from the Moon
  • Finding Eden on the Moon
  • Why Farmers Need a Pay Raise
  • Building a Better Future for Haiti

The Top 20 (Plus 5) Technologies for the World Ahead

By James H. Irvine and Sandra Schwarzbach

Breakthroughs now emerging in biotechnology, robotics, and other key areas bear the potential to reshape life on Earth. Two military analysts describe the 20 innovations that will have the biggest impacts in the near future, plus five prospective technologies that could have major repercussions in the longer term.

About 10 years ago, we at the Naval Air Warfare Center in Southern California set out to determine how emerging technologies might change armed conflict over the next 25 to 50 years. We selected 200 new technological applications, projecting out their growth and how they might influence future military strategy and warfare.

Our conclusion: These technologies would be major drivers of not only future military affairs, but of virtually all of human life. From these 200, we examine here what we consider the top 20 innovations that will have the greatest effect in the near term; in addition, we’ve selected five other feasible technological developments that could significantly change our world in the more distant future.

1. Computer Technology

Computing power has increased by a factor of 106 since 1959. Based on present-day central processing technology, we can expect a 108 further improvement in the next 30 to 40 years. Advances of up to 1018 (100 quintillion) could result, if any of the following innovations (which already exist at the laboratory level) undergo further development:

  • Parallel processing.
  • Advanced computer architecture.
  • Special function processing chips.
  • Special function analysis chips.

Such a level of enhanced computer performance would require much more advanced production technologies, including new chip production technologies; new types of computer chips, circuit elements, and computer architectures and software; and a projected 105 improvement in telecommunications-transmission rates over the next 25 years.

2. Ubiquitous Computing

Household appliances and many other items in our everyday lives will be embedded with cheap and barely detectable microchips, sensors, microcontrollers, and microprocessors that sense our presence, anticipate our wishes, and read our emotions. Imbued with these tiny yet powerful computing components, appliances and consumer products will become “intelligent.” They will interconnect and communicate with each other via network grids.

The ubiquitous-computing phenomenon will be further enabled by three new technologies:

MEMS. Micro-electro-mechanical systems integrate items such as sensors, computers, data storage, and transmission systems onto a single computer chip. MEMS are small, low mass, lightweight, low power, and easy to mass produce. They also measure a wide range of physical phenomena, such as acceleration, inertia, and vibration. They can be analytical instruments to measure biological or physical states and can also be active response systems.

Bots. Formally known as semi-intelligent specialized agent software programs, bots can automatically sort data based on set preferences, keep track of specific dynamic data sets (such as checkbook balances or inventories), maintain schedules and calendars, and track movement of things and people while integrating them with outside events. Bots are also capable of interacting with other computer software and other bots on their own initiative to accomplish tasks independently of a human user.

The general deployment of bots is projected to occur in the next seven to 10 years, pending the rollout of more advanced processor hardware. Masses of bots and bots-inhabited equipment will work together without human initiative—or even human knowledge—to automate large portions of society’s routine activities. Bots will also manage computer networks. By 2025, the Internet will have evolved into a bot-coordinated, bot-directed “information grid” that connects billions of devices, nodes, and sensors to each other. Under bot management, the Internet will be much more dynamic than it is today.

Swarm technology. Network command-and-control system architecture will be very unlike that of networks today. The ability to understand and manage the collective movements, reactions, and interactions of masses of interconnected items will be critical. Swarm technology—i.e., decentralized arrays of agents or programs interacting locally with one another and with their surroundings, thus carrying out “intelligent” large-scale behavior (much like an ant colony, bacterial culture, or school of fish)—will be important in the near future for controlling and managing this new system.

3. Human Language Interface for Computers

Another great technological advance of the next 20 years will be the development of computers with human-language interfaces that fully comprehend human words—both spoken and written—and their meanings and that will talk, listen, and read aloud in humanlike voices. Some applications will permit information retrieval using natural language and automated foreign language translation for print and voice. Also, semi-intelligent personal search agents will use human-language interfaces to search the Internet’s databases and archives to compile information in specialized fields of knowledge and areas of interest based on the human user’s specific interests and wishes.

The human language computer interface could potentially transform society from a written culture to one relying more on verbal interactions. This interface will also automate a large number of voice-based activities, such as placing orders, asking directions, and executing verbal instructions to perform complex tasking. The education system and service area will both become more automated.

4. Machine Vision

Machine vision that will become available in five to 15 years will grow more sophisticated over time. Developed machine vision will have capability far beyond the range of the human eye (infrared, ultraviolet, multispectral). Robotic systems equipped with machine vision will recognize, classify, sort, and manipulate objects and respond to changes in their environments in unique ways. They will be put to a wide variety of industrial, laboratory, and surveillance uses, such as automatic guidance systems for vehicles and accident avoidance systems for machinery.

5. Robot Technology

We are now in the process of developing human-directed, virtual presence machines capable of remote-controlled movement and manipulation of objects. These devices are often called robots, which they are not. The technology to build real robots is on the way, however.

In the near-term future, our world will be driven by two emerging technologies that are advancing simultaneously: robotics and biotechnology. These technologies will overtake information technology and give us a new Socio-Technological Age around the year 2025. This new age will continue for 50-plus years.

The technologies needed to build robots that can perceive their surroundings, move themselves, and perform tasks without human oversight should reach fruition between 2015 and 2025. By 2040, robotlike machinery will inhabit the world alongside people, doing much of the work.

As robots enter the mainstream, they will probably exert major economic and societal impacts. On the positive side, labor productivity will vastly increase, which could be life-saving as populations of retirees will swell in the developed world. Replacing the retiring labor force with high-tech robotic equipment will ensure that economies remain productive enough to support their retirees. Additionally, hundreds of thousands of new jobs could become available for human professionals who possess the skills to program robot-to-human interface systems, movement control, harm-avoidance systems, vision packages, tasking systems, and speech-recognition programs.

At the same time, major disruptions of the world’s workforces could result. Studies estimate that robots could replace as much as one-third to one-half of human labor in some industrial and service sectors. Additionally, robot technology could almost completely take over agriculture and displace most, if not all, of the world’s farm workers. Even worse, this precipitous fall in human labor will probably occur at a rapid pace: in about a five- to seven-year period.

6. Telecommunications Revolution

Mass interconnection of computerized data systems has enabled people and machines to talk to each other at high data rates. These data rates will get progressively higher over the next half century. Optical fiber network transmission systems will continuously increase their capacities and reach transmission capabilities as high as 100 terabytes per second once new photonics switches, photonics circuit elements, optical routers, and plasmon switches—all now under development—go into widespread use. This will ultimately produce a seamless, all-optical network for data communications four to five times more powerful than the current one.

The next telecommunications revolution will offer mass sharing and transfer of databases, unrestricted worldwide communications and an ability to locate and communicate with anyone, a much higher diffusion of work via telecommuting, rapid and widespread dissemination of knowledge (unlimited access for everyone to the sum total of knowledge of the human race), and a much wider variety and availability of education and entertainment.

7. Fullerene Chemistry

In September 1985, Nobel Prize–winning chemist Rick Smalley discovered the original C60 molecule, buckminsterfullerene (“buckyballs”), which comprised 60 pure carbon atoms. In 1990, a means to mass produce buckyballs was discovered, making them available for large-scale study and establishing the new field of fullerene chemistry. Since then, chemists have learned not only how to form fullerene molecules, but also how to attach other kinds of molecules to them and build new structures and materials, such as nanotubes and graphene.

Nanotubes are hollow, tubelike structures composed of carbon atoms. They are very strong under linear tensile loads, conduct electricity with little resistance, can store items in their hollow interiors, can filter substances that pass through them, and conduct heat better than any other known material. Researchers are exploring carbon nanotubes’ potential commercial uses as fiber in composite structures, as superconductive wire, and as a storage medium for hydrogen fuel. Other uses may include transport mechanisms for fluids in and out of the body, as molecular sieves and filters, superconducting interconnections on circuit chips, computer memory storage devices, thermal regulators, and small electric plasma guns.

Graphene, first produced in a lab in 2004, is a flat, two-dimensional carbon fullerene consisting of a carbon sheet just a few atoms thick that can be extended indefinitely along its edges. It is an amazingly good conductor of electricity and has many potential uses in the electronics and semiconductor industries. Graphene ribbons made on an industrial scale, for example, could be used as connectors on computer chips. An experimental nanoscale graphene transistor was first demonstrated in a laboratory in April 2008.

Large-scale production of graphene wafers could produce a new class of semi-superconducting substrate with which to build computer chips. This would make possible several revolutionary advances in chip technology: development of a superconducting substrate layer to connect components, processing elements, and multiple core dies and development of graphene-based superconducting transistors.

Graphene wafers might also make Johnson junctions, induction switches, and “Y” switches work at room temperature. These three devices are three times faster than transistors, but at present they only work at cryogenic temperatures. If, by using graphene wafers, engineers successfully made them work at room temperature, they could create extremely efficient electrical networks that would not require active switching—i.e., fewer moving parts and fewer resources required.

8. Multi-Level Coding System in DNA

Scientists now recognize that DNA has at least six levels of coding. Some birth defects, cancers, and other genetic disorders may not actually be the result of genes themselves, but of coding errors in these outer layers of the DNA system. Some of the “non-gene” control layers may be easier to manipulate than the classic, first-line gene layer. Recent discoveries have opened several new lines of genetic research, making this the dawn of a new era in molecular genetics.

9. Biotech Analysis Instrumentation

Development of new instruments to examine biological phenomena is revolutionizing the fields of biological research and medicine. One of the most important new instruments is the DNA microarrays, which are compact robotic systems that detect DNA and other biochemical matter. Modern microarrays’ detector systems are made with postage-stamp-sized coated glass wafers. Each wafer includes a grid of strands of DNA that only bind to their complementary DNA matches (or alternatively, dots of some biochemical reagent).

The grid elements can measure the presence and level of a given gene or gene product (mutants, abnormal variants, dysfunctional genes) in a sample. These wafers can also find and analyze chemical and biological compounds within the body. Current machines are only capable of statistical samples. Scientists would like a machine capable of analyzing the entire human genome and its biochemical environment with all its variants in a single pass. It will probably be the late 2020s before a full human body biochemical scan can be performed.

At present, however, our greater challenge is not how to detect the chemicals, but how to interpret the results. Too few tests have been conducted to establish the normal level of most of these chemicals in the human body. By the mid-twenty-first century, we will have enough data analyzed to tell what chemicals, proteins, and enzymes are normal in the human body; whether their physical form is a mutation or merely a normal statistical variation; and whether certain measurements are metabolic disorders or just normal variations of human metabolism. This information will have a significant impact on health.

10. Human Biogenetic–Chemical Computer Model

New biotechnology and computer science breakthroughs are revealing the body’s biochemical secrets and spurring creation of new methods for attacking metabolic and genetic disorders. We have begun to examine the body’s biochemical nature to determine whether it is functioning correctly and is in balance and to determine what effect this balance or imbalance has on health.

Biochemists’ ultimate goal—of a full-scale biochemical computer model of human genetics, biochemistry, and all their interactions—will be available within 10 to 20 years. The amount of data and calculations involved will require a larger computer than available today, but this deficiency will be overcome within that time.

By the mid-twenty-first century, we will have a working computer model of human genetics, biochemistry, and major portions of their interactions. This will permit the modeling of an individual’s genetics and biochemistry, which can be used to diagnose and isolate individual biochemical deficiencies, including a number of conditions that today may be considered psychological but are actually statistical variations in metabolism. This will also be used to determine the effect of drugs and nutrients.

11. Treatment of Hereditary Diseases

The human race is now afflicted by some 4,000 hereditary diseases caused by genetic abnormalities. These diseases have until now largely been untreatable. However, the new knowledge of gene structure and function could possibly lead to new treatments. Eventually, genetic intervention could prevent or treat a large number of diseases. More successful treatments might be possible via selected artificial protein therapy and/or micronutrients.

It is also possible that this new biotech knowledge will uncover a variety of “minor” genetic diseases that people haven’t recognized or have assumed to be normal variations. Since these minor diseases affect a larger portion of the working population than the major hereditary disorders do, mitigating or curing them could lead to bigger increases in workforce productivity and performance.

12. Control of Bio-Metabolic Disorders

New means will arise to measure how the body is working at a biochemical level and to assess the body’s biochemicals (types and amounts) and whether the body metabolism displays proper balance. Biochemical “retuning” will treat a number of chronic, long-term conditions—including Parkinson’s disease, Alzheimer’s, and possibly even the aging process itself—by supplying chemical compounds that the patient’s body does not have in order to realign the biochemical functions.

13. Blood and Tissue Matching of Drugs

At present, only about 40% of the population reacts favorably to a new drug. The rest have either minimal reaction or adverse reactions. As knowledge of human bio-metabolism advances, however, clinicians will learn to group patients into bio-metabolism classes and tissue-type groups to determine who will benefit from a specific drug and who will have adverse reactions. Use of bio-metabolism classes and tissue-type groups will be widespread by 2050 and result in increased drug effectiveness, fewer negative drug reactions, and lower drug-treatment costs.

14. Tissue Engineering

The creation of self-replicating biomaterials for healing wounds and bone fractures, including the combining of synthetic materials and structures with living cells, is another area of scientific exploration. Tissue engineering will revolutionize body and wound repair, organ transplantation, and surgery in general.

New polymers that satisfy safety and effectiveness requirements are being researched and developed for many surgical uses, including tissue scaffolding, bone grafts, cartilage repair, tissue regeneration, wound repair, and tissue joining. Soon, artificial organs and body parts will be available for replacement surgery. Research programs are now under way to develop artificial ears, hearts, pancreases, lungs, kidneys, livers, and legs.

15. Neurotechnology

Neuroscientists have developed a set of scanners capable of determining how and where the brain performs specific functions. The new brain-scanning and brain-mapping tools are opening up a whole new understanding of how humans think and act. Using them, researchers can observe brain activity, measure its intensity, chart the general pattern of brain operation, and identify the type of chemical reactions occurring in the brain.

Brain-scanning technology will soon be upgraded by the use of atomic magnetometer sensors—a new magnetic sensor technology that uses cesium vapor as a sensing element. These devices are 100 times more sensitive and 1,000 times faster than present sensor elements. They will, with time, better discern how people think, how the brain performs tasks, how thought processes differ among individuals, and what those differences mean in relation to task performance and personality.

The new knowledge of brain operation and its effects will be one of the major, socially transforming events of the twenty-first century. Understanding how the brain operates on an individual basis will permit society to match the individual to task performance, to individualize educational programs, and to identify and mitigate mental illness.

16. Neuropharmacology

We can now see the operation of the brain. We can also systematically study and measure the effects of nutrients, micronutrients, and drug treatments on the brain and on various mental conditions. This has led to the new science of neuropharmacology—the study of how we change the brain’s operation through the use of drugs, food, and other nutrients, micronutrients, and proteins. Over time, this field will apply knowledge of the brain’s biochemical operations to systematically treat mental disorders and mental conditions pharmaceutically, as well as enhance people’s natural mental abilities.

17. Cellulose-to-Glucose Process

One of the major goals of the biotech and chemical industry is the production of glucose, the principal food of many microorganisms, from cellulose. If cheap, plentiful glucose were available, microbes could be genetically engineered to make almost anything. An economical cellulose-to-glucose process would revolutionize the world’s chemical industries and allow the conversion of much agricultural cellulose-based waste into useful raw materials.

18. Nanotechnology

Instrumentation has begun to permit us to see and manipulate matter at a nano level—10-6 to 10-9 meter—the level of atoms and molecules. This has created the new field of nanotechnology. The ability to create smaller structures using modern chip-manufacturing technology will permit us to change and modify materials one atom or molecule at a time and to develop super-fine powders, quantum dots, and nanotubes. These capabilities have now started to shrink things into the “upper nano” range—a range that advancements in production technology will push us into over the next 10 to 15 years. The scale of objects will continue to shrink, and some useful upper-nanoscale devices and phenomena will be developed and deployed.

19. Chaos Theory and Complexity Models

Our world is much more complex, interconnected, and dynamic than we once thought. New mathematical concepts are challenging the rationalized, deterministic, scientific models of the Industrial Age. The Industrial Age paradigm held that there is one best way to organize a given thing and that, in all cases, a given “rational” outcome is predetermined by nature. The new scientific paradigm will ultimately replace this older mentality.

The new Information Age is being driven by applied technology and by two major advances in theoretical science that are altering our view of how the world works: an ecological/ecosystem model, which supports ecological and environmental diversity, and modern chaos and complexity theories, which emphasize unpredictability, self-organizing systems, and the coexistence of the linear and the random. In the near term, this paradigm shift will significantly change people’s views of society, of themselves in relation to society, and of how the world and the greater universe work.

20. Fuel Cells to Allow Deep-Sea Habitation

A major effort is under way to develop advanced fuel cells for cars. The greatest social effect of fuel cells will not be in automobiles, however, but in the opening of the undersea world to exploration and habitation. Fuel cells that produce electricity directly, without producing toxic fumes as a byproduct, will bring down the costs of submarines and keep them running for days as opposed to hours. This will permit human exploration and—eventually—colonization of the continental shelves and the shallow oceans.

Fuel cells will lead to the development of extensive deep-sea business sectors and myriad human habitations out in the ocean. Mining operations to exploit the shallow ocean floor’s mineral wealth, as well as commercial aquaculture enterprises to exploit the ocean’s biological resources, will follow. Earth’s available resource base will expand significantly, and Earth’s population—which could reach more than 9 billion people by mid-century (or even 11 billion if medical advances extend average life spans)—will have much more room to grow.

Five Future Technologies and The Problems They Could Solve

Several technologies yet to come could significantly affect the nature of our world. Our top five are as follows:

1. Superconductivity at room temperature. When certain metals and ceramics are cooled to ultra-low temperatures, they become superconductive—i.e., they can carry huge amounts of electrical current for long durations of time without losing any of the current’s energy as heat. Diverse work is going on in making materials superconductive at room temperature. If it succeeds, we could substantially increase the efficiency of electrical machines and power grids and also develop new types of computer chips, improved medical-imaging devices, and high-efficiency ion drives for space vehicles.

2. Low-cost space lift. Lifting objects into orbit is expensive—a problem that slows human improvement in space capabilities. The advent of a cheap space lift would allow exponential growth, and perhaps a new technological age. It would be attainable either by politicians agreeing to the massive funding needed for such a development or by some unforeseen, dramatic technological breakthrough. Neither, however, can be guaranteed to happen within the next 25 years.

3. Artificial intelligence of human-level capability in computers. The development and widespread use of AI of human-level capability in computer systems stands to be one of the major advances in computer technology over the next 75 years. AI claims have been made for 40 years, but to date, they have not delivered. Furthermore, there appears to be no current, fundamental breakthrough that will alter this in the near future. However, research grants bolster those who think that the big breakthrough is right around the corner.

4. Cellulose-to-liquid-hydrocarbon path. A number of new, synthetic fuel processes can produce diesel fuels from agricultural products. The means now exist for converting vegetable oils into biodiesel fuel, protein matter into diesel oil, various agricultural substances into synthetic oil, and sugars and starches into fuel-grade ethyl alcohol. Unfortunately, all these biosynthetic fuel processes are much more expensive than fossil-fuel generation, largely due to the costs of harvesting and processing.

One lower-cost option may exist, however: converting low-end agricultural waste (largely cellulose) into synthetic oil. A number of experimental processes to derive fuel from cellulose waste are now in R&D. A successful low-grade, agricultural-product-to-fuel path would enrich agricultural economies throughout much of the world, and in addition make energy independence more attainable for communities everywhere.

5. Improved medicine and life span. The question is not whether we are going to get some life-span extension, but how much: Will the extension be a moderate increase in life expectancy of 100 to 120 years, a significantly increased life expectancy of 150 to 170 years, or a very significant life extension of 250 to 300 years? Conversely, radical life extension could lead to life spans of 1,000-plus years.

Life extension has both positive and negative social implications. It will alleviate suffering caused by age deterioration and will result in a longer-lived, more productive workforce. On the other hand, it may cause issues with pension plans, Social Security, life insurance, and other retirement programs. It could result in overpopulation, food shortages, pollution, wars for resources, and extinction of species. Another important consideration is that of control: Who would determine how this precious technology would be shared?

The Effects of Emerging Technologies on Society

As the technology areas covered in this article advance along their individual development curves, their combined effect will remake society as we know it. Ultimately, they will give humankind two new socio-technological ages in the first half of the twenty-first century: the Information Age and the Robotic-Biotech Age. The current Information Age, which should continue for the next 20 to 40 years, is being driven by advances in computers, telecommunications, and electronic instrumentation, plus major advances in materials, space, energy, and manufacturing.

The Robotic-Biotech Age will follow at around 2025, driven by the simultaneous advances of robotics and biotechnology, and reinforced by advances in nanotechnology, materials, and manufacturing technology. The Robotic-Biotech Age will continue for 50-plus years, until another great technology emerges as a new force in the world.

The Information Age came on very quickly and will be relatively short-lived (about 50 years). Society and social structure will not have had the time to fully adjust before the next wave of technological innovation comes along. This speed of change is going to continue for the next 50 to 75 years as the current wave of emerging technologies matures.

In the twentieth century, many people viewed the philosophical movements to which they belonged (communism, fascism, various radical nationalisms, socialism, social democracy, liberalism, etc.) as belief systems that would and should govern how the world runs. In the name of these systems, 500 million people died by war, genocide, war-related famine and disease, politically motivated terror, and régime-based terror.

Members of the emerging generation, operating under the new paradigm, are much more likely to see themselves as cellular automata—trying to optimize themselves in their environment—rather than as governors of the universe. Whether this is good for society as a whole is not yet known, but it will represent a new social viewpoint.

New socio-technological ages tend to produce new social structures and new social mores. Historical precedent suggests that this new age will also produce a new and different societal basis for war and the use of military force, along with a new social perception of the legitimate application of war.

It is possible to project with some certainty the social structure of both the new Information Age and the Robotic-Biotech Age. With each transition to a more advanced stage of civilization, certain things transpire:

  • The social structure acquires an increasingly large number of small, specialized niches.
  • There is a significant increase in the number of players in the political power structure.
  • There is an increasing spread of knowledge out to the masses.
  • The average person’s standard of living goes up.
  • Human control over nature increases.

As society has advanced, class structure has become more complex. In the later Information Age and Robotic-Biotech Age, there will be simply too many classes for a dominant one to emerge. The complexity of the new social structure, coupled with the rise in general knowledge level, will require recognition that specialized knowledge is necessary and that all classes serve useful functions and are needed for society to operate properly.

It is usually hard to change the direction of society, absent great social perturbations, such as war or economic disaster, which can force rapid social change. In an age of peace and prosperity, it takes a long time to modify social norms, regardless of the level of new technological progress that occurs. Technological progress alone is relatively slow at driving social change. However, the near future will see society change markedly as a result of new emerging technology and demography.

This need not be any cause for alarm. With some exceptions, most of the changes described portend to be highly positive. Barring bad luck and bad management, the world will—when all the technologies are deployed—be a better place to live in.

About the Authors

James Irvine is director of the Revolution in Military Affairs Program at the Naval Air Warfare Center, Weapons Division (NAWCWD) in China Lake, California. He has worked as a systems engineer for the U.S. Navy for the last four decades, and has authored numerous studies on future military geopolitics and technology. E-mail james.irvine@navy.mil.

Sandra Schwarzbach is senior strategic analyst for the Naval Air Warfare Center at China Lake, California. She advises the Office of the Secretary of Defense and the Chief of Naval Operations on security issues and contributes regularly to Department of Defense planning initiatives. She has also taught courses on military strategy development and participated in the design of multiple weapons and weapon systems. E-mail Sandra.schwarzbach@navy.mil.

Global MegaCrisis: Four Scenarios, Two Perspectives

By William E. Halal and Michael Marien

Two futurists map out the convergence of multiple global challenges, offering divergent viewpoints—one optimistic and one pessimistic—on the likelihood of successfully meeting these challenges and turning them into global progress.

Killer pandemics, financial meltdowns, runaway global warming, environmental decay, nuclear war, cyberdisasters: These catastrophes are becoming increasingly routine headlines. But as the mainstream press focuses only on individual extreme events, attention is drawn away from an issue far more complex: the convergence of multiple problems into a Global MegaCrisis. This article offers an explanation of this complex issue, as well as four plausible scenarios based on how we and our institutions approach it.

The Global MegaCrisis cuts across all sectors in an era of multiple transformations. The Iraq War demonstrated the limits of U.S. military power, and the 2008 global financial crisis highlighted the limits of deregulated markets. With these foundations of the old global order shaken badly, the growing threat of climate change, looming energy shortages, huge government deficits, terrorism, and a host of wild cards now form a complex interplay of destructive forces that are straining established systems to the breaking point. These multiple threats converge like a multi-vehicle freeway pileup in slow motion. If it had not been bad mortgages and arcane derivatives, other driving forces in these complex systems might have caused roughly the same type of global failure. And more failures seem all too likely.

The Global MegaCrisis: What Is It And What Does It Look Like?

The MegaCrisis, simply defined, is a global environmental and economic collapse or near collapse, along with attendant problems of rising prices, mass protests, widespread psychic stress, and lawlessness. We present the following tentative outline to better paint a picture of what MegaCrisis might look like.

Some Trends Driving the MegaCrisis

Climate Change, No Matter What. The year 2010 marked the hottest year (and decade) on record. The world has already seen a 1°F temperature rise, and an additional 4°–6° rise is likely even if all proposed actions are taken. Expect possibly 10°F in the next few decades if greenhouse gases keep growing. In addition, the projected sea-level rise in the 2007 Intergovernmental Panel on Climate Change (IPCC) report was 16 inches by 2100; now it is about three to six feet by 2100.

Complicating this first point is the fact that reducing CO2 is costly. The science indicates that greenhouse gases must be reduced by 60% from 1980 levels to avoid severe climate change. This would cost roughly $20 trillion, or about 1% to 3% of global GDP, if done soon, but would be far more costly if done later. The problem is even more daunting because most developing nations are likely to industrialize, and most industrialized nations are likely to grow, increasing all these threats over the long term.

Political Will to Reduce CO2 Is Lacking. There are as yet no global agreements that would decrease carbon emissions significantly. Meanwhile, China, India, and the United States are planning to build a total of 850 coal-fired plants, adding five times as much CO2 to the atmosphere as present treaties intend to reduce.

Methane May Be Worse Than CO2. Keep your eye on methane, a potent greenhouse gas that is 23 times worse than CO2, although it doesn’t stay in the atmosphere as long. Large quantities of methane are being released from thawing tundra in the Arctic region, and still larger quantities may be released from icelike methane clathrates on the ocean floor in coastal areas.

Freshwater Is Becoming More Scarce. Nearly a billion people lack clean water, and 2.6 billion lack good sanitation. Water tables are falling on all continents, and the World Bank estimates that, by 2025, half of the world population could face water scarcity due to climate change, population growth, and increasing demand for water. Unless major changes occur, global water shortages are likely to cause mass migrations, higher food prices, malnutrition, and major conflicts.

Recession Likely to Last for Years. The Great Recession that began in 2008 is often compared to the Great Depression of 1930, which lasted until 1940. The International Monetary Fund forecasts growth for the next two years at slightly above 2% in developed nations, although it should remain at 8% in the developing world. Some economists think unemployment rates between 8% and 9% are quite likely for several years, much like Japan’s “lost decade” in the 1990s.

Severe Institutional Failures. The near collapse of the world’s financial system in 2008 highlighted structural failures in the financial industry, government, and other institutions. A study of 1,500 CEOs noted: “The world’s leaders think their enterprises are not equipped to cope with complexity in the global environment.” Nobel Prize–winning economist Joseph Stiglitz wrote, “The financial collapse may be to markets what the Berlin Wall was to Communism.”

Cyberwarfare/Cyberterrorism. Computer hacking is growing, commensurate with the boom in global e-commerce. U.S. military networks, nuclear facilities, banks, air-traffic-control systems, and electrical grids are under constant attack. The U.S. Naval War College was shut down by hackers for more than two weeks in 2006. The threat is so great that one expert suggested installing “cyberwar hotlines” similar to the special phones that the United States and Soviet Union used to avoid nuclear Armageddon.

Weapons of Mass Destruction. The old status quo of MAD (mutually assured destruction) may have kept two superpowers locked in a stalemate, but it is no longer viable with nine contending nuclear powers (and more likely to emerge, including terrorist groups). Between 1993 and the end of 2009, the Illicit Trafficking Database recorded 1,784 nuclear trafficking incidents.

Suddenly, many of the concerns we were forewarned of over recent decades are at hand. The future is arriving—and with a vengeance. There is a palpable and widespread fear that the present world is unsustainable and that events could easily spin out of control. Scientists are convinced that a 60% reduction in carbon-dioxide emissions is needed to stave off ruinous climate change, but achieving that goal looks so unrealistic that many are girding to withstand a significant rise in sea levels, scorching heat, withering droughts, and more extreme weather patterns. Policy makers in major world capitals, including Washington, are seriously considering geoengineering the planet as a last-ditch effort to stave off disaster. The MegaCrisis represents what could occur if the human species fails to transform its economies, technologies, politics, and lifestyles into something more sustainable within the next two decades.

Debating the Global MegaCrisis And Its Outcomes

With these political, financial, and ecological crises threatening the world, the two of us engaged in a spirited e-mail discussion, later published in World Future Review (“Letter to the Editor: A Dialogue Between William E. Halal and Michael Marien,” June-July 2009). We then published a survey on TechCast.org to encourage discussion and to learn what others think. The survey summarizes our differing views and asks TechCast experts and visitors to evaluate the severity of the Global MegaCrisis and the probability of four alternative scenarios.

The four scenarios run along a single axis from pessimistic to optimistic. This enables us to focus on alternative outcomes for the entire world or entire societies moving through a period of crisis.

Scenario 1: Decline to Disaster

The world fails to react to the Global MegaCrisis in time. Indecision reigns due to too many choices, too many entrenched interest groups, and too few resources to make necessary changes. Huge government deficits persist, leading to failures of public services and an inability to make crucial transition investments in energy, education, and infrastructure. Governments are unable to reform financial systems, curb global warming, reduce military spending, or conquer deficits. Most corporations remain focused on short-term profit. Technological advances are shelved, delayed, controversial, or fail to help. Climate change accelerates, thanks in part to large amounts of methane complementing the carbon dioxide being released into the atmosphere, resulting in more extreme weather events, massive migrations, and crop losses.

The bottom line: a global economic depression, crippling energy shortages, ecological collapse, local and regional wars, rampant terrorism, crime, corruption, and more.

Scenario 2: Muddling Down

Halfhearted, inadequate actions result in the apparent paradox of a high-tech dark age. Political stalemates, general ignorance about the complexity of the problems, and lack of resources stymie all but the most modest changes in financial systems, governance, energy, and education. The promise of new technologies is only partly met, and pollution and population pressures continue as the world population passes 7 billion in late 2011. The effects of climate change become even more extreme. Meanwhile, recovery from the Great Recession is slow and uneven, and the number of failed states rises. Local wars and terrorist attacks increase.

Despite claims of progress by political and corporate leaders, high unemployment persists and the quality of life declines for most people.

Scenario 3: Muddling Up

Governments and corporations act slowly, but with increasing knowledge. Mounting threats spur generally successful efforts. Far more sophisticated information technology (IT) and artificial intelligence (AI) provide powerful technical capabilities to help counter the challenges. The sense of urgency builds as problems increase, so public attitudes shift enough to favor needed changes, and reasonably good leadership is able to provide guidance. There are relatively minor disasters along the way but little that is catastrophic for an entire region or the planet. A rudimentary but functioning global order emerges to manage this advanced society in time to avert widespread disaster. Many new problems arise nonetheless, but most are adequately addressed.

Scenario 4: Rise to Maturity

The transition to a new global order is made quickly and easily. Governments and corporations act wisely and with determination, and are supported by the majority of people. The world surpasses the United Nations Millennium Development Goals of halving poverty by 2015, and many countries approach ecological sustainability (at least as it is currently defined). A conversion to clean, renewable energy happens quickly and provides a solid boost to many national and regional economies.

Early Survey Results

As of January 2011, our exploratory survey has been completed by 60 responders, and more replies are coming in. It’s not a random sample; these are smart and thoughtful people. Here is the breakdown of responses to the initial question, “How severe is the potential threat posed by the Global MegaCrisis?”

Table 1. Severity of the Potential Threat

Severity Respondents
Catastrophic (Decline to Disaster) Could be the end of civilization for many if not all 22%
Severe (Muddling Down) Major declines in central aspects of life 60%
Bad (Muddling Up) Serious challenges likely to be met in time 13%
Overblown (Rise to Maturity) Problems greatly exaggerated; technology and the market can handle them 4%
Don’t Know / Too murky and can’t even make a guess 2%

We also asked respondents to estimate the probability for each of the four scenarios along the pessimism–optimism axis. This question frames the issue differently, but produces roughly the same general results: a 60% probability for the two most pessimistic scenarios, compared with a 40% probability for the two most optimistic.

Table 2. Probability of Four Scenarios
Scenario Probability (%)
Decline to Disaster World fails to react, resulting in accelerated climate change, widespread energy and water shortages, economic depression, conflict, etc. 25%
Muddling Down World reacts, partially, but problems continue to outdistance policies and technologies. Ecological damage continues, as does increased poverty, inequality, and conflict. 35%
Muddling Up World reacts out of need. Policies and technologies help make headway on problems. Widespread disaster avoided, but many problems remain. 28%
Rise to Maturity World transitions to a humane and responsible global order. 12%

The rough timetable for these four scenarios is estimated as follows. Note that the Muddling Down scenario is thought to occur earlier than the others; indeed, some think it has already begun. Here are the dates that respondents suggested:

Table 3. Mean Arrival Dates
Scenario Year
Decline to Disaster 2029
Muddling Down 2023
Muddling Up 2027
Rise to Maturity 2033

Many respondents identified the key problems as chronic failures in governance, leadership, and cultural attitudes. They also believe that, despite such failures, humanity has a proven capacity to survive, usually by muddling up.

Halal’s Analysis: The World Is Entering an Advanced Stage of Evolution

Despite the enormity of the challenges, there is reason for hope. Advanced IT, along with the rise of green technologies and other new industries, will help spur an economic upcycle starting about 2015, and it is likely that the Global MegaCrisis will be largely resolved by 2020. That is why I rate the four scenarios as follows: Decline to Disaster, 10%; Muddling Down, 25%; Muddling Up, 60%; Rise to Maturity, 5%.

The forces involved are so historic and powerful that a long-term evolutionary perspective is necessary to understand what is taking place. Our work at the TechCast Project shows that the Global MegaCrisis is the inevitable result of high-tech globalization that is causing what we call a “global crisis of maturity.” This is a critical growth phase in the life cycle of the planet, marked by unprecedented transition points in climate change, energy consumption, economic systems, and all other facets of an emerging global order. We also believe that the relentless advance of information technology is driving a transition to an advanced stage of civilization powered by new technologies, interrelated global systems, adaptive social institutions, mounting knowledge and intelligence, and global consciousness.

By combining our 70 forecasts of technology breakthroughs, we are able to produce “macroforecasts” that suggest that the Muddling Up scenario could occur in about 10 years, give or take three years. Worldwide e-commerce is likely to take off in about five years to form a rudimentary version of the “global brain” that futurists have long anticipated. Around 2020 or so, we are likely to see second-generation computing (optical, nano, bio, and quantum) and artificial intelligence that can automate routine knowledge.

These developments will enable people to concentrate on values, beliefs, ideologies, and other higher levels of thought and to focus most of their attention on solving crucial global challenges. This constitutes the next logical phase in the progression of society from agriculture to manufacturing, services, knowledge, and even consciousness itself.

The central role of IT/AI is a game changer because it shifts the relationship between humans and machines in profound ways. Contrary to the assertion that AI will surpass human abilities, AI liberates us from mental drudgery and releases the unique human capability for higher consciousness at the very time that the world faces unprecedented challenges. This is hardly a coincidence, but rather the playing out of historic forces in the evolutionary cycle. Sure, there will be lots of information overload and confusion, because the world is struggling to take responsibility for its future or suffer enormous consequences. However, pollster John Zogby’s research shows a “fundamental reorientation of the American character: away from wanton consumption and toward a new global citizenry in an age of limited resources.”

Events are likely to culminate around 2020, when we expect IT/AI to mature and the threats to reach intolerable levels as the global GDP almost doubles. Yes, the situation looks bleak, but it’s always darkest just before the dawn. The rise of consciousness can be seen even now in the way the economic crisis has provoked a widespread awareness of the need to transform business and government institutions, stabilize the world’s financial system, promote renewable energy, and halt climate change.

It is not possible to know much more about this coming “Age of Global Awareness,” just as we never could have guessed that the Information Age would entail us being virtually inseparable from our PCs, laptops, and smart phones for practically every waking hour. I suspect we will use what I call “Technologies of Consciousness” to see us through the crisis of maturity.

Technologies of Consciousness (ToC) are methods that shape awareness, emotions, values, beliefs, ideologies, choices, and states of mind. The ToCs in this survey range from so-called “hard” ToCs, such as artificial intelligence, biofeedback, virtual reality, and even cybernetic brain enhancements, to “soft” or “social” ToCs, such as collaborative enterprise, conflict resolution, and even meditation and prayer.

The key tool in the ToC arsenal is the little-used power of collaborative problem solving. In a knowledge society, collaboration creates new solutions that can benefit all parties, but this is not yet well recognized. Maybe this collaborative article can serve as a small example.

When we (Michael Marien and myself) started working together on this project, I thought many times that we could not go on because our views were so strikingly at odds. We were dealing with a tough issue, of course, but the problem was exacerbated because both of us have thought about futures for many decades, but from different perspectives. One of us is guardedly optimistic, while the other is decidedly pessimistic (albeit hoping to be proven wrong). By examining our differences in the light of compromise, we made important breakthroughs. Collaboration is a powerful approach to problem solving—and possibly the single best way to resolve the Global MegaCrisis. Technologies of Consciousness such as those mentioned above could greatly encourage collaboration.

Marien’s Analysis: Infoglut, Ignorance, Indecision, and Inadequacy

The two of us agree that both a Global MegaCrisis and an IT/AI explosion are under way, and that there are other technology revolutions ahead, as nicely summarized by the TechCast Project. The question is: Will the IT/AI explosion make things better? It is indeed “a game changer,” and it will change many games—for good and ill. It could bring convergence of thinking about important global issues and move attention to “higher levels of consciousness.” It is also just as likely to cause further information glut, fragmentation, degraded consciousness, indecision, and, ultimately, half-baked inadequate action. Based on the first decade or so of the Internet and vastly expanded information abundance of all sorts, I see no reason for unfettered optimism, which is simply wishful thinking in the end.

In my essay “Futures Thinking and Macro-Systems: Our Era of Mal-Adaptive, Non-Adaptive, and Semi-Adaptive Systems” (World Future Review, April-May 2009), I argue that our increasingly complex social systems are adapting in the wrong direction, not adapting at all, or only partly adapting, which could well result in the paradox of “improvement and growing inadequacy.” As a consequence, I rate the four scenarios as follows: Decline to Disaster, 20%; Muddling Down, 60%; Muddling Up, 20%; and Rise to Maturity, 0%.

Certainly there is more consciousness about global issues nowadays, and some actions are being taken to improve global governance. There is growing awareness of climate change. The “greening” of communities, businesses, and governments is under way in many places, and there is a veritable gold rush to develop a wide variety of clean energy technologies (for example, ExxonMobil’s recent claimed investment of $600 million to produce liquid fuels from algae). And yet the latest assessments of climate experts are increasingly dire—thus, “improvement and growing inadequacy” seems likely.

The biggest blind spot in the IT/AI vision has to do with governance. In the “Rise to Maturity” scenario, governments and corporations do the right thing—and are supported by the public. This happens even in the more likely “Muddling Up” scenario. It may be desirable, but it is not likely in our chaotic new information environment of tweets, twitters, trivia, sound bites, floods of emails, superficiality, commercialism, and ever more fragmentation. Huge deficits, run up by many governments, are leading to draconian cuts in essential services and inattention to decaying or inadequate infrastructure, while fueling overreactionary fears that we are headed toward fiscal ruin, “evil” socialism, and/or unwelcome centralized global government.

Also, despite the hyperabundance of information, there is no evidence that people are better informed about current affairs today than they were in the past. Newspapers and magazines are closing down or shrinking their coverage of national and global issues. In the United States, financially stressed schools and colleges are still deficient in civic education, let alone serious futures education, and socioeconomic inequalities continue to grow. We may still see some shift to enlightened views, but, more likely than not, too little too late. And it may well be offset or rolled back by simplistic reactionary movements.

Granted, Facebook and Twitter have sparked a spectacular and welcome string of regime changes in the Middle East. However, once the post-dictator euphoria passes, the harsh realities of rising prices and a bulging youth population in need of employment may lead to further discontent.

This is not “doom and gloom,” but mainstream social-science thinking, based on my synthesis of hundreds of recent books on environmental issues, governance, IT impacts, and education. Perhaps we can return to an undisputed path of evolutionary progress, but it will require a major restructuring of industrial-era knowledge and education/learning, especially adult/voter learning, and serious consideration of ethics and the quality of public discourse. What Halal refers to as “Technologies of Consciousness” are not a solution in and of themselves.

Your Turn

You have now encountered four scenarios and two differing arguments about which direction the world is heading in. Now it’s your turn to think and respond—and to encourage others to do the same. We invite readers to take the MegaCrisis Survey at www.TechCast.org.

About the Authors

William E. Halal is professor emeritus at George Washington University and president of TechCast LLC (www.TechCast.org). Portions of this article are adapted from his forthcoming book, Through the MegaCrisis: The Technology Revolution to a World of Knowledge, Intelligence, and Global Consciousness.

Michael Marien is the founder and former editor of Future Survey, published by WFS for 30 years, and is now the director of GlobalForesightBooks.org. Despite their differences, Halal and Marien share the common bond of having studied for advanced degrees at the University of California, Berkeley.

The authors gratefully acknowledge contributions to this analysis by Jerome C. Glenn, director of the Millennium Project, and Mike MacCracken, chief scientist at the Climate Institute. Readers are invited to take the MegaCrisis Survey at www.TechCast.org. E-mail comments to halal@gwu .edu and mmarien@twcny.rr.com.

Recently Published Books Other perspectives on the Global MegaCrisis

To provide a broader sense of the MegaCrisis, we offer a summary of the problem as seen by a variety of prominent futurists and other writers.

It is important to realize that there is no shared language on the general global condition. Nor is there any shared approach. Some writers use a balanced perspective that looks at both pessimistic and optimistic indicators, but most decidedly take one side or the other. Here is a sampling of both general overviews and one-sided views.

Perhaps the best starting point is the “State of the Future Index” in the Millennium Project’s annual State of the Future report, assembled by Jerome C. Glenn, Theodore J. Gordon, and Elizabeth Florescu (The Millennium Project, 2010). The Index reviews 30 trends to provide a “report card for humanity,” divided into four categories: where we are winning (improved literacy rate, more Internet users, improved life expectancy, etc.), where we are losing (fossil fuel emissions, unemployment, terrorist attack casualties, etc.), where there is little change (HIV prevalence, for example), and where there is uncertainty (infectious diseases, for example). How the trends are weighted is problematic, however, and there is doubt as to whether the 30 indicators cover all essential developments.

A recent report prepared by the Rockefeller Foundation, along with Peter Schwartz and the Global Business Network, parallels somewhat the four single-axis scenarios presented in our article. Scenarios for the Future of Technology and International Development (2010) provides four scenarios for the next decade or so in a 2x2 matrix along two axes: strong versus weak political/economic alignment, and low versus high adaptive capacity. The scenarios are “Hack Attack” (an unstable and shock-prone world, with weak governments, thriving criminality, and dangerous technologies), “Lock Step” (tighter top-down government control after a 2012 pandemic, with limited innovation and growing citizen pushback), “Smart Scramble” (an economically depressed world, with local makeshift solutions and “good enough” technology addressing a growing set of problems), and “Clever Together” (a world of highly coordinated and successful strategies addressing global issues). A free PDF is available at www.RockFound.org; Global Foresight Books selected this as its Book of the Month for November 2010.

Essential reading, as always, is provided by Lester R. Brown, founder of the Earth Policy Institute, in World on the Edge: How to Prevent Environmental and Economic Collapse (W.W. Norton, 2011). He warns that “ecological and economic deficits are now shaping not only our future, but our present. … [T]he ‘perfect storm’ or the ‘ultimate recession’ could come at any time.”

In The Great Disruption: How the Climate Crisis Will Change Everything (for the Better) (Bloomsbury USA, 2011), Paul Gilding, a faculty member of the Cambridge University Program for Sustainability Leadership, sees loss, suffering, and conflict in the coming decades, as our “planetary overdraft is paid,” but believes that compassion, innovation, resilience, and adaptability will win out.

John L. Petersen, founder of The Arlington Institute, focuses on a wide range of converging global trends, breakdowns, and breakthroughs in A Vision for 2012: Planning for Extraordinary Change (Fulcrum, 2008), concluding with an exploration of various possibilities after a massive catastrophe, ranging from a failed global system to a new world of global cooperation and harmony with nature. His brief version, “A New End, A New Beginning,” appears in the World Future Society’s 2009 conference volume, Innovation and Creativity in a Complex World.

Another and still broader view of world-scale systems crises and civic collapse by the 2020s, to be followed by “our maturity as a species,” is provided by Duane Elgin in The Living Universe (Berrett-Koehler, 2009).

Acceleration: The Forces Driving Human Progress by Ronald G. Havelock (Prometheus Books, 2011) makes a strong and thoughtful case for long-term progress of humanity, and a somewhat successful attempt to address various “fears for the future.” However, the 15-page annotated bibliography is a bit spotty, with favorable comments on Julian Simon and John Naisbitt, negative reviews of Paul Ehrlich and the 1972 Limits to Growth report, and no consideration of Lester R. Brown and current thinking of the vast majority of climate scientists.

An upbeat view looking beyond the Great Recession is provided by urbanist Richard Florida in The Great Reset: How New Ways of Living and Working Drive Post-Crash Prosperity (Harper, 2010). This is countered with the grim view of Dystopia: What Is to Be Done? by Canadian sociologist Gary Potter (CreateSpace, 2010), who sees capitalist-driven disaster already afflicting at least one billion people and coming soon for the rest of us. Collapse: How Societies Choose to Fail or Succeed by UCLA geography professor Jared Diamond (Penguin, 2005) was a best-seller for more than six months and is still relevant. Our Final Century: The 50/50 Threat to Humanity’s Survival by UK Royal Astronomer and Cambridge professor Martin Rees (Basic Books, 2003) covers a broad range of science and technology risks and is also still very relevant.

Severe climate change scenarios in particular deserve our attention. Climatic Cataclysm: The Foreign Policy and National Security Implications of Climate Change, edited by Kurt M. Campbell of the Center for a New American Security (Brookings Institution Press, 2008), offers three plausible scenarios: Expected Climate Change by 2040, Severe Climate Change by 2040, and Catastrophic Climate Change in the 2040-2100 period, as average global temperatures rise to 5.6°C above 1990 levels.

In a more popular style, former U.S. Assistant Secretary of Energy Joseph J. Romm provides three scenarios in Hell and High Water (Morrow, 2007) on developments in three periods: 2000-2025, 2025-2050, and 2050-2100 (when a sea level rise of 20–80 feet will be “all but unstoppable” if current trends continue). A longer-term view of our world in 2050, 2100, and 2300 is enabled by University of Washington geologist Peter D. Ward in The Flooded Earth: Our Future in a World without Ice Caps (Basic Books, 2010), who argues that sea-level rise will happen no matter what we do.

Our own previous contributions to thinking about the MegaCrisis include Democracy in the 21st Century by Michael Marien (Future Survey Mini-Guide #3, 2008), on problems of democracy and today’s ill-informed citizens, and Technology’s Promise by William E. Halal (Palgrave Macmillan, 2008), which covers TechCast forecasts of the technology revolution.

—William E. Halal and Michael Marien

Note: Longer reviews of many of these books are available online at GlobalForesightBooks.org.

Defining and Anticipating the Global MegaCrisis

How to Define the Global MegaCrisis

At the personal level, it is a MegaCrisis to lose one’s home, job, and/or spouse. At the community level, a city or state (like Haiti) reeling from high unemployment and/or a natural disaster is in a MegaCrisis. In a broader sense, a MegaCrisis is more than a “catastrophe,” and it can bring about a natural turning point in social evolution. It is thus not only a threat but may also be an opportunity.

The Global MegaCrisis is a constellation of major issues such as climate change, ecological collapse, economic depression, nuclear threats, and/or high-impact wild cards that threaten civilization. Worth noting is that, in the most hopeful scenario, the Global MegaCrisis could initiate the creation of an advanced stage of development based on knowledge, high technology, and global community.

How to Understand the Global MegaCrisis

Trends such as those listed in this article suggest that we are moving toward a MegaCrisis, and there are many other indicators to consider as well. If Iran demonstrates a nuclear bomb, for example, this would heighten the chances of war, which could destabilize the Middle East and deepen a global MegaCrisis. Many would argue that failed or failing states, such as Somalia and Haiti, are already in a condition of MegaCrisis. There will always be contending perspectives when it comes to anticipating crises and gauging their severity. However, avoiding the issue, forestalling painful but necessary changes, or simply thinking, “it can’t happen here” will increase the probability of catastrophe.

What Might Happen When the Global MegaCrisis Arrives?

Could it be the beginning of “The End” (complete extinction or major decline in civilization)? Or could such a breakdown ultimately lead to a breakthrough—a shift in global consciousness, for example—as Ervin Laszlo and others have postulated? Such a transition could be rapid or slow. It could be a clear upturn or downturn, or mixed paths, as in our “Muddling Down” and “Muddling Up” scenarios. The uncertainty is huge. What is certain is that sharply differing visions of what is likely to happen will be hotly contested, as illustrated in this article.

—William E. Halal and Michael Marien

Solar Power from the Moon

By Patrick Tucker

A Japanese company is pitching an alternative energy plan that’s out of this world—and potentially the largest public infrastructure project in human history.

The year is 2050 and it’s morning on the Moon. The Sun is rising over a landscape that is bleak and featureless with one exception: a wide belt of photovoltaic panels that cuts across the ash-gray lunar surface like a straight river. Not a single astronaut is in sight, but a troop of robots is busily making repairs to the installation where tune-ups are needed. Beneath the panels, superconducting cables are ferrying the Sun’s power to transmission centers. The power will be beamed to a receiving station near the Earth’s equator, and from there, it will be distributed to energy-hungry cities and towns across the globe where it will keep the lights on in offices, hospitals, and schools.

Meet the LUNA RING, the brainchild of Tetsuji Yoshida and his colleagues at CSP, the research arm of Shimizu, one of the largest construction firms in Japan. The LUNA RING is an idea that could only come from the land of the rising sun, a country boasting many of the world’s best-known technology companies, like Sony, Hitachi, and Panasonic, but also saddled with a shortage of natural resources.

The LUNA RING speaks to a future global need that’s keenly felt in the present in Japan, a nation now also coping with the impacts of the devastating March 2011 earthquake on its nuclear power capacity.

It’s also an example of planning in the long term.

“My very optimistic forecast is 25 years,” Yoshida told me when I visited the company headquarters in Tokyo last November. He explained that this is the time required before they could even begin the lunar-surface activity, assuming that Japan, the United States, or some other investor was actually willing to fund the project. “The scale is so huge; I don’t know how long it would take to construct. We may have to adjust the plan and the scale,” he says.

If the most exciting part of Yoshida’s job is coming up with bold engineering concepts, the most difficult part, except for the math, is keeping people’s expectations realistic. Shimizu’s company president, Yoichi Miyamoto, was hoping to pitch the project to potential investors with a start date on the Moon of around 2035. Yoshida sees this as ambitious, to say the least. The technical, practical, and monetary obstacles to building a solar laser power station on the Moon are unprecedented.

But the LUNA RING is buildable. Photovoltaic panels, remotely guided robots, and microwave transmission and lasers are already proven technologies. The project is simply raising the proverbial bar on the current state of innovation—raising the bar to the Moon.

“It’s very challenging, a good project for a company like Shimizu. So this is a type of campaign for us,” says Yoshida.

Why We Need the Moon for Solar Power on Earth

By David R. Criswell

Lunar-based solar-power production should have been developed decades ago, argues one space expert.

Our Sun is the primary power source driving life on Earth. It has enabled us to use massive flows of oil, coal, and natural gas burned with oxygen to provide approximately 85% of the 15 trillion watts of commercial thermal power that energizes the $60-trillion-a-year world economy.

Every year, more of this thermal power is converted into electricity. By mid-century, most power will be delivered as electricity. Since 1980, Japan and western Europe have achieved $42 trillion per year of gross national product for every 1 trillion watts of electric energy consumed. Two kilowatts per person of clean electric power can power economic prosperity. Ten billion people will need 20 trillion watts of power a year.

Our Sun is the only reasonable source for sustainable global-scale commercial power. But we cannot gather it dependably and inexpensively on Earth. Our biosphere interrupts the flow of solar power with varying day–night cycles, clouds, fog, rain, smoke, dust, and volcanic ash. These forces act with floods, wind, sandstorms, industrial chemicals, biofilms, animals, earthquakes, etc., to attack the necessary large-area solar installations. Extremely expensive, planetary-scale power storage, of indeterminable capacity, and global-scale power distribution systems will be required to deliver electricity somewhat reliably to consumers all around the world. Japan’s nuclear power plants deliver approximately 50 GWe of commercial power. An Earth-based station receiving solar energy from the Moon (a rectenna) could easily be built to produce that amount of power for commercial use. Moreover, such rectennas would never release radioactivity or CO2 and could be quickly replaced at low cost after a disaster.

For these reasons and others, solar power from the Moon is our best shot at meeting future energy demands. If the United States had stayed on the Moon during the 1970s, focusing on using the common lunar materials to manufacture at low cost the simple standard components of a lunar solar power system, then today, not only the United States but also the rest of the world would be green, prosperous, and secure. Such a system would pay for itself with 15 years of use.

Our primary challenge is mental. We must refocus our actions from battling each other and Earth for the declining resources within our limited biosphere and instead tap the Moon for solar power that is engineered to meet our needs.

About the Author

David R. Criswell is the director of the Institute for Space Systems Operations at the University of Houston. E-mail drcriswell@comcast.net.

A Feat of Futurism

To the jaded technology watcher, the LUNA RING may read not so much bold as old-fashioned. In the project’s size and scope, the faith it expresses in large-scale and long-term government-funded initiatives, it harkens back to the 1970s, a decade synonymous with many things, not least of which was U.S. space program euphoria. It was during the 1970s that the U.S. Department of Energy and NASA first conducted a series of studies on the feasibility of sending energy to Earth from satellites.

These studies, called the Satellite Power System Concept Development and Evaluation Program, were nothing less than an exercise in super-futurism, with a group of scientists from around the world writing back and forth in reports, letters, and journal articles, trying to design something in the distant future using tools and technologies that did not exist in the present.

The proceedings of the program note more than a few major obstacles to collecting and transmitting power in space. “The space infrastructure requirements were projected to be significant,” John C. Mankins, the manager of the Advanced Concepts Studies Office of Space Flight, told Congress in 1979, in what might be considered something of an understatement.

The program explored a variety of concepts, design plans, and scenarios. One proposal emerged as a leader: a network of dozens of satellites working together to catch solar energy and beam it to Earth, rather than a single satellite. But even with a network, the objects and their solar arrays would need to be enormous to do the job: large enough to collect and transmit 5 gigawatts of power each, according to Mankins’s testimony. (They would be transmitting power for use in the United States exclusively.) Sending objects into orbit becomes more costly and complicated as the size of the satellite increases. These wouldn’t be simple Sputniks, either, but rather floating power stations a kilometer or so in diameter—far larger and more complex than any communications satellites in space today.

The ongoing maintenance costs of the network would thus be enormous. Mankins testified that the cost to build the system would be more than $250 billion in present-day dollars. The program concluded in 1979, leaving many questions unanswered. Then, between 1980 and 1981, the U.S. energy crisis ended, and interest in space-based solar power hit a wall.

Fifteen years later, NASA initiated a three-year Fresh Look Study. A brief Exploratory Research and Technology Program followed. The agency found that many of the technical obstacles it first faced decades ago no longer seemed so insurmountable. Photovoltaic arrays in the 1970s could convert into power roughly 10% of the solar energy that struck them. By 1995, they were far more efficient and much lighter. New ideas were on the table, such as satellites that used inflatable trusses rather than metal to decrease object weight.

Mankins himself ditched the dispersed satellite network scheme and came up with a new idea for designing, building, and launching satellites. In his 1995 plan, many thousands of smaller, identical solar-gathering modules come together to form a much larger whole, the same way that thousands of similar ants come together to form colonies and millions of quite similar Web sites and Web servers form the Internet—a “super-organism,” Mankins calls it. The logistics of building and launching a type A mini-satellite 9,000 times (then type B, then type C) is less daunting than figuring out how to launch a few extremely complex, independently functioning machines. Mankins calls this realization his eureka moment. “It led me for the first time to believe that space-based solar power was technically possible,” he says.

Despite this encouraging progress, the question remained: How do you conduct tens of thousands of satellite launches, keep the devices working together collecting and transmitting energy safely, and keep the maintenance costs under control?

According to Yoshida, this is the wrong series of questions.

The Moon-Based Power Station

A solar collection satellite launched from Earth, even using the most advanced materials available in 2011, would weigh close to 10,000 tons, says Yoshida. This number, he later explained in an e-mail, is his estimation of the weight of a 1-million-kilowatt power plant in geosynchronous orbit.

“So heavy and hard to control, you will need so many rocket launch pads. Too much money.… So we chose the Moon as a power station,” he says. “We already have a natural satellite, one with minerals and resources. And it already receives sunlight across its surface area.”

The Moon’s face receives 13,000 trillion watts (terawatts) of solar power continuously. This is 650 times the amount of power the entire human population would need to continue to grow economically, according to space power expert David Criswell. Solar collection on the lunar surface would be 10 times more efficient than it is on Earth, where our ozone and rich atmosphere make solar collection less efficient.

Here’s how the LUNA RING would work.

Robotic staff. The lunar base would require some human personnel, but the bulk of the work on the Moon would be performed by robots that were remotely controlled. Japan has been conducting experiments with robotic giant arms in space since the 1997 launch of the ETS (Experimental Test Satellite) No. 7. “I don’t think [it will be] a big problem to control the robots on the Moon,” Yoshida says.

Panels. Sending enough photovoltaic arrays to encircle the lunar equator would require a lot of costly launches and burn up a lot of rocket fuel. The LUNA RING plan calls for the robotic construction of those panels on the Moon directly from lunar soil. This increases the overall efficiency and energy savings of the program compared with others. It also bumps up the complexity level of the proposal considerably.

Photovoltaic panels are constructed from silicon, which makes up 23% of the lunar surface. The Moon also hosts aluminum and aluminum oxide, which factor into many solar cell designs. “Theoretically, we have enough materials on the lunar surface” to build solar panels, Yoshida says. But finding significant deposits of these minerals is a lot harder on the Moon than on Earth, where the formation and movement of oceans, rivers, lakes, and streams created accessible mineral stores. “There’s no concentration of these minerals,” says Yoshida, “so all these resources are spread over the lunar surface.”

Shimizu scientists are working on ways to derive sufficient quantities of the minerals they need using hydrogen deduction. But building solar panels from moon dirt (and doing so via remote-controlled robot) remains the most ambitious aspect of the plan.

Once constructed, those panels would produce a lot of power. A 4 × 400 km portion of the lunar solar belt would produce power equal to the energy consumption of Japan, says Yoshida. A 30 × 400 km portion would equal the energy consumption of India. Sixty by 400 km would power the United States, and a 400 × 400 km square would collect enough energy to satisfy the power needs of the entire human population, by Yoshida’s calculations.

Laser transmission. Like those solar-based power plans from the 1970s, the LUNA RING would beam energy to Earth in one of two ways, using either a microwave or a laser.

Microwave transmission experiments have been ongoing since the 1960s and space laser studies since the 1980s. In that time, science agencies have demonstrated power transmission in space, between orbiting objects and the Earth and between planes and the ground. These, however, were low-level power exchanges. The most famous of these took place in Goldstone, California, on June 5, 1975; the NASA Jet Propulsion Laboratory successfully transmitted 34 kilowatts of power over a distance of 1.5 kilometers. A space-based power station would have to transfer a lot more power a lot farther. More tests will be conducted around the world between now and 2015, including in the Tokai region of Japan where researchers are working with a 2 kilowatt infrared laser. This isn’t a lot of power, either—not enough to run a car, but sufficient to boil water in a matter of seconds.

The ultimate test of spatial power-beaming could occur on the Moon itself. If NASA sets up a lunar base at either of the Moon’s poles—one of many projects under perennial consideration at the agency—a satellite flying around the Moon could conceivably power that base via microwave or laser transmission, thus proving the feasibility of using the Moon as a power station.

“Because there is no population on the Moon, it’s a good test spot for laser tests,” Yoshida says. “On Earth, it’s too dangerous. We have to spread out the energy concentration.”

The LUNA RING station would beam 220 trillion watts (terawatts) to Earth on a yearly basis (the beaming would be continual). Of that, only about 8.8 terawatts would be usable on the ground. The rest would be lost in space.

Over the Moon

Reaction to the LUNA RING among space experts whom THE FUTURIST contacted was optimism tempered by skepticism.

“It’s good that a major corporation is considering the Moon as a platform for gathering solar power and providing it to Earth,” said David Criswell, the director of the Institute for Space Systems Operation at the University of Houston, in an e-mail. “I’ve argued for years that the Moon [is] the only means to provide adequate commercial power to Earth to enable sustainable prosperity.”

Criswell is a long-time advocate for using the Moon as a power station. Although he’s a cheerleader, he acknowledges that much more research needs to be done before a Moon-based power plan can attract serious consideration. Much of that research would have to take place aboard the International Space Station, which, according to Criswell, presents something of a problem. “The fully staffed International Space Station will be hard pressed to do its few authorized experiments in low-Earth orbit and keep the station operating. It doesn’t have the capability to support the logistics for a major lunar infrastructure project or the staff to monitor lunar surface operations,” he said. “However, the station does provide the operational experience for building other specialized facilities in orbit about the Earth and Moon and on the Moon for power production.”

Power from the Moon would have to travel 10 times farther to get to Earth than would the same juice collected from a satellite. Mankins believes that a giant wireless transmitter floating in space would need to play a part in sending microwave or laser power from point A to point B. Robots building solar arrays out of lunar dirt? Maybe one day, Mankins says, but he insists that, when space-based solar power comes to light, it will have to use hardware built on Earth, at least initially.

“I believe that the first [space-based power] pilot plant could (with funding) be on orbit within 10–15 years; waiting for a lunar base to be established first would delay the availability of space solar power by decades,” he wrote in an e-mail. “From time to time, Shimizu develops a very visionary future large-scale engineering concept that they then articulate to a broad audience. Their LUNA RING concept is only the latest of these.”

John Hickman, a member of the board of advisors of the MarsDrive project and author of Reopening the Space Frontier (Common Ground Publishing, 2010), is known as a space-policy realist. He’s argued that the problem with most super-large space projects is that they require too much from potential investors: too much up-front capital, too much patience, and too much faith.

“If attracting capital for projects using proven technologies like communications satellites remains difficult, imagine the difficulty of attracting sufficient capital to construct a mining facility on the Moon or terraforming Mars or Venus,” he wrote in his 1999 essay, “The Political Economy of Very Large Space Projects,” a critical analysis of why mega-scale space schemes almost never get off the ground.

Hickman says that the LUNA RING boasts a few advantages over other similar projects. It could provide returns within a reasonable time frame, but would probably make for a better investment if ownership of lunar real estate were part of the deal. He suggests that Shimizu obtain legal title to the land on which it plans to build. “Unfortunately, the 1967 Outer Space Treaty made the Moon an international commons. That means that Shimizu would be constructing the LUNA RING on land ‘owned’ by all of the states on Earth,” he wrote in an email. But Japan could withdraw from the treaty and “claim the lunar equator as its sovereign national territory.”

Hickman is curious about what funding streams the company may draw upon but thinks the LUNA RING would probably need a large public investment to be economically viable.

The project is well suited to Japan, he says, in that it makes use of the country’s expertise in public works construction and robotics. But that doesn’t mean Japan is a good funding source. Japan carries more debt than almost any other highly industrialized country: almost 200% of the country’s GDP. Financially, Japan is in a terrible position to sponsor a project of this size.

“For Japanese decision makers to commit the capital necessary to launch construction of the LUNA RING would be a demonstration of unusual political will,” says Hickman.

The United States is another potential investor, if not for the LUNA RING, then for some competing space-based solar power program, perhaps of the sort that Mankins has suggested. The Obama administration has made repeated statements in favor of alternative-energy research initiatives and big public works. But the administration is also facing record deficits, a Congress fighting to repeal its signature health-care program, a retirement wave of historic proportions, and reelection in two years. Pitching a speculative and fantastically expensive lunar energy project to the American people under such conditions would be a loser.

“National political and economic decision makers in every advanced industrial democracy are especially risk averse at present about government expenditures for new projects,” Hickman points out.

Ask Yoshida about cost and he’ll shake his head and cross his arms tightly across his chest. “It’s always cost,” he grumbles. “Cost is a problem.… But price is a human tool for exchanging goods. Maybe this type of project could be out of range of cost considerations. We would have to find a new word for it?”

An energy plan beyond the realm of cost considerations? It’s an optimistic idea, even more so than sending robots to the Moon to build solar panels. In broaching it, Yoshida is also acknowledging that the greatest impediment to space-based power isn’t rockets or robots or physics; it’s a dearth of public resources. A project of such size and scope would require the willingness of hundreds of millions of souls to reembrace government-funded space programs. It would require sacrifice in the form of higher taxes, cuts in other areas, or both. At present, this seems beyond the capacity of the developed world.

But then, not long ago, we said the same thing about reaching the Moon.

About the Author

Patrick Tucker is the senior editor of THE FUTURIST magazine and the director of communications for the World Future Society.

Finding Eden on the Moon

By Joseph N. Pelton

At a time when world leaders see few compelling objectives for space exploration, here is one: a colony on the Moon. The economic and scientific benefits would more than compensate for the up-front costs and time investment, argues a former dean of the International Space University.

In recent months, there has been a great deal of debate about the future of space activities. Those of the United States, in particular, have been highly uncertain since spring 2010, when President Obama called for the end of Project Constellation. He thought that this program, which would send several human missions to the Moon and would only develop a new launcher much like the 40-year-old Saturn V—i.e., Apollo on steroids—was too expensive. In short, this program seemed unlikely to achieve major new technical or scientific breakthroughs.

A number of questions continue to be debated about the future of space programs around the world. What should we do with the International Space Station (ISS)? How soon can there realistically be commercial human space flight to low-Earth orbit, the ISS, and even private space habitats? What should the world’s priorities in space be: improving global communications, enhancing national and international security, monitoring climate change, promoting scientific understanding, or exploring other worlds?

It is clearly appropriate to question the objectives of space enterprise and debate the many options. Should we return to the Moon? Or should we build a space colony to beam power back to Earth? Maybe we should go to Europa, or to an asteroid—perhaps one filled with platinum. And we always have Mars!

Why We Should Go to the Moon

I believe strongly that we would generate major scientific and economic gains if we were to embark on creating an economically viable colony on the Moon. This would be achievable largely through lower-cost robotic missions that could, within a decade, create a livable environment on a lunar outpost, allowing humans to carry out a wide range of economically viable tasks. Let us call this future lunar colony Eden 1.

There are many reasons we should focus on the Moon, including the following:

  • Communications. We have the technology today to establish an affordable, broadband system to communicate with astronauts on the Moon with only a few seconds’ delay in transmission. It is much easier than attempting broadband communications with astronauts at a deep-space destination, such as Mars, where transmissions would be hugely expensive and suffer delays as long as 20 minutes (which could prove critical in emergency situations).
  • Transportation costs. Landing a few human crews on the Moon to build a habitable colony would indeed be expensive. An earlier study, the Project Constellation, projected the costs to be near $100 billion. Sending robots to build a lunar colony, however, is a much more affordable proposition. Teams of robots could dig the lunar surface and build a permanent human habitat there for a fraction of the cost of a human construction crew.

“Smart” robots might first land on the Moon and build a radiation-hardened living environment. Robotic missions to create a permanent human habitat would push back a human return to the Moon by four years or more, but such a modest delay would help ensure that the inhabitants of a lunar colony would be able to stay longer and accomplish much more. Once the robots have completed the initial groundwork, people can move into this new living area and activate commercial projects that will ultimately make the investment of capital resources in lunar enterprise profitable.

This would be doubly cost-effective if developers from the United States, Russia, Europe, Japan, and perhaps China collaborated on the effort. There also have been out-of-the-box suggestions that propose adding shielding and some nuclear thrusters to the International Space Station and then sending the station to the Moon as a staging habitat (i.e., a cosmic construction trailer) while a lunar colony is being constructed.

Viable, cost-effective and radiation-hardened habitat. The discovery of billions of gallons of water and other resources on the Moon suggests that such a habitat could be created largely from resources available on the lunar surface, and that a permanent colony could be established on the Moon at much lower cost than required to create another off-Earth habitat. The Moon may have as much as 80% of the materials needed to create the colony, and digging down to build the colony would provide significant radiation protection.

Creating a space colony at a Lagrange point (one of five gravitationally stable positions in an orbital configuration with Earth) would require lifting hundreds of tons of materials to this location at huge cost. Our proposed Eden 1 colony, in contrast, could include a material processing area from which communications, solar power, and remote-sensing satellites could be fabricated and lowered to desired Earth orbits with substantial launch cost savings. Lunar manufacturing sites could also produce components for use on space stations, or even for buildings and homes back on Earth. With the Moon’s material capacity, human centers there could make any number of things more affordable: Lowering material from space to Earth is much less costly than is building them on Earth and rocketing them up into space.

It could turn a profit relatively soon. A colony such as Eden 1 is the only human off-world activity that might reasonably be expected to realize an economic return within 20 to 30 years and could offer huge long-term economic returns. For example, the Moon could be the site for manufacturing satellites used in remote sensing, climate monitoring, and telecommunications. Compared with building satellites on Earth and launching them into space, the cost and power savings would be substantial. In time, such a station could provide substantial returns, based on a 20- to 30-year business plan that included consideration of reduced launch costs, power cost savings, and other such factors.

Other imaginative possibilities could make Earth–Moon commercial activity even more economical. These include construction of a space elevator with pods that lift materials and people from Earth’s surface into orbit. Once you are in Earth orbit, the pull of gravity is 560 times less. You could exit the elevator and fly to the Moon, Mars, or other destinations by very low-thrust, high-efficiency rocket propulsion systems.

Lunar Colonization as a Business Enterprise

Commercial involvement in building Eden 1 would be important because this could help provide a viable business case. For instance, the business plan would have to demonstrate that there are sufficient material-processing capabilities on site to build and deploy applications satellites. The creative and entrepreneurial power of international business enterprise can make going the Moon not only possible, but even profitable in a couple of decades or so.

Space enterprise is a fast-growing area of R&D. For instance, the entrepreneurs Paul Allen and Burt Rutan recently demonstrated how, for a few tens of millions of dollars, they could create a space plane capable of flying to the edge of Earth’s atmosphere and returning safely.

Also, a number of innovative companies build robotic lunar explorers each year and enter them into the Google Lunar X Prize, a competition of robots designed to land on the Moon, travel around its surface, and relay images and data back to Earth. A total of $30 million in prizes will be awarded ($20 million to the first team to succeed). The designs and ideas generated through this competition might not traditionally be funded by a governmental space agency because they are too unconventional and daring.

Robert Bigelow has launched two Earth-orbiting inflatable habitats with private money, and he plans to deploy a private space station larger than the ISS. Most significantly, he has adapted NASA-developed technology to meet his entrepreneurial goals.

There was sound logic in the Aldridge Commission report that advised, among other things, that NASA expand opportunities for international cooperation and limit its role to the development of cutting-edge technologies, space sciences, and governmental functions that are not easily or appropriately carried out by private industry. This is because private industry is more driven to achieve end results at lowest costs, can carry out international cooperative projects with fewer formal constraints and greater flexibility in partnerships and contractual relationships, and is more entrepreneurial and better able to take innovative approaches, with risk-taking bounded by insurance or reassurance agreements.

Practical Space Development

This is not to suggest that private industry should do everything. The space agencies around the world, including NASA, Japan Aerospace Exploration Agency, the European Space Agency, and many others, need to develop the advanced technology that lies beyond the means of corporate R&D. There are projects that are properly the role of governments and that do not involve a profit motive, such as space telescopes, satellites to monitor climate change and space weather, and nuclear-propulsion systems.

But publicly funded projects must meet public needs, including budgetary ones. In the United States, for example, the public would likely expect international cooperation to reduce the costs of space enterprise and focus on activities that are vital to sustaining the human race or improving life on Earth. Going forward, space agencies should not only set new priorities, but also communicate them more effectively to citizens through major television networks, the Internet, and other media channels.

Today, U.S. space activities still dominate world spending on the space frontier. Perhaps in time this will change, but for now creative leadership in developing space vision and goals—for better or worse—remains with the United States. For example, new ways to find cooperative relationships with “smart machines” will become critical as we look forward to what Ray Kurzweil calls the “Singularity,” or what I have called in other writings the “Age of Super Automation.”

If we are to develop Eden 1—a habitable colony capable of independently sustaining life—the best hope to do so cost-effectively and within a reasonable time span entails an approach that is both entrepreneurial and cooperative, primarily capitalized by a team of international corporations who are bonded together in a unique way.

Here it would be wise to consider models such as Arianespace, which developed the successful Ariane launch vehicles, or Intelsat, which pioneered global satellite communications. The key for Eden 1 to succeed would be to assemble the world’s best technical skills and the best possible entrepreneurial and business management team, and to start with a business plan based on viable economic return. Elements of this plan might include:

• Processing materials on the Moon to create new products.

• Building satellites that could be “lowered” to Earth orbit at a fraction of today’s launch costs.

• Creating solar-energy systems or thermocouple power units that could beam clean energy to the Earth’s higher latitudes during their “long nights” and “solar winters.”

• Monitoring environmental, meteorological, and climatic trends and events on Earth.

• Conducting astronomical or other cosmic research that cannot be carried out on Earth.

An Evolutionary Move Forward

The human species is now potentially in danger of becoming a dead branch on the evolutionary tree. We simply must address climate change in much more aggressive ways. We also must limit population expansion. We must restore the ozone layer that protects us from space radiation. In short: We need to move forward to survive.

We must also make space enterprise both economically viable and a sustainable pathway to the future. In time we may indeed find that we need an ultimate doomsday escape plan for the human race, but unless we start now in a systematic way, it could be too late to save human life as we now know it.

It is technically possible to create an “off-world” presence that is actually profitable within a few years based largely on energy production and material processing. Eden 1 could be a commercially viable pathway forward to new knowledge, new jobs, and new wealth.

About the Author

Joseph N. Pelton is the founder and vice chairman of the Arthur C. Clarke Foundation, as well as the founding president of the Society of Satellite Professionals. He has directed strategic policy for Intelsat, served the International Space University as dean and board chairman, and been elected a full member of the International Academy of Astronautics. He also serves as THE FUTURIST magazine’s contributing editor for telecommunications. E-mail joepelton@verizon.net.

This article draws from his paper, “Eden 1-2-3: A Sustainable, Post-Doomsday Strategy,” to be published in the World Future Society’s 2011 conference volume, Moving from Vision to Action.

Why Farmers Need a Pay Raise

By Julian Cribb

Global commercial trends threaten farmers’ livelihoods—and the global food supply along with them, argues an agricultural policy watcher. The consequences for human beings everywhere could be dire.

The world’s farmers need a pay raise, or else, come mid-century, the other 8 billion of us may not have enough to eat.

As the Earth Policy Institute notes, the world produced more grain than it consumed throughout the 1970s, 1980s, and 1990s. Today, those surpluses are gone. While the world harvested 20.4 million tons of grain between 2001 and 2010, it consumed 20.5 million tons. This gap may sound small, but it will surely widen later this century as the world population and food demands continue to rise.

At its “How to Feed the World” meeting in October 2009, the UN Food and Agriculture Organization stated that world food production would have to increase 70% by 2050 to adequately feed the growing world population. This would require an investment of $83 billion a year in the developing world alone. However, it also noted, “Farmers and prospective farmers will invest in agriculture only if their investments are profitable.”

Unfortunately, farming in the last few decades has not been particularly profitable. The real prices of rice, wheat, soybeans, and maize fell by an average of 2%–3% per year between 1975 and 2008, according to University of Minnesota economists Julian Alston, Jason Beddow, and Philip Pardey.

Cheap food is a boon for consumers, but not for farmers and not for the planet. Among the effects are disincentives for farmers to grow more food, leading to reduced agricultural productivity gains, a disincentive to young people to work in agriculture, huge wastage, and spreading ill-health in society. Cheap food prices also reduce national and international investment in agriculture, as investors consider farming less profitable than other opportunities. Because of the disincentives to investment, farmers cannot adopt more sustainable and productive farming techniques so readily.

Food Becomes Scarcer and Costlier

Ominous warning signs lie within the most recent data on global food production. Farming sectors everywhere are contracting. Agricultural employment in the European Union fell 25% between 2000 and 2009, according to the European Commission. In all, according to the International Labor Organization, worldwide agriculture shed more than 550,000 jobs between 2001 and 2007, a 4.7% decline.

Should these trends continue, all of the basic resources for food production will likely become much scarcer. Global food supplies will tend to tighten over time, making the world more vulnerable to sudden unanticipated shortages and price spikes whenever seasonal conditions in key farming regions are unfavorable.

Food prices soared to record highs in 2008, according to the UN Food and Agriculture Organization (FAO). The agency warns that we may witness many more, and more extreme, price spikes if we continue to ignore the plight of agriculture. The FAO’s November 2010 Food Outlook report notes that, since 2008, harvests of cereal, wheat, and coarse grain have declined by several percentage points each. Further, due to stagnant production, food prices will likely rise to record-high levels this year, and unless production expands substantially, high demands will lead to critical food shortages in many parts of the world.

“With the pressure on world prices of most commodities not abating, the international community must remain vigilant against further supply shocks in 2011 and be prepared,” the report states.

Sources: Eurostat, http://epp.eurostat.ec.europa.eu.

International Labor Organization, www.ilo.org.

Food and Agriculture Organization, www.fao.org.

The dramatic increases in world crop prices in 2008 and 2010 have not made farming more profitable. The reason is a growing imbalance in market power between farmers and the businesses that dominate the food supply and input chains.

Two decades ago, most consumers bought their farm produce from local farmers in local markets. In the twenty-first century, market power is increasingly concentrated in a very small number of food corporations and supermarkets sourcing food worldwide. The food corporations minimize their input costs by paying farmers less for farm commodities. The power of the farmer to resist downward price pressure has weakened, as farmers in rich and poor countries alike now compete intensely with each other to sell at the lowest possible prices.

At the same time, the manufacturers of fuel, machinery, fertilizer, chemicals, seeds, and other farmers’ necessities have grown much larger, more globalized, and more powerful. This makes it easier for them to raise the cost of their products. When farm commodity prices rise, the industrial firms increase the prices of their wares, often by far more. In 2008, when grain prices rose 80%, fertilizer prices went up 160% in some cases, while oil reached to $160 a barrel with proportionate increases in farm fuel costs. Many farmers have noted the irony: They earn lower profits when commodity prices are higher.

Farmers are thus trapped between muscular globalized food firms that drive down the prices of their produce and muscular industrial firms that drive up the cost of their inputs. The economic message now reaching most of the world’s farmers from the market is “Don’t grow more food.” As a result, world food output is increasing too slowly to meet rising demand, overall farm productivity gains are sliding, and yield gains for major crops are stagnating.

Global Resource Degradation and Productivity Decline

In a recent satellite survey, FAO researchers reported that 24% of the Earth’s land surface was seriously degraded, compared with 15% estimated by an on-ground survey in 1990. The FAO team noted that degradation was proceeding at a rate of around 1% a year. This degradation is caused primarily by the low profitability of agriculture, which drives many farmers (especially in poorer regions) to overuse their land. If we continue to sacrifice 1% of the world’s productive land every year, there is going to be precious little left on which to double food production by 2060.

Much the same applies to irrigation: “In order to double food production we need to double the water volume we use in agriculture, and there are serious doubts about whether there is enough water available to do this,” Colin Chartres, director general of the International Water Management Institute, told the 2010 World Congress of Soil Science in Brisbane, Australia.

Solutions to land and water degradation are fairly well known and have been shown to work. Unfortunately, most farmers cannot afford to implement them, even though many would like to do so.

As a result, world agriculture is today primarily a mining activity. We all know what happens to mines when the ore runs out.

University of Minnesota economists Alston, Beddow, and Pardey attribute much of the productivity decline to falling investment worldwide in agricultural science, technology, and extension of new knowledge to farmers. In the United States, public expenditures on agricultural R&D grew 3.6% a year from 1950 to 1970, but only 1.7% a year from 1970 to 2007.

“A continuation of the recent trends in funding, policy, and markets is likely to have significant effects on the long-term productivity path for food staples in developed and developing countries alike,” they write.

The role of low returns in discouraging farmers, in both developed and developing countries, from adopting more productive and sustainable farming systems cannot be ignored. While a few highly efficient and profitable producers continue to make advances, the bulk of the world’s farmers are being left behind. Since small farmers feed more than half the world, this is a matter of some concern.

Cuts in support for farm research have been inflicted in most developed countries and even in places such as China, where the level of agricultural R&D support is falling as a proportion of the total science investment. With agricultural R&D comprising a mere 1.8 cents of the developed world’s science dollar in 2000, you can get a very clear idea how unimportant most governments now consider food production to be.

Solving the Food Challenge

Although most experts agree that we should be seeking ways to double food output sustainably over the coming half century, the ruling economic signal is: “Don’t do it.” We could obey the economic signal and allow agricultural output to gradually fall behind—but that will expose 8 billion consumers to massive unprecedented price spikes, imperil the poor, and maybe start wars and topple governments. It will not benefit farmers nearly as much as would stable, steady increases in their incomes, which would provide incentives for investment and innovation.

Policy makers need to move much faster and farther toward totally free trade in agricultural products, thus encouraging efficient producers around the world. But we also need to be aware of the universal dangers of undervaluing agriculture as we approach the greatest demand for food in all of history. Here are a few ways to address the issue:

  • Consumers, supermarkets, and food processors could agree to pay more for food so as to protect the resource base and enable farmers to invest in new technologies.
  • Governments could pay farmers a social wage for exercising proper stewardship of soil, water, atmosphere, and biodiversity, separate from their commercial food production.
  • Regulations could limit the practices or technologies that degrade the food resource base and reward those that improve it.
  • A resource tax could be imposed on all food to reflect its true cost to the environment to produce; proceeds could be reinvested into researching and implementing more sustainable farming systems.
  • Markets could be established for key farm resources that offer farmers higher returns for wise and sustainable farming practices.
  • Public education programs could be launched to demonstrate how to eat more sustainably, and industry education programs could showcase sustainability standards and techniques.

If we all want to eat securely in the future, it is imperative that a more serious debate take place about how to deliver fairer incomes to farmers worldwide, countering the unintended effects of overwhelming market forces against farmers.

About the Author

Julian Cribb is an author, journalist, editor, and science communicator, and principal of Julian Cribb & Associates consultancy in Nicholls, ACT, Australia. His latest book is The Coming Famine. E-mail Julian.Cribb@work.netspeed.com.au.

Building a Better Future for Haiti

The former Haitian ambassador to the United States visited the offices of the World Future Society in January, seeking help for rebuilding his country. This remarkable meeting offered the Society the opportunity to outline the futuring process and to clarify what it can—and cannot—do.

At its small office in downtown Bethesda, Maryland, the World Future Society recently welcomed Raymond Joseph, the former Haitian ambassador to the United States. He was accompanied by his son, Paul Joseph—a futurist and activist—and Emmanuel Henry—a retired Panasonic executive. The goal of the meeting was to explore ways that futuring tools can help rebuild a nation.

Joseph is an ambitious man. Not only does he want to save his own country, but he also wants Haiti to become a role model for other countries written off as “failed states” with no futures.

As one of many would-be candidates in Haiti’s 2010 presidential election whose eligibility was revoked (allegedly because he had abandoned his duties as ambassador in order to make a bid for the presidency), Joseph conceded that his ambitions are political. The first thing his country needs, he said, is leadership based on trust.

The Josephs and their compatriot Henry, who helped manage the “Friends of Raymond Joseph for President” campaign in 2010, spoke with Society President Timothy Mack and FUTURIST magazine editor Cynthia G. Wagner on January 13, one day after the first anniversary of Haiti’s devastating earthquake.

The following is an edited transcript of the discussion that took place in our office.

Raymond Joseph: I was in Washington at the time [when the earthquake struck Haiti on January 12, 2010]. The [Haitian] leadership was absent, they were not to be seen anywhere, so all of a sudden I became the face of Haiti for the world. And I had to make the first decisions in the first 48 hours, to get help to the country.

Based on that, quite a few of my friends, both Haitian and foreign, came to me and said, “You know what? We need new leadership in Haiti. You should consider the president’s candidacy,” which I did. And for no reason at all, they disqualified me.

Mack: Let me speak frankly to you. I believe they felt they had lots of reasons, because you posed a threat. You were too well known and too popular.

Raymond Joseph: Yes, because of that I was a threat. Yesterday I wrote a piece in The Wall Street Journal, and in there I say what needs to be done if we’re going to get Haiti back on track. And what I said should be done is for the president who’s there now, whose term ends February 7th, to exit on February 7th with his team and not try to hang on as he wants to until May 14th. Because in three more months, he will not be able to accomplish what he could not do in five years.

I was quite forceful in that and quite forceful last night again, repeating it. Now, what I seek in [terms of] government for Haiti is a large coalition, and that’s what I’m working for, that’s why I stayed in the country after they disqualified me. They thought I was going to go back abroad. I did not do that.

I feel that we need to look at ways of changing Haiti. And to do that, we have to change the leadership. That’s what I’m working on.

But besides changing the leadership of Haiti, people know that I have some ideas for the future. One of the ideas I have is about energy, … and another major idea for us is reforestation.

To get moving on these things, I feel that we have to entice a percentage of Haitian intellectuals and professionals living abroad. … According to the Inter-American Development Bank, that’s 83% of our intellectuals and professionals. I feel we have to entice a percentage of them to come back.

Wagner: To reverse the brain drain.

Raymond Joseph: To reverse the brain drain. So, knowing that you work with the future, I felt I may come and tell you what I think I need.

Mack: Well, let us be honest in the sense of full disclosure. We [the World Future Society] are a convening and a publishing house. We do articles on a range of issues, but certainly one of the most powerful stories that we are able to tell is the story of organizations, countries, and even individuals who have taken their own future under advisement and are working to make it better, in order to avoid repeating the mistakes of the past and improve the quality of life for those that cannot speak for themselves.

Wagner: I would welcome an article that would tell the rest of the world what it is you need, step-by-step. How do you build a future?

Mack: Another thing we should be clear about: The Society is in fact a neutral clearinghouse, and that gives us our authenticity and the trust that we have with our readers. But also it makes us very interested in finding the truth and making it clear to an international audience. And that international audience is spread across citizens and academics, policy makers, corporate leaders—a wide range of people who would be very interested in the future of Haiti.

Wagner: We also tell stories when other media outlets aren’t interested in them. And that is a very big problem with the attention span in the United States. We had this horrific crisis in Haiti with the earthquake, and people reached out to their fellow man, because that’s what we humans do.

Mack: But we don’t do it for very long.

Wagner: We don’t do it for very long, and that’s the media problem that we have. And that’s where I think THE FUTURIST is very different. We had a story on alternative technologies that are very low cost—energy, water filtration, a bicycle built for cargo. That’s the kind of story that doesn’t really make headlines, so that’s what we try to do: focus attention on problems and how they can be solved. The story of Haiti’s potential catastrophes was very well known to people who were watching the trends.

You mentioned reforestation—that was the first thing that came to my mind. If you’re starting to rebuild the country, you need to build the natural resources back up and get your entire population involved, one person at a time: “Plant one tree and you will help your country.” That’s very motivating and it’s very doable.

Mack: Right. And we’ve seen reforestation models in other parts of the world (Mongolia, for example, which is very arid) work very well.

Wagner: Getting into the politics of international aid: People become very frustrated when their donations sit on the docks and don’t go into the country. Then you come into the problem where people want to help you and then stop helping you. And you can’t have that stop. You still need people to contribute, but for your own people [to contribute as well]; they’re the ones who live there. They also can contribute—more than they think they can.

Mack: And as well, self-reliance is a strong position to negotiate from. When the country is rejuvenating itself, you don’t have to rely on what I would call unreliable assistance.

Wagner: You also don’t need the experts to do the futures for you. We have found that community groups—in Michigan, for example—have been very useful in dealing with the auto industry crisis in their own communities. They get town hall groups together to start discussing “Where do we want our community to be? What do we want?” You start with that vision, and then you work back and build the steps to get there. The term for that is “backcasting.” You can call it “envisioning the future,” whatever you want to, but it is a process, and communities can do that.

Mack: It is a trusted process, and it’s worked well elsewhere. That point is very important, because it seems to me one of the great crises that Haiti faces—and perhaps one you respond to—is the trust in the present government. That must be addressed, and that trust must be rebuilt, regardless.

Wagner: So part of the enticement, of bringing the intellectuals back into Haiti, has to be from Haiti itself.

Raymond Joseph: Right.

Mack: The chance is for them to have a real hand in building Haiti’s glorious future. It’s much more possible for change to occur in smaller countries. The United States is so large and it has so many people wrestling for the future of the country, while Haiti has one national culture instead of 40 national cultures, as we see in the United States. And a vision that can be built with a country that has a scale that is workable and a sense of the national culture is extraordinary and could be done very quickly. So I’m saying that there are real opportunities here.

Raymond Joseph: So if I understand, you work in the realm of ideas.

Mack: And the possible, too; we work not just in the realm of ideas, but in the realm of making those ideas practical and implementable.

Paul Joseph: If I may, I would like to have the theme of this meeting go from the possible, which is the art of politics, to the implementable. I talked with some friends of mine at the church I go to; we have our goals set out for our congregation for the year, and I said one of the things we have to approach every one of our goals with is, How do we get it done?

Part of the reason that I set this meeting up was because I understood the synergies between the two entities. Wheaton College anthropology [indicating Raymond Joseph, who holds a bachelor’s degree in anthropology from Wheaton College in Illinois], political destabilizer, and very successful at it. You’re the futurist, that’s anthropology and projections, the modeling of case histories.

Mack: Right.

Paul Joseph: Then, here you have here the editor [indicating Wagner], he’s an editor [indicating Raymond Joseph]—that’s how the destabilization of the Duvalier regime came about, through the newspaper my father and my uncle founded. I looked at all the synergies, and I thought you’re speaking the same languages, just not in the exact same animal, for lack of a better term, with which Haiti now is identified. In shaping the future of what the country can be, that [becomes a] blueprint that you can use as a model. If another Katrina hits someplace else, or a tsunami, here’s what happened in Haiti, and here’s how we rebuilt, here’s what we’re doing in Mongolia, here is what’s going on there.

Mack: Right. And one [goal] is to bring implementable, on-the-ground, transportable, and affordable technologies that can be put in place quickly. We certainly heard a lot about the use of communication technologies in Haiti, where when the networks were cut off, the people were able to keep communications, information, flowing about need, about damage, about fatalities, about the immediate triage that was required. Those were very helpful.…

As we all know, Haiti will always be in the path of harm. And I don’t mean politically, I mean from the environment, from the growing problems that we see with climate change, from the instability of the land. We really focus on how new technology affects people’s lives—is it practical? One of the things that happens in a country which has seen crisis is that entrepreneurial forces come from around the world—largely from the United States—and say, “I have such a wonderful deal: I’d be glad to share this technology with you, only five million dollars.”

Wagner: That was the other point I think we can make about starting small and at the grassroots. We talk about new technologies, but there are also social technologies. And one thing that I think would be very implementable would be the microlending programs that have been very widely …

Raymond Joseph: Microfinance.

Mack: Yes, microfinance.

Wagner: Yes, absolutely, lending to your neighbors, community lending: “What do I have that can help you?” But what you face is a collection of problems, and the decision has to be made, what do you tackle first?

Mack: And of course the biggest problem you face is leadership.

Raymond Joseph: That’s it, that’s it.

Mack: And how do you get the strong leadership that is necessary to make this change even be considered.

Paul Joseph: This is where I feel the first step had to be made. I thought that all day and last night as well, and how specifically the World Future Society can help, because it has such an extensive reach. These two men [indicating Emmanuel Henry and Raymond Joseph], with their collective experience, have a vast wealth of knowledge and an extensive network in Haiti. If you want the facts, if you want the figures, if you want the information that very few people know and you show that you can use it to the best advantage of the country, I’m sure they would be willing to make some of it available. …

Now, where that information can best be utilized and with the right parties, … that’s the way that the story of what’s gone wrong with Haiti can gain a much larger international audience and institute the changes instead of the OAS [Organization of American States] and the UN and whoever else saying, “Let’s have a runoff of the candidates” in a fraudulent election already. Change that story to, let’s say, “If you have a runoff of this kind of election, then you deny, historically speaking, the legitimacy of the United States’ birth, because it was a country that rebelled against unjust rule.” You have to support the rebellion against unjust rule today, or the hypocrisy is too outlandish.

Wagner: [But] if you can outline the vision of your future, that’s the story that we can tell.

Mack: We can certainly help you with shaping that story. And we can help you with telling that story. But the telling may be on a little longer timeframe than the immediate March crisis … or, you know, the 7th of February.

Wagner: Think of this as post-crisis thinking.

Raymond Joseph: Mr. Mack here, he says: Work on the blueprint, the future you want to see. And then come and visit and see …

Wagner: And instruct us.

Mack: Right.… It seems to me the first step would be for us to put together a list of people you should be talking to.

So let me ask you, What are your next steps? What are you hoping to accomplish in the next few weeks?

Raymond Joseph: My next steps. That’s what I’ve been working on. Since I was bumped off the ballot, I have stayed in Haiti and worked with various candidates—some who were running, and some who were not running—and looking towards having a large coalition for future change. That’s my goal. I’m not even considering myself as a candidate for the president of Haiti.

Mack: Well you know who comes to mind, I mean, you look at South Africa and the history of South Africa, you know, there were years and years of struggle. No, I’m not suggesting you should spend any time in prison like Nelson Mandela, but another name in that group is Desmond Tutu: you know, people who had not a formal role in the government, but enormous influence.

Raymond Joseph: That’s the way I’ve been through the years, you know? I fought the dictatorship of Duvalier, I fought against Aristide and … his kind. And I was condemned for death in absentia.

Mack: Yes, I know that. And, as you know, some people who are condemned to death in absentia had it come and visit them.…

Raymond Joseph: What I have tried to do in the past, building a coalition of ethical leaderships, has been successful. Since they have bumped me off the ballot as for the presidency, I’ve come back. Now we have quite a few candidates for presidency. I want to tell them you cannot all be president of Haiti, but you can all work for change.

Mack: Yes. You can all be friends of Haiti.

Raymond Joseph: Exactly. So, let’s work together to do this. And immediately, the next thing I’m doing is to help annul the election that took place, which was not an election. That’s what I’m working on right now.

Mack: Are you also working on observers for the coming elections, too, or is that something that will happen no matter what?

Raymond Joseph: We haven’t gotten there yet. However, the first democratic elections in Haiti, in 1990—December 16th—I was the one that signed the agreement with the OAS back then. I was the representative of the country to the OAS; the UN took that agreement and expanded on it, and we had 3,000 observers in Haiti the week of the elections. So I’m used to doing that. And I will want to—in the elections coming up, after we get through this harrowing year—to have the best observer teams. I brought President Carter to Haiti in 1990, and others. I want to get to that point in the next elections coming up, which will probably be in a year, because this thing here that they’re trying to patch up, they cannot patch it up. They’re trying to patch it up at the level of the presidential elections. However, the fraud was widespread. It was at the legislative [level] also.

Mack: And that may be very self-defeating in the sense that a weak government does not last, especially if that government is clearly founded on fraud.

Raymond Joseph: Exactly.

Mack: We know many people, but mobilizing them within days or weeks—I would be honest with you and say it’s unrealistic based on our capabilities, our staff, and our resources.

However, mobilizing the kind of organization for change that you’re talking about, and helping you not only put together a plan, but also [figuring out] who should be part of that and give you some nonprofits from a range of areas, or at least people that are not seen as political to say, “Yes, this is the direction, this is how Haiti should think about its own future,” we can be helpful with that.

Paul Joseph: … and because they don’t have any political allegiance as well. They’re more credible because they’re not interested for the profit motives.

Raymond Joseph: And to be frank with you, since the earthquake, Haiti has had too many NGOs, so much so that now they’re calling Haiti “The Republic of NGOs.” They’re saying ten thousand. There’s no coordination.

Mack: They all have their own agendas, and therefore they step on each other.

Raymond Joseph: And you don’t see what they accomplish.

Wagner: There’s duplication and gaps.

Mack: Also, they are there to accomplish what they are built for, which is their own, their own …

Paul Joseph: … agendas.

Mack: Not just agendas; their own pride. You’ve seen that. You know, NGOs are very proud. And they are very moral, but not always in a good way: “Maybe you should change the way you live your life because I say so.” Too much of that in NGOs.

Let me just say one last thing, which is, I think that what we bring is tools for the people of Haiti to use, as opposed to rules for the people of Haiti to follow.

Henry: That’s well said.

Raymond Joseph: Good. That’s well said. I will take that. I want to take that sentence.

Henry: And when you have ten thousand NGOs, everybody wants to pull you in different directions. “My direction is better; yours is better,” and nothing is accomplished, nothing is achieved.

Mack: But we’re very, very pleased that you would come here and talk to us about this, and we want to be as helpful as we can.

Raymond Joseph: And I’m going to tell you, also, Paul has tried to get me to talk to various people, and you know …

Mack: Some you say, “Yes,” some you say, “No.”

Raymond Joseph: When he talked about you, I said I want to come. Not because I know you’re going to help me solve the problem right away, but that you can help me think about the future.

Mack: And one of the things we can do is bring together a group that could meet with you at some time in the future, when you have a better sense of what the next year, for example, is going to look like. That we’d be very interested in. And certainly we know a lot of groups that were active in Haiti in a positive way, more in the way that I described, bringing tools to the people.

Henry: More the tools than the rules!

Paul Joseph: Yeah!

Mack: Yes.

Raymond Joseph: Yes.

Editor’s note: Three days after our meeting, on January 16, exiled dictator Jean-Claude “Baby Doc” Duvalier returned to Haiti, accompanied by heavy security. Ousted President Jean-Bertrand Aristide also returned from exile, arriving in Port-au-Prince just two days before a runoff between the two top vote getters in the disputed 2010 election. The results were not known at the time of publication.

Changing Agriculture From the Ground Up

By Rick Docksai

Africa’s farmers innovate to meet formidable challenges, offering lessons for the rest of the world, says the Worldwatch Institute.

State of the World 2011: Innovations That Nourish the Planet by the Worldwatch Institute. W.W. Norton & Company. 2011. 237 pages. Paperback. $19.95.

Hunger, water shortages, and environmental devastation are looming global problems, but farming communities in Africa have workable solutions, according to the Worldwatch Institute’s State of the World 2011. The report documents improvements that growers throughout the continent are implementing, sometimes with outside help and sometimes on their own, to increase yields while reducing their ecological footprints.

The report follows the completion of the Institute’s Nourishing the Planet project, which traveled through 25 countries in sub-Saharan Africa. The project researchers met with individual farmers and community-based organizations working to solve the intersecting problems of hunger, poverty, and environmental degradation. The researchers witnessed reforms that they believe could be exported to other continents and bring about a massive—and much-needed—transformation in global food production and distribution.

“These approaches can feed a large portion of the world—while at the same time addressing a host of present and looming problems of environmental degradation, livelihood insecurity, and poverty,” the authors write.

Chapters written by Worldwatch Institute researchers and contributing authors detail these innovations. Among them are:

  • The nonprofit Heifer International Rwanda imported a South African dairy cow breed known for high milk production and gave cows to Rwandan farmers. Recipient farmer Helen Bahikwe spent a government subsidy on construction of a biogas collection tank that would take methane from cattle manure and convert it to electricity. The fuel tank emits minuscule pollution compared with a wood-burning furnace, and it frees Bahikwe from the time-consuming chore of collecting firewood.
  • Corn farmers in Malawi planted nitrogen-fixing trees alongside their corn plants to enrich the soil. This technique quadrupled their corn yields without using artificial fertilizer.
  • The Solar Electric Light Fund, a U.S. nonprofit, introduced a solar-powered drip irrigation system to farmers in Benin. Villages that installed the system could for the first time grow fruits and vegetables year-round. Residents’ diets improved and their incomes increased.
  • Poor storage methods lead to much produce rotting or being infested by insects before it ever reaches markets. But a Village Community Granaries microcredit scheme enabled 27,000 small farmers in Madagascar to build new storage facilities for their rice. They cut crop contamination by 50%.

Aid organizations serve farmers best when they equip farmers to implement their own solutions, the researchers argue. Scientists can provide critical assistance, also, by partnering with farmers to help them conduct their own experiments. What is key is that the aid organizations and scientists listen to the farmers. The farmers know their crops and their ecosystems, and they have the best perspectives on what will work for their unique locales.

Farmers themselves can be great resources for other farmers, as well. The project researchers reported some farmers forming research committees, farmer-to-farmer educational programs, and radio broadcasts for spreading innovations throughout whole regions.

The authors make clear that it is not just Africa that needs farming innovation, however. Food supplies everywhere stand at a critical juncture: Production increased substantially in the last century, but the increase exacted huge ecological tolls that set the stage for a looming agricultural disaster this century. Agriculture is a major producer of greenhouse gases. In addition, overgrazing and excess cultivation have depleted soils and compromised their ability to nurture bountiful crop yields in the future.

Meanwhile, flaws in the distribution chain keep food from reaching all the consumers who need it. At least a billion people on Earth continue to suffer from severe malnutrition. In Africa, child malnourishment has increased 30% in the last 30 years.

As the world population continues to climb, and climate change strains communities everywhere, keeping food supplies stable will be more important than ever. Present-day adoption of sustainable farming practices stands to benefit not only the farmers, but all of humanity, in the long term.

“Healthy rural economies are also fundamental to global sustainability,” the authors write.

State of the World 2011 tells of the ground-level successes taking place on a continent not often associated with success. The authors objectively state the problems facing Africa and the rest of the world, but illuminate a multitude of encouraging answers to them that are already saving lives and livelihoods. It is an eloquent, painstakingly researched sound of warning and expression of hope.

About the Reviewer

Rick Docksai is a staff editor for THE FUTURIST and World Future Review. Email rdocksai @wfs.org.

Introduction to Personal Futuring

By Rick Docksai

You can’t predict your future, but you can direct it, says a professional futures workshop leader.

It’s Your Future … Make It a Good One! by Verne Wheelwright. Personal Futures Network. 2010. 253 pages. Paperback. $17.50. An accompanying workbook is available as a free download from www.personalfutures.net.

You may find yourself living in a shotgun shack.
You may find yourself in another part of the world.
You may find yourself behind the wheel of a large automobile.

(“Once in a Lifetime,” song lyrics by David Byrne et al.)

Anyone who has heard the Talking Heads’ hit song “Once in a Lifetime” will agree with the lead singer that a person’s future holds many alternative possibilities. But unlike the song’s protagonist, you don’t need to look around one day and tell yourself, “Well, how did I get here?”

Verne Wheelwright, a professional futurist, emphasizes in his new book It’s Your Future … Make It a Good One! that the years of life ahead of you are much less mysterious than you might believe. With proper thinking and evaluating, you can obtain a clear sense now of the direction your life is heading in and what you can do to guide it toward the outcome that you want.

“You will be surprised to find out how much you can know about your future. And, you will be surprised at how much influence you can have over your future,” he writes.

Government agencies and businesses throughout the world rely on formal foresight exercises to help them identify plausible futures and plan ahead how they will navigate them. Wheelwright adapts these exercises to the personal level to show how you can thoroughly map out where you might go—and where you might want to go—in the next 10, 20, 30, or more years of your own life.

“If you have a plan for your life, then as you make daily decisions, small as they may be, you will keep moving toward your plan and toward the future that you want for yourself,” he writes.

Wheelwright’s methods begin with you observing your present situation and your past. Next, you develop several scenarios for what your future might entail: best case, worst case, most likely, and a few unexpected “wild card” scenarios.

Alternatively, you could backcast—i.e., start in the future and work backward. This entails having a preferred destination in mind and then working through the steps that you would have to take to reach it.

Wheelwright demonstrates how you can use Excel sheets to list the “stakeholders” in your life—family members, co-workers, supervisors, elected officials, and other individuals who can impact your future for good or ill. Then you can similarly chart the “forces” that motivate you: finances, social relationships, housing, health, etc. Don’t forget to employ “environmental scanning” methods, which Wheelwright explains are how you look around to identify events and people likely to impact your future: marriage, job change, illness, divorce, and so on.

Self-awareness is integral to Wheelwright’s methods, also. He advises you to determine your values, as well as your strengths and weaknesses. You must know what you want and what would be the best approach you could use for attaining it.

The future can be a bewildering and intimidating concept. Wheelwright helps readers to not be daunted, however. The exercises and strategies he lays out in It’s Your Future can help any reader apply the long-term perspective necessary to find a desirable future and proceed confidently toward it.—Rick Docksai

Breakthroughs Gone Wild

The Very Next New Thing: Commentaries on the Latest Developments That Will Be Changing Your Life

Wooly mammoths could once again roam the frozen tundras. People recently killed by freezing or drowning could be brought back to life. And chimpanzees might take up day jobs in professional movie studios as cinematographers and camera operators.

These are just a few of the seemingly impossible developments that Gini Scott, founder of Changemakers Publishing and Writing, argues could be possible within our lifetimes once cutting-edge scientific research taking place today attains further fruition.

Many of these developments are bound to be controversial. For instance, Scott tells readers that medical researchers recently inserted human DNA into newborn pigs. The pigs grew to adulthood and were able to receive donated human blood, which would normally be incompatible to pigs. Scott speculates how scientists might one day build upon this experiment: Could actual human-ape, human-dog, or human-cat hybrids live among humans? Human–hybrid marriages and questions over whether to bestow citizenship on hybrids would loom large.

The Very Next New Thing is a walking tour of our future world radically made anew by technologies and discoveries that the scientific community has just recently grasped. General audiences who are curious about what today’s science could bring to tomorrow’s world will find it an exciting and engaging read.

Bringing the Planet Back from the Brink

World on the Edge: How to Prevent Environmental and Economic Collapse

Many great civilizations collapsed due to exhaustion of their resource bases, and our civilization is on the brink of repeating history, warns Lester Brown, president of the Earth Policy Institute, in World on the Edge. He traces a plethora of present crises brought on by unchecked human activity: rising food prices, shortages of freshwater, instability in dozens of failing states, pervasive malnutrition, and tangible effects of climate change, among others. The global community must change course before it is too late, he warns.

Brown presents an ambitious plan to stabilize energy supplies, conserve resources, diminish poverty, halt pollution, and cut carbon-dioxide emissions by 80%—all by 2020. The technologies that would make each goal possible are with us today. Burgeoning solar industries are taking off in the Middle East, Germany is on course to get 30% of its energy from renewable sources by 2030, and growing numbers of Northern Hemisphere communities are producing their fruits and vegetables locally in greenhouses powered in winter months by geothermal turbines.

Brown has much promising news on the poverty front, also. Liberia is a successful test case in rescuing a failing state, and Iran showcases how a government can use education and incentives to lower its population’s birthrate. And the percentages of children attending regular schooling is rising worldwide.

In a brief 210 pages, Brown compellingly describes a wide array of looming problems and then spells out how the world can fix them. All readers who are concerned for human health and the planet’s health may take great interest in what he has to say.

Editor’s note: An excerpt from World on the Edge is scheduled for the July-August 2011 issue of THE FUTURIST.

It Pays to Share

The Mesh: Why the Future of Business Is Sharing

A new business model is emerging based on sharing rather than selling and owning. Entrepreneur Lisa Gansky calls the new model the Mesh and reports that a variety of new businesses are using it to become far more responsive to their customers’ wants and needs.

Mesh businesses rely heavily on social media, online marketing, and word-of-mouth recommendations to gain new customers, interact with them, and deliver to them extra-personalized services at far lower economic and environmental costs.

Gansky profiles dozens of Mesh businesses and describes the strategies that most often help them succeed. Some Mesh businesses rent products: Netflix lends movies, for example, and Zipcar offers cars that customers borrow and drive on an as-needed basis. Others sell wares that they produce in close collaboration with their customers: Chocolate merchant TCHO rolls out new flavors in as little as 36 hours by continuously testing “beta versions” of recipes on customers.

Aspiring entrepreneurs should take great interest in The Mesh. As Gansky notes, all those who have products that their communities would enjoy sharing could launch successful Mesh businesses.

Public-Service Futures

Jobs That Matter: Find a Stable, Fulfilling Career in Public Service

Even in slow job markets, those who use the best job-search strategies will find many opportunities for rewarding careers in public service, says career advisor Heather Krasna in Jobs That Matter.

“Public service” jobs are ones whose main objective is solving societal problems, rather than earning profits or promoting an association’s members’ interests. Public-service opportunities abound in government, the nonprofit sector, and the growing field of social entrepreneurship. Krasna gives readers a detailed breakdown of dozens of job categories, projections of their future hiring rates, and resources for finding jobs in each field.

Krasna projects rapid increases in hiring for many types of public-service jobs, such as social work, public relations, human resources, and epidemiology. Career opportunities in public works—including urban and regional planning, civil engineering, and water treatment—are also set to increase at rapid rates. And although print journalism jobs are disappearing, the future looks promising for digital media strategists who know how to utilize social media to launch effective viral marketing campaigns.

Jobs That Matter thoroughly assesses the job market and what it will offer in the years ahead for job seekers who want to use their skills to serve others. With the book’s consumer-oriented focus, job seekers will find it very approachable and useful.

Reality Check for Virtual Living

Virtually You: The Dangerous Powers of the E-Personality

The Internet has a propensity for bringing out reckless, cruel, and sometimes psychopathological behaviors in people who are normally rational and stable, says psychiatrist Elias Aboujaoude in Virtually You. Citing clinical surveys and a series of patients that he personally treated for Internet-related behavioral disorders, he describes how the seeming unreality of cyberspace can lead Internet users to say or do regrettable things online and wreak real damage to their careers, relationships, and health.

We do not think, talk, or behave online as we would in everyday life, he explains. On the Internet, our personalities become “e-personalities”: more impulsive, more ambitious, and less restrained by common sense and personal responsibility.

Web users who are disciplined, rational, and polite in everyday life are known to fire off brusque e-mails that offend colleagues or co-workers, shop or gamble compulsively in online retail outlets and casinos, or create online profiles that brim with uncharacteristic bravado and overconfidence. And many young people are unable to pay attention to anything in everyday life for more than a few minutes at a time because Web surfing has atrophied their attention spans.

Virtually You is a reality check on the Internet’s power to enrich life and, conversely, impoverish it. Readers will find a thorough, firsthand account of the destructive side of Internet use and a challenge to reevaluate who they are on and off the Web.

Sustainability’s Dividends

Climate Capitalism: Capitalism in the Age of Climate Change

A business that lowers its fossil-fuel use is not only benefiting the planet’s health; it is also increasing its own profitability, argue L. Hunter Lovins and Boyd Cohen. In Climate Capitalism, they demonstrate how businesses in a variety of industries are adapting to the recession by adopting policies of environmental sustainability.

It’s no fluke that Toyota and Volkswagen became the world’s largest car companies in recent years by marketing fuel-efficient cars, according to the authors. Nor is it too surprising that General Motors regained solvency after its 2008 bankruptcy by selling hybrid cars. Companies are increasingly recognizing that wasting energy and materials is a high-risk strategy, while implementing environmental sustainability in their business models creates jobs. They also increasingly view corporate environmental responsibility as the most promising path toward improving performance, government relations, brand reputation, and management of their supply chains.

Lovins and Cohen profile major companies, such as Google and Walmart, that are embracing environmentally friendly innovations. They also profile the fast growth of new alternative-energy markets, green venture capital, and energy-efficient building design.

Climate Capitalism portrays a hopeful, sustainable future for global commerce: Even if some business leaders used to think that their profit margins and the environment’s health were mutually exclusive, they will very likely think otherwise in the years ahead. Market watchers, environmental advocates, and general readers of all kinds will find in Climate Capitalism a compelling counterweight to business as usual.

Tapping the Fountain of Entrepreneurial Youth

Young World Rising: How Youth Technology and Entrepreneurship are Changing the World from the Bottom Up (Microsoft Executive Leadership Series)

Geography and income may separate the young people of developing and industrialized countries, but digital technology is a powerful common ground, according to tech entrepreneur Rob Salkowitz. In Young World Rising, he describes the spread of digital communications technology among developing nations and the new opportunities that it creates for disadvantaged young people to patent new products and launch new businesses.

With Internet access, business-minded youths anywhere can more easily study markets, acquire training, and connect to people and resources. Civic-minded young professionals employ digital systems to make government agencies more effective and root out corruption. Youths create new software programs at low cost by “open-sourcing” their development. And young entrepreneurs start tech companies that are hugely profitable while embodying sustainability and investing back into their communities.

Since many developing nations’ populations are composed disproportionately of people less than 30 years old, young “consumer entrepreneurs” have vast potential to raise developing-world standards of living. It is not certain that they will succeed. Troubled economies, unstable governments, and blowback from established business interests threaten their successes. But if this young entrepreneurial wave navigates the challenges, it could make profound and lasting impacts on the global marketplace.

Young World Rising tells of the vast changes that young people could bring to economies everywhere. It is well suited for public policy analysts, global development advocates, and for all who are interested in how developing nations might attain greater prosperity and greater influence in the twenty-first century.

When the Oil Wells Run Dry

Life Without Oil: Why We Must Shift to a New Energy Future

Petroleum enabled the world population to reach its present-day total of 7 billion, argue environmental scientist Steve Hallett and journalist John Wright. They both doubt that this population will sustain itself once oil supplies run low this century.

Credible evidence suggests that we have already entered the era of peak oil—when the world has discovered all the oil there is to discover and supplies will steadily shrink ever after. Oil yields have been declining in the United States, Venezuela, and every other major producing nation, and most large oil companies have been reducing their investments in exploring for new reserves.

The oil companies are planning for a future beyond oil, and the rest of us would be wise to do so, as well, Hallett and Wright warn. The authors expect the point of noticeably depleted oil supplies to be as soon as 2015. Vast disruptions in modern life will follow.

Nations will rush for coal, the easiest substitute for oil, and greenhouse gas emissions will accelerate. Astronomical spikes in energy prices will set in. Russia, which possesses some of the largest remaining oil reserves on earth, will flourish, but the United States, India, and most other countries will be at risk of dramatic economic contractions. The Middle East will descend deeper into violence as national governments clash for remaining reserves.

Global hardship is inevitable, Hallett and Wright conclude. Alternative energy and ecosystem conservation will not save us from it, even though they are both necessary. The world will essentially have to rebuild itself into a new civilization that exists within nature’s limits.

Life Without Oil is a grim forecast that is sure to encourage deep thinking and debate about human society’s future. It may resonate with conscientious economists, environmentalists, and public policy analysts.

Tomorrow in Brief

Custom Teaser: 
  • Recycled Heat
  • Greener Architecture with Bio-Buildings
  • Anti-Stress Devices
  • Brain Pacemaker
  • Curious Case of Contagious Cancer

Recycled Heat

Personal energy self-sufficiency is coming closer to reality as micro-scale systems allow homes and small buildings to recycle their own heat waste.

Combined heat and power (CHP) systems capture energy from space or water heaters and convert it to electricity. The technology could potentially cut carbon-dioxide emissions by up to 30%, according to the U.S. Environmental Protection Agency.

Large applications of CHP systems have been in use for many years, but only recently have they been scaled down to sizes suitable for residential or small-business use.

Source: U.S. Environmental Protection Agency Energy Star Emerging Technology Award, www.energy star.gov/emergingtech.

Greener Architecture With Bio-Buildings

Future buildings from homes to skyscrapers may be more responsive to fluctuations in the surrounding climate, improving their resource efficiency, thanks to architectural research under way at the University of Greenwich.

“Protocell cladding” using bioluminescent bacteria or other materials would be applied on building facades to collect water and sunlight, helping to cool the interiors and produce biofuels. The protocells are made from oil droplets in water, which allow soluble chemicals to be exchanged between the drops and their surroundings.

“The big drive in the construction industry in the next growth period is going to revolve all around sustainability and ecological planning,” says Neil Spiller, head of the university’s School for Architecture and Construction.

Sources: University of Greenwich School of Architecture and Construction, www.gre.ac.uk/schools/arc. British Council, www.britishcouncil.org.

Anti-Stress Devices

Pens, steering wheels, and other products we handle daily could one day actively reduce our stress.

People tend to play with their pens when they are nervous, so Delft University of Technology researcher Miguel Bruns Alonso developed a pen that senses this fidgety habit. To remind the worrier to calm down, the pen’s built-in electronics and electromagnets provide a counterweight to these movements.

Applied to steering wheels in automobiles, the stress- sensing and counterforce system could help reduce aggressive driving, Bruns believes.

Source: Delft University of Technology, http://home.tudelft.nl/en/.

Brain Pacemaker

Targeted stimulation of areas of the brain could provide relief for patients whose severe depression is unresponsive to other treatments.

A tiny “brain pacemaker” is implanted under the patient’s clavicle, allowing doctors to control electrodes implanted in the brain.

The technique, developed by physicians at the University of Bonn and colleagues in the United States, was devised for Parkinson’s patients but now raises hopes for significantly improving conditions for the severely depressed.

Source: University of Bonn, www.uni-bonn.de.

Curious Case of Contagious Cancer

An unusual form of cancer that can be transmitted between individuals has been observed among dogs, wolves, and coyotes. Understanding the phenomenon may help advance techniques for stopping the progression of cancer in other species.

The canine transmissible venereal tumor, spread by licking, biting, or sniffing tumor-infected areas, survives through a process of stealing mitochondria from the host animal.

The research is being conducted at Imperial College London, supported by the U.K. Natural Environment Research Council.

Source: Imperial College London, www.ic.ac.uk.

Daniel Bell and the Post-Industrial Society

By Edward Cornish

The late sociologist was best known for defining and describing the new era and social realities that information technologies were helping to create in the twentieth century.

Daniel Bell, who died January 26, 2011, at the age of 91, left a lasting legacy of imposing books analyzing the economic and social trends that have shaped and now are reshaping American society.

Bell was born on Manhattan’s Lower East Side in 1919. His parents were Polish Jewish garment workers and, until the age of six, Bell spoke only Yiddish. By the time he was 13, however, he had no difficulty reading and speaking English.

Intensely interested in socialist ideals, Bell joined the Young People’s Socialist League, but soon became critical of the ideological dogmas he found among them.

At the age of 19, he graduated from the City College of New York and began writing regularly for the liberal weekly The New Leader. Later, he became the labor editor of Fortune magazine after writing a memorandum on labor-management relations that impressed the editors. He went on to write a monthly column for Fortune but maintained his association with the academic community as a lecturer in sociology at Columbia University and, later, at the University of Chicago.

Bell’s reputation as a social thinker grew with the publication, in 1960, of his book The End of Ideology, which argued that U.S. society had passed through its ideological phase, having outgrown the need for simple rubrics to describe and justify public conduct. Ideologies, Bell decided, offer attractive but often unworkable solutions for human problems.

The End of Ideology won high praise from reviewers like political scientist Andrew Hacker, who said Bell “clearly ranks among the outstanding essayists of our generation.” Hacker added:

There is a sense of relief in being able to discuss Medicare or civil rights or the anti-trust laws without having to cope with the specter of Socialism, Wall Street, or Mongrelization. Not only have intellectuals and politicians thrown aside the prisms that once clouded their eyes, but the general public too is increasingly suspicious of catchalls and catchphrases.

In 1965, Bell became chairman of the Commission on the Year 2000, organized by the American Academy of Arts and Sciences in Boston. The Commission brought together a stellar group of thinkers, including Daniel P. Moynihan, Karl Deutsch, James Q. Wilson, Erik Erikson, and Samuel P. Huntington, to think about the future of America and the world.

The work of the Commission was summarized in a volume edited by Bell, Toward the Year 2000: Work in Progress (Houghton-Mifflin, 1968).

Bell’s masterwork The Coming of Post-Industrial Society (Basic Books, 1973) noted that, in the nineteenth century, America shifted from an agricultural economy to an industrial economy as workers abandoned farming for better-paying jobs in manufacturing. Then in the twentieth century, increasing efficiency in manufacturing led to such a sharp decline in industrial jobs that the United States could no longer be classed as an industrial society.

But if America is not an industrial society, what is it?

Various thinkers have suggested that today’s U.S. economy might be described a “service society,” an “information society,” or a “cybersociety,” but Bell felt that the basic character of today’s society is uncertain. Preferring to be cautious, Bell insisted on calling it simply a “post-industrial society,” and he went on to cogently describe some of its problems.

First, he said, social problems are now national in scope due to the revolution in communications and transportation, such as the rise of newsweeklies, jet transportation, and, more recently, the World Wide Web.

Second, America’s present administrative structure is inadequate. The United States is still composed of quasi-sovereign states, each with tax powers resting on varied and often inadequate tax bases.

“What is the rationale for the present crazy-quilt pattern of townships, municipalities, counties, and cities, plus the multifarious health, park, sewage, and water districts?” he wrote. The functioning of the U.S. government “is largely out of step with the needs of the times.”

Third, the rise of plebiscitary politics poses a serious challenge. The ease with which tens and even hundreds of thousands of people can pour into Washington, D.C., within a 48-hour period “makes the national capital a cockpit for mobilization pressures in a way this society has never before experienced.”

Fourth, capitalism is threatened by its “cultural contradictions.” Bell worried that capitalism may destroy itself if the polarities between its affective (emotional) and rational elements are not reconciled.

Capitalism as an economic system requires ever greater applications of rationality to solve problems of organization and deficiency, and to find the right balance between cost and benefit. On the other hand, capitalist culture places an ever-greater emphasis on such values as self-fulfillment and personal gratification. As these two tendencies grow stronger, the rift between them widens. In other words, capitalism demands a strong work ethic for efficient production—that is, hard work and an emphasis on saving for the future—plus a fun ethic to ensure robust consumption.

In short, Bell argued, modern society can best be thought of as “an uneasy amalgam of three distinct realms: (1) the social structure (principally the techno-economic order), (2) the polity or political system, and (3) the culture.” The three realms are ruled by contrary principles: efficiency, equality, and self-gratification.

About the Author

Edward Cornish is the founding editor of THE FUTURIST.

Future Scope

Custom Teaser: 
  • E-Books Will Replace Textbooks
  • Global Model Forecasts Civil Unrest
  • Teens Trust Parents More Than the Internet
  • Threats to (and from) Sharks
  • WordBuzz: Hetail

E-Books Will Replace Textbooks

Education

Students’ book bags will soon be considerably lighter, as e-books replace traditional textbooks within three years, predicts the New Media Consortium, an international nonprofit organization exploring media technologies.

Other advantages of the technology, such as ease of updating and sharing material and enabling multimedia experiences and online access, give learners a powerful drive to push textbook publishers to move more quickly to ebooks.

In addition, augmented reality with computer-generated imagery will be common on university campuses within two to three years, and gesture-based interfaces for computing will arrive within five years, according to the Consortium’s Horizon Report 2011.

Source: New Media Consortium, www.nmc.org.

Global Model Forecasts Civil Unrest

A government’s coerciveness, its capacity to protect itself, and its citizens’ ability to mobilize against it are the three key factors in determining how vulnerable a country may be to civic violence.

The Domestic Political Violence Forecasting Model, developed by a team of political scientists from Kansas State and Binghamton universities, has already successfully predicted unrest in Tunisia, Peru, Ecuador, Ireland, and Italy.

The model shows that outbreaks of violence are not limited to repressive states, but can also predictably occur in Western democracies. The researchers warn that responding to unrest with crackdowns on human rights tends to fuel insurgency rather than suppress it.

Sources: Kansas State University, www.k-state.edu. A list of the top 37 countries projected to experience civil unrest through 2014 is available at the Domestic Political Violence Forecasting Model, http://radicalism .milcord.com/blog.

Teens Trust Parents More Than the Internet

Teenagers may seem to spend most of their lives on the Internet, but when they need answers to intimate questions about sexuality and health, they are more likely to seek information from parents and other people they trust.

According to a study by the Guttmacher Institute, high-school students are wary of Internet sources because they know that much of the content is user generated and thus likely to be incorrect. Also, having to sort through a sea of prurient content to find useful information may be another deterrent to many young users.

There are sites offering a great deal of accurate, useful, teen-friendly information about sexuality on the Internet, however. Capitalizing on teens’ trust in their parents and schools could help these sites bridge the information gap, the study’s authors conclude.

Source: Guttmacher Institute, www.guttmacher.org.

Threats to (and from) Sharks

If sharks wrote the news, the headlines would be apocalyptic. Sharks can claim only a handful of human fatalities a year (just six deaths from 79 attacks worldwide in 2010), while humans kill between 30 million and 70 million sharks a year in fisheries. Thirty percent of all shark species are now threatened or near threatened with extinction.

As human populations increase and more people enjoy recreation in sharks’ habitats, the number of shark-on-human attacks is likely to continue to increase, says University of Florida ichthyologist George Burgess.

But another consequence of rising human populations is demand for fish. Sharks seeking their next meal are lured into growing numbers of fishing lines, Burgess explains.

Sharks are also directly sought for their fins, used in popular East Asian dishes. Only 13 of the top 20 shark-catching nations have developed protection plans, according to the Pew Charitable Trusts.

Sources: University of Florida, www.ufl.edu. Pew Charitable Trusts, www.pewtrusts.org.

WordBuzz: Hetail

The metrosexuals have grown up, settled down, and gotten serious about their manly duties, which increasingly include shopping for things besides tools at Home Depot. Men want to feel cool when they shop, and they need the experience to be convenient, whether online or in stores.

Hetail—marketing to the mainstream male consumer—involves understanding what appeals to them and curating their experience, writes Euro RSCG Worldwide PR blogger Karina Meckel. One-stop shopping and an atmosphere appealing to a specific aesthetic (academic, sporty, nostalgic, casual, or rock star, for example) are ways to please the masculine shopper.

Source: Euro RSCG Worldwide PR, http://euroscgpr.com.

Future Active

Custom Teaser: 
  • Workshop Targets Domestic Violence in Uganda
  • The New “Peace Building”
  • Asia’s Next 50 Years

Workshop Targets Domestic Violence in Uganda

Domestic violence is a growing issue in Uganda, particularly in rural areas, where there tends to be greater poverty and less access to quality education.

At a three-day conference convened by the United Nations Children’s Fund (UNICEF) in Uganda, policy makers, community leaders, and concerned citizens came together to find ways to reduce domestic violence in the African nation. The Future Search Workshop on Violence Against Children and Women utilized a community-oriented futuring method intended to prompt fast action on pressing issues.

Developed by Marvin Weisbord and Sandra Janoff, a Future Search workshop is a highly interactive planning meeting that facilitates dialogue among people who often differ in opinions and backgrounds. It gathers a wide cross-section of people over a three-day period—including people who can make change happen as well as those who express the need for change—and enables them to cooperatively plan for the future.

Future Search workshops are divided into three parts. The first part focuses on reexamining the past and creating timelines that reflect the history of the issue in question (in this case, domestic abuse). Moving on to the present, the next step is to identify and analyze key trends, see where they may be heading, and then brainstorm what can and should be done in order to move them in the right direction. The last step is to determine plans of action in order to arrive at the most desired future.

It is here, in the third part of the workshop, that participants begin building scenarios and describing their ideal future. Proposed action plans presented on the third day in Uganda included training police officers in child and family protection services.

Previously, UNICEF-Uganda has used the Future Search method to find ways to improve life in Uganda’s most poverty-stricken region, Karamoja. UNICEF has conducted future searches in many different countries around the world, including Iran, Bangladesh, Kenya, and Indonesia.

Sources: UNICEF, www.unicef.org. Future Search Network, www.futuresearch.net.

The New "Peace Building"

the new headquarters of the United States Institute of Peace

If the Washington, D.C., skyline seems a little more peaceful these days, there is a reason.

In March 2011, the United States Institute of Peace began moving into a new, $186-million headquarters located on the National Mall. The five-story building faces the Lincoln Memorial and is located near both the Korean War and Vietnam War memorials. Constructed on top of an old parking lot, it incorporates sustainable building methods and is conceptual in design: The translucent white glass rooftop is intended to evoke the undulating white wing of a dove of peace.

Visiting members of the public will be able to view office work taking place through floor-to-ceiling glass windows that open onto the Great Hall inside.

“The design of the new building embodies the open, transparent, and inclusionary nature of peacebuilding,” says USIP President Richard H. Solomon.

Boasting a state-of-the-art workspace, the building will also be home to the Global Peacebuilding Center, an interactive public education center geared especially toward students and young people. A rotating series of exhibitions will raise awareness of international issues and introduce viewers to various methods of preventing, analyzing, managing, and resolving conflicts. Exhibits and activities will include “an immersion theater [that] will put visitors ‘on the ground,’ transporting them from the Global Peacebuilding Center to, for example, the Cambodian killing fields,” according to the USIP’s Web site.

The building was designed by Moshe Safdie, a Boston-based Israeli architect whose many groundbreaking designs include the Yad Vashem Holocaust Museum in Jerusalem and the Khalsa Heritage Memorial Complex in Punjab, India (a museum dedicated to preserving the history and culture of the Sikh people).

Established by Congress in 1984, the USIP takes a multidisciplinary approach to conflict prevention, conflict resolution, and peacebuilding. THE FUTURIST covered its birth from idea to Act in the early 1980s; then-Senator Spark Matsunaga, who played a key role in founding the USIP, described it in the magazine’s February 1985 cover story, “An Academy of Peace: Training for a Peaceful Future.” He noted that the idea for an academy to train Americans in peaceful resolution of conflict had been around since the aftermath of the Revolutionary War:

In first introducing legislation more than two decades ago to establish a U.S. Academy of Peace, it was my intention that the academy should train the best and brightest of America’s youth to undertake the waging of peace. … Peacemaking represents a growing body of knowledge drawn from diverse disciplines and honed to professional skills in conflict resolution techniques. It is a dynamic function, not a passive or static condition, utilizing the same human energy we observe under conditions of war, but applied to more humane ends.

The new “peace building” will open to the public in September 2011.

Sources: United States Institute of Peace, www.usip.org. Safdie Architects, www.msafdie.com.

Asia's Next 50 Years

portrait of attendees of Global Transitions and Asia 2060
portrait of attendees of Global Transitions and Asia 2060
photo of Vahid Motlagh
Vahid Motlagh listening to speaker K. V. Kesavan

A select group of academics, politicians, and NGO representatives from across Asia and beyond gathered together in November 2010 to project what the next 50 years may hold in store for Asia as economic and political power shifts eastward.

The invitation-only conference, entitled “Global Transitions and Asia 2060: Climate, Political-Economy, and Identity,” examined possible long-term futures of the continent. Hosted by Tamkang University’s Graduate Institute of Future Studies in Taipei, Taiwan, and co-sponsored by Korea’s Kyung Hee University and the United States–based Foundation For the Future, the three-day workshop took an interdisciplinary approach to problem solving.

Much of the conversation was geared toward developing a long-term policy perspective across many sectors, with particular focus on three core issues: climate change and a shift to renewable energy, the transformation of national and regional identities across Asia, and the possible creation of a politically and economically unified Asia—in other words, an Asian Union similar to the European Union.

WFS member Vahid Motlagh, the founder and editor of Vahid Think Tank and co-author of several award-winning futures studies books in Farsi, was among the speakers who addressed the topic of changing identities. His presentation, entitled “Multiple Longer-Term Futures of Asia,” in part examined the possible impacts of breakthroughs in artificial intelligence, genetics, and biotechnology. He argued that Eastern cultures are more likely than Western cultures to accept the “benefits” of these breakthroughs (such as gene therapy, designer babies, and human cloning).

A number of speakers stressed the need for long-term economic planning with an emphasis on protecting the environment. They delved into such issues as environmental education in Korea, a switch to renewables in Oman, and freshwater scarcity and desertification in China. Economic growth and environmental sustainability go hand-in-hand, noted Kyung Hee University chemistry professor Young Sik Lee.

The likelihood that an Asian Union will emerge, with shared values as well as shared currency, seems slim, but nevertheless the scenario offers an intriguing “what-if” possibility and an avenue toward increased regional cooperation and security.

Toward the end of the conference, participants engaged in breakout sessions, dubbed “fishbowl conversations.” In small groups, they built 50-year scenarios, ranging from best case to worst case, and brainstormed ways to successfully bring about the most desirable future for Asia.

Sources: Foundation for the Future, www.futurefoundation.org. Vahid Think Tank, www.vahidthinktank.com.

Computers Making the Quantum Leap

One branch of physics holds huge implications for information technologies.

Quantum computational devices with calculating power greater than any of today’s conventional computers could be just a decade away, says Bristol University physicist and electrical engineer Mark Thompson. He anticipates accelerated research and development breakthroughs in many fields of science, thanks to quantum computing.

At a January 2011 Cambridge University forum, Thompson presented two Bristol-developed quantum photonic computer chips, which process photons (particles of light). One chip used a quantum algorithm to find the prime factors of 15. Thompson says that factoring numbers is hard for conventional computers but would be relatively easy for quantum computers.

With further development, quantum processing could create powerful simulation tools for modeling many natural processes, such as superconductivity and photosynthesis. Quantum computers might also model molecular and subatomic systems with greater precision than today’s computers can.

“We plan to perform calculations that are exponentially more complex, and will pave the way to quantum computers that will help us understand the most complex scientific problems,” says Thompson.

A conventional computer stores information in bits, each bit either a 0 or 1. A quantum computer would store information in “qubits,” and each qubit could be both 1 and 0 at the same time. David Lee Hayes, a researcher at the University of Maryland’s Joint Quantum Institute, explains that a particle in a quantum state is in “superposition”: It can be in more than one place at the same time. It assumes one location, however, once someone observes it.

“You can think of the observer as getting entangled with the quantum bit in a weird way,” says Hayes.

Entanglement, another property of quantum particles, means that one quantum particle links telepathically to another particle far away. The second particle then exactly imitates all its partner’s properties.

Since qubits can hold more than one location at once, a quantum computer could compute many more problems at once, according to Carl Williams, chief of the Atomic Physics Division at the U.S. National Institute of Standards and Technology.

Such a computer would be a powerful tool for pharmaceutical developers, says Williams. Drug researchers now use conventional computers to model the human body’s chemical systems and project how certain chemical compounds might interact with it. The models guide the researchers’ synthesis of experimental new drugs.

The modeling processes involve millions of calculations. A quantum computer might complete the same calculations much more quickly and speed up drug development.

“Our time scale for developing new drugs would become cheaper and faster,” says Williams. “Researchers would only have to synthesize those things that are going to work.”

The quest to build a quantum computer is becoming a race, according to Martin Rotteler, head of the quantum computing research group at NEC Laboratories. He says that NEC has built a quantum computing device that has two qubits of memory, but other labs have built devices with three qubits of memory, and someone may build a four- or five-qubit device in another three to five years.

Rotteler says that quantum computers would be optimum for working on problems in which there is a lot of structure, such as a graph. They could also map magnetic fields, protein folding, and other natural systems down to magnitudes of detail that are impossible today.

Building a quantum computer will require more efficient ways of controlling quantum phenomena, according to Williams. Quantum particles can easily entangle with particles they are not supposed to entangle with, or interact with each other in ways that the researchers do not intend.

Also, creating qubits and photons requires massive system components. But just as the first conventional computers filled entire rooms and were later replaced by progressively more-compact successors, quantum computing could evolve into smaller and cheaper systems.

“Build the first one,” says Williams, “and in 25 years, they will be 25% of the size. I bet that, after the first quantum computer, the cost of one 10 years later will be significantly reduced.”—Rick Docksai

Sources: David Lee Hayes, University of Maryland Joint Quantum Institute, http://jqi.umd.edu.

Martin Rotteler, NEC Labs, www.nec-labs.com.

Mark Thompson, Bristol University, www.bris.ac.uk.

Carl Williams, NIST, www.nist.gov.

Holographic Videoconferencing

The next breakthrough in digital communications may be 3-D and 360.

Imagine having a long-distance conversation with a colleague who, to your eyes and ears, appears to be right in front you. Now, 3-D telepresence has moved closer to reality, thanks to research by the University of Arizona and supported by the National Science Foundation.

The system they are working on features a holographic video display that refreshes every two seconds. That two-second refresh rate represents a huge step up from where the technology was a couple of years ago, when the display refreshed once every four minutes.

A three-dimensional image of a moving person or object, with 360-degree viewing capability, projected from afar in something approximating real time, could represent a major breakthrough in communications technology. Unlike depictions of holograms in popular science-fiction movies, however, the images are not projected into empty space but onto a transparent sheet of plastic—a key part of the process.

“The heart of the system is a new plastic material that we have come up with which we call … a photorefractive polymer,” says Nasser Peyghambarian, project leader and chair of photonics and lasers at the University of Arizona. Peyghambarian is also the director of the National Science Foundation’s Engineering Research Center for Integrated Access Networks.

As new images are “written” on the polymer screens, old ones are erased. The material is also able to store the projected images, and, unlike face-to-face conversations, there is a pause button. Viewers can circle the projection and view it practically in its entirety, which results in a more realistic simulation.

The process begins with 16 computer-controlled cameras arranged in a semicircle around the person or object, taking two-dimensional pictures from different angles simultaneously. “The 16 views are processed into hogel data by the host computer and sent to the holographic recording controller through an Ethernet link,” Peyghambarian explains. Hogel is a nickname for holographic pixel; hogels are the 3-D version of pixels.

When the recording has been sent, a pulsed laser inscribes the images into the polymer screen. “Once a hologram has been written, the system uses the next available hogels to update the information. The hologram is displayed using a color LED that gets scattered off the image to the viewer’s eyes,” Peyghambarian adds. This optical effect renders the 3-D image perceptible to the naked eye, no special glasses required.

The designers’ main goal is to achieve full-motion video rate—30 frames per second. They point out that other improvements need to be made as well before commercializing the technology. For instance, the color palette is very limited right now (although it is worth noting that adding color into the process doesn’t slow down the refresh rate at all). Size also presents a challenge—the maximum projection size is currently 17 inches, but the design goal is to increase that to encompass at least the average size of a person. The resolution of the projection and sensitivity of the materials need improvement as well, and the research team is working on ensuring that the optics can competently handle indoor low-light settings.

Many other important uses for the technology exist besides holding long-distance business meetings, say the researchers. These uses include digital design and engineering, and telemedicine for complex surgical procedures. Such a telepresence system would also improve 3-D printing capabilities, better enable 3-D mapping, and enhance entertainment experiences.

Affordable large-scale holographic projections may still be a long way off; however, they are moving closer to becoming a reality.—Aaron M. Cohen

Sources: The National Science Foundation, www.nsf.gov.

Nasser Peyghambarian, University of Arizona (email interview).

Reversing the Mexican “Brain Drain”

By Concepción Olavarrieta

Investing in more tech opportunities may lure the best and brightest back home.

After investing more than a billion dollars (or 25% of the Ministry for Education’s budget) in postgraduate studies for young students abroad, Mexico is looking for a return on that investment—literally. Many of those students never come back to Mexico once their studies are completed. Their reasons for remaining abroad include superior wages and salaries; the ability to work in research centers, offices, and labs equipped with the latest technologies; and the opportunity to be involved in cutting-edge research projects.

Of these former students, 66% reside in the United States, 26% in Europe, and the rest in Canada and elsewhere. Half of the 5,000 scientists who did not return to Mexico obtained PhDs, and some went on to obtain postdoctoral positions. An estimated 575,000 Mexican professionals and academics now live and work in the United States and Europe, and this number is growing. Every year, 20,000 highly educated Mexicans search for better working conditions outside Mexico. Most of them ultimately get hired.

This brain drain has policy repercussions as far as investment in higher education is concerned, but, more importantly, it signifies an irreplaceable drain of human resources, the retention of which is vital for the country’s development. For every five Mexicans with master’s degrees and every three with PhDs working in Mexico, there is one with an equivalent degree working in the United States.

Both public and private investment in science and technology research and development is needed in order to attract and retain these “brains.” However, that investment is currently precipitously low.

In 2010, the amount set aside in the Mexican government’s budget for research and development represented 0.4% of the GDP, while the contribution from the private sector was 0.1%. Together, this amounted to a mere 0.5% of Mexico’s GDP, placing the country last among members of the Organization for Economic Cooperation and Development, which recommends that developed countries devote 4% of the GDP to R&D.

Not surprisingly, The OECD Reviews of Innovation Policy: Mexico (2009) recommends that the government increase public spending on science and technology. It adds that, given the current global crisis and economic recovery, there are two fundamental issues to which the Mexican government should give priority.

First, the government should mitigate the negative impact of the world’s financial crisis on the actors involved in innovation. Continuous support by the National Council on Science and Technology (CONACYT) and the Ministry of Economy is critical for maintaining research and development as well as preserving long-term projects in the public sector and in partnerships between the public and private sectors.

Second, it should view the innovation process as a key component of a green recovery program. Green technologies, green jobs, and innovation and investment in renewable energy will drive future growth.

Moreover, the OECD has proposed that the Mexican government create a Ministry of Science.

Currently, Mexico’s National Researchers Program is intended to abate the brain drain. It offers researchers and academics the ability to earn an annual tax-free bonus calculated on the basis of individual performance. The OECD recommends that the bonuses be incorporated into the regular salaries of all 15,000 participants in this program. Within the criteria used for assessing the performance of researchers, the organization advocates that more credit be given to collective work and research carried out by international teams and networks as well as in university research institutes. The OECD argues that these steps will enable the National Researchers Program—which consumes a third of CONACYT’s budget—to fulfill its aims.

The Mexican government has taken steps toward implementing these recommendations, including the creation of an Innovation Stimulus Program and a Sector Funds program for monitoring and evaluating scientific, technological, and innovation activities. There are also plans to invest more in graduate education programs in Mexico.

These funds enable financial speculation in certain sectors of the economy, such as alternative energy, information technologies, poverty alleviation, and others, fostering greater investment in science and technology. By the end of 2009, there were 20 such funds with federal support, and contributions exceeded $2 billion.

Mexico has also become the leading promoter of the Latin America and Caribbean Innovation Network. This network purports to further the exchange of ideas concerning the ways in which innovation policies can be evaluated, and to identify the common challenges and effective policy responses that will benefit the strategic analytical frame that the OECD will soon launch.

Further enhancing science and technology opportunities at home are programs such as the Institute of Mexicans Abroad, the Mexican USA Foundation for Science, and CONACYT, which have all been promoting the Mexican Talent Network. This network encourages liaisons, synergies, business development, and education for global innovation; fosters Mexico’s prestige abroad; supports Mexican communities in other countries; and facilitates a better understanding of Mexicans’ contributions to their adopted countries. These efforts also aim to introduce Mexican technology companies to the world market. There are associations of the Mexican Talent Network in tech hotspots such as Silicon Valley, Houston, Austin, Boston, Los Angeles, and Redmond.

Former NASA astronaut José Hernández, an American of Mexican descent, has predicted that, if Mexico were to invest seriously in space, in five years the Mexican Space Agency could be reaping its first fruits, and within 10 years it could count itself as one of the eight major space agencies in the world. Studies have shown that, for every dollar spent by NASA, it gains six from the technology it develops and commercializes. According to Hernández, Mexico has to use its reservoir of talent at home and abroad in developing such technologies.

With these proposals, the brain drain could be transformed from a net loss into an opportunity for globalization.

Concepción Olavarrieta is the president of the Mexican Node of the Millennium Project.

Dimming the Sun

Humans could reduce Earth’s sunlight intake, but are they playing with fire?

When a volcano erupts and dims the sun with ash-laden clouds, one noteworthy effect is immediate cooling. Can this effect be replicated worldwide—without the ash?

The British government is sponsoring prospective studies of “solar radiation management” procedures to halt global warming by blocking some of the sun’s radiation from reaching Earth. But some of the studies’ researchers are not sure that solar radiation management’s benefits will outweigh its harms.

Solar radiation management would deploy clouds of gas, sulfate aerosols, or water vapor into Earth’s upper atmosphere to reflect some of the sun’s rays back into space. The Southwest Pacific Ocean Circulation and Climate Experiment (SPICE) is assessing candidate gases, how to deploy them, and the likely impacts. SPICE is one of several solar radiation management projects receiving grants from the Engineering and Physical Sciences Research Council, the main UK government agency for funding science and engineering research and training.

Peter Braesicke, a SPICE researcher and Cambridge University atmospheric science professor, suspects that solar radiation management might cause major disruptions in world weather patterns. In a study published January 2011, he argued that interfering with sunlight could change the “teleconnections” that link Earth’s wind, water, and temperature currents. As he told THE FUTURIST, the consequences could include increased droughts in some parts of the world.

“Circulation regimes like monsoons and associated precipitation patterns are likely to change—and that might mean that some areas will be drier than now,” he says, adding that some countries will suffer more than others. “Regional changes will almost certainly always produce winners and losers.”

Ben Kravitz, a doctoral student in Rutgers University’s Atmospheric Sciences Department, wrote a companion paper to Braesicke’s study. Kravitz evaluated the consequences of using sulfate aerosols to manage solar radiation and concluded that they might reduce summer rainfall in Africa and Asia, thus threatening billions of people’s food supplies.

There are other technical issues, too, as Kravitz explained to THE FUTURIST. First, the sun’s radiation rises and falls over time, so any clouds created for radiation management would have to be adjusted continually.

“This involves changing the amount of aerosols you make, which—assuming everything works as it’s supposed to—would not be particularly hard to do. The most effective climate modification ideas will be adjustable and reversible in a certain time frame,” he says.

A more serious problem, according to Kravitz, is knowing when and how to stop these sun-dimming measures. If engineers cease it too quickly, the rebounding sunlight would shock Earth’s whole climate system.

“If you stop solar radiation management abruptly, the climate will rebound according to greenhouse gas concentration. Since adaptation to climate change depends upon how long you have to adapt, this rapid change would be disastrous,” he says.

Adapting to climate change includes reducing the emissions of climate-changing pollutants, Kravitz adds. But achieving this could actually be more difficult in a solar radiation-managed world: As Kravitz notes, less sunlight tends to mean less solar energy.

“If we decide to switch to a greener economy and vastly increase the portion of our energy that comes from solar power, solar radiation management could have a huge impact,” says Kravitz, adding that, if solar generators no longer produce as much electricity, people might try to fill the energy void by burning more coal, oil, and natural gas. “If we’re generating less energy from one source, we have to compensate for it from a different source, and that’s likely fossil fuels.”

According to Kravitz, that could defeat solar radiation management’s whole purpose. To cool Earth’s climate and keep it cool, the world needs to let atmospheric greenhouse gases dissipate and not replace them by continuing to emit them in large quantities. Otherwise, solar radiation management’s benefits, if any, will not last.

“The only permanent solution is to stop putting CO2 into the atmosphere,” says Kravitz.—Rick Docksai

Sources: Peter Braesicke, National Centre for Atmospheric Science, Cambridge University, www.atm.ch .cam.ac.uk/.

Ben Kravitz, Rutgers University, http://envsci.rutgers.edu/~benkravitz/.

Predicting Our Own Happiness

Why we’re usually wrong about how we’ll feel in the future.

Will acing an exam truly make you happy? Will the snub of a cute co-worker send you into throes of despair? Maybe not. New research shows that people routinely discount their own personality biases when they envision how happy or sad they will be as a result of changing external circumstances.

Individuals who are naturally pessimistic imagine that they will be far more euphoric as a result of big life events than usually turns out to be the case. Folks who are usually in a great mood underestimate how much happier particular events will make them (which must make for a pleasant surprise later on).

The new study comes from psychological researchers Jordi Quoidbach of the University of Liege, Belgium, and Elizabeth Dunn of the University of British Columbia. To test their hypothesis that both pessimists and optimists tend to incorrectly predict their future happiness, they surveyed a group of college students to determine their base-level personality (from “optimistic” to “neurotic”). The subjects were then asked to imagine how they would feel, on a scale from one to five, if they received a certain grade in a class.

Six weeks later, when grades actually came out, the researchers surveyed the subjects again. They found a wide gap between how the students expected to feel and how they actually felt. But Quoidbach and Dunn did find a close correlation between how the subjects felt earlier and how they felt when they received their grades.

“Results supported our hypothesis that dispositions would shape participants’ actual feelings but would be largely neglected when people made affective forecasts,” they write.

In a second test, participants (Belgian adults) were asked to describe how happy they would be in the event that Barack Obama won the 2008 U.S. presidential election. After the election was called, the researchers again found that the participants’ actual level of happiness reflected how happy they were when they were asked the question, not how happy they expected to be later.

Why are people so bad at predicting their future happiness levels? The problem may be in the brain. Previous studies have shown that the part of the brain responsible for envisioning future states is the same part tasked with remembering situations we’ve already experienced, the episodic memory center. Neurologically, the act of imaging a scenario is a lot like the act of remembering. But we process thoughts and ideas about our own personalities in a different part of the brain, the semantic memory center, which is tasked with learning and analyzing abstract concepts but not remembering specific events.

“For example, an amnesic patient was able to rate his personality in a highly reliable and consistent manner even though he was unable to recollect a single thing he had ever done,” write the researchers. When we envision the future, we use the part of the brain we use to remember the past, not the part that knows our personality the best. This is why our personal-happiness forecasts are so often off the mark.

Quoidbach and Dunn’s research provides further support for Hedonic Adaptation, a 40-year-old theory that says that most people have a baseline level of happiness, whether or not they’re aware of it. So while we may experience blips of joy when we rush out to make a big consumer purchase, or bouts of melancholy when we suffer a setback, eventually we return to a default emotional setting.

Quoidbach and Dunn hope their research will help people take their personality into account when making big decisions or forming expectations. “For example, individuals high in dispositional happiness who are planning their next vacation might not need to waste money and effort finding the perfect location (because they will be happy in the end anyway). By contrast, people with less happy dispositions might be more prone to regret the slightest annoyance, so carefully planning every detail of the trip might be the best strategy for their future well-being,” they write.

In other words, if you want to know how a big event will make you feel in the future, consider how you feel right now and you’ll have your answer.—Patrick Tucker

Source: “Personality Neglect: The Unforeseen Impact of Personal Dispositions on Emotional Life” by Jordi Quoidbach and Elizabeth W. Dunn, Psychological Science (December 2010), www.psychologicalscience.org.

Envisioning a Global Economic Dashboard

Economic futurist Hazel Henderson offers alternative measures.

A growing number of economists and policy makers argue that statistics such as gross domestic product (GDP) and gross national product (GNP) may be useful as snapshots of a nation’s total economic activity, but they are limited in scope. Critics advocate for a new metric that calculates the overall standard of living in a country by factoring in environmental and public health, social welfare, infrastructure, and other quality-of-life factors.

While there has been much talk around the issue, little reform has actually occurred at the national level, says economist and futurist Hazel Henderson. The United Nations’ Human Development Index, which includes education, health, and income, is perhaps the best-known and most widely cited alternative.

Henderson, author of Ethical Markets: Growing the Green Economy (Chelsea Green, 2006) and president of Ethical Markets Media, tells THE FUTURIST that a revamping of GDP hasn’t happened for a number of reasons. Chief among them is the potential drawback that factoring in social and environmental costs “would lower the apparent performance, both of companies and of a country,” she says.

Nonetheless, a majority of people around the world agree that a new model is needed, according to Ethical Markets’ research.

“Health, social, and environmental statistics are as important as economic data, and the governments should also use those to measure national progress,” according to more than two-thirds of the approximately 12,000 individuals in a dozen countries surveyed in 2010 by Ethical Markets Media and the international polling firm GlobeScan. Less than a quarter of respondents identified most strongly with the second statement—that national progress is best gauged by “money-based economic statistics” such as GDP and GNP.

These findings update the initial GlobeScan–Ethical Markets public opinion survey, which was undertaken at the behest of the European Commission as part of the 2007 Beyond GDP conference. (The follow-up survey was conducted independently, Henderson says.)

However, in some countries, support for the traditional GDP/GNP methodology has risen slightly during the three-year interim. These include the United States and several European countries. Furthermore, people in emerging economies such as Kenya were less likely overall to side with GDP reform than those in developed countries. Henderson believes that this may be due to financial concerns brought about by the recession—and overall financial well-being in general. The executive summary of the report notes: “The stronger support in developed countries for this expanded measure suggests that, once a level of material well-being has been attained, many people feel that it is critical to take other measures of life quality and sustainability into account, and that these are a valid way of expressing national progress.”

Henderson emphasizes that significant numbers in all 12 countries included in the survey expressed interest in reforming traditional economic metrics to incorporate long-term quality-of-life indicators. She hopes that the survey contributes to a growing awareness of GDP’s limitations in terms of depicting a country’s overall quality of life. Purely economic statistics neglect countries’ genuine wealth, she says: “Well-educated workforces, efficient infrastructure, and productive ecosystems and resources … all [are] ignored and missing from GDP.

“The good news is that we no longer need to have macroeconomists control the GDP model,” she continues. “We can now use the Internet and Web sites to unbundle these indicators (as we do at Calvert-Henderson.com) and display these 12 indicators of quality of life on a ‘dashboard.’ This is the new approach and it simply bypasses the current formulations of GDP and makes them politically transparent and available to all who are interested.”

Another Ethical Markets project is the Green Transition Scoreboard, which tracks private investment in green businesses around the world. The latest Scoreboard reveals a growing economic emphasis on environmental sustainability. It also shows that interest and speculation continue to rise. By mid-2010, total private investment in the so-called “green economy” had surpassed $1.6 trillion, which represents an increase of approximately $400 million since the end of 2009. Henderson projects that there could soon be a cumulative $1 trillion annual investment in green businesses.—Aaron M. Cohen

Sources: Hazel Henderson (interview), Ethical Markets Media, www.EthicalMarkets.com.

Beyond GDP International Initiative, www.beyond-gdp.eu.

As Blogged: Futuring the Revolution

Watching Egypt’s 30-year-old dictatorship come to an abrupt end inspired futurists to reflect on wild cards, tipping points, and the power of information-empowered people.

For 18 days in January and February, the virtual voices of protest were united to bring about a new reality for Egypt.

Called the People’s Revolution, it was truly one of the world’s first socially networked revolutions, embracing not only the activists organizing flash-mob protests and the demonstrators filling Cairo’s Tahrir Square, but also a worldwide community of keenly interested witnesses.

In our own community of futurists, our Web site hosted the observations of several expert trend watchers, including More Than Human author Ramez Naam, an Egypt-born U.S. citizen. Futurists’ role in analyzing the Egyptian crisis was to provide a context for the present outcome of identifiable trends, as well as lend ideas for what may happen next.

Here are a few excerpts from our bloggers’ comments during these extraordinary events. To read the postings in their entirety, please visit www.wfs.org/blog.

Egypt: Lessons for U.S. Foreign Policy

Posted by Ramez Naam, Sunday, January 30, 2011

portrait of Ramez Naam

… Egypt was the first Arab country to recognize and make peace with Israel. For that, Egypt is rewarded with aid. In addition, Egypt is a key military partner. U.S. and Egyptian forces conduct joint exercises in the area every year. … For those reasons and more, the U.S. has continued to prop up the government of Hosni Mubarak for decades.

There are good reasons for the United States to want a stable and pro-U.S. government in place in Egypt. Yet the protests on the street today show how supporting convenient dictators can have negative consequences. …

In the long run, democracies make the best friends and allies. In the long run, encouraging democracy—through free and fair elections, through personal freedom of expression, through the establishment of a free and uncensored press—is the best foreign policy investment any free nation can make.

“Malecontentment” in Egypt

Posted by Erica Orange, Thursday, February 3, 2011

portrait of Erica Orange

… In Egypt, the unemployment among young males (aged 15 to 29 years) was 32% in 2009. In other words, one in three young men were out of a job, and, because of increased education, many more were affected by underemployment. Clearly, growing unemployment has led to insecurity over their future, which to many, seems bleak. But when you take a generation of young males who have no future, and have no outlet for their aggression (and testosterone), a range of potentially dangerous problems could occur. …

So the question then becomes this: What do we do with the young males? As we’re seeing now, testosterone-fueled aggressiveness can disrupt or even tear apart societies that don’t find ways to channel those drives into activities that aren’t destructive to the communities. In a worst-case scenario, it may be that countries afflicted by the imbalance could to go to war as a means of sending young men’s aggressiveness to where it can do no harm internally.

Egypt and Changing Units of Analysis

Posted by Eric Garland, Thursday, February 3, 2011

portrait of Eric Garland

… One of the biggest implications of the past few weeks of major unrest in the Arab/Middle Eastern world is that the units of analysis are being scrambled. Remember: foreign policy experts use the nation-state as the key unit of analysis. …

September 11th screwed things up by suggesting that non-state actors would no longer play bit parts, but could influence the whole geopolitical game. …

A nation-state is truly the result of a social contract, and when the millions of people who form that contract decide it’s no longer for them—it’s not the same thing anymore. It can’t be used as a unit of analysis in the same way. Let’s say the people of Egypt follow through on their popular revolt and elect a parliament of all taxi drivers. Can a foreign policy analyst in Paris seriously expect the same type of future behavior that it got from foreign-educated elites who understood what was expected of Cold War nation-states?

Nope, it’s a whole new world.

Egypt, Twitter, and the Collapse Of Top-Heavy Societies

Posted by Ramez Naam, Saturday, February 5, 2011

portrait of Ramez Naam

… The weight that eventually caused the collapse of both the Maya and the Roman Empire wasn’t just any sort of complexity, it was an upper layer of society that was largely parasitic, consuming more and more of the resources of society without producing much value.

I’m struck by this in the case of Egypt. The protests in Egypt are fueled by the frustration of lack of opportunity and the anger of lack of ability to change the system or even speak out against it. …

Neither state control of the economy nor rampant corruption that lines the pockets of ministers and high officials is truly a form of additional “complexity.” It’s parasitism.

By contrast, services like Twitter and Facebook or more basic telecommunication via cell phones, SMS, and email do increase the societal complexity of a country. They increase the number of voices being heard. They add density to the social graph.

Yet that complexity does not belong to the old world of Hosni Mubarak’s government or its elite friends. It belongs to the younger generation on the street. Facebook, Twitter, cell phones, email, and SMS add complexity, but it’s a peer-to-peer complexity that empowers those who use those tools. That peer-to-peer complexity may cause a collapse, but not of the side that uses it. …

I’m optimistic about the future of both Egypt and of modern society as a whole. … We should expect the collapse of parasitic and top-down societies and institutions, and the emergence of more and more network-centric institutions and societies.

North African Dominoes

Posted by Stephen Aguilar-Millan, Monday, February 7, 2011

portrait of Stephen Aguilar-Millan

First Tunisia, then Egypt, and on to Jordan and Yemen. Ought we to have been surprised by recent events in North Africa and the Middle East? No! Despite the timing of the revolutions now under way, I don’t think that we ought to be surprised at all.

… At a seminar at the World Future Society conference in Chicago in 2009, as a demonstration of the International Futures computer simulation model, Professor Jay Gary and Dr. Tom Ferleman showed us that a combination of economic and demographic trends, in conjunction with a number of social and political trends, were leading to the possibility of a significant event in North Africa and the Middle East in this decade. For a reasonably sustained period, the warning bells have been ringing and those investors and businesses that have been tuned into this potential hotspot are now able to deploy their contingency plans.

… The important factor now is to consider what might happen next—to look to the future rather than to the past. To my mind, the most significant future factor is that the “youth bulge” in North Africa and the Middle East has yet to peak. Over the course of this decade, even more unemployed, impoverished, and bored young men will reach the age when they might be predisposed to action in changing their world. If this cohort can be fulfilled, then the prospect of the future (growth, employment, and prosperity) is very bright. If, on the other hand, nothing changes, then the prospect is quite dim.

Mom and Mubarak

Posted by Cynthia G. Wagner, Friday, February 11, 2011

portrait of Cynthia G. Wagner

My mother, who died two and a half years ago, probably would have had some sympathy for Hosni Mubarak this week, for no other reason than that she once shook his hand.…

From her diary [1994]:

We were resting near King Tut’s tomb when a motorcade suddenly appeared—out jumped security guards—young, lean, in dark suits with white shirts and ties. In moments they were positioned all round—and President Maburak [sic] appeared. I asked the guard in front of me if I could take pictures—at first he said “no”—but then the President gave different orders. Before I quite realized what was happening, I was shaking his hand and chatting with him about the opera and my appreciation of all that had been done for that event—and my enjoyment of Egypt. When we got back to the hotel, I discovered that I was an instant (though temporary) celebrity. I was on the 6 o’clock TV news and people started recognizing me everywhere.

Mom was far more interested in the history of Egypt—its ancient beauties and mysteries—than in the turmoil of contemporary geopolitics. Shaking the man’s hand was enough to charm her. Politics isn’t just local; it’s personal.

I think about Mom and Mubarak when I look back on how differently I feel about people after I have met them. I was as charmed by Newt Gingrich as by Al Gore when I met them at World Future Society conferences.

But of course I would not want either gentleman running my country for 30 years.

About the Authors

Ramez Naam is a computer scientist and author.

Erica Orange is vice president of Weiner, Edrich, Brown, Inc.

Eric Garland is the founder and managing partner of Competitive Futures Inc.

Stephen Aguilar-Millan is director of research at the European Futures Observatory.

Cynthia G. Wagner is editor of THE FUTURIST.

March-April 2011, Vol. 45, No. 2

  • From Hospital to Healthspital: A Better Paradigm for Health Care
  • Health Insurance in America After the Reform
  • Could Medical Tourism Aid Health-Care Delivery?
  • Bike to the Future
  • Relationships, Community, and Identity in the New Virtual Society
  • Imagineers in Search of the Future
  • Understanding Technological Evolution and Diversity

From Hospital to Healthspital: A Better Paradigm for Health Care

By Frank W. Maletz

Hospitals should not simply be places where people go to get well (or, worse, where they go to die). Future hospitals could become wellness information centers and proactive partners in community well-being, says a practicing orthopedic surgeon.

Is health-care delivery in the United States so broken that it cannot be repaired, remediated, rejuvenated, reformed, or reorganized? Should all existing delivery mechanisms be torn down so we can start from scratch?

My unequivocal answer is no to creative destruction, but creative rethinking is imperative. Nowhere on the planet is there a “perfect delivery system” for health-care modeling. In the United States, what is currently called a “system” is certainly not one in the sense of an ecosystem—i.e., controlled, sustainable, natural, with known inputs and outputs, with precise and defined resources and resource management, and with holistic feedback loops. There should also be within the ecosystem a balanced and proportionate response to all perturbations. A health-delivery system requires open adjustability.

The current U.S. health-delivery system does have many strengths: strong expertise at universities and other research hubs. Its free-market structure for product development and dissemination is inventive and innovative. Its safety is ensured through oversight by the U.S. Food and Drug Administration and organizations such as the Joint Commission. The robust National Institutes of Health provides funding and research prioritization. We now also have the social networking tools (wikis, Facebook, Twitter, LinkedIn, and the like) to deploy seamless and remarkable change on the magnitude of a paradigm shift.

But the biggest asset of the current system is the network of 5,010 community hospitals that deliver care to unique individuals locally, one provider to one patient in need, day or night, weekend or holiday. Thus, the United States already has the fundamental building blocks for a strong, personalized health-care-delivery system. So what else is needed?

Goals for Health: Elements of a Redesigned Approach

According to the Institute of Medicine report “Cross the Quality Chasm: A New Health System for the 21st Century,” the U.S. health-care system should strive to effect the following changes:

  • Redesign care processes.
  • Make effective use of information technology (IT).
  • Improve knowledge and skills management.
  • Develop effective teams.
  • Coordinate care across patient condition, service, and settings.
  • Use performance and outcome measurement for continuous quality improvement and accountability.

Reforming health care is a ubiquitous topic in the national dialogue because of the amount of resources that health consumes—16% of GDP. For all the ideas and opinions brought forth, however, all we seem to get is more GDP devoted to the problem, with partial solutions that get traction, then fizzle, doing little to improve quality or reduce the chaos in the system. Then the blame game begins: Rising costs are the “fault” of providers, or of insurers, attorneys, pharmaceutical and product companies, patient demands and expectations, for-profit hospitals, or government leaders who lack will.

It is time now for a true health renaissance, with constructive, holistic, integral, paradigm-shifting thinking and action. I believe that, until we can fix the delivery systems, we cannot begin to correct the reimbursement mechanisms.

What We Already Know about Health

First, we know that prevention is more cost-effective than treatment. Emergency-room visits are more expensive than routine maintenance. Chronic disorders such as diabetes, hypertension, heart disease, strokes, and renal failure consume an inordinate share of health-care dollars. Smoking cigarettes is bad. Obesity and nutritional deficiencies are epidemic. Fruitless, futile care at the end of life dominates a large proportion of the Medicare allocation. Reckless behaviors are responsible for much loss of productive and functional young lives. Cure and precision diagnosis are much more desired than mere control, maintenance, or palliation.

We also know that waste and redundancy in a paper-based information system have extraordinary costs both in real dollars and in time that could be allocated much more productively. A systematized, constantly updated, searchable, linkable database available at each point of care would reduce waste, repetition, redundancy, and the tendency for hand-off errors. Care could then be coordinated among all providers.

On the positive side, we know that workers who are healthy function more productively. Jobs, income, and reliable, portable health-insurance benefits add to security and productivity. Happy, contented people live longer and better, and many people already spend huge amounts of money on a host of programs to improve their health and well-being.

We know that regular exercise, especially aerobic, improves clarity, mental functioning, and wellness. Having a meaningful, fulfilled, goal-directed life and trying to contribute to society also increase longevity. And meeting our basic needs, including shelter, nutrition, and clothing, and maintaining appropriate levels of stress, balance, and moderation, are essential ingredients for physical and mental well-being.

Thus, the goal is not simply to eliminate sickness or delay death. We must take a much more holistic and expansive view of health care that embraces wellness and enrichment, a view that is flexible and that adopts the best practice from moment to moment.

Hospitals Today

Hospitals and sanitaria were developed to house the sick and treat or quarantine the diseased, deformed, or demented. Today, care is usually delivered locally to one patient by one provider at a time. Community hospitals provide the vast majority of the contact visits. Patients are generally not fluent in health-related matters, and this lack of understanding leads to major compliance failures with best advice and recommendations.

Providers are not infallible. Patient problems are inherently complex, and there are many unknowns. Medicine itself is becoming more complex. Natural healing using biological, biochemical, and immunologic enhancing remedies will function more predictably than artificial implants, prosthetics, xenograft replacements, and the like, but we are on the verge of advanced treatments with nanotechnology, bioengineering, genomics, proteomics, metabolomics, stem cells, and immunomodulation. Such advances bring us closer to cures and disease elimination.

Hospitals could do more to experimentally model an integrated, holistic health-delivery system that effects a real shift. They would collate the best research and brilliant idea production; incorporate the best of wellness, well-being, natural, and alternative options; improve oversight of chronic debilitating conditions; and mobilize and coordinate effective preventive strategies.

We have the tools to craft a better more healthful future and enable more-productive lives for everyone on the planet. The first step does not require much more than a creative paradigm shift in thinking and approach—a paradigm I call Healthspital 2.0.™

Elements for Integration

The biggest need is for data and information management. The needs for privacy and confidentiality of health information have not disappeared in the age of social networking, with people’s growing desire to be heard, noticed, and connected. Thus, rather than locking down all health information as a matter of privacy, we need to reconstruct laws regarding inappropriate use of data, such as in the discriminatory use of genetic information.

Seamless availability and transfer of health knowledge allows in-depth knowledge of confounding variables, reduces redundancy, and potentially eliminates hand-off errors. A computerized health “passport” would serve as a template and allow interconnectivity, not just benefiting the patient, but also allowing broader public-health research to be performed. This systemization of information would be able to highlight best outcomes and best practices through true tracking and social networking.

The Healthspital model also requires more-effective use of expert systems. With integrated data management, experts could render opinions from afar on questions within the database. Doctors and other practitioners would have access to remote monitoring, enabling them to render remote advice. They could find answers to questions and discover best practices, as well as share their own discoveries, ideas, and best practices. Innovation could be instantly disseminated globally.

Patients and families will more easily engage with extensive information and support networks, and self-education would expand.

Healthspitals in the Community

Each Healthspital would appreciate the norms, mores, and expectations of the community it serves on issues such as end-of-life ministrations. Dialogue could begin in earnest regarding hospice services. Part of the Healthspital’s mission could be to celebrate each patient as a life well lived, honoring individual care preferences during life-and-death decision making. Throughout the community, such openness would reenergize relationships between younger and older generations and promote mutual caring, which would contribute to the curing function across the health-care continuum.

The Healthspitals’ integrated delivery system at the community level would allow a much truer triage at emergency departments. As these are often the places of first resort for patients with all levels of care needs, a system-wide approach to triage would help refer all patients to the appropriate (and often less expensive) level of care. This would lessen the issue of “dumping” and allow tracking of referral patterns to provide a feedback mechanism for improving triage throughout the system.

A Healthspital 2.0 approach could proactively intervene against negative health modulators such as smoking, impaired driving, and other reckless behaviors and would promote modifications.

Healthspitals would also assess and promote healthy lifestyles, such as appropriate nutrition and exercise regimens. Using personal monitoring devices for walks would allow people to compile and monitor their health via a database, which would be accessible to their physicians as well as to researchers tracking public-health trends.

Healthspital 2.0 would, by virtue of eliminating redundancy and improving health, allow huge savings from current health-care expenditures. These savings could be reinvested into promoting more healthful programs such as building walking trails, biking areas, parks, and local organic farms. Public health and wellness would thus become self-sustaining.

Once the Healthspital is fully functional, true reform of medical malpractice would be possible, as errors would decline and overall health of the community would be improved. Also, the integration would allow risk sharing across the system, which would require understanding the rights and responsibilities of all stakeholders, from patients to the Healthspital personnel, all of whom are truly invested in providing and maintaining the health of the entire community—the health ecosystem.

Barriers to the Healthspital Paradigm

Professor Randy Pausch, in The Last Lecture (Hyperion, 2008), taught that barriers are put in front of us to see how much we want what is beyond them. Here are some of the challenges facing the Healthspital 2.0 paradigm.

  • Legal issues: Many modifications of current laws will be needed, especially in the areas of information use and availability at point of care. Issues that will need to be addressed include HIPAA (Health Insurance Portability and Accountability Act), patient dumping, conflict of interest, and discrimination. HIPAA, in particular, was originally enacted as a privacy guard. I contend that the American population is more comfortable sharing personal health-care information than current legislation indicates, so long as they have confidence that the information will be used responsibly. With 500 million people already utilizing Facebook, I believe the vast majority of people (therefore, patients) would make health-related data available to providers and researchers in the interests of preserving health.
  • Financial issues: Compared with building a new hospital, the Healthspital model offers potentially tremendous cost savings, but care must be taken that these savings are reinvested into more healthful projects rather than shifted to various nonhealth-related special interests.
  • Political issues: The creation of the Healthspital 2.0 concept will require substantial commitment, investment, and will on the part of politicians. The paradigm shift is monumental, so it is certainly appropriate to work at the experimental project level where results can be analyzed in terms of cost savings and improved health care. However, politicians with appropriate foresight would also be helpful in providing leadership and serving as champions for concepts such as this.
  • Educational issues: As with all major changes, educational ramifications of a health system paradigm shift are tremendous. Health awareness should be taught at the earliest levels, starting in pre-school. Science and nutritional coursework throughout formal schooling is imperative, as well as example setting. Patients currently receiving treatment in the older delivery model will need tools that the local community Healthspital will provide. Lifelong education could thus enable individuals to become more involved in their own health future, allowing them to assist responsibly in the delivery of care to themselves and family members.
  • Punitive and unconstructive programs: Bashing and the blame game must be eliminated throughout the health-delivery system. No one—individual or institution—functions well with a stick at the back. The current pay-for-performance model does not allow the raising of all boats toward improvement, but rather widens the gap between the great performers and the health programs and systems that are performing poorly.

Building a Healthspital Model

I currently work at Lawrence & Memorial Hospital in southeastern Connecticut, a 250-bed community hospital. We serve a number of employers and on the continuum of care from birth to death, from neonatal intensive care to skilled nursing facilities, and with a robust hospice presence. We care for patients in 10 counties, and our primary service area includes both the destitute and the wealthy. We are regional, and our facility would be a perfect venue for an experimental design incorporating any and all of the above suggestions.

How would this work? First and foremost, it would be an experiment requiring bright investigators to provide oversight and analysis of data. All elements of health care and wellness should be incorporated. Every member of the community in the 10 primary service areas should be enrolled, and a swipe-card passport developed such that, at any point of care, information is standardized. Any and all good ideas would be welcomed for inclusion in a central repository of ideas and best practices. Through instant messaging, such bright ideas would be disseminated throughout the system for consideration, and this would assure equality.

No person requiring care or requesting information would get anything less than the best available. Funding sources would include venture capitalists, information system vendors, federal government pilot project or American Hospital Association new investigator sources.

Pilot projects shown to work effectively would merge databases and coalesce into a national or even global health-delivery ecosystem, addressing the big five issues of waste and redundancy, expensive access, prevention, chronic disease management, and fruitless ministrations at end-of-life.

About the Author

Frank W. Maletz, MD, is an orthopedic surgeon specializing in spine and trauma at the Lawrence & Memorial Hospital in East Lyme, Connecticut. For more information about the Healthspital 2.0™ concept, please contact the author, email malfam5@aol.com. He will also speak on this topic at WorldFuture 2011: Moving from Vision to Action in Vancouver.

Health Insurance in America After the Reform

By Jay Herson and David Pearce Snyder

If for-profit health insurers find that business is too unprofitable under the new law, where will Americans find affordable coverage? One solution may rise from the nonprofit sector led by credit unions, which have already demonstrated an ability to keep up with for-profit banks.

The primary objective of President Obama’s 2009 health care reform initiative was to provide health insurance for an estimated 46 million people who did not have it. The Act requires insurers not to reject coverage on the basis of preexisting health conditions, and it requires all citizens to purchase health insurance or to pay a tax should they decline to purchase.

Should the Reform Be Reformed?

Conventional post-election wisdom holds that, in spite of heavy rhetorical assault, the 2010 health insurance reforms will survive the new Congress largely intact. Since conventional political wisdom is right only half the time, this offers little assurance. However, the predictable demographic and economic realities underlying the coming decade will be sufficient, by themselves, to produce the sequence of developments summarized in this article—even if the Patient Protection and Affordable Care Act (PPACA) were to be overturned.

Without PPACA’s constraints, for-profit insurers can be expected to increase premiums in line with health-care providers’ costs, which are rising at two to four times the rate of the Consumer Price Index. At the same time, the United States will experience a 50% increase in the “high-maintenance” over-65 patient population—plus the retirement of the baby boomers, who represent one-third of the nation’s current caregivers—just as the nation passes through five to seven years of projected stagnant income growth, chronic high unemployment, fiscal deleveraging, and shrinking public-sector budgets.

Absent the PPACA reforms, with each passing year a growing percentage of U.S. households will simply be unable to afford the premiums set by for-profit insurers. Nonprofits would emerge naturally to fill the growing unmet marketplace need.

In short, PPACA will largely serve to facilitate and accelerate the adaptive free-market behavior that is almost certain to occur in the austere circumstances that will confront most Americans for the foreseeable future.—David Pearce Snyder

Omitted from the final version of the Patient Protection and Affordable Care Act (PPACA), passed in March 2010, was the so-called “public option,” a government-run health insurance program designed to compete with profit-making companies. Legislation notwithstanding, it has generally been marketplace forces—not government interventions—that have shaped the U.S. future, so we shall examine how these market forces will create a new source of competition for the health-insurance market: nonprofit organizations.

Health Insurance Forecast to 2030

Under the 2010 health care reform legislation, the health-insurance business is expected to become less attractive for investor-owned public insurance companies. This will especially be the case if courts decide that requiring citizens to purchase health insurance is unconstitutional.

More particularly, insurers’ inability to reject applicants or to cap the benefits (or even terminate the policies) of patients incurring serious and costly illnesses will make health insurance increasingly unattractive as a profit-making business. As for-profit insurers exit the affordable health insurance market, nonprofit institutions may step up to meet consumer demand.

There are already a number of nonprofit organizations that serve large pools of people, such as credit unions, which may offer their members health insurance. These programs would be administered by large data-processing organizations similar to those that currently have service contracts with Social Security, Medicare, and Medicaid and other state-run programs.

There are now approximately 7,800 credit unions (CUs) in the United States, including federally insured, state insured, and self-insured institutions. These serve tens of millions of members and hold hundreds of billions of dollars in assets, which increased significantly during the recent banking crisis.

Credit unions should have little concern about competing with for-profit insurance companies since they have been competing with the for-profit banks for the past 75 years. Health insurance would be a logical extension of providing low-cost services to members, as well as an extension of their current offerings of health savings accounts.

Interstate cooperatives of CUs—already in existence—could serve a critical mass of insured, taking advantage of their existing institutional infrastructure such as data processing and electronic funds transfer. The CUs would initially offer low-cost health insurance primarily targeted at the uninsured. However, people insured with individual policies or group insurance might also choose CU health insurance as an alternative. In fact, employers could offer CU health insurance as a benefit. As insured pools increase, the number of providers (doctors, hospitals) accepting the insurance would increase and the insurance coverage would become more attractive to the public and employers.

The nonprofit organizations offering health insurance would by no means be limited to CUs. New health insurance companies can be created by all sorts of nonprofits banding together to represent a sufficiently large pool of insured. For example, public radio and TV stations could unite to form insurance groups, as could university alumni associations or retirement funds such as the Texas Teachers Retirement System and the California Public Employees’ Retirement System (CalPERS).

The 2010 health care reform act provides for subsidies to people who cannot afford to purchase health insurance. Presumably, these subsidies could be used to purchase the nonprofit health insurance described above. Federal subsidies, however, may be insufficient for some families to afford health insurance. Should this be the case, there will be pressure on the states to provide subsidies. Some states may be more progressive than others in helping citizens get the necessary coverage, and those that do not provide a path to health insurance may see a dwindling labor supply as workers and businesses move to more progressive states. Under the health reform act, state health insurance programs will become a tool of economic development policy.

Scenarios for Nonprofit Health Insurance

Much of the foregoing discussion is, admittedly, speculative. The following are four possible scenarios, plus a most-likely scenario, that could emerge if nonprofits began a process of providing health insurance as a consequence of the sweeping, congressionally mandated reform.

1. Business as Usual. Although a nonprofit initiative is widely discussed, actual health insurance policies issued by credit unions and other nonprofits never get off the ground as Congress comes to a stalemate over legislation that would enable it. Perhaps because of intense lobbying by for-profit health insurance companies, Congress eliminates some aspects of the Act. Proposals for state-run, high-risk insurance pools to merge over state lines and provide expanded coverage are also widely discussed, but fail to get the approval of state legislatures due to budget constraints and problems foreseen in governance. Meanwhile, health-care costs continue to rise rapidly, and, as premiums charged by for-profit health insurers soar, more people are forced to abandon their coverage; there are growing lines at public health clinics for minimal care.

2. At Least We Tried. Credit unions launch several health-insurance companies, but they fail to enroll enough people fast enough to sustain the enterprise. Although subsidies from state government and charities do materialize, interest dwindles because of failing CU health insurance initiatives and a declining sense of urgency, in spite of the fact that tens of millions of Americans remain uninsured.

3. Nonprofits Succeed. The demand is so great that credit-union-based health insurance takes off, and 30 states create funds to subsidize premiums for those who qualify. The success of the first CU groups creates the experience base for other groups to be quickly formed. Competition is healthy for all. By 2030, 93% of the U.S. population has some form of health insurance—40% from nonprofits and 53% from public company health insurance and government agencies.

4. Watch What You Wish For. After 10 years of success, CU health insurance becomes commonplace and a workplace standard. However, with the increased visibility that comes with success, fraud among providers and patients is making headlines. This causes a drop in governmental and charity subsidies for premiums. The existing CU insurance companies feel they need to grow more, and mergers begin taking place. This reduces the amount of competition and consumer choice. To compete, some of the remaining CU insurance companies decide that they can reduce costs and attract more members by actually becoming direct health-care providers, ultimately building (or buying) their own medical facilities. This leads some of the CU insurers to go public and, thus, cease to be nonprofit. By 2030, the medical insurance industry has begun to look the way it did in 2010.

5. Most-Likely Scenario. The most-likely scenario for the next 20 years lies somewhere between scenarios 3 and 4 above. By 2030, population demographics will make scenarios 1 and 2 politically unviable. Although scenario 4 is possible, the pendulum never swings completely back. While most Americans are likely to be covered by private and government health insurance in 2030, there will continue to be a need for the nonprofit alternatives described here. Still, barring further legislative intervention, it seems unlikely that more than 93% of the population will have health insurance in 2030.

Amtrak emerged when private railroads did not want to continue providing passenger service. Rural electric cooperatives emerged when it was not profitable for private industry to provide power to rural areas. Similarly, some form of cooperative health insurance is likely to emerge to fill the void created by omission of a public option in the health insurance reform.

It is difficult to forecast beyond the year 2030, but the information, communication, and health-care-management technologies that exist by 2050 should make a single-payer system easy to implement and the only logical way to provide quality health care to the U.S. population. Out of necessity, nonprofit organizations will pave the way to 2050.

About the Authors

Jay Herson is a senior associate at the Johns Hopkins Bloomberg School of Public Health and managing editor of FutureTakes. E-mail jay.herson@earthlink.net.

David Pearce Snyder is a consulting futurist and principal of The Snyder Family Enterprise and THE FUTURIST’s contributing editor for Lifestyles. E-mail david_snyder@verizon.net.

This article draws from and updates their essay in the World Future Society’s 2010 conference volume, Strategies and Technologies for a Sustainable Future (WFS, 2010, 450 pages), which may be ordered from www.wfs.org/wfsbooks for $29.95 ($24.95 for Society members).

Could Medical Tourism Aid Health-Care Delivery?

By Prema Nakra

Medical tourism—wherein patients seek more affordable or specialized treatment outside their home countries—represents a major challenge for health-care delivery in developed countries such as the United States. It also offers an opportunity to integrate and improve medical delivery globally.

Health care has long been one of the most local of all industries, but in today’s world, people, information, ideas, and technologies are increasingly crossing national borders. The move to “go global” is such a strong force that hardly any human activity is exempt from its impact.

Medical tourism, an outgrowth of the globalization of services, has emerged as an innovative, border-crossing industry, and many developing countries are poised to take advantage of this opportunity. But this opportunity also represents a challenge to health-care-delivery systems in developed countries such as the United States.

U.S. health-care costs, already an estimated $2 trillion a year, are predicted to double in the coming decade. By 2020, health-care spending is projected to consume 21% of U.S. GDP, compared with 16% of GDP in other developed countries.

Today, more than 40 governments are involved in supporting medical tourism, and the number is growing each year. The medical community in developed countries has started to recognize medical tourism as a real phenomenon with significant impacts on both practitioners and patients. Yet “medical tourism” is not a phrase that has come up openly in the U.S. debate on health-care reform.

Just after the 2011 Patient Protection and Affordable Care Act (PPACA) was passed, President Obama signed into law the Health Care and Education Reconciliation Act, which made a number of significant changes to the PPACA. According to Chris Brandt and Michael Cohen of Deloitte Consulting, these reforms represent one of the most significant disruptive events for U.S. health-care providers in the last century. Key challenges that providers will face due to this reform include:

• Estimating the potential impact of increased coverage and associated revenues on profit margins.

• Reviewing the operational capacity to ascertain whether or not the providers can respond to the pent-up demand from the newly insured.

• Handling the approximately 32 million people added to the list of those seeking primary medical care, typically provided by an internist or family-care physician.

A nationwide shortage of doctors—projected by the American Academy of Family Physicians to reach 40,000 primary-care physicians by 2020—may eventually mean long hours in the waiting rooms at busy clinics, less quality time available with doctors in examining rooms, and emergency rooms packed with patients who couldn’t find physicians elsewhere.

For past 30 years, the United States has relied heavily on foreign-born and foreign-educated doctors to help meet the demand for health-care services. About a quarter of all physicians now practicing in the United States came from other countries. In 2007, more than 38% of U.S. family-medicine residents were international medical graduates, according to the American Academy of Family Physicians.

If medical tourism continues to grow at its current rate, recruiting foreign-born physicians and nursing staff to the United States will become more challenging. In 2006, the Association of American Medical Colleges recommended that medical schools increase their student enrollment 30% by 2015 in order to address the nation’s growing shortage of physicians.

No matter what shape the current health-care reform takes or how it is implemented, health-care costs in the United Sates will continue to increase and consume more of the public’s discretionary spending. By 2017, as many as 23 million Americans could be traveling internationally and spending almost $79 billion per year for medical/surgical care, according to a 2008 report from the Deloitte Center for Health Solutions.

Stated differently, if these predictions are correct, U.S. health-care providers stand to lose $79 billion per year to medical tourism. If the gap between the cost of major medical procedures performed in the United States and other countries continues to grow, low-cost providers will capture a larger share of the market for complex surgical procedures. Top U.S. health-services managers, policy makers, and physician and surgeon groups appear to be strategically unprepared for globalization in the health-care services industry and the resulting international competition.

When patients travel out of the country for surgical care and then return home, they need follow-up care. Their providers are then faced with such challenges as the unavailability of adequate medical records and the potential of complications after their patients’ overseas surgeries. The issue of adequately reimbursing the surgeons providing the follow-up care also remains unsolved.

Turning “Medical Tourism” into Globalized Health

The medical-tourism industry has introduced new business models to deal with global health-care challenges. These business models are bringing about significant changes in the way that governments around the world deal with financing hospitals, recruiting physicians, reimbursing health-care providers, and building adequate health-care systems for current and future generations.

It is time for policy makers in the United States and other developed countries to embrace medical tourism: It could save money by taking advantage of more-efficient health-care systems outside the country, while also enabling providers to learn from the best practices in this increasingly globalizing industry.

Globalization and medical tourism are changing the health-care landscape in industrialized and developing countries alike. A “globalized health system” of the future should include international networks of highly specialized, virtually connected providers, organized around mid-sized district hospitals that function as planning, management, and communication hubs to offer a variety of local, community-oriented, preventive, and curative services.

Medical tourism is largely a consumer-driven trend. In order to survive and thrive, the health-delivery industry must keep up with its consumers’ demands and needs.

About the Author

Prema Nakra is a professor of marketing at the School of Management, Marist College, Poughkeepsie, New York 12601. E-mail prema.nakra@marist.edu.

VISIONS: Imagineers in Search of the Future

By Gary Dehrer

In 1955, Walt Disney Imagineers achieved virtual reality with Disneyland. Eight Imagineering principles explain how they did it.

Here You Leave Today, and Enter the World of Yesterday, Tomorrow and Fantasy —Sign at the entrance to Disneyland

The opening of Disneyland in the middle of the twentieth century saw Walt Disney unleashing the forces of Imagineering to create a true “virtual reality” world of entertainment and adventure. When the first paying customers entered Disneyland on July 18, 1955, they walked through one of two tunnel passageways leading to Main Street, U.S.A. Many thought they were about to encounter an upgraded amusement park, but Walt Disney knew he had created something much more than that. From Town Square, guests looked down Main Street to see Sleeping Beauty Castle beckoning in the distance. This immediate first impression was designed to have guests feel like they were being absorbed into a cinematic experience, a sensation of knowing they had stepped from their everyday life into an extraordinary world.

Eight Principles of Imagineering

According to Disney historian Alex Wright and contributors to The Imagineering Field Guide to Disneyland, Imagineering consists of the following eight basic principles.

1. Area Development: “The interstitial spaces between the attractions, restaurants, and shops. This includes landscaping, architecture, propping, show elements, and special enhancements intended to expand the experience.”

2. Blue Sky: “The early stages in the idea-generation process when anything is possible. There are not yet any considerations taken into account that might rein in the creative process. At this point, the sky’s the limit!”

3. Brainstorm: “A gathering for the purpose of generating as many ideas as possible in the shortest time possible. We hold many brainstorming sessions at WDI [Walt Disney Imagineering], always looking for the best ideas.” The rules include remembering that there is no such thing as a bad idea and that nothing should stifle the flow of ideas.

4. Dark Ride: “A term often used to describe the charming Fantasyland attractions, among others, housed more or less completely inside a show building, which allows for greater isolation of show elements and light control, as needed.”

5. Elevation: “A drawing of a true frontal view of an object—usually a building—often drawn from multiple sides, eliminating the perspective that you would see in the real world, for clarity in the design and to lead construction activities.”

6. Kinetics: “Movement and motion in a scene that give it life and energy. This can come from moving vehicles, active signage, changes in lighting, special effects, or even hanging banners or flags that move as the wind blows.”

7. Plussing: “A word derived from Walt’s penchant for always trying to make an idea better. Imagineers are continually trying to plus work, even after it’s ‘finished.’”

8. Show: “Everything we put ‘onstage’ in a Disney park. Walt believed that everything we put out for the Guests in our parks was part of a big show, so much of our terminology originated in the show business world. With that in mind, ‘show’ becomes for us a very broad term that includes just about anything our Guests see, hear, smell, or come in contact with during their visit to any of our parks or resorts.”

Source: The Imagineering Field Guide to Disneyland by Alex Wright and the Imagineers (Disney Editions, 2008).

Virtual reality is most often defined as a simulated sensory experience made possible by computer software, creating a convincing, three-dimensional experience that—at its best—looks, feels, and sounds like the real thing. It can be likened to any virtual environment where someone can literally walk into it and perceive it as true to life. Another word for virtual is enhanced reality. While various applications of simulated virtual reality will be increasingly possible in the future, people actually experienced it at Disneyland in 1955, without the aid of computer-generated special effects or other advanced technology.

Ground was broken for the Disneyland Park in July 1954, with opening day set for only 12 months later. A frenzy of construction activity swept over the former Anaheim, California, orange grove. In just a few months, the outlines of now-familiar landmarks began to emerge, with Main Street, Sleeping Beauty Castle, the Jungle Rivers of the World, and the larger Rivers of America visible. The Tomorrowland site, which lagged behind in construction, lacked the clear identity of the other lands. The Imagineers, specialists using creativity and technical know-how, became frustrated and suggested that the Tomorrowland of 1986 be concealed behind an attractive fence until it was ready. Although Walt Disney agreed to this at first, he changed his mind, saying, “We’ll open the whole park.… Do the best you can with Tomorrowland, and we’ll fix it up after we open.”

Now Is the Time for the Future

At the entrance to the original 1955 Tomorrowland, the first attraction to come into view was a tall clock structure. This was the Clock of the World, which declared that now is the time for the future. This clock was intended to symbolize the incredible futuristic world about to be entered. Standing more than 17 feet tall, the clock looked much like a squeezed soda can topped with a half sphere, gold-spiked anodized aluminum sun and a stylized silver crescent Man in the Moon face. The blue tiles encircling its base depicted the vast universe.

Few passersby stopped to notice that the timepiece showed not only the time in Anaheim, California, but also around the world. Other than serving as a convenient place for parents to meet their kids, the clock rapidly faded into obscurity. The towering red-and-white TWA Rocket was a much more-remembered symbol of Tomorrowland.

The Clock of the World is now gone, with only some first-generation Disneylanders able to recall it. The clock continued to faithfully perform its timekeeping duties until it was removed in 1966, along with the widespread demolition of the original 1955 Tomorrowland. The exiting of the clock was captured in a photo showing the timepiece, minus its top ornamentation, being hauled away with the lower edge of its blue “universe” mosaic tiles broken off at the base.

Sometimes the future can be treated rather shabbily.—Gary Dehrer

Imagineering Realism And Fantasy

To realize his Disneyland vision, Walt Disney assembled a talented team of Imagineers, who would transform ideas and dreams into reality. Looking up at the second-floor windows along Disneyland’s Main Street, you can see painted signs with the names of people and their businesses. While the businesses are somewhat fictitious, the people are not. These are names of Imagineers—such as Harper Goff, Ken Anderson, Herb Ryman, and Sam McKim —and others who played significant roles in making Disneyland happen. Even Walt Disney’s father, Elias Disney, has a window with his name painted on it with “Contractor Est. 1895” listed.

Goff, with his background in designing movie sets, would lend a hand with Main Street and the Jungle Cruise ride. Anderson, trained as an architect and all-around designer, worked on many last-minute Disneyland projects. Ryman, a versatile artist who rendered the dazzling overview of Disneyland in 1953, would later help conceptualize New Orleans Square. McKim, a multitalented artist, rendered concept sketches for Disneyland and other Disney projects.

These and many other Imagineers to follow helped dream and bring Disneyland into existence.—Gary Dehrer

Imagineering Principles: How a Dream Is Built

Eight basic Imagineering principles were essential to the creation of Disneyland’s virtual reality: Area Development, Blue Sky, Brainstorm, Dark Ride, Elevation, Kinetics, Plussing, and Show [see sidebar, “Eight Principles of Imagineering”].

1. Area Development. The original 1955 master plan for Disneyland envisioned Main Street, U.S.A., as the initial experience funneling people to a central plaza hub and then drawing them into one of four adjoining lands: Adventureland, Frontierland, Fantasyland, and Tomorrowland. Creating an expansive and interactive 60-acre venue such as Disneyland was a monumental undertaking; with no other prior experience, the Imagineers were faced with a Herculean task.

In reviewing Walt Disney’s plan to have everyone enter Disneyland at Town Square, amusement-park experts questioned why there was only one entrance. They warned that this would create unnecessary congestion. They also questioned the expense of Town Square, especially since it was not going to produce any revenue.

Disney responded that this entry space was designed to create an essential first impression and special mood for his guests. All guests had to enter the Park the same way to share an identical illusion. Even the Main Street transportation, which included a fire wagon and horse-drawn trolleys, was not intended to make any money but to help add to the overall sensory experience. Town Square was to serve as the gateway to Disneyland’s virtual reality.

The dramatic, one-two punch of the Main Street environs with Sleeping Beauty Castle looming down the street convinced Disney that he was on the right track in lifting his guests to a higher entertainment experience.

Disney was able to use his experience in animation and films, especially his extraordinary storytelling skills, to add believability to his Park creation. He grasped the importance of quickly altering the perception and attitudes of guests entering Disneyland, thereby drawing them into a new reality. This is similar to what video-game designers would be doing decades later using an interactive electronic visual format.

2. Blue Sky. Disneyland was the first project for Walt Disney Imagineering (WDI), which was created on December 16, 1952, as part of WED (Walter Elias Disney) Enterprises. Walt Disney, considered to be the foremost Imagineer of modern times, had built a major animation and film studio by the early 1950s. WED was to address all Disney activities outside the film studio and this would come to include Disney parks, resorts, special attractions at World’s Fairs, cruise ships, and other diverse entertainment activities. Disneyland offered the Imagineers an opportunity to demonstrate that anything is possible.

He was creating something to bring people across disciplines—engineering, animation, scriptwriting and filmmaking—together to tackle specific projects. Early in the development of the Disneyland project, Walt Disney realized that creating his park illusion or “show” needed mechanical know-how as well as artistic expertise. To make his “big dreams” a reality, he would have to enlist an army of Imagineers, versed in an ever-widening range of disciplines. The Disneyland show needed not only people who could design and illustrate the dream, but also writers, architects, interior designers, engineers, lighting experts, graphic designers, set designers, craftsmen, sound technicians, landscapers, model makers, sculptors, special-effects technicians, master planners, researchers, managers, construction experts, and more.

Disneyland was first envisioned as a “place for people to find happiness and knowledge.” Here, people would not be watching a movie, but rather participating in it. They would be walking through a tunnel and emerging in another world. Even the landscaping and specially scaled architecture would add to the credibility of this dream place. He was intent on creating an illusion of time and space taking people away from their daily cares on a journey of imagination that was different from anything they had ever experienced before.

In explaining the secret of his success, Walt Disney had one word for it: curiosity. “There’s really no secret about our approach,” he said. “We keep moving forward—opening up new doors and doing new things—because we’re curious. And curiosity keeps leading us down new paths. We’re always exploring and experimenting.” And curiosity was forever wrapped in endless “Blue Sky” possibilities that begged to become realities.

3. Brainstorm. Brainstorming was used to shape and define the Park, as well as to solve practical problems. The collaborative-thinking process energized the designing of Disneyland as the Imagineers pursued ideas both good and bad. Brainstorming represents a continuous process where success is many times intermingled with failure, as evidenced by Disneyland’s 1955 opening. Two of Tomorrowland’s brightest ideas—the freeway Autopia and Rocket to the Moon—both experienced initial failure. Bob Gurr, a young Imagineer with a bachelor’s degree in industrial design but scant mechanical knowledge, was put in charge of the Autopia’s first fleet of cars. On opening day, the Autopia drew a good-sized crowd, but by closing time, half of the cars were disabled. By the end of the first week, only two cars were still moving.

Walt Disney came by to inspect the ravaged car fleet and said, “Well, we’ve got to do something.” Gurr responded that he didn’t have a place to repair the broken cars. The Park, by this point, was already built, so there was no place to construct a shed. Some outside-of-the-box thinking was in order. Half an hour later, a tractor showed up towing a small wooden shed. The driver asked Gurr, “Where do you want your damn garage?” An enhanced Autopia with its sporty cars and meandering freeway is still thriving in the twenty-first century.

4. Dark Ride. Of all the rides in Fantasyland, Walt Disney’s favorite was Peter Pan. He particularly appreciated its fly-through concept, with its tiny galleon cars suspended on ceiling cables allowing passengers to soar over landscapes. It was one ride that he rode over and over again. Peter Pan was an original 1955 dark ride housed completely inside of a building.

Dark rides formed the backbone of Fantasyland’s entertainment experience, as special effects could be used to further create illusion and magic. In 1965, John Hench, one of Disney’s first and longtime Imagineers, rendered a concept sketch that would evolve into Space Mountain, housing a dark-ride roller coaster. The Space Mountain ride was finally achieved in 1975 as Tomorrowland continued to be reworked. Hench said, “The ride is above all an experience of speed, enhanced by the controlled lighting and projected moving images. But it evokes such ideas as the mystery of outer space, the excitement of setting out on a journey, and the thrill of the unknown.”

The power of dark rides pulled guests deeper into the Park experience, whether it was riding with Mr. Toad or flying with Peter Pan. Guests would themselves pass through the live-action scenes and physically experience being part of the story. The rides and attractions were designed to work in harmony to produce a series of sensations. Arguably, the Park setting and attractions worked well to subliminally capture moods and influence attitudes that are so important in creating virtual reality. Fantasy would become real.

5. Elevation. Imagineering ushered in the concept of three-dimensional storytelling. Imagineers detailed the images and settings they felt important to telling stories through mood and sensation.

Even Main Street, U.S.A., had a story to tell. John Hench explains, “Mood is created mainly by the sensation of carefully orchestrated and intensified stimuli, of color, sound, form, and movement. Disneyland’s Main Street, U.S.A., which represents the main shopping street in an idealized American turn-of-the-century small town, is a good example of mood created by sensation that results in enhanced reality.”

Disney historian Jeff Kurtti notes, “While the first Imagineers had no formal training in urban design, the nature of the animator’s art made them natural systems architects. As storytellers, they ‘wrote’ the park, giving it consistency of narrative that is matched by few other public spaces.” As the architectural elevation drawings of Disneyland were made into real buildings, Walt Disney was achieving an unprecedented breakthrough in entertainment, causing people to directly experience and interact with a virtual world as stories and adventures came alive. The Imagineered elements of storytelling created a virtual-reality setting by placing Park guests in a fantasy, larger-than-life environment. Transferring imagination into blueprints and then into an actual park virtual experience was a singular achievement that foreshadowed a future world as yet unknown.

6. Kinetics. On an inspection tour of Disneyland when it was under construction, Walt Disney spent several hours riding around in a Jeep accompanied by several people, including Joe Fowler, his construction boss. Departing from Town Square, Disney and his small party drove over to Sleeping Beauty’s unfinished castle, where he described all of the attractions and how everything would look in full color. He was describing the kinetics of Fantasyland and how the carousel horses would be leaping.

Disney realized that transferring stories from film to real-life three dimensionality would be challenging but knew his guests could use their imaginations in the Park just as they did in movie theaters. Thus, the Park experience would become believable, allowing guests to trust and enjoy the attractions and illusions.

The Jeep visited all the lands, and everyone could feel the enthusiasm of Walt Disney. When the Jeep returned to the Park entrance, Disney looked back down an unpaved Main Street and remarked, “Don’t forget the biggest attraction isn’t here yet.” When asked what that was he responded, “People. You fill this place with people, and you’ll really have a show.”

7. Plussing. Walt Disney said of Disneyland, “It’s something that will never be finished, something I can keep developing, keep ‘plussing’ and adding to. It’s alive.”

Disneyland has been compared to an animated movie, where main attractions are much like “key frames” in a film. Disney even went so far as to devise ways to fade from one Disneyland attraction and then focus guests into another, much as a film moves from scene to scene. John Hench said of Disney, “He would insist on changing the texture of the pavement at the threshold of each new land because, he said, ‘You can get information about a changing environment through the soles of your feet.’” Thus, through continuous plussing, the Disneyland experience would be both ordered and harmonious, not chaotic or confusing.

From opening day in 1955, Disneyland was meant to undergo continuous innovation and upgrading. Walt Disney and his Imagineers envisioned that Disneyland would embrace ongoing change and newly emerging technologies, while retaining its original footprint of a wondrous “magical kingdom.”

Imagineering plussing kept the park vision alive, with each “frame” being reedited to achieve the best real-life experience possible. Virtual reality is all about “plussing” an environment so that it is constantly being changed and improved.

8. Show. Crucial to the virtual-reality creation was its cast of characters. To further create his Disneyland illusion, Walt Disney instituted his Disneyland University, which would train Park personnel to not just do their jobs, but to perform as though they were onstage. Employees were expected to be happy and cheerful, further creating the feeling of an optimistic world. They would follow special protocols and a dress code to help guests feel comfortable about participating in the show.

Adding to this inclusive effect were Mickey and Minnie Mouse, along with other Disney cartoon characters, who would join guests in the Park. These costumed walk-around characters were meant to mingle with guests, posing for pictures but remaining silent. The physical impact of the walk-around characters enhanced the show and produced a convincing and compelling fantasy environment for adults and children alike.

Disneyland: A Living Virtual World And Portal into the Future

In 1955, Walt Disney had made Disneyland a living virtual reality. It would pull generations of people into Town Square to start altering their moods and sensations, and then down Main Street, U.S.A., and on into the Park, enabling them to escape into their imaginations through carefully Imagineered experiences, settings, stories, and adventures. Imagineering architecture, landscaping, and storytelling created not only a compelling “show,” but also a living virtual world.

Walt Disney, who died in 1966, had a family apartment over the Fire Station overlooking Town Square in Main Street, U.S.A., where he would sometimes stay overnight at the Park. Staff members knew that, when the front window lamp was on, their ever-watchful boss was on board. Few guests took notice of the apartment lamp, as there were many lights along Main Street. Today, if you look up to the second-floor Fire Station apartment, you realize that the lamp in the window behind the curtain is always on.

In assembling his team of Imagineers, Walt Disney had created an extension of himself that would pursue his dreams and the future long after he had died. Disneyland is a living virtual world that is a portal into an optimistic future. It is “another world” where everything is all right, people are innately good, and anything can be handled. In this sense, all of Disneyland is indeed a bright and hopeful Tomorrowland.

About the Author

Gary Dehrer is a retired principal of the San Bernardino City Unified School District (San Bernardino, California), a retired lieutenant colonel in the U.S. Army Reserves, author of Building a Championship Family (New Horizon Press, 2007), and a lifelong visitor to Disneyland. He resides in Yucaipa, California. E-mail gpdehrer@yahoo.com.

This article draws from his essay “Tomorrowland,” to be published in the 2011 World Future Society conference volume, Moving from Vision to Action.

The Disneyland Story: For Further Reading

Walt Disney: An American Original by Bob Thomas (Walt Disney Company, 1994). Thomas chronicles Disney’s keen attention to detail in perfecting an enhanced park experience, as with tree placement, the scale of the trains, and noise level of cars in his dark rides. He also observes that Walt Disney challenged those around him to go the extra mile in their work, but that this was not always well received. According to Thomas, Walt Disney viewed the Park as a living motion picture that could change and grow with its guests.

Walt Disney: The Triumph of the American Imagination by Neal Gabler (Vintage Books, 2006). Gabler’s candid assessment of Walt Disney offers an excellent companion to Bob Thomas’s insightful biography. Gabler feels that Disney saw the Park as an interlocking series of movie sets, whereby guests were to be absorbed as participants in a cinematic experience. He sees Disneyland as both transforming and therapeutic in helping people feel good about themselves and in love with life, and he sees Walt Disney, as the master animator, pulling his audience or guests into his own creation.

Walt Disney’s Imagineering Legends and the Genesis of the Disney Theme Park by Jeff Kurtti (Disney Editions, 2008). Kurtti’s book is an informative overview of the men and women who created the Disney theme-park concept. Beyond Disneyland’s “architecture of reassurance” is a carefully crafted encounter with virtual reality. Kurtti writes, “Nothing looks fake. Fabricated, yes; fake, no. Disneyland isn’t the mimicry of a thing. It’s a thing.” Once through the entry tunnels, you are quickly absorbed into Disney’s imagineered world of fantasy.

Designing Disney: Imagineering and the Art of the Show by John Hench (Disney Editions, 2008). Hench, a legendary Disney Imagineer, had a 65-year Disney career, from 1939 to until his death at age 95 in 2004. In this book, he relates how the 1955 Disneyland was to be a venue for a succession of new attractions within the park’s original Main Street and four lands framework. Hench suggests the enhanced simulated reality is achieved through carefully orchestrated and intensified color, sound, form, and movement.—Gary Dehrer

Tomorrow in Brief

The Broccoli Plan

Nutritionists tell us that broccoli is one of the healthiest foods for us, but this super veggie must be shipped from far away to reach markets where it isn’t so easily grown. For instance, 90% of broccoli sold on the U.S. Eastern Seaboard is shipped from California and Mexico—with less than desirable environmental impacts.

To solve this problem, researchers led by Cornell University horticulturalist Thomas Bjorkman are developing new strains of broccoli that can tolerate the more-humid East Coast climate. Once the right varieties have been developed, the project will also train local growers and marketers, organizing them into production networks.

With USDA support, the team aims to develop a $100 million broccoli industry on the East Coast over the next 10 years.

Source: Cornell University, www.cornell.edu.

Eye Exams via Smart Phones

Need an eye exam? There’s an app for that.

A $2 smart-phone application could tell you in minutes what prescription eyeglasses you need. Developed by the MIT Media Lab’s Camera Culture research group, the NETRA (Near-Eye Tool for Refractive Assessment) combines software with a small, lightweight plastic viewfinder that clips onto your smart phone.

Within minutes, NETRA can diagnose whether someone is nearsighted or farsighted, or suffers from astigmatism or the vision loss associated with aging. The researchers claim that NETRA is safe, fast, accurate, and easy to use.

Currently being field-tested, the device is intended primarily for use in poorer communities, such as those in the developing world, that lack access to proper eye care. While eyeglasses themselves can be inexpensive, the testing equipment up until now has been fairly cost-prohibitive, especially for those in underdeveloped areas.

Source: MIT Media Lab, www.media.mit.edu/press/netra.

Catching Up With the Stars

The Hubble Space Telescope has enormously accelerated astronomers’ ability to detect star movement, from 50 years with ground-based telescopes to just a few years.

It is Hubble’s razor-sharp visual acuity that enables the measurement of the stars’ motion, so predicting stars’ future movement has likewise been speeded up: Astronomers at the Space Telescope Science Institute in Baltimore have collected Hubble’s images from 2002 to 2006 to simulate stars’ projected migration over the next 10,000 years.

Source: Hubble Site, http://hubblesite.org.

Artificial Experimenter

Software that can take over the routine aspects of experimentation could help reduce its costs.

An “artificial experimenter” developed at Britain’s University of Southampton autonomously analyzes a project’s data, builds hypotheses, and chooses the experiments to perform, according to one of the developers, PhD student Chris Lovell of the School of Electronics and Computer Science. The program will also help detect anomalies in error-prone areas such as biological experimentation.

The next step is to join the AI software with automated platforms—labs on a chip—to perform the experiments requested by the artificial experimenter, using fewer resources in the process.

Source: University of Southampton, School of Electronics and Computer Science, www.ecs.soton.ac.uk.

WordBuzz: Weisure

Mobility, connectedness, and competitiveness have long been blurring the boundaries between activities performed in the workplace and everywhere else. Now, a term has been coined to define these omnitasking hours: weisure (work and leisure).

Attributed to Dalton Conley’s book Elsewhere, U.S.A. published by Pantheon, 2009, the term was soon popularized by CNN in a story entitled “Welcome to the ‘weisure’ lifestyle.”

Comment: We are hoping someone can still come up with a less-unwieldy coinage (something less frighteningly similar to seizure). Please send your suggestions for renaming this concept of time-use-blurring to letters@wfs.org.

News from WFS: Renewal at THE FUTURIST Magazine

By Edward Cornish, Founding Editor

For 44 years, I have had the privilege of serving as Editor of THE FUTURIST magazine. I would like to thank all of you for your support during our journey along the frontiers of the future. It has been a thrilling ride, but the time has come for me to retire as Editor and assume a new role at THE FUTURIST.

So, starting with this issue, your editor will be Cynthia G. Wagner, who has served as Managing Editor of THE FUTURIST since 1992.

In my new role, I plan to act as a futurist-in-residence. After thinking and writing about the future for more than four decades, I believe I have learned some things about foresight and I would like to pass them on to readers of THE FUTURIST through the articles I plan to write.

The study of the future is a pioneering field that is still developing. The World Future Society today is, I believe, only a foreshadowing of what it could become in the future. As futurists, we can make major contributions to the improvement of human life around the world. This is an awe-inspiring challenge but one worthy of our best efforts.

Our New Editor

Cindy Wagner came to THE FUTURIST as an editorial assistant in 1981. She has a bachelor’s degree in English from the prestigious Grinnell College in Iowa and a master’s degree in communications, specializing in magazine journalism, from Syracuse University’s S. I. Newhouse School of Public Communications. Right from the start, she proved to be a highly capable editor and quickly developed into an outstanding one. When it came time to recommend a successor I could think of no one better qualified than Cindy to replace me as Editor.

Timothy C. Mack, president of the World Future Society, shares my enthusiasm for Cindy and has given her his full support.

Adding further to my confidence in the future of THE FUTURIST is the fact that we have in the last six years added three talented journalists to the staff. They are senior editor Patrick Tucker, who also serves as the Society’s director of communications, and staff editors Aaron M. Cohen and Rick Docksai, who also work diligently on the Society’s journal for professional members, World Future Review. In addition, we have on staff Lisa Mathias, a highly talented artist, as our Art Director.

All in all, THE FUTURIST has never had such a strong editorial staff, so I have never been more confident of the future of our magazine. We hope that you will continue to share our journey into the future.

THE FUTURIST versus the World Future Society

Some readers may wonder, “Which came first—THE FUTURIST or the World Future Society?” The fact is that they were born almost simultaneously and either one can claim priority.

President Ronald Reagan meeting futurists at the White House, February 1, 1985

Here’s why. Back in 1966, I prepared a six-page newsletter providing news about new scientific discoveries and the ideas that scientists and other thoughtful people were expressing about the future. I decided to call this newsletter THE FUTURIST and sent copies of it to people I thought might be interested. These people included comprehensive designer Buckminster Fuller, physicist Herman Kahn (author of On Thermonuclear War and other prescient works), science writer Arthur C. Clarke, science-fiction writer Isaac Asimov, and Glenn T. Seaborg, the Nobel Prize–winning discoverer of plutonium.

In my newsletter, I invited the recipients to join me in establishing an organization devoted to the study of the future. To my surprise and delight, a number of well-known people actually responded to my mailing with a keen interest in what I was doing.

Furthermore, a few of the respondents lived in the Washington, D.C., area where I lived, so I could easily invite them to lunch and try to enlist their support for the project. Happily, several people responded and one, Charles W. Williams, said he could arrange space for a meeting in his suite at the National Science Foundation. This was perfect: We would be born in one of the world’s most prestigious scientific organizations. That fact, I hoped, would counter the view that people interested in the future were exclusively science-fiction fans or perhaps something weird.

As our plans for the proposed World Future Society began to take shape, we started preparing for its official launch, but we immediately encountered a big problem: We needed money if we were going to do anything. So we decided we would have to ask members of our new Society to pay modest dues and also to pay for their own lunches at our first meeting. Fortunately, a number of attendees were willing to do so.

This policy made the Society economically viable, though money would remain even to today a serious limitation on what the Society could do.

Slowly and erratically, we received membership applications and dues income while avoiding every possible expense by doing almost everything we could by ourselves. We pressed family members and colleagues into providing free labor for humble projects such as typing and stuffing envelopes. My wife, Sally, and our neighbors, friends, and children all were enlisted into doing Society chores.

So with a little money and lots of free labor, the newborn World Future Society—and its modest newsletter—could just barely manage to pay the bills. The Society’s membership gradually grew; though lack of money continued to dog us, we were able to survive and even grow.

To boost revenue, I decided we needed to offer members something more than just a crudely printed newsletter. So I decided to expand the newsletter, despite knowing nothing about typesetting, layout, art, and other skills needed for magazine publishing, and despite still having almost no money to pay suppliers for these services. However, I managed to recruit an unemployed friend who had had some experience in publishing, and with his help we produced the first issue of THE FUTURIST as a magazine (the March-April 1967 issue).

To our great joy, the response to this first issue was very encouraging and allowed us to persevere. We continued to improve the magazine and keep the World Future Society alive, but it was never easy.

Today, the members of the Society can take pride in what we have accomplished so far. We have come a long way, but I believe we have enormous opportunities to develop into a far stronger Society with an increasingly influential magazine that can help the people of the world toward a far better future than any known in the past.

To read more about the birth of WFS and THE FUTURIST, go to www.wfs.org/content/search-for-foresight.

The American Dream Moves Downtown

Revitalizing urban life with both nature and culture may benefit communities and citizens alike.

By Roger L. Kemp

In mid-twentieth-century America, the dream was to raise children in a single-family house with a yard, away from the traffic and noise in downtown areas. And the U.S. highway system stretched out to new residential subdivisions in the suburbs, as homes added more and more garages for everyone’s cars.

Downtown Trends

Major trends now under way in U.S. downtowns include:

  • Restoring and enhancing nature, such as ponds, parks, and even urban farms.
  • Integrating commercial and residential functions in multistory buildings.
  • Making public transit available, usually light-rail systems.
  • Restoring the public infrastructure to favor people over cars.
  • Combining landscaping with the restoration of all aspects of the public infrastructure.
  • Converting surface parking lots into parks, gardens, and open spaces.
  • Attracting culture, the arts, and entertainment facilities.
  • Attracting educational institutions and nonprofit organizations.
  • Attracting or keeping smaller specialized businesses downtown while bigger businesses relocate in malls or “big-box” sites.
  • Supporting ethnic and niche stores, such as markets, delicatessens, bakeries, and restaurants.
  • Providing a sense of “public place” in the core of downtowns to ensure that shared spaces feel truly shared.

This trend now is in the process of reversing. The children born in the middle of the twentieth century are now grown, and older parents are relocating to more-convenient downtown areas. Young professionals focusing on their respective jobs, too, head toward inner-city areas, postponing the American dream of starting a family and moving to the suburbs until later in life. Another group of urban dwellers consists of those who would like to live without needing a vehicle. Hence, a new type of residential development has emerged around public transit stations, called Transit-Oriented Developments. The market for condominiums and townhouses located next to public light-rail transit systems has developed rapidly in recent decades.

the Riverwalk District in downtown Reno, Nevada

Now the challenge for communities is to make downtowns more attractive, more livable. Government planners at the state and local levels need to advocate for changes that will benefit downtown areas. One model is the high-rise residential area in the Lower East Side of New York City a century ago, where individuals and families lived in multistory residential structures that featured an assortment of commercial businesses located on the ground floor. All of the restaurants, markets, and other types of commercial activity took place at street level.

It’s also great for those commercial businesses established on the ground level to have their market built-in above them. Rezoning downtowns to allow more residential units above ground-level businesses is the wave of the future. If you build them, people will come, especially if there’s public transit in the area.

In addition to such mixed-use zoning, blending the commercial and the residential, thriving communities should increasingly bring arts, entertainment, and culture back to downtown areas. Some cities have used libraries and museums as tools to stimulate economic development, while others are trying to lure educational institutions and nonprofit organizations back downtown.

There is also a big trend to preserve what’s left of nature in urban environments, restoring what’s been removed over the decades. Cities are expanding parks, wetlands, and waterways; they’re enhancing pedestrian access and movement by narrowing the streets and widening walkways, bikeways, plazas, and other public areas, reversing the car-centric planning of the previous century. This trend, too, has facilitated the movement of people back to downtown areas.

When successful, these efforts stimulate the local economy and attract the type of businesses, educational institutions, and nonprofit organizations that would benefit from revitalized downtown areas. Additional economic-development incentives would help attract desirable private, educational, and nonprofit institutions to downtowns, but selling local public officials on such incentives requires a clear demonstration of their reasonableness and long-term benefits to the taxpayers and all of the citizens within the community. A nice downtown should serve as a great public place not only for those who live there, but also for other citizens in the area who come to work, shop, eat, or participate in cultural attractions.

Prudent economic-development incentives that promote downtown renewal are a wise way to generate revenues without raising taxes and can assist in balancing a community’s budget. Most cities evolved piecemeal over the years and now need to be retrofitted and redesigned for the future.

Planning and zoning regulations should be in place to accommodate mixed land-uses, infill, and redevelopment projects. Call it New Urbanism, Sustainability, Pedestrian Cities, Healthy Cities, Inner-City Renewal, or the Green Cities Movement—these practices can be applied to projects of all sizes to promote livability in a single building, on a full block, in a neighborhood, and even an entire community.

Roger L. Kemp is an adjunct professor in the Public Administration Program, University of New Haven, and in the Urban Studies Program, Southern Connecticut State University. E-mail rlkbsr@snet.net.

Hackers of the World Unite

Crowd-sourced attacks on networks are increasingly destructive.

Computer networks have been on guard for decades against individuals trying to “hack” them. But networks now face a larger danger from mass attacks, warns IT security analyst Richard Stiennon.

“The new trend is to mobilize forces over the Internet to engage in the equivalent of mass online protests,” writes Stiennon in his latest book, Surviving Cyberwar.

Political groups, organized-crime syndicates, and some governments launch distributed denial of service (DDoS) attacks, which direct hundreds, thousands, or millions of computers to simultaneously strike a single Web site. The browser overloads and shuts down.

In 2007, when Estonia enacted laws that some Russian-Estonians opposed, denial of service attacks from some 80,000 IP addresses based in Russia sabotaged the Web sites of Estonian government agencies, banks, and telecommunications companies.

Stiennon blames many attacks on Nashi, a 120,000-member Russian nationalist youth association. Some Nashi operatives distribute the attack instructions and encourage members to use them against designated targets.

“They share a political mind and have the computer skills to join a call for an attack,” Stiennon writes.

In an exclusive interview with THE FUTURIST, Nashi member Alexi Kanskakof claims that Russian DDoS attacks have caused major economic disruption in Ukraine and may have contributed to Moscow-favored candidate Victor Yanukovych winning Ukraine’s presidential election in 2009. Also, during Russia’s 2008 war against Georgia, Russian hackers co-opted Georgian television stations to run pro-Russian broadcasts.

“From these examples, one can see just how effective Russian cyberattacks can be at blackmailing the citizens of other nations or causing economic chaos,” says Kanskakof.

He points out that DDoS attacks carry few risks for the perpetrators. A Nashi member could attack the Web site of a business in Ukraine, for example, without ever leaving Russia. “Even if the Ukrainian police forces found out it was you who did the cyberattack, there is really nothing they can do about it.”

Of course, Russians are not the only ones who may be using this weapon. It is believed that such attacks were also deployed to thwart WikiLeaks in its attempt to distribute “anonymously submitted” diplomatic cables embarrassing to the U.S. government and its global partners. And DDos attacks were also allegedly launched by WikiLeaks supporters against its “enemies.”

Businesses and government agencies worldwide are at risk, according to Daniel Gonzalez, director of information systems for the Software & Information Industry Association. He says that, while some denial of service attacks are orchestrated by masses of volunteers, others are created by “botnets,” automated software tools that infect computers and make them emit malware without their owners knowing it.

“With botnets, what they’re doing is building a network of all these infected computers that they can use for their own purposes,” says Gonzalez. He adds that many organized-crime groups create botnets and sell them to buyers on every continent.

Social-networking sites provide huge opportunities for botnets. These sites have few spam filters, according to Gonzalez, so hackers increasingly use them to distribute malware.

“Someone I know opened up a Facebook message. It looked like it was coming from one of their Facebook friends. It said, ‘Hey, I found this photo of you.’ It turned out it wasn’t a photo. It was installing a virus,” says Gonzalez.

Normal precautions that many people fail to take could be the simplest protections, such as keeping software up to date, notes Stiennon. He also urges Web sites to have independent platforms and not share servers. That way, if one site suffers a DDoS attack, other sites won’t fail, too.—Rick Docksai

Sources: Richard Stiennon, author of Surviving Cyberwar (Government Institutes, 2010), IT-Harvest, www.it-harvest.com.

Alexi Kanskakof, member of Nashi, private communications.

Daniel Gonzalez, Software and Information Industry Association, www.siia.net.

Alarms Ring as Wedding Bells Do Not

Trends in postponed marriages and births spark debate on economy’s role.

Americans are waiting longer to marry, and household size declined between 2000 and 2010, according to the U.S. Census Bureau. Marriage is also declining among young people, the Bureau reports. The media have been quick to point to the 2008 recession as the key cause.

“The United States crossed an important marital threshold in 2009, with the number of young adults who have never married surpassing, for the first time in more than a century, the number who were married,” Erik Eckholm of The New York Times reported. “A long-term decline in marriage accelerated during the severe recession, according to new data from the Census Bureau, with more couples postponing marriage and often choosing to cohabit without tying the knot,” he concluded.

Meanwhile, the U.S. Center for Health Statistics has reported a 2.7% drop in fertility from 2008 to 2009, leading Marilynn Marchione of the Associated Press to comment, “The U.S. birth rate has dropped for the second year in a row, and experts think the wrenching recession led many people to put off having children. The 2009 birth rate also set a record: lowest in a century.”

But, while the fertility drop is recent, it’s actually linked to a longer-term trend. The Pew Research Center reports that the number of American women who had ended their childbearing years without giving birth has doubled since 1970s, from 1 in 10 to 1 in 5. Childlessness rose among women without a high school diploma, which could be attributable to a bad economy. Another plausible explanation is the success of public information campaigns urging people to delay childbirth until after high school.

Meanwhile, rates of childlessness declined by 32% for women with doctorate or professional degrees. But this group is still the least likely to have a child, according to Pew.

A few researchers have cautioned that, while the economy may have played a role in some people waiting longer to wed or bear children, it is still too early to extrapolate a clear causal link between the bad economic environment of 2008 and 2009 and the recent marriage and childbearing statistics. Census Director Robert Groves, writing on his blog, noted, “Many factors can affect the estimates of the number and proportion of people currently married. For example, declining numbers could reflect the passing of members of an older generation that had higher marriage rates.”

Pew recently reported that young adults (under 30) with a college degree had become more likely to marry than their peers without a degree, representing a reversal in favor of marriage among that group. Despite this, the overall marriage rate was still down among both degreed and non-degreed young adults.

Pew points to what researchers call a clear “marriage gap” along economic lines. “Those in this less-advantaged group are as likely as others to want to marry, but they place a higher premium on economic security as a condition for marriage.”

In sum, data from the last 10 years shows more Americans now waiting to marry, compared with a few years ago, but fewer college-educated Americans waiting than non-college educated. A drop-off in fertility occurred in 2008–2009 and was more pronounced among non-college educated women than for women with advanced degrees.

The state of the U.S. economy may have been a factor in the drop-off in fertility, and an income-based “marriage gap” may be emerging. However, these trends could turn out to be a blip. A longer-term decline in marriage is seen in decades-old trends of fewer weddings among twenty-somethings and rising cohabitation arrangements in lieu of tying the knot.—Patrick Tucker

Sources: The U.S. Census Bureau, www.census.gov.

Stephanie Ventura, the Centers for Disease Control and Prevention, www.cdc.gov.

Surviving the Great Recession's Aftershocks

By Patrick Tucker

Too much wealth in the hands of too few will result in less for all, warns a former U.S. labor secretary, who offers a prescription for rebalancing wealth.

Aftershock: The Next Economy and America’s Future by Robert B. Reich. Knopf. 2010. 192 pages. $25.

The inequality of wealth in the United States will result in a stagnant economy and political turmoil by the year 2020, argues public-policy scholar and former U.S. Labor Secretary Robert B. Reich in Aftershock. Millions of deeply indebted Americans will embrace isolationism, reject both big government and big business, and sever America’s ties with the rest of the world, he predicts.

To illustrate the size and scope of this disaster, Reich sets up a credible and horrifying scenario: The year is 2020. The recently elected president, Margaret Jones of the Independence Party, is about to set forth on a legislative agenda reflecting the frustrations of the broad, outsider constituency that elected her. Her objectives: a freeze on legal immigration and the swift deportation of all illegal immigrants; increased tariffs on foreign goods; prohibition against foreign investment; withdrawal from the World Bank, the United Nations, and other international organizations; and a default on the U.S. debt to China.

The results are immediate.

“On November 4, the day after Election Day, the Dow Jones Industrial Average drops 50 percent in an unprecedented volume of trading,” writes Reich. “The dollar plummets 30 percent against a weighted average of other currencies. Wall Street is in a panic. Banks close. Business leaders predict economic calamity. Mainstream pollsters, pundits, and political consultants fill the airwaves with expressions of shock and horror. Over and over again, they ask: How could this have happened?”

This aftershock, says Reich, is a direct result of Americans failing to learn the lessons of the Great Depression, thus setting the country up on a course for yet another economic crisis. The most important of these lessons is that too much money resting in the hands of too few people cannot grow an economy. What’s needed is an orderly division of income spread across lower, middle, and upper classes, he argues. When income (hence, wealth) is too concentrated among elites, the economy atrophies and declines.

It’s a classic Keynesian argument that would ring shrill and tinny if we didn’t live in such Dickensian times. Consider that, prior to the Great Recession of 2008, income and wealth inequality in the United States were higher than they had been any time in the recent past other than just before the Great Depression, with the top 1%—those with incomes more than $380,000 per year—owning roughly 23% of the assets. Median wages for workers have been stagnant since the 1970s, at about $45,000 a year, despite the fact that the economy itself is much larger than it was three decades ago. Those gains mostly went to those at the top.

This present situation is, of course, not without historic precedent. In the 1700s, wealth inequality in the American colonies was similar to that of the United States today. The climate was particularly wintery in Boston, where the top 5% of the population controlled 25% of the wealth in the 1720s (this would become 50% by 1770). Too often we forget that the decades leading up to the American Revolution were marked by the burning of rich merchants’ shops, occasional riots, and massive resentment over the issue of debt and wealth inequality, as chronicled by the late historian Howard Zinn in A People’s History of the United States: 1492-Present.

Today’s wealth inequality is a moral failing, says Reich, but it’s also an operational malfunction at the root of many of America’s other problems. An economy that is growing across all income levels encourages people to buy more things like new cars, consumer electronics, bachelor’s degrees, bigger houses, and the like. Instead, over the last two decades, a larger portion of the wealth went to a smaller group; as a result, Americans were forced to resort to a number of coping mechanisms to continue to consume at ever higher levels.

The first of these coping mechanisms was the two-income household. In the 1970s, the mass entry of women into the workforce increased household income, but only up to a point. Over the last decades, those economic gains have been eaten up by such things as the costs of child care.

The second coping mechanism that Americans employed to mitigate stagnant wages was longer working hours. This also worked well until, by the mid-2000s, Americans were putting in 500 more hours—that’s 12 more weeks—of paid work a year than they were in 1970.

Finally, Americans resorted to saving less and borrowing more in order to continue consuming at ever higher levels. Reich points out that average household debt was 138% of household income in 2007, up from a manageable 55% in the 1960s. This represents the largest gap since the Great Depression. Much of that debt was tied up in home loans that people would never be able to pay off.

The question becomes, Does voluminous spending by the well-funded few necessarily lead to reckless spending on the part of the many? Reich argues that it does. There is some recent independent research to back him up on this. In an October 2010 paper titled “Expenditure Cascades,” Robert H. Frank of Cornell University, Adam Seth Levine of Vanderbilt University, and Oege Dijk of the European University show that “changes in one group’s spending shift the frame of reference that defines consumption standards for others just below them on the income scale....”

What of the gainers, the 10% who saw unprecedented wealth and income increases? They didn’t fare as well as you might expect. With too much capital to ever spend efficiently, many of them invested in a series of asset bubbles through unscrupulous Wall Street intermediaries, with predictably lackluster results.

The battle against falling middle-class wages is one that Reich has been fighting for decades, since serving as labor secretary in the Clinton White House. He acknowledges that, even in those instances when he’s had the ear of the president (he also served briefly on the Obama administration team), he hasn’t had much success in implementing the sorts of structural changes that would set the nation’s distribution of income on a more equitable path.

“We in the Clinton administration tinkered. We raised the minimum wage.… We offered students from poor families access to college and expanded a refundable tax credit for low-income workers.… All these steps were helpful but frustratingly small in light of the larger backward lunge.”

Reich lays out several proposals—either reasonable or radical depending on your point of view—to correct the imbalance of wealth in the next decade:

  • A reverse income tax. The government would put extra money into the paychecks of low wage earners and cut taxes on middle class Americans (those earning less than $90,000 per year). The policy would be modeled after the Earned Income Tax Credit but would be more ambitious in reach. Reich speculates that the cost to the government would be about $600 billion per year.
  • A carbon tax collected against energy companies. Reich estimates that, if set at $35 per metric ton of CO2, this tax would raise about as much as the reverse income tax (wage supplement) would cost—around $600 billion.
  • A one-time “severance tax” levied against employers who lay off long-term workers, equal to 75% of a worker’s yearly salary.
  • Federal subsidization of less-profitable but socially valuable college majors. Public universities, under a Reich plan, would be free, and loans for private schools would be available at low cost. Upon graduating, a student who took such a loan would pay about 10% of his or her income on the loan for 10 years. After that, the loan would be considered fully paid. “This way,” says Reich, “graduates who pursue low-income occupations such as social work, teaching, or legal services would be subsidized by graduates who pursue high-income occupations including business, finance, and corporate law.”

The effect of these proposals, with the exception of the college funding one, would be to transfer investing power away from the private sector (rich people and their money advisors) and put it in the hands of the federal government, which would then distribute those funds to the people to buy consumer goods.

There’s a libertarian argument against this, but also a practical one. As Reich himself points out, a rising share of consumer spending now goes abroad, as more Americans purchase products made in other countries. Taxing U.S. energy companies—at a time when a larger than ever portion of the fuel the country uses comes from Canada, Mexico, and Saudi Arabia—in order to pay Americans to purchase electronics from Malaysia, toys from China, and wine from Spain seems unlikely to have a positive effect on national GDP.

A better use of such money might be infrastructure or public works, which would put more money in the hands of Americans. Reich acknowledges the dilapidated state of the country’s roads and bridges, but he doesn’t propose a single large-scale public infrastructure project. In fact, he derides the 1990s as a time when too much private investment capital resulted in “more miles of fiber-optic cable than could ever be profitable.” The 1990s telecom asset bubble was certainly severe, but Reich disregards or ignores the types of services that can be offered over the Internet once bandwidth limitations are removed. Perhaps, while serving in the White House, he never experienced the frustration of a slow download.

The idea of a company paying severance costs of 75% of a terminated employee’s yearly salary—in essence paying the “social costs” for outsourcing—is a radical one for the United States. Businesses would argue that such a measure would crimp their flexibility and that the ability to hire and fire freely helps keep companies lean, nimble, and competitive. They might say that, faced with a 75% severance requirement, firing anybody would be too difficult and American companies would come to resemble Japanese companies during the 1990s—the so-called “lost decade,” when every employee was guaranteed a high degree of job security regardless of whether or not he (it was mostly men) helped or hindered the overall corporation. The suggestion that companies be penalized for firing people reads like an open pander to labor interests, not a viable revenue generating strategy. A straight tax hike on corporate entities, regardless of hiring or firing behavior, would seem to meet the same objective with fewer downsides.

The principal argument against Reich is that his proposals are politically untenable in an environment where any effort to raise taxes on any American, for any reason, meets with nearly insurmountable resistance from the Right and passionate charges of socialism on the floor of the House of Representatives. The 2010 election saw a number of Tea Party candidates rise to power in some very poor states like Kentucky—places that would benefit greatly from the wealth-redistributing policies that Reich proposes. How did these candidates win? They succeeded by promising to thwart any increase on taxation for the very wealthy, no matter what the cost; they promised to halt any remaining “bail-out” funds from being spent. They vowed to undo the recently enacted health care law and its provisions to expand health coverage to more Americans.

It’s one thing to argue that the country, running a record deficit, cannot afford such policies. It is another thing entirely to suggest that such policies are not in the interests of the growing poor. Yet, people in the first district of West Virginia and the first district of Arkansas voted against their own interests.

What does this show? Perhaps the worst enemy of the American middle class is not the most wealthy 1%, but the mistrustful and ever-angrier middle class itself, all of which adds to the timeliness and value of Reich’s achievement with this important book.

About the Reviewer

Patrick Tucker is the senior editor of THE FUTURIST magazine and the director of communications for the World Future Society.

What Hath Hawking Wrought?

By Edward Cornish

Scientists show how gravitational forces might create universes spontaneously, with no divine intervention required.

The Grand Design by Stephen Hawking and Leonard Mlodinow. Bantam Books. 2010. 119 pages. Color illustrations, including original art by Peter Bollinger. $28.

In their ambitious new book, The Grand Design, mathematician Stephen Hawking and his collaborator, physicist Leonard Mlodinow of Caltech, offer scientific explanations for many of the mysteries of the universe.

Why do we exist?

Why is there something rather than nothing?

Why do we live under this particular set of natural laws and not some other?

Philosophers have long struggled with such questions and typically ended by invoking God. But Hawking and Mlodinow insist on a strictly scientific view, commenting, “It is reasonable to ask who or what created the universe, but if the answer is God, then the question has merely been deflected to that of who created God. In this view it is accepted that some entity exists that needs no creator, and that entity is called God. This is known as the first-cause argument for the existence of God. We claim, however, that it is possible to answer these questions purely within the realm of science, and without invoking any divine beings.”

To warm up for Hawking’s expansive thinking, we might begin with his assertion that our universe is merely one of a set or assemblage of universes, which he calls the Multiverse, or M-Theory.

“Our universe seems to be one of many, each with different laws,” Hawking and Mlodinow assert. “The Multiverse Theory is the only theory that has all the properties we think the final theory ought to have.”

According to the authors, a whole universe can be created out of nothing because gravity shapes space and time.

“Gravity allows space-time to be locally stable but globally unstable,” they write. “On the scale of the entire universe, the positive energy of matter can be balanced by the negative gravitational energy, and so there is no restriction on the creation of whole universes. Because there is a law like gravity, a universe can and will create itself from nothing. Spontaneous creation is the reason the universe exists, why we exist. It is not necessary to invoke God….”

Hawking and Mlodinow write in a friendly, engaging style, but the average reader may still struggle with their mind-blowing ideas. Never mind: It’s worth making the effort. Most of us don’t stretch our minds nearly enough.

Readers will certainly expand their thinking by reading The Grand Design, but they may have difficulty finding immediate practical use for it. But let us be patient: Practical uses may well come in the future. In science, theory tends to precede practical applications. Benjamin Franklin’s theorizing about electricity (along with his experiments) led eventually to the huge electric-power industry that we know today. So sometime in the future, Hawking’s ideas may well reshape the world economy and other aspects of our world that we have yet to imagine.

About the Reviewer

Edward Cornish is the founding editor of THE FUTURIST magazine and founder of the World Future Society.

Tools for Problem Solving

By Rick Docksai

In order to meet the challenges ahead, we’ll need less control, more distributed action, and less resistance to change.

2030: Technology That Will Change the World by Rutger van Santen, Djan Khoe, and Bram Vermeer. Oxford University Press. 2010. 295 pages. $29.95.

Technology could contribute to solving many of the world’s problems, ranging from resource shortages to financial crises, state three Dutch scientists in 2030: Technology That Will Change the World. Citing interviews with researchers from health, information technology, energy, foreign policy, and other fields, they identify an array of innovations that could improve life across the globe.

The authors—chemist Rutger van Santen and electro-optical communication professor Djan Khoe, both of Eindhoven University, with science journalist Bram Vermeer—and the experts they cite express agreement on several fundamentals: that the world’s global systems are growing more interconnected, that information systems must become more adept at gathering information from the ground level and rapidly responding to it, and that humans must overcome their reticence to change.

“We need to pursue more flexible solutions so that technology can serve us more effectively in a fast-changing environment. And we must also come to grips with complexity itself,” the authors write.

Some promising research areas, according to the authors, include the following:

  • Water management. Droughts worldwide will worsen in the absence of new methods to reduce stress on water systems. Potential remedies include drought-resistant crops, improved irrigation, and water purification and desalination systems that operate at the neighborhood or household scale.
  • Energy efficiency. Humans already consume the earth’s resources more than 1.5 times faster than the planet can replenish them—and the deficit is widening. Some hope, however, lies in more energy-efficient buildings and household appliances.
    Prototype alternative-energy systems also show promise. Solar cells made with conducting polymers would be lighter, more flexible, and easier to manufacture than present-day solar cells, for example. Nuclear breeder reactors would provide massive amounts of energy with minimal nuclear waste. And hydrogen could be a practical fuel if combined with other, denser gases.
  • Medicine. In the future, medical scanning software will process more images in less time. Also, the scanners will analyze the images and advise physicians on follow-up tests and treatments.
    Cognitive decline may be inevitable for some people at advanced ages, but they may better cope by using technological applications such as cookers that turn themselves off or kettles that protect users against accidental burnings, for example.
  • Manufacturing. Microplants—whole factories the size of a computer chip—will construct devices “to a precision of a few micrometers,” all with much less energy and waste than traditional manufacturing processes. Computer chips cannot get much smaller, but they can become far more capable. “Smart” computer chips will be aware of their environments and act upon them. Applications could include brain-wave monitors for patients who have epilepsy. The monitor would recognize an oncoming seizure and avert it.
  • Communications. Numbers of radio stations, TV stations, and mobile-phone and satellite connections are increasing, but room on the electromagnetic spectrum is limited. Networks will operate better if regulations governing bandwidth are loosened and control delegated to local units, the authors argue. Communications will further benefit from new systems that broadcast with less spectrum and from software-defined radio sets whose components change frequencies and perform upgrades automatically in accordance with changing airwave transmissions.
  • Finances. Major economic crashes are often preventable if market observers spot market instabilities before they spiral out of control. Use of computer simulations and other new network science tools would enable economists to better understand market mechanisms. Computers could even perform trades for people: “Automated” trading would eliminate unnecessary trading and lower market risk.
  • Conflict resolution. Civil conflicts are more numerous, and nuclear weapons are an ever-present danger. Satellites and environmental sampling can help keep the peace, however, by enforcing disarmament agreements.

Foreign policies have to evolve, too, according to the authors. Governments need to pursue greater integration, economic cooperation, and interdependence. In addition, every measure that nations take to use less oil and electricity will engender a more peaceful world.

The world and the challenges it faces are both becoming increasingly complex, the authors acknowledge. They are hopeful, however, that if humans expand their capabilities to cooperatively gather information, analyze it, and act upon it, they will thrive.

“Protecting the future of our industry is not about securing the status quo but fostering the dynamics needed to adapt to changes as they arise,” they write.

In 2030, the authors have provided an incisive report about the upcoming frontiers of modern scientific research. Readers will find this book an approachable guide to the new applications that we might realistically see come into use in the decades ahead.

Books in Brief (March-April 2011)

Edited by Rick Docksai

Keeping Connected with the Joneses

The Abundant Community: Awakening the Power of Families and Neighborhoods by John McKnight and Peter Block. Berrett-Koehler. 2010. 173 pages. $26.95.

Avid consumerism became a societal trend in the early twentieth century, and since then “keeping up with the Joneses” has impacted life in many harmful ways, according to social-policy professor John McKnight and workplace consultant Peter Block in The Abundant Community. They argue that the marketplace has essentially replaced the community in most people’s minds, and thus people’s neighborhoods no longer satisfy their emotional needs.

The incessant drive to buy and consume requires huge corporations, health-care infrastructures, and thousands of different types of specialists to feed it. People work nonstop and rely on specialists to look after their health, maintain their homes, keep their neighborhoods in order, and care for their children. Families spend less time together, neighbors scarcely know each other, and relationships become shallow and utilitarian.

Should consumerism persist, the health of communities everywhere will suffer greatly, the authors warn. No neighborhood can effectively prevent crime, educate its youth, create jobs, keep parks clean, and ensure that the elderly, the poor, and other people in need are cared for unless its residents work together to make all these things happen.

McKnight and Block hold out hope that communities everywhere will rediscover their own nonmaterial abundance and relearn how to create vibrant community life. They conclude by laying out the values a community must adopt to achieve this.

The Abundant Community is an in-depth evaluation of twenty-first-century society and the values that define it. Community activists, organizers, and leaders of all kinds will find it deeply meaningful.

Engines of Human Advancement

Acceleration: The Forces Driving Human Progress by Ronald Havelock. Prometheus. 2011. 363 pages. $28.

Humanity has much to look forward to in this century, argues technology consultant Ronald Havelock in Acceleration. He describes a sweeping transformation of human life by 2050: longer life spans, growing knowledge platforms, swelling ranks of scientists and engineers, exponentially more powerful computers, and the diffusion of a more inclusive human ethics.

Havelock identifies a powerful “Forward Function”—movement of societal and technological progress—that he says has been active throughout human history. Progress has been especially great over the last 60 years due to an array of new forces: expanded learning, increased information storage capacity, the evolution of social networking, a larger division of labor in the service of problem solving, more sophisticated problem-solving processes, and immensely enhanced power to distribute knowledge via media.

For the first time in human history, individual groups of researchers, producers, distributors, and consumers are all continuously connected. These ties of communication will bring all more closely into alignment and enable them to work together to make more rapid and consistent innovation.

Pessimism about the future still runs deep, Havelock notes. Vast numbers of people believe that the future will be grim. Havelock encourages a more positive outlook: Pessimism not only lowers quality of life, but it also slows the Forward Function. He remains confident that the Forward Function will stay on course for as long as there is a human species and will continue to improve human life.

Acceleration is an upbeat philosophical perspective on humanity’s past, present, and future. Audiences from all walks of life will find it thought-provoking and inspirational.

Foresight in a Flash?

Flash Foresight: How to See the Invisible and Do the Impossible by Daniel Burrus with John David Mann. HarperBusiness. 2011. 268 pages. $27.95.

We’ve all had moments of “flash foresight”—i.e., intuitive grasps of what is to come—says executive consultant Daniel Burrus in Flash Foresight, written with business journalist John David Mann. The challenge, Burrus adds, is to know when to act on it; sometimes this foresight is counterintuitive and requires doing the opposite of what everyone else is doing.

You exercise flash foresight when you look to the future and try to discern what you already know. Then, once you’ve established your certainties, you attempt to fill in the uncertainties. There is much about the future that we can predict in advance, Burrus says.

He describes real-life examples of people who exercised flash foresight to solve real problems. Apple Computers’ leadership used it to resurge from market failure to market domination. The phone company Mobile Telephone Networks used it to create burgeoning cell-phone markets throughout sub-Saharan Africa. And Burrus claims to have used it in the early 1980s to accurately predict the digital revolution, the explosive growth of fiber-optic cable networks, and the sequencing of the human genetic code by the year 2000.

Burrus also points out examples of people who failed to use it. They include the heads of General Motors, who had a hugely successful company in the mid-twentieth century but faced collapse and federal takeover in 2008.

Flash Foresight presents helpful case studies in how decision makers in any industry can more effectively shed light on their futures.

Islam’s Call to Sustainability

Green Deen: What Islam Teaches About Protecting the Planet by Ibrahim Abdul-Matin. Berrett-Koehler. 2010. 232 pages. Paperback. $16.95.

Conservation of the earth is integral to Islam, argues Muslim author and policy advisor Ibrahim Abdul-Matin. He presents multiple examples of what Muslims are doing and can do to improve human stewardship of the planet and its resources.

These include “green” mosques that incorporate sustainability into their architecture; urban and suburban food gardens that flourish in some Muslim neighborhoods; and Alpujarra, a Muslim community in Mexico that draws all of its energy from localized solar and wind generators.

There are also individual Muslims who are leading sustainability changes in their own communities, such as Adnan Durrani, an organic food pioneer, and Qaid Hassan, an entrepreneur who delivers fresh produce to low-income communities in Chicago. Also, the Inner-City Muslim Action Network, a Chicago nonprofit, operates a Green Reentry Project that helps recently incarcerated men transition into green jobs.

None of the examples above is an anomaly, Abdul-Matin asserts. He notes that Muhammad, Islam’s foremost prophet, once said that “The Earth is a Mosque, and everything on it is sacred.” Abdul-Matin points to many verses in the Koran pertaining to daily living and how each actually contributes to solving global problems of energy use, food distribution, water supplies, and waste. He further explains how these teachings can be useful and relevant to anyone, Muslim and non-Muslim alike, who is concerned about the environment’s long-term health.

Green Deen offers a new perspective on Islam—the world’s second-largest religion—and its potential as a force for positive worldwide change. Secular and religious audiences of all faith traditions may find it informative and enlightening.

Can Information Empires Be Free?

The Master Switch: The Rise and Fall of Information Empires by Tim Wu. Knopf. 2010. 366 pages. $27.95.

Since the invention of the telephone, every information technology has evolved along a similar trajectory, says Tim Wu, chairman of the media reform organization Free Press, in The Master Switch. He calls this trajectory “The Cycle.”

At first, the technology is an open system that is controlled by no one and subject to extensive innovation by many different developers. Over time, however, one corporation or entity gains exclusive control. Then the technology becomes a “closed system,” and innovation grinds to a halt.

He traces the Cycle as it played out during the twentieth century in film, telecommunications, and broadcast media. Key industry players took over each market, and the outcomes were blander media content, stifled individual expression, and fewer choices for consumers.

The Internet is still an open system, Wu adds. But there are signs that it, too, could fall under centralized control. The consequences would be staggering, given that information industries are integral to almost every aspect of our lives.

Wu advises against aggressive government regulation of information markets. At the same time, he insists that those who develop information, those who own the networks on which it travels, and those who control the tools of information access must all be kept separate from each other. Government must also remain vigilant against excessively large corporate mergers. These basic checks are vital, Wu argues, to prevent any one corporation from becoming the sole arbiter of what consumers see and hear online.

The Master Switch is a provocative thesis on where the Internet has come from and where it is headed. It will interest technology enthusiasts and all who value a vibrant media market.

Putting Our Minds to Morality

My Brain Made Me Do It: The Rise of Neuroscience and the Threat to Moral Responsibility by Eliezer Sternberg. Prometheus. 2010. 244 pages. Paperback. $21.

As neuroscientists learn more about the influences that the brain’s neurons and neurotransmitters have, difficult questions arise over how much control people really have over their lives, according to Tufts University medical student Eliezer Sternberg in My Brain Made Me Do It.

Some neurologists believe that human behavior is entirely predetermined by brain chemistry and that free will does not really exist. Many philosophers object strongly to this viewpoint, however. They hold that to deny free will is to reduce human beings to mindless machines without capacity for moral responsibility.

Sternberg presents both sides and then concludes with his own nuanced view: The brain influences behavior, but it does not determine it. Humans still have the capacity to make their own decisions. Referencing numerous studies of brain activity, brain hormones, and mental disorders, he constructs the complex process of human decision making and the multiple factors—emotional, hormonal, logical, and situational—that underlie it.

Sternberg recasts complex theories about the human brain and human behavior in simple terms that almost any audience will readily grasp. My Brain Made Me Do It will be an engaging read for scientists and lay readers alike.

Humanity’s Next Great Evolution in Values

Thriving in the Crosscurrent: Clarity and Hope in a Time of Cultural Sea Change by James Kenney. Quest. 2010. 253 pages. Paperback. $16.95.

A cultural sea change is under way across the globe, says interfaith activist Jim Kenney in Thriving in the Crosscurrent. Old beliefs and new beliefs are clashing, and the end result will be the prevalence of cultural values that are better attuned to current realities.

Ethnocentric values—sexism, racism, war, materialism, greed, and exploitation of the environment—are receding. And world-centered values—gender partnership, intercultural dialogue, religious pluralism, nonviolence, spiritual awareness, social justice, and environmental justice—are taking their place.

At least three such sea changes have taken place in human history: the rise of agriculture, the emergence of the major Eastern and Western religious traditions, and the Copernican realization that the Earth is not the center of the universe. Each one signified a profound shift in human understanding and an affirmation of interdependence and creative complexity.

Kenney points out concrete examples of the sea change in academia, the nonprofit world, contemporary politics, and other areas of life. He describes current reactionary forces opposing change, but argues that the new values will ultimately prevail.

Readers who worry about humanity’s future will find in Thriving in the Crosscurrent a compelling case for hope.

Working with Millennials

The 2020 Workplace: How Innovative Companies Attract, Develop, and Keep Tomorrow’s Employees Today by Jeanne C. Meister and Karie Willyerd. HarperCollins. 2011. 294 pages. $26.99.

The Millennial generation—all those born between 1977 and 1997—will constitute nearly half the world’s workforce by 2014, according to workplace consultant Jeanne Meister and Sun Microsystems vice president Karie Willyerd in The 2020 Workplace. They call on employers to plan now for a new paradigm in how and where people will work, the skills they will offer, and the technologies they will use to communicate.

Workforces will exhibit greater diversity in age, gender, and ethnicity, the authors forecast. Also, due to the proliferation of virtual communications, more offices will consist of employees who are dispersed across remote corners of the globe. Professionals everywhere will have far more options as to how, where, when, and for whom they work—provided that they produce results. Leadership will have to be more global, culturally aware, and skilled at building alliances and sharing authority.

The authors describe the unique values that will set the Millennial workforce apart—such as freedom, personal choice, collaboration, corporate integrity, and innovation—and how these priorities will influence their professional lives. They advise employers on how to best engage this new generation while still keeping their senior employees satisfied.

Workplace managers and leaders in practically any industry or sector may find The 2020 Workplace to be a helpful guide to how they can prepare their workplaces for success in the world of 2020.

Future Active

Edited by Aaron M. Cohen

Symposium Tackles Sustainable Transportation

University of Virginia professors from such diverse departments as business, nursing, urban planning, and architecture came together to discuss sustainable transportation at the symposium “The Car of the Future / Future of the Car.”

The event was conceived as a multidisciplinary exploration. “If you want to approach the subject properly, you need expertise that comes from many different disciplines,” said co-organizer Manuela Achilles, program director of UVa’s Center for German Studies.

Guest speakers included bestselling author and futurist Jeremy Rifkin, president of the Foundation on Economic Trends, who presented on “The Third Industrial Revolution and the Reinvention of the Automobile.” Christopher Borroni-Bird, GM’s director of advanced technology vehicle concepts and co-author of Reinventing the Automobile: Personal Urban Mobility for the 21st Century, spoke as well.

Daniel Sperling and Deborah Gordon, the co-authors of Two Billion Cars: Driving Toward Sustainability (Oxford University Press, 2009), also gave a presentation. Their book examines, among other things, the global emphasis on individual car ownership.

Most of the sessions were free and open to the public. University undergraduates also participated in “The Car and its Future,” a contest that gave them the option to either write an essay or design a project around the symposium’s theme.

Source: University of Virginia Center for German Studies, http://artsandsciences.virginia .edu/centerforgermanstudies.

Ten Likely Global Occurrences

“Much can happen in ten years—just review the past decade.” So begins the Copenhagen Institute for Futures Studies’ report “Ten Tendencies Towards 2020.” With this in mind, the CIFS analyzes 10 shifts that the organization believes are already well under way and examines how they could play out in the future, charting potential consequences.

The fact that things are already moving in these general directions, with some momentum behind them, is what distinguishes these as tendencies as opposed to trends. A panel of Danish executives from different industries rates the significance of each item on the list, on both industry-wide and global levels.

Some highlights from the report are as follows:

  • Due to a number of factors, ranging from aging populations to the financial crisis, companies will place increasingly greater value on employees’ talent and ability, to the point where talent will be “regarded as a company’s most important asset for future growth.” In order to retain talented employees, businesses will compete with each other to offer better work environments, larger salaries, and other benefits. Worth noting: “One method of identifying potential and existing employees’ undiscovered talents in the future could be brain scanning.”
  • The 10-year economic prognosis looks good for many countries in Africa—particularly the so-called African Lions, which include Botswana, Egypt, Libya, Mauritius, Morocco, South Africa, and Tunisia. Fueled in no small part by attention from investors in China and India, “the region’s economic capacity is one of the fastest growing in the world,” according to the report. However, there will continue to be substantial divides between haves and have-nots within these countries. Worth noting: Africa could become an increasingly popular vacation destination for Westerners.
  • In contrast, “the indications are that Europe’s glory days are coming to an end,” according to the CIFS. One (perhaps all-too-likely) scenario shows Europe and the United States experiencing zero economic growth. “On the other hand, it is possible to set up scenarios in which [basic] reforms are gradually implemented, and there is a return to growth, albeit at a lower level than we experienced during the decade from 2000-2010.” Worth noting: Spain, the EU’s fifth largest economy, recently made headlines when it reported zero growth for the third quarter of 2010.
  • A renewed and lasting interest in collectivity and community will benefit global society. Examples of such range from social networking sites to urban car-sharing programs. The CIFS writes, “The communities of the future will be based on co-creation.” In other words, rather than competing against each other, talented people will work together to find innovative solutions to overarching problems. Worth noting: The report suggests that digital media could be facilitating a kind of collective intelligence on a global level.
  • Mental doping is on the rise. Prescription medications such as Adderall, Ritalin, and beta blockers are being used (and abused) more and more as brain stimulants by students and workers looking to improve their mental performances. Yet, cognitive-performance enhancement does not have the same stigma attached as physical-performance enhancement has. “Is this a development that gives cause for concern? Opinion on this is divided,” the report says. Whether such substances will be banned from schools and workplaces—or at least tacitly allowed—is a big question. Worth noting: This tendency overlaps with intensifying genetic research, personalized medicine, and the pioneering of such methods as in utero gene therapy.

The CIFS report is a follow-up to 2003’s “Ten Tendencies Towards 2010.”

Source: Copenhagen Institute for Futures Studies, www.cifs.dk.

Legendary Conservationists Share Award

President emeritus of the Missouri Botanical Garden Peter H. Raven and Harvard University entomology professor Edward O. Wilson were the co-recipients of the 2010 Linnaean Legacy Award. The award was presented to the two colleagues in recognition of their contributions to the field of biological classification by the Linnaean Society of London and the International Institute for Species Exploration at Arizona State University. The ceremony was held at the New York Academy of Sciences as part of the conference “Sustain What? The Mission to Explore and Conserve Biodiversity.”

During the conference, scientists also worked on developing an ambitious 50-year plan to discover and classify at least 90% of the Earth’s species. It is estimated that only 20% (1.9 million) of all species have been discovered and classified so far. What’s more, experts predict that around 30% of all species will become extinct during the twenty-first century. This massive extinction is “changing the entire character of life on Earth,” Raven told the crowd. Preserving the various species—the so-called living environment—is essential to protecting the physical environment, Wilson said during the joint keynote presentation.

Raven’s past articles for THE FUTURIST, including “A Time of Catastrophic Extinction: What We Must Do” (September-October 1995) and the cover story “Disappearing Species: A Global Tragedy” (October 1985), have also sounded this alarm. In “A Time of Catastrophic Extinction,” he suggests ways to prevent what he warns would be “an episode of species extinction greater than anything the world has experienced for the past 65 million years.”

Source: The International Institute for Species Exploration at Arizona State University, www.species.asu.edu.

Europe’s Blue Future: Offshore Energy

The Marine Board of the European Science Foundation presented a report at the EurOCEAN2010 conference that details how Europe could get half of its electricity from renewable marine resources by 2050. The plan entails researching and developing innovative ways to harness energy from offshore wind, tides, and ocean currents, as well as marine biofuels such as algae.

The report, entitled “Marine Renewable Energy: Research Challenges and Opportunities for a New Energy Era in Europe,” points to the fact that the EU currently imports more than half of its energy and that this amount is projected to increase if current trends are unchanged.

In making its case, the Marine Board highlights potential economic benefits, such as job creation and new business opportunities—which were dubbed “blue jobs” and “blue growth” at the conference. The Board’s projections show that “by 2050, the Renewable Ocean Energy sector could provide 470,000 jobs, which corresponds to ten to twelve jobs (direct and indirect) created per megawatt installed.”

Developing the technology means developing new bodies of knowledge in fields ranging from engineering to ecology. It also entails crafting innovative legislation to help facilitate it. “Marine renewable energy is in its infancy, but it has remarkable potential, so the target of 50% is ambitious, but achievable,” said Marine Board chair Lars Horn. “We just need research, industry and policy to come together.”

The report further recommends comprehensively assessing the available aquatic resources, and developing ways to properly monitor them, in order to keep track of the environmental impacts caused by large commercial-scale installations. Such issues could include electromagnetic disturbances and problems caused by altering water circulation patterns. The report states: “There is limited data or knowledge on the medium- and long-term environmental impacts of Marine Renewable Energy devices.” The Board advocates finding better ways to research, predict, and respond to potential cumulative impacts. To that end, it also advocates for the creation of an initial test site.

The Marine Board is a co-organizer of the EurOCEAN2010 conference, which was held in October 2010 in Belgium.

Sources: European Science Foundation, www.esf.org. EurOCEAN2010 Conference, www.eurocean2010.eu.

January-February 2011, Vol. 45, No. 1

Futurist cover

Order a printed copy of the January-February 2011 issue

The World Is My School: Welcome to the Era of Personalized Learning

By Maria H. Andersen

Future learning will become both more social and more personal, says an educational technology expert.

Tomorrow in Brief

  • Can Handedness Be Altered?
  • El Niño Events Gain Strength
  • Stress and Cancer
  • Clean-Energy Innovations
  • Artificial Leaf Mimics Solar Cells

Book Reviews

Human Civilization Migrates Northward

A book review by Rick Docksai

In The World in 2050: Four Forces Shaping Civilization’s Northern Future, geologist Laurence C. Smith notes world-changing population and economic shifts.

Books in Brief

  • Bottled and Sold
  • Climatopolis
  • How to Catch a Robot Rat
  • Outrageous Fortunes
  • Packing for Mars
  • Rethinking Risk

World Trends & Forecasts

As Tweeted: You Know You’re a Futurist If …

Recently on Twitter, a few of us were reflecting on what makes futurists special.

Members Only

Pleasure, Beauty, and Wonder: Educating for the Knowledge Age

By John M. Eger

The future workforce will need to be more innovative, argues a communications and public policy scholar. While math and science are important, they need to be infused with the creative spark that comes from the arts.

The Future of Medicine: Are Custom-Printed Organs on the Horizon?

By Vladimir Mironov

Medical researchers are creating robots that can bioprint new tissue and organs directly into patients’ bodies while performing surgery—without assistance from doctors.

A Convenient Truth about Clean Energy

By Carl E. Schoder

The earth is awash in energy; we just need new infrastructure to tap it. A chemical engineer shows how we could break free of fossil fuels by deploying the power of ammonia and hydrogen.

Special Section: 70 Jobs for 2030

“Job creation” starts with innovative thinking, so we invited some of the best futurist minds to envision where the ground may be most fertile for future opportunities.

Future View: Future, Fantasy, And Positive Volition

By Matthew Colborn

When futurists choose to be optimistic, it is sometimes mistaken for mindless fantasy. But a psychologist argues that optimism is vital for effective futuring, because it allows us to face reality with the fortitude to make things better.

Future Scope

  • Tipping Point in National Debt
  • Who’s High Now?
  • Men, Women, and Cognitive Impairment
  • Reducing Military’s Resource Consumption
  • WordBuzz: Halfalogue

World Trends & Forecasts

The World Is My School: Welcome to the Era of Personalized Learning

By Maria H. Andersen

Future learning will become both more social and more personal, says an educational technology expert.

Humans have always been learning, but how we learn has changed over time. The earliest means of education were highly personal: Oral histories passed from adults to children, informal or formal apprenticeships, and one-on-one tutoring have all been used in the early history of most cultures. It’s only been in the last two centuries that we’ve used formalized systems of mass public education (aka industrialized education).

Certainly, personalized learning is the more effective method. In 1984, educational researcher Benjamin Bloom found that average students who were tutored one-on-one outperformed 98% of students who were learning via conventional methods (this is referred to as Bloom’s two-sigma problem). However, personal learning is not cost-effective, and so we currently educate students in batches of 20, 30, or even 200 students at a time. This is likely to get worse before it gets better, with prominent philanthropists like Bill Gates declaring that “the best lectures in the world” will be online within the next five years. Certainly we can use technology to deliver those lectures to thousands, or even millions, of students at a time, but a lecture does not automatically produce learning any more than attending a class does.

Mass education is adequate, as long as students are highly motivated to learn and get ahead of their peers. In developing countries, a student who is successful in education will be able to climb the ladder of personal economic prosperity faster than those who are not successful. But in industrialized countries, where prosperity is the norm, an education does not necessarily translate into a significantly higher standard of living. In these countries, there is no longer a large economic incentive to learn, so the motivation to learn must become intrinsic. As we redesign en masse education, we must address learners’ intrinsic motivations, which means that education must circle back to being personal again.

The vision of a modern education built around personalized learning is not new, but it is definitely tantalizing. Neal Stephenson’s novel The Diamond Age (Spectra, 1995) shares a vision of personalized learning in the future via an interactive book that possesses a conversational interface (CI) and “pseudo-intelligence,” a kind of artificial intelligence (AI) that is inferior to human intelligence. It’s likely that we’ll see decent conversational interfaces within the next decade, and certainly applications like Google Voice are moving us much closer to this reality. AI that is capable of directing the learning needs of a human will take much longer, developing in the next 20–50 years, but we can’t wait that long for the technology to catch up with education. The need for personalized learning exists in the here and now. So how does one bridge this vision of the future with the realities of the present?

Learning Technologies Today

Let’s start by taking stock of the personalized technologies for information that we already have. We have software that stores the content we like (e.g., Evernote, Posterous) and software that merely stores the location of that content (e.g., Diigo or Delicious). Even traditional media, like books, now have parallel digital systems that allow for note taking, highlighting, and bookmarking (e.g., Kindle, Nook, or iPad). While it’s useful to store and search information, I would venture that we rarely go back to look at the information we mark for storage.

This is a problem; for deep learning to occur, we need to have repeated exposure to the information, along with some time in between for reflection. We need to give our brains a repeated opportunity to process the information we take in so that it becomes knowledge, understanding, and wisdom. This means we’re going to have to find time in our busy lives to reflect on the information that flows past us on a daily basis, and we’re going to need some kind of technology that keeps us on track with our learning goals.

While it seems outrageous that we could find any more time in our busy lives, consider some of the disruptive changes we’ve seen quite recently that affect how we spend our free time. Facebook, now with 500 million users, has disrupted normal social interactions in a little over six years. Micro-blogging exploded when a Web site simply invited us to answer the question: What’s on your mind? Twitter users now send more than 50 million tweets per day, and big news stories break first on Twitter—in real time and with eyewitness accounts. As big as Twitter is, there were more people playing Farmville (a social media game on Facebook) at its peak than there were active Twitter users—a fact that has not gone unnoticed by game designers and educators. These Farmville players are choosing to spend their free time for collaborative activities (their “cognitive surplus,” as media scholar Clay Shirky puts it) plowing virtual soil and planting virtual crops.

These innovative social disruptions have happened quickly, but not from within the existing organizational structures. For example, Facebook did not disrupt phone communication by changing the nature of phone calls or phones. Facebook built an entirely new system that eventually circled back around to phones by the way of phone apps. In the same way, the trick to developing a personal learning system is to abandon thinking about how to build it from within the existing educational system and to begin pondering how such a system could be developed outside of education. Educational institutions form a vast interconnected network, and while small changes can occur within the system, individual parts only have the ability to flex within their existing boundaries. For a personalized learning system to take hold inside education, it will have to be built on the outside.

A Simple Idea: Learn This

Let me propose a realistic scenario of what a true personalized learning system might look like and how it would function. We first have to create (1) a new layer of learning media in the background of the existing Internet and (2) an ecosystem of software to easily manage the learning media we engage with. In the same way we’ve integrated buttons like Twitter’s “Tweet this” and Facebook’s “Like” at the end of videos, articles, and other media, imagine we now add a button for “Learn This.” Clicking this button (anywhere you find it) would bring you into an interface to help you learn the content.

We don’t need a humanlike artificial intelligence to begin this journey. The technology for such a journey already exists and is simple enough to use with traditional learning methods. In the first version, learning should simply be by way of Socratic questioning, where questions are used to analyze concepts, to prod at the depth of knowledge, and to focus on principles, issues, or problems. Socratic questions are elegant because, unlike with other formats (e.g., multiple choice), learners must self-generate the answers rather than rely heavily on the ability to recognize a correct answer when they see it. The personal learning system would use a spaced repetition algorithm (SRA) to reintroduce the Socratic questions over time so that biological memory is more likely to grasp onto the ideas and information. For now, let’s call this system SOCRAIT (a play on “Socratic” that includes SOC for social, AI for artificial intelligence, and IT for information technology within its name).

Learn This! SOCRAIT Questions for “The World Is My School”

Author Maria H. Andersen offers the following questions as sample Socratic-learning prompts for readers of this article.

• What technologies are we likely to see in personalized learning systems on the 20–50 year horizon?

• What arguments are made for the likelihood that we can “find” the free time to engage in a personal learning system?

• Why are Socratic questions and spaced repetition algorithms (SRA) an elegant solution to the personalized learning problem?

• How are responses evaluated in the proposed SOCRAIT system?

• What evidence do we have that people will be willing to put in the cognitive energy to create a learning layer on the Web?

• How could SOCRAIT be used by journalism to improve the revenue stream?

• How would the SOCRAIT model change the way we consume media?

• What are Socratic scholars and what function do they serve?

• If SOCRAIT were implemented, how would the role of educators shift?

• What is the “game layer for learning” and why is it necessary for something like SOCRAIT to work?

• What is needed to build a system like SOCRAIT?

For example, suppose I read an article about digital copyright in educational settings, and I decide that it’s important for me to remember some of the details of this article. At the end of this article, I choose “Learn This” to add a question to my SOCRAIT question bank. Two options would appear: (1) write your own question or (2) choose from a list of questions written by others. If I choose the first option, I might write a simple question and answer for myself: “What are the allowable uses for copyrighted video in an educational setting?” Following this, I’d write a short summary or clip a few sentences of content from the article to summarize the answer to the question. Along with the question and answer, SOCRAIT would save the source URL (link to the content), and I could tag the question with metadata tags I indicate (e.g., copyright, digital copyright, and education).

Later in the day or the week, when I have some down time, I could reengage with SOCRAIT. Here’s how it would work: I read or listen to a question, answer it in my head or out loud, view or listen to the answer, rate my understanding, and go to the next question. Since the learning is tailored to intrinsic motivations, learners could rate their own ability to answer a question (e.g., 1 = I have no clue, 2 = I knew some of it, and 3 = I nailed it!), and SOCRAIT could make decisions based on these ratings. If your rating of understanding is low or spotty, the system would offer to send you back to the source for another look. Notice that there is no need to develop software to verify the answers to questions—if you aren’t good at rating your own understanding (we call this metacognition), this will come out later in the process, and you’ll have to learn to get better at it.

With a rudimentary computer interface, like the one implemented in Google Voice, there’s no reason why SOCRAIT couldn’t be voice-based and available anywhere we interact with computers (e.g., cell phones, tablets, auto navigation systems). This would allow us to improve our learning while performing other tasks: commuting to work, making dinner, or walking the dogs.

Initially, the so-called “Pareto’s Vital Few” (the 20% of people who get 80% of the work done) would be the ones who would be most interested in creating and engaging with questions. But as the connectedness of the system matures, the need to write your own Socratic questions would lessen. Authors and media creators would write their own questions, targeting comprehension of important ideas and facts. Media consumers would be able to choose from a list of questions, perhaps seeing a sorted list based on their indicated learning priorities. Two readers of the same article would see different questions at the top of their “suggested questions” based on tags of the content. In some cases, the user might choose to pay for curated or reputable content so that their learning can later be certified by an employer, educational body, or organization.

Personal Learning’s Implications For Education

Now let’s take a step back and look at the big picture. Any content that exists on the Internet (or is connected to the Internet) would be tagged with Socratic learning questions and metadata for subjects. Learners would have their own bank of questions, personalized to their own learning interests. As a result, instead of learning that is designed around a physical place (e.g., schools), an educational space (e.g., learning management systems), or a person of authority (e.g., instructor), this system is designed around the learner.

It goes without saying that the implications for education are huge. In the space of a few years, we could develop a completely separate content learning system that’s incredibly flexible and personalized to the interests of the learner. The architecture needs to develop organically around Web-based content and grow tendrils into everything we produce in the future. It will take some time to go back and create a learning layer to integrate with all the content that we already have, but as we’ve seen from projects like Wikipedia, there are people willing to contribute their time and energy to these kinds of tasks. Wikipedia became the largest encyclopedia ever assembled within a mere six years after its creation, and was built using less than 1% of the time that Americans spend watching TV every year (as calculated by Clay Shirky).

A system like SOCRAIT has the potential to benefit other industries outside of education. For example, modern journalism has been struggling with a problem of income stream. While revenue has shifted to online advertising, it is not enough to shore up the industry. At present, the vast majority of Internet content is free and, as Chris Anderson argues in his book Free (Hyperion, 2009), it’s not likely to change. How do you get readers (or viewers) to pay for something that they already get for free? The answer: Add something to the content that’s not already there. If readers or viewers had the ability to quickly add reputable questions to their learning bank, this would be a value-added service. Cleverly, the media content would remain free, but access to the question bank would require a one-time payment or ongoing subscription by the consumer. This would certainly help modern journalism (or the textbook industry) to shore up their revenue stream.

A New Learning Ecosystem

Books like Nicholas Carr’s The Shallows (W.W. Norton, 2010) cause us to question whether we might be trapped on the information superhighway—stuck on the line between data lanes and unable to scoot forward or backward. Twitter users regularly use the phrase “drink from the fire hose” when referring to their experience of dipping into the live data stream. Information, whether it be from radio, television, print, Web media, or social networks, is coming at us too quickly; all that most of us can do is surface-skim, rarely pausing to reflect or think deeply. To learn, to analyze, to innovate, and to think creatively, we must internalize some of the information we process.

An entirely new ecosystem could grow up around this Socratic learning system. Certainly a ratings system for questions could be built using the technology developed by companies like Netflix. For example, “Your friends John and Iveta chose this question. Would you like to see other questions/media they chose for this topic?” If you choose to do so, the questions you see when you add content to your question bank could be filtered by your existing social networks. Rather than showing all the possible questions in existence for that media (which could become a fairly lengthy list), you could choose to see only the ones people in your social network have also used.

So far, I’ve discussed how the system would work if you engaged in reading and watching media as you do today. However, such a system could also shift how and when we seek out content. After all, a lot of time is wasted in modern education by re-teaching content that some of the learners already know. There is no incentive for students to get ahead when the reward is sitting through a lecture on something they’ve already learned.

Imagine: When you need to learn something new, you could subscribe to a curated collection of questions on that topic. For example, “Digital Copyright 101” might be a collection of questions developed by somebody who teaches digital copyright policy to beginners. The truly fascinating shift is that you wouldn’t necessarily start by consuming the media that goes with the questions. Instead, you would simply start answering the questions in your bank. As you encounter learning questions that you can’t answer, you could dive into the content at those points in time—this is the exact point between boredom (with things you already know) and frustration (with things you don’t know), the point to engage in learning.

Testing Knowledge Acquisition

Almost immediately after the personalized learning architecture is in place, we will need a new educational industry tasked with certifying knowledge and understanding. For lack of a better name, let’s call these folks “Socratic scholars.” Their job will be to rate how well you know what you claim to have learned. For example, let’s say I’ve engaged with and theoretically learned 500 tagged questions on biochemistry to prepare for teaching a new class. In order for this to count toward my professional development hours, my college asks me to certify the learning. I pay for a Socratic scholar who specializes in chemistry to rate my knowledge. We meet either in person or via the Web (more likely) and have a discussion about the questions in my learning bank on biochemistry.

The scholar has access to the 500 questions I say I’ve mastered and asks me to answer a random selection. Of course, this is where it would be valuable to have reputable questions in my learning bank (from authors, researchers, scientists, and leaders in the field). Since the scholar can see both my questions and the answers (linked back to original content), it should not be difficult to ascertain whether I have, in fact, mastered the knowledge and concepts as I have claimed. Because the certification is human-to-human, and not human-to-machine, the nuances of human language would be understood. So if the language of the verbal answer and the language of the written answer don’t match up exactly, that wouldn’t be a problem. At the end of the session, the scholar would “grade” my understanding of the 500 questions on biochemistry, and I could provide this certification to the human resources department.

In many respects, this is a much better system than what we have today. For most certification of learning, we simply look at a transcript. If the class is listed, we assume the learner has that knowledge. Of course, knowledge ages—sometimes it evolves into understanding or wisdom, and sometimes it fades out of existence. The fact that I earned a chemistry degree in 1996 does not mean you would want to hire me as a chemist today. Ideally, you’d want me to recertify before I entered the “chemist” job pool. Biological memory is not reflected in the metrics of transcripts or grade point averages.

I am not saying that this “certified” content knowledge equals the ability to function as a practitioner in the discipline. Even a diploma only indicates that the educational system has walked you through some series of appropriate paces for the discipline. Skills like critical thinking and creativity are often lost in education (especially in science and technology) because there is such an incredible amount of content to cover. However, if the content knowledge moved outside the educational system, then educators could focus on the learning that surrounds technical knowledge instead (e.g., problem solving, analysis, creativity, applications).

Let’s imagine what would happen if a robust Socratic learning system was at the heart of the educational system. A learning coach (a more appropriate term for the teacher or instructor in this learner-centered environment) will designate some core material that he or she wants you to learn. For example, in calculus, I might use a set of 500 curated concept-oriented questions from a well-known calculus textbook author, with each question linking to supporting media. Every student would be working on those questions, and so, as a learning community, we’d all work on that together. I would hope that this doesn’t sound like too radical a departure from normal.

This is where it changes: Because every student has different interests and career ambitions, I would also require that each student find an additional 100 questions tagged with both calculus and tags that are of interest to that student. For a student studying to be a doctor, questions tagged with medicine or epidemiology might be appropriate. For a student going into business, questions tagged with marketing or management might be more appropriate.

As the learning coach, my job is no longer to “deliver content” to the students. SOCRAIT does that. Now I can use my time to help students search for good questions, help them to understand the content they are learning, provide activities to help them work with the concepts or connect the material in an applied way, and foster discussion with other students on these topics.

When it comes time to certify the learning for each student, it is done by an oral interview in which I have access to the common questions and the personalized questions for each student. Even if I’m not an expert on all the personalized questions, the answers are provided and the content is related to a subject of my expertise. Again, I only have to ask about a random selection of questions to be able to assess understanding. At the end of the semester, all students have learned their own personal versions of calculus, while still learning a core of common material.

Such a system has implications for lifelong learning “on the job,” too. Instead of holding mandatory training, a human resources department could push out a bank of Socratic questions to all their employees about safety, new initiatives, mission statements, etc. For example, to train employees on Occupational Safety and Health Administration (OSHA) compliance, the employees would be invited to add a curated list of 40 questions about OSHA policies. Each question would lead back to a source that provides the necessary content to answer the question. After two weeks, someone in HR can act as the Socratic scholar and spend five minutes with each employee to test his or her knowledge of the policies, using a random selection of questions.

A Game Layer for Learning

Futurist John Smart writes about a coming “valuecosm” within 10 to 20 years, when we’ll be able to program our apps or avatars to make decisions for us based on what we say is our set of values. The real question is whether learning can become one of our new values, especially in the United States. In 2009, The U.S. Bureau of Labor Statistics estimated that the average American adult spent more than five hours per day on leisure activities (close to three of those leisure hours watching television) and about 30 minutes per day on educational activities. Given the 10:1 ratio of leisure to educational activities, is American culture likely to embrace learning as a choice? Initially my answer was no, but then I began to think about video-game design.

Entrepreneur Seth Priebatsch spoke at TEDxBoston (2010) about building a “game layer on top of the world.” What if one of the game layers we create surrounds learning? The same game dynamics used to build successful video games (e.g., appointment dynamics, influence and status dynamics, and progression dynamics) could be deployed to make learning the game itself. While this might still be a hard sell for the average adult, there will be subpopulations, such as early technology adapters, who will see the immediate value in cultivating and learning from their own question banks. Children who grow up learning with a Socratic question system might gain learning values naturally and carry these to their adult lives.

A successful Spaced Repetition Socratic Learning System (SRSLS) would have to entice you to keep to specific goals, like answering 50 questions per week or answering 100 questions with a certain tag in the next month. Any of these goals could be incentivized with points (1 question answered correctly = 1 point), incentive rewards for meeting certain goals (“you’ve earned your Silver Calculus badge for 100 questions learned”), and social status levels (“Maria has just become a Calculus Master—can you do it too?”).

Those engaged in formal education would participate with a far greater intensity of daily questions than those who are in the workforce. However, the wise worker would continue to learn, albeit at a slower pace. Résumés would boast levels of knowledge on particular topics and stats on the intensity at which you participate in learning.

Let’s Build It

A diploma has become a social signal to stop learning. In today’s world, where technical knowledge doubles every two years, this is absolutely the wrong thing to do. Careers shift overnight, and industries collapse rapidly. We have to learn, and learn faster than we ever have before, in order to stay ahead of the problems we are now creating.

The content for a system like SOCRAIT already exists; it is the architecture and interface we are missing. This new learning medium needs to be an interconnected network of user-generated, or author-generated, Socratic questions with a seamless question-management interface. The architecture needs to remain open so that anyone can create questions on any content, and any developer can build applications for the computing device of his or her choice.

A system for personalized learning will not grow from inside formal education. Education is like a field that’s been overplanted with only small patches of fertile soil. Too many stakeholders (parents, unions, administration, faculty, etc.) compete to promote various ideas about how to change, acting like weeds or plagues that choke off plant growth. The fresh and fertile soil of the open Web can foster the quick growth of a personalized learning system. Then, like a good fertilizer, it can be used to replenish the soil of formal education and help us to reach that “Holy Grail” of education: personalized learning for all.

About the Author

Maria H. Andersen is the Learning Futurist for The LIFT Institute at Muskegon Community College, Muskegon, Michigan. She has degrees in mathematics, chemistry, biology, business, and (ABD) Higher Education Leadership. She is considered an expert in educational technology and has been studying, researching, speaking, and writing about the future of education and learning for several years, including at the World Future Society’s 2010 meeting. E-mail busynessgirl@gmail.com or search @busynessgirl on Twitter.

Tomorrow in Brief (January-February 2011)

Can Handedness Be  Altered?


“Choosing” which hand you use to reach for a cup or doorknob isn’t something you give a lot of thought to, but in fact the brain undergoes a complex decision-making process, pitting left versus right sides. Understanding this process may help researchers develop treatments for stroke patients and others with motor disorders.


Researchers at the University of California, Berkeley, found that they could increase the use of the left hand among right-handed individuals by applying magnetic stimulation to the left side of the parietal cortex (which governs the processing of spatial relationships and planning).


Beyond the clinical applications for helping patients with brain injuries, the researchers believe that magnetic stimulation could potentially be used to influence other decision-making processes.


Source: University of California, Berkeley, www.berkeley.edu.

El Niño Events Gain  Strength


The El Niños occurring in the central Pacific Ocean have nearly doubled in intensity since 1982, according to researchers from the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA). Intensity is measured by how much the sea surface temperature deviates from the average.


While climate change may be behind the shift of these more-intense El Niños from the eastern to the central Pacific region, it is their impacts on weather patterns that have the researchers concerned.


“El Niño’s impact on global weather patterns is different if ocean warming occurs primarily in the central Pacific instead of the eastern Pacific,” according to NOAA’s Michael ­McPhaden. “If the trend we observe continues, it could throw a monkey wrench into long-range weather forecasting, which is largely based on our understanding of El Niños from the latter half of the twentieth century.”


Source: National Oceanic and 
Atmospheric Administration, www.noaa.gov.

Stress and Cancer


Cancer may have found a partner in resisting radiation and chemotherapy: stress.


If a patient exercises intensely or experiences emotional stress within two days prior to therapy, a cell-repairing protein (Hsp27) is activated that protects the cancer cells, according to Ohio State University researchers.


The observation gives doctors hope of finding ways to counter Hsp27’s role in interfering with cancer-cell death. In the meantime, cancer patients may be able to improve their own treatment by avoiding stress, according to lead researcher Govindasamy Ilangovan, an associate professor of internal medicine.


Source: Ohio State University, www.osu.edu.


Clean-Energy Innovations


Patenting rates for clean-energy technologies have increased by approximately 20% per year since 1997, and nearly 80% of innovations originated in just six countries: Japan, the United States, Germany, South Korea, France, and the U.K., according to a new study, “Patents and Clean Energy: Bridging the Gap between Evidence and Policy.”


The surge in patent activity following the adoption of the Kyoto Protocol suggests that political decisions can help drive international competition, even in countries that did not sign the treaty, such as the United States, the study’s authors conclude.


Source: “Patents and Clean Energy,” published by the European Patent Office, is available from www.epo.org/clean-energy.


Artificial Leaf Mimics Solar Cells


A leaf-like solar device made from a water-based gel and ­infused with light-sensitive ­molecules could offer a less expensive and more environmentally friendly alternative to silicon-based solar cells.


Researchers at North Carolina State University used plant chlorophyll coupled with electrodes coated by carbon materials. The devices are stimulated by the sun’s rays to produce electricity in the same way that plants are stimulated to synthesize sugars.


Though the device is currently low-efficiency, the researchers hope to improve the biologically inspired “soft” photovoltaic arrays, perhaps one day covering roofs with sheets of artificial-leaf solar cells.


Source: North Carolina State University, www.ncsu.edu.

Human Civilization Migrates Northward

By Rick Docksai

A geologist notes world-changing population and economic shifts.

The World in 2050: Four Forces Shaping Civilization’s Northern Future by Laurence C. Smith. Dutton. 2010. 322 pages. $26.95.

Brazil, China, or Iceland—which country’s population will grow the fastest between now and 2050?

The answer is Iceland, according to Laurence C. Smith in The World in 2050. The UCLA geologist envisions a “New North”—comprising Canada, the United States, Russia, Sweden, Finland, Norway, Iceland, Denmark, and Greenland—of intense activity, expansion, and economic growth. The populations of Canada, Iceland, and Norway will all grow by 20% or more.

Smith identifies four forces of change behind this great shift—demography, increasing strain on the earth’s resources, globalization, and climate change—and the specific ways that each force may shape human civilization in the next four decades.

“Big changes often just sort of ease their way in,” he says. “And quietly change the world.”

Four Forces Shaping the Future

Demography. In 2008, for the first time in history, more humans were living in cities than in rural areas. Urbanization will continue and will necessitate expert growth management.

Industrialized countries will also need to worry about their rapidly growing elderly populations. By 2050, the nursing homes of Brazil, Russia, India, and China may be full to the brim, and none of the four countries may have enough employees to staff them.

The highest fertility rates will be in developing nations. But before their young people can take up needed jobs in industrialized nations, their societies will need to boost education, security, and governance so that they will have opportunities to gain education and job training.

Growing strain on the earth’s resources. The world is projected to consume 106 million barrels of oil a day by 2030. Pressure will mount to tap any existing reserves. Russia in particular will aggressively develop its vast oil fields and compete with its neighbors to drill the pristine ice fields of the Arctic Circle.

The world’s population will grow 50% by 2050, and all that growth will require enormous increases in crop production to feed it. Worldwide freshwater sources are already running low due to pollution and massive irrigation.

Globalization. Canada, Russia, the United States, and other northern nations all depend heavily on migrant workers to fill needed labor jobs. These countries’ need for migrants will rise considerably to sustain growing industries despite population aging. All nations will have to ease their immigration restrictions and discourage any surges of xenophobia, according to Smith.

The now-sparsely settled Arctic will see major influxes of settlers. Smith expects that it could host growing urban centers and larger aboriginal communities.

Climate change. Shifts in the earth’s climate are bound to unfold erratically over time, though the long-term result will be significantly higher temperatures. The Arctic could be seasonally ice-free by 2050, and human infrastructures throughout the far north will be severely challenged.

Some amount of warming is inevitable no matter what actions the world takes now, but decisive reductions in carbon emissions during this century could keep the warming to a moderate 2.5°C increase instead of a rise of 5°C or higher.

Smith writes that he took great care to make his forecasts realistic and based on trends already under way. That meant steering clear of discussing wild-card shocks, apocalyptic doomsday scenarios, or other radical changes in the status quo.

“The described outcomes favor the likely over the unlikely. I honestly expect, should I live long enough, that I will see them within my lifetime,” Smith writes.

The World in 2050 is a plausible vision of what the world may look like four decades from today. Smith convincingly states not only what he expects to see, but also why he expects to see it.

About the Reviewer

Rick Docksai is a staff editor of THE FUTURIST and World Future Review.

Books in Brief (January-February 2011)

Edited by Rick Docksai

Message in Our Bottles

Bottled and Sold: The Story Behind Our Obsession with Bottled Water by Peter H. Gleick. 2010. 211 pages. AMACOM. $26.96.

Consumption of bottled water has skyrocketed over the last few decades, says globally recognized water expert Peter Gleick. However, he thinks that the tide may be turning. In Bottled and Sold, he describes a “war on bottled water” now under way in offices, recreation centers, restaurants, and private homes across the globe.

Consumers and businesses are increasingly forgoing bottled water and getting their water exclusively from the tap, Gleick notes. Cities are banning municipal purchases of it, and some restaurants are eliminating it from their menus. Environmental concerns are a prime motivator: Every plastic bottle requires water and electricity to produce it and to move it onto a store shelf. Other critics worry about the human costs; they believe that bottled water imposes undue burdens on low-income persons. And some just hold a philosophical grudge against corporate ownership of water.

Whatever their reasons, says Gleick, the boycotters are sufficiently numerous to put the most prominent bottled-water industry associations on “crisis footing.” Bottled and Sold is a book that environmentalists, water experts, and all who follow consumer trends will want to read.

Building for a Hotter Planet

Climatopolis: How Our Cities Will Thrive in the Hotter Future by Matthew Kahn. Basic. 2010. 260 pages. Paperback. $26.95.

Construction and housing in cities around the world will be forced to adapt to global climate change, forecasts environmentalist Matthew Kahn. In Climatopolis, he describes how increased flooding, higher temperatures, and erratic weather patterns in general will force planners to redesign urban housing units worldwide in the next few decades.

Climate change will affect different cities differently, he argues. Coastal cities will face unique health and economic difficulties, while inland cities may be confronted with mass migrations of “climate refugees.” In addition, architects and home buyers in many places will have to plan ahead for increased risks of floods, droughts, or wildfires.

Kahn expresses hope that, through innovation and careful planning, cities might ensure a desirable quality of life for their residents in the face of long-term climate change. He includes a “top five” list of U.S. cities that are currently best-protected against climate change’s effects: Detroit, Michigan; Minneapolis, Minnesota; Buffalo, New York; Milwaukee, Wisconsin; and Salt Lake City, Utah.

Climate researchers, engineers, and urban planners will all find Climatopolis educational.

Robotic Imitation of Life

How to Catch a Robot Rat: When Biology Inspires Innovation by Agnès Guillot and Jean-Arcady Meyer. 2010. 226 pages. MIT Press. $29.95.

Human engineers draw some of their best inspirations from nature, according to technology researchers Agnès Guillot and Jean-Arcady Meyer in How to Catch a Robot Rat. The authors introduce readers to “the new bionics,” a field that integrates biology and engineering.

Guillot and Meyer share examples of recent innovations in new bionics. For example, observations of the iridescence of butterfly wings gave rise to new high-resolution video screens. And studying the powerful auditory systems of owls, which can track even the most muffled sound in a nighttime forest, clued a German company in on how to build an acoustic camera whose hyper-sensitive microphones can locate and capture sounds inaudible to humans.

Many more breakthroughs are soon to come. Some aquariums have debuted robotic fish that look and swim almost exactly like real fish. Prototype walking robots move faster than present-day ones because their designs are based on the bodies of cockroaches. And new drone airplanes have wings that flap like birds and insects.

Guillot and Meyer are hopeful that, over time, new bionics will create robots that behave like animals, too: They will learn, think, and adapt to changes in their environments. How to Catch a Robot Rat is an engaging introduction to revolutionary new fields in robotics that is appropriate for experts and general audiences alike.

Forecasting for Economic Peaks and Valleys

Outrageous Fortunes: The Twelve Surprising Trends That Will Reshape the Global Economy by Daniel Altman. 2011. 244 pages. Holt. $25.

China’s seemingly invincible economy won’t flourish forever, says economist Daniel Altman. In Outrageous Fortunes, he forecasts that the Chinese economy’s structural weaknesses will overtake it later this century and cause the nation to grow poorer even after so many decades of growing richer.

Many nations around the world will experience financial turbulence as they strive for the highest possible living standards but are hampered by market instability and shortages of human and material resources. National economic policies will shift back and forth, and waves of immigration will challenge both developing and industrialized economies.

Keeping track of the global marketplace’s daily rising and falling indexes while still maintaining sight of the long term could seem difficult, says Altman, but he assures readers that it’s actually fairly understandable. If we grasp the deep-rooted economic factors that sway a country’s economic path, we can make fairly accurate guesses as to where the path will lead.

He identifies a series of factors that he expects will markedly shake up the markets of China and Europe, demolish the World Trade Organization, and generate unexpected new job opportunities in the United States. He spots some impending risks, as well, such as worldwide expansion of “black market” activity. Outrageous Fortunes is an economic treatise that is incisive and approachable enough for economists and general audiences alike.

Reality Check for Space Travel

Packing for Mars: The Curious Science of Life in the Void by Mary Roach. 2010. 334 pages. W.W. Norton. $25.95.

Travel to Mars is feasible, but the astronauts who attempt it will have to contend with tremendous psychological and physical pressures—cramped confinements, sterile surroundings, and isolation more profound than any humans before them have ever experienced—says science writer Mary Roach in Packing for Mars. She visits space-travel research stations to witness the isolation chambers, antigravity rooms, and other experimental units that astronauts today are using to prepare for future voyages into deep space.

As she describes each exercise the flight crews undertake, she shares the unique forms of vertigo, disorientation, visual illusions, and other sensations that the low-gravity environs of space will impose on human space travelers. Roach adds the even more grueling experiments into how weeks or months of immobility would impact astronauts’ bodies—a key concern, since missions to other planets might require keeping the human crew in hibernation states for the duration of the voyages.

Roach’s Packing for Mars is a reality check into the challenges of deep space and how humans can gear up now to meet them. It’s worthwhile reading for aspiring astronauts, space enthusiasts, and all who take great interest in humanity’s potential future in space.

Averting Risky Business

Rethinking Risk: How Companies Sabotage Themselves and What They Must Do Differently by Joseph W. Koletar. 2011. 242 pages. AMACOM. $29.95.

Most business executives are vigilant about identifying strong competitors and important technological developments, but they often fail to watch for the broader array of real-life risks, argues Joseph Koletar in Rethinking Risk. The consequences, he concludes, fill newspaper headlines: BP’s oil rig breaks and wreaks havoc on the Gulf of Mexico, Toyota must recall hundreds of thousands of cars due to faulty brakes, and 9 million Mattel toys made in China are found to be laced with lead paint.

Koletar, who directed the fraud investigations of Ernst & Young and Deloitte & Touche, argues that the mistake that BP, Toyota, and other leaders in these and similar incidents made was not greed or carelessness, but rather a failure to plan ahead and avert the approaching danger. Most disasters are preventable, he says, but the leadership has to be aware and pay attention to the information at the ground level.

Koletar presents action strategies for business and organization leaders who want to raise their own foresight and keep their operations safe over the long term. Through examples of businesses that did not watch for risks, he teaches lessons on risk analysis, employee training, accountability, organizational intelligence, and the risk mechanisms that a business can put in place to stay aware and secure. Rethinking Risk is a guide that leaders in any industry or sector may want to consult.

Forward to the Steam Age?

Geothermal plants could have a seismic impact on energy demand—literally.

As the gradual shift from fossil fuels to renewables gets further under way, a number of researchers are beginning to look more closely at the promise of geothermal energy. While the geothermal process is not completely emission-free, the amount of greenhouse gases released is far less than that from conventional fossil fuels. Like wind and solar, heat from the earth is safer and cleaner than fossil fuels and provides an inexhaustible source of energy.

Currently, most geothermal energy comes from around 200 meters deep, at temperatures of less than 10°C—which is actually not very hot when compared with the temperatures just a little further down.

It is estimated that 99% of the earth has a temperature of more than 1,000°C. According to researchers at the Norwegian-based organization SINTEF, harnessing just a tiny fraction of this heat could theoretically provide enough energy for the entire world population.

Several Norwegian companies and organizations— including SINTEF, which has experience with petroleum exploration—are planning an ambitious pilot project that would harness the geothermal energy from 5,500 meters deep in the earth—about the same depth as some of the more recent onshore oil wells. According to researchers, the energy is there—all that is needed is a truly safe, effective means to tap into it.

Current wisdom favors binary cycle geothermal power plants. This type of plant features two deep, interconnected wells that operate in a cyclical fashion. Cold water is pumped down an injection well and heated by the underground rock (to around 95°C at that depth), then pumped back up via a production well, giving off steam, driving turbines, and generating electricity.

After around three decades, this process will have cooled the bedrock to the point where it is no longer hot enough to be productive, not unlike a tapped oil field run dry. The wells would be sealed off and the power plant shut down.

Then, three decades later, the temperature will have risen again, and it will be time to unseal the wells and begin the process anew, after upgrading the plant. This “ace in the hole” helps make geothermal power plants more cost-effective than oil rigs, researchers argue.

The Norwegian coalition is planning to go two or even three times deeper if the pilot plant is successful. That would require new technologies, however, and greatly increases the likelihood of incurring very serious risks. Most severely, fracturing or eroding the crust’s bedrock in order to recover its heat could trigger earthquakes in the region. This notably occurred in Basel, Switzerland, in 2006, when Geothermal Explorers Limited’s field operations set off a series of quakes.

At such high temperatures in the earth’s crust, the rock is liquefied, so anything else there runs the risk of being liquefied or broken. Electronic equipment shorts out very quickly at temperatures over 200°C. New technologies are needed in order to meet those challenges.

ExxonMobil and other oil companies are beginning to drill exploration wells at 10,000 meters, depths once believed to be too risky. Geothermal researchers now hope to adapt these oil drilling technologies, perhaps working in tandem with oil companies, and make them safer and more affordable for clean, renewable energy.

“Geothermal energy is a unique opportunity for the oil industry to develop in a new way,” says Are Lund, senior researcher at SINTEF Materials and Chemistry. “They will come to realize this, it’s just a matter of time.”—Aaron M. Cohen

Source: SINTEF, www.sintef.no.

Robots as Athletes

Soccer-playing robots may help advance artificial intelligence.

Imagine robots that can play soccer (football) at the level of the World Cup championships. For researchers in artificial intelligence, such an event would be tantamount to—and possibly even surpass—that moment in 1997 when IBM’s Deep Blue supercomputer defeated then-world champion Garry Kasparov in chess.

The challenges are daunting. Autonomous, athletically capable humanoids that act together as a unit would require not just highly advanced software (the intellectual component) but also highly advanced hardware (the physical component). By sharing knowledge and codes, and developing and testing technologies together, AI designers hope to realize this vision.

Launched in 1993, the RoboCup international robot soccer competition (also known as the Robot World Cup Initiative) provides a platform for AI and robotics researchers to test their developments, work together, spur each other on, and create research breakthroughs. It is a competition in the best sense of the word—the kind that facilitates cooperation.

In his essay “Robot Soccer,” University of New South Wales computer science and engineering professor Claude Sammut describes the different levels of play, pointing out that the robotic soccer fields are smaller (and virtual in some low-level competitions), and the rules much simpler than in soccer played by humans. Currently, there are only three robots per team, as compared to eleven in human play. Sammut writes: “As the robots and their programming have become more sophisticated, the rules of the game, including field size and number of players, have been made tougher to encourage progress.”

French company Aldebaran Robotics’ humanoid Nao is the model of robot currently in use in the RoboCup. While still relatively basic, these humanoid robots use color cameras as their primary sensors (not unlike HAL in 2001: A Space Odyssey), operate autonomously (as opposed to being remote-controlled), and can communicate with each other wirelessly.

Sammut stresses that soccer is only a means to an end—not an end in itself. “In addition to soccer playing, the competition also includes leagues for urban search and rescue and for robotic helpers at home,” he writes. He emphasizes that soccer is good for developing the fundamentals that will be necessary for these and many other tasks. The basics include “perceiving” their surroundings, interpreting constantly changing situations, making quick decisions based on those situations, and then acting on them, adjusting tactics as necessary. The AI units must also be able to transmit information back and forth.

Whereas soccer fields always conform to the same basic grid layout and boast the same landmarks (goal posts, for example), less-structured environments present greater challenges. For example, a house or apartment and the possessions it contains (which can act as landmarks) may not change much over time, but it is more complex to move about in. It is harder still for an AI program to map a completely unfamiliar urban environment without any immediately identifiable landmarks. In search and rescue situations, “the robot has to simultaneously map its environment while reacting to and interacting with the surroundings,” Sammut writes. And off the soccer field, AI units must interact with actual people—not just other AI units.

Despite the challenges, little by little, progress is being made each year. And if the participants and organizers meet their stated goals, then expect a team of fully autonomous humanoid robots to show no mercy against their opponents in the actual World Cup in 2050.—Aaron M. Cohen

Sources: Claude Sammut, “Robot Soccer,” Wiley Interdisciplinary Reviews: Cognitive Science, Volume 1, Issue 6, http://wires.wiley.com/WileyCDA.

RoboCup, www.robocup.org.

You Know You're a Futurist If ...

DramaTweep Personae:

@busynessgirl = Maria H. Andersen

@ISOLABELLA = Isobel Kramen

@jbmahaffie = John B. Mahaffie

@kristinalford = Kristin Alford

@latta = a writer, cyclist; Global IT/ HIT/Consulting

@lisadonchak = Lisa Donchak

@nanofoo = Gerald Thurman

@OscarMopperkont = Adriaan

@ryonck = Richard Yonck

@WorldFutureSoc = World Future Society (Cindy Wagner, FUTURIST managing editor)

Recently on Twitter, a few of us were reflecting on what makes futurists special.

@WorldFutureSoc You know you’re a futurist if you start accidentally putting NEXT year’s year in the date line. *smacks forehead*

@jbmahaffie It’d be great to collect “you know you’re a futurist if...” jokes, with a nod to Jeff Foxworthy, got any more?

@WorldFutureSoc If I’d had foresight, I would have hashtagged.

@WorldFutureSoc #YKYAF You know you’re a futurist if you look both ways before crossing a one-way street (watch out for wild cards)

@latta if you live in a large metro area, you *need* to look both ways on a one way.

@lisadonchak In the UK, I look the wrong way first before crossing the street. Priming!

@WorldFutureSoc #YKYAF You know you’re a futurist if you think AI stands for artificial intelligence, not “American Idol.”

@lisadonchak #YKYAF You know you’re a futurist if you type AGI—artificial general intelligence—instead of AIG.

@WorldFutureSoc #YKYAF You know you’re a futurist if you would use your time machine to travel forward (and come back to change the present, not the past).

@ISOLABELLA You know you’re a futurist when you use a worm hole to commute.

@lisadonchak Is this possible yet?! I’d sign up instantly! ;)

@ISOLABELLA Wld I b here?

@WorldFutureSoc #YKYAF You know you’re a futurist if you ask “What’s next?” instead of “What’s new?”

@lisadonchak #YKYAF if you think increased prevalence of “Internet addiction” is really just a sign of the impending tech #singularity.

@lisadonchak #YKYAF when you try to click nonexistent hotlinks in your physical newspaper, book, or magazine.

@WorldFutureSoc Or fast-forward a live lecture.

@lisadonchak Or rewind a live radio program!

@ryonck #YKYAF You know you’re a futurist if someone mentions weak signals & you don’t assume they’re talking about their cell phone plan.

@busynessgirl #YKYAF You know you’re a futurist if you’re already getting annoyed with software features that don’t actually exist yet.

@OscarMopperkon #YKYAF when you ask yourself so many “what if …” questions that it drives you crazy.

@nanofoo #YKYAF if you believe we’ll have exa-scale computing by year 2019.

@lisadonchak #YKYAF when you’re up at 6am tweeting about how #ykyaf.

@WorldFutureSoc Clearly we know we’re #futurists this morning! Thx for the company.

Source: The World Future Society Twitter page, http://twitter.com/World FutureSoc.