2020 Visionaries Part 1: Andrew Hessel, Robert Freitas, Janna Anderson, and Mark Bauerlein

In this first series of essays, we tackle health and education. Andrew Hessel showcases his vision for open-source drug manufacturing and noted nanoscientist Robert Freitas details the medical future of nanorobotics. Then two teachers — Janna Anderson and Mark Bauerlein — present two distinct visions for education in the twenty-first century.

Reinventing the Pharmaceutical Industry, without the Industry

The founder of the Pink Army Cooperative is bringing the open-source development model to breast cancer therapies.
By Andrew Hessel

If I were to tell you that volunteers working out of garages and bedrooms could play as big a role in the elimination of breast cancer by 2020 as a multibillion-dollar big pharmaceutical company, would you believe me?

I’m convinced it’s possible. That’s why I founded the Pink Army Cooperative. The Cooperative is not your average biotechnology startup. It’s an open source biotechnology venture that is member-owned and operated and not-for-profit. It’s working to create individualized therapies for breast cancer. The mission is to build a new drug development pipeline able to produce effective therapies faster for less money, without compromising safety.

Big Drug Makers versus Co-Op: Why Small Is Better

About six years ago, I realized that the cooperative model could change the future of medicine. I’d just spent years working inside a well-funded scientific playhouse where R&D should have moved forward at breakneck speed, but somehow it hadn’t. Technologies are changing fast, and drugs frequently fail in development.

It costs hundreds of millions, or even billions, of dollars to bring a drug to market, and the costs are still growing faster than inflation. Even the largest pharmaceutical companies are struggling. The bottom line? Making a new drug is an adventure with no guarantee of success at any cost. The question I asked myself was, why hasn’t the pipeline been scrapped and replaced with something that can accommodate development done faster, better, and cheaper?

There is no public route for drug development; virtually all development is industry-backed. I wondered, if open-source software could effectively challenge multibillion-dollar software franchises, could scientists and drug developers work cooperatively to compete with a product from a big pharmaceutical company? To my mind, breast cancer therapies were the obvious choice, since many people already give time and money toward finding a cure.

Perhaps the single most powerful tool for accomplishing this goal is openness, which allows everyone, amateur or professional, anywhere, to peek under the hood of the company, understand what is being done, and add his or her ideas or comments. I personally believe it’s lack of transparency and inability to share information easily that has held back the biopharma industry compared to the IT industry.

Overall, as biology becomes more digital, there is potential for massive change. Open access will make it easier to share ideas, publish protocols and tools, verify results, firewall bad designs, communicate best practices, and more. Individualized medicine development will be built on this open foundation, which will only help developers be more successful and lower risk.

It also permits a novel funding model — i.e., directly approaching those who would benefit from any breakthrough. Whereas traditional funding models require attracting a few individuals or groups able to make large investments, for which they expect a financial return, we can deliver our message widely, asking people to invest $20 in a membership, in exchange for sharing our data with the community. Finding people to support us and running the cooperative itself is made easier because of social networking sites like Facebook and Twitter.

In the short term, I don’t see open-source drug development having a large effect on the U.S. health economy. The $2 trillion–plus system includes many products and services beyond just drugs. But there is room for a few examples to exist, make a real and measurable difference, and inspire others to experiment with nonprofit development. If Pink Army can treat even a single individual, I will consider the project a tremendous success, although I hope it will grow to treat millions of people with medicines that only get better and cheaper over time.
Personal Cures: From Individuals, For Individuals

The idea of cures or therapies that are unique to the individual is a critical component of the Pink Army Cooperative vision. A few years ago, the notion of cancer treatment that was specific to a person’s genome was seen as a fantasy. But thanks to rapidly moving technologies like synthetic biology, the prospects are very different today. This is a powerful new genetic engineering technology founded on DNA synthesis that amounts to writing software for cells. It’s the ideal technical foundation for open-source biotechnology. Moreover, synthetic biology drops the cost of doing bioengineering by several orders of magnitude. Small proteins, antibodies, and viruses were amenable to the technology and within reach of a startup.

Readers familiar with Wired editor Chris Anderson’s The Long Tail will recognize individualized medicine as the very end of the tail — a future of one product sold only to one person. I don’t think any company had seriously considered making these types of drugs before Pink Army. Most people accept that drugs cost hundreds of millions to make. Who could pay that much for a custom medicine, other than a few billionaires?

But individualized drugs could lower the cost of drug development across the entire spectrum of the development chain. Only very small-scale manufacturing capability is necessary. Lab testing is simplified. And clinical trials are reduced to a single person: No large phased trials are necessary, so there’s no ambiguity about who will be treated, and every patient can be rigorously profiled. This shaves money and years off development. Moreover, with the client fully informed and integral to all aspects of development and testing, the developer’s liability approaches the theoretical minimum.

My interest in breast cancer is personal and professional. Because it affects so many women — roughly 12% — almost everyone has been touched by breast cancer either personally or through someone they know. But cancer has always been central to my work as a genetic scientist, and I’m lucky to have been involved with several breast cancer–related projects during my time in biopharma. Curing cancer should be straightforward: It’s about making a better antibiotic, but the search for a cure seems to have stalled. It’s time to see if open-source drug development can reboot the process. That’s why Pink Army is important.

About the Author

Andrew Hessel is a geneticist and founder of the Pink Army Cooperative in Alberta, Canada. Web site www.pinkarmy.org.

The Future of Nanomedicine

The founder of the Nanofactory Collaboration is innovating medicine molecule by molecule.
By Robert A. Freitas Jr.

© 2009 Robert A. Freitas Jr. All Rights Reserved.

For countless centuries, physicians and their antecedents have sought to aid the human body in its efforts to heal and repair itself. Slowly at first, and later with gathering speed, new methods and instruments have been added to the physician’s toolkit – anesthesia and x-ray imaging, antibiotics for jamming the molecular machinery of unwanted bacteria, microsurgical techniques for physically removing pathological tissue and reconfiguring healthy tissue, and most recently biotechnology, molecular medicine, pharmacogenetics and whole-genome sequencing, and early efforts at gene therapies.

In most cases, however, physicians must chiefly rely on the body’s ability to repair itself. If this fails, external efforts may be useless. We cannot today place the component parts of human cells exactly where they should be, and restructure them as they should be, to ensure a healthy physiological state. There are no tools for working, precisely and with three-dimensional control, at the molecular level.

To obtain such tools, we need nanotechnology (nanomedicine.com/NMI/1.1.htm ). Nanotechnology is the engineering of atomically precise structures and, ultimately, molecular machines. The prefix “nano-” refers to the scale of these constructions. A nanometer is one-billionth of a meter, the width of about five carbon atoms nestled side by side. Nanomedicine is the application of nanotechnology to medicine.

The ultimate tool of nanomedicine is the medical nanorobot (http://www.nanomedicine.com/index.htm#NanorobotAnalyses ) – a robot the size of a bacterium, composed of many thousands of molecule-size mechanical parts perhaps resembling macroscale gears, bearings, and ratchets, possibly composed of a strong diamond-like material. A nanorobot will need motors to make things move, and manipulator arms or mechanical legs for dexterity and mobility. It will have a power supply for energy, sensors to guide its actions, and an onboard computer to control its behavior. But unlike a regular robot, a nanorobot will be very small. A nanorobot that would travel through the bloodstream must be smaller than the red cells in our blood – tiny enough to squeeze through even the narrowest capillaries in the human body. Medical nanorobotics holds the greatest promise for curing disease and extending the human health span. With diligent effort, the first fruits of this advanced nanomedicine could begin to appear in clinical treatment sometime during the 2020s.

For example, one medical nanorobot called a “microbivore” ( http://www.jetpress.org/volume14/freitas.pdf ) could act as an artificial mechanical white cell, seeking out and digesting unwanted pathogens including bacteria, viruses, or fungi in the bloodstream. A patient with a bloodborne infection might be injected with a dose of about 100 billion microbivores (about 1 cc). When a targeted bacterium bumps into a microbivore, the microbe sticks to the nanorobot’s surface like a fly caught on flypaper. Telescoping grapples emerge from the microbivore’s hull and transport the pathogen toward the front of the device, bucket-brigade style, and into the microbivore’s “mouth.” Once inside, the microbe is minced and digested into amino acids, mononucleotides, simple fatty acids and sugars in just minutes. These basic molecules are then harmlessly discharged back into the bloodstream through an exhaust port at the rear of the device. A complete treatment might take a few hours, far faster than the days or weeks often needed for antibiotics to work, and no microbe can evolve multidrug resistance to these machines as they can to antibiotics. When the nanorobotic treatment is finished, the doctor broadcasts an ultrasound signal and the nanorobots exit the body through the kidneys, to be excreted with the urine in due course. Related nanorobots could be programmed to quickly recognize and digest even the tiniest aggregates of early cancer cells.

Medical nanorobots could also be used to perform surgery on individual cells. In one proposed procedure, a cell repair nanorobot called a “chromallocyte” ( http://www.jetpress.org/v16/freitas.pdf ), controlled by a physician, would extract all existing chromosomes from a diseased cell and insert fresh new ones in their place. This process is called chromosome replacement therapy. The replacement chromosomes are manufactured outside of the patient’s body using a desktop nanofactory optimized for organic molecules. The patient’s own individual genome serves as the blueprint to fabricate the new genetic material. Each chromallocyte is loaded with a single copy of a digitally corrected chromosome set. After injection, each device travels to its target tissue cell, enters the nucleus, replaces old worn-out genes with new chromosome copies, then exits the cell and is removed from the body. If the patient chooses, inherited defective genes could be replaced with non-defective base-pair sequences, permanently curing any genetic disease and even permitting cancerous cells to be reprogrammed to a healthy state. Perhaps most importantly, chromosome replacement therapy could correct the accumulating genetic damage and mutations that lead to aging in every one of our cells.

Right now, medical nanorobots are just theory. To actually build them, we need to create a new technology called molecular manufacturing. Molecular manufacturing is the production of complex atomically precise structures using positionally controlled fabrication and assembly of nanoparts inside a nanofactory, much like cars are manufactured on an assembly line. The first experimental proof that individual atoms could be manipulated was obtained by IBM scientists back in 1989 when they used a scanning tunneling microscope to precisely position 35 xenon atoms on a nickel surface to spell out the corporate logo “IBM”. Similarly, inside the nanofactory simple feedstock molecules such as methane (natural gas), propane, or acetylene will be manipulated by massively parallel arrays of tiny probe tips to build atomically precise structures needed for medical nanorobots. In 2006, Ralph Merkle and I founded the Nanofactory Collaboration ( MolecularAssembler.com/Nanofactory ) to coordinate a combined experimental and theoretical R&D program to design and build the first working diamondoid nanofactory that could build medical nanorobots.

How are these ideas being received in the medical community? Initial skepticism was anticipated, but over time people have begun taking the concept more seriously. (In late 1999 when my first book on “nanomedicine” came out, googling the word returned only 420 hits but this number rose fourfold in 2000 and fourfold again in 2001, finally exceeding 1 million hits by 2008.) Of course, most physicians cannot indulge themselves in exploring the future of medicine. This is not only understandable but quite reasonable for those who must treat patients today with the methods available today. The same is true of the medical researcher, diligently working to improve current pharmaceuticals, whose natural curiosity may be restrained by the knowledge that his or her success – no matter how dramatic – will eventually be superseded. In both cases, what can be done today, or next year, is the most appropriate professional focus.

But only a fraction of today’s physicians and researchers need look ahead for the entire field of medicine to benefit. Those practitioners who plan to continue their careers into the timeframe when nanomedical developments are expected to arrive – e.g., younger physicians and researchers, certainly those now in medical and graduate programs – can incrementally speed the development process, while simultaneously positioning their own work for best effect, if they have a solid idea of where the field of medicine is heading. Those farther along in their careers will be better able to direct research resources today if the goals of nanomedicine are better understood.

The potential impact of medical nanorobotics is enormous. Rather than using drugs that act statistically and have unwanted side effects, we can deploy therapeutic nanomachines that act with digital precision, have no side effects, and can report exactly what they did back to the physician. Test results, ranging from simple blood panels to full genomic sequencing, should be available to the doctor within minutes of sample collection from the patient. Continuous medical monitoring by embedded nanorobotic systems, as exemplified by the programmable dermal display ( http://www.nanogirl.com/museumfuture/freitastalk.htm ), can permit very early disease detection by patients or their physicians. Such monitoring will also provide automatic collection of long-baseline physiologic data permitting detection of slowly developing chronic conditions that may take years or decades to develop, such as obesity, diabetes, calcium loss, or Alzheimer’s.

Drug companies? Rather than brewing giant batches of single-action drug molecules, Big Pharma can shift to manufacturing large quantities of generic nanorobots of several basic types. These devices could later be customized to each patient’s unique genome and physiology, then programmed to address specific disease conditions, on site in the doctor’s office at the time of need. Could personal nanofactories ( http://www.rfreitas.com/Nano/NoninflationaryPN.pdf ) in patients’ homes eventually do some of this manufacturing? Yes, especially if creative designs for new devices or procedures are placed online as open-source information. But basic issues such as IP rights, quality control, legal liability, trustworthiness of design improvements and software upgrades, product branding, government regulation and the like should allow Big Pharma to retain a significant role in medical nanomachine manufacture even in an era of widespread at-home personal manufacturing.

Doctors and hospitals? For commonplace pathologies such as cuts or bruises, colds or flu, bacterial infections or cancers of many kinds, individuals might keep a batch of generic nanorobots at the ready in their home medical appliance, ready to be reprogrammed at need either remotely by their doctor or by some generically-available procedure, allowing patients to self-treat in the simplest of cases. Doctors in this situation will act in the role of consultants, advisors, or in some cases gatekeepers regarding a particular subset of regulated conventional treatments. This will free up physicians and hospitals to deal with the most difficult or complex cases, including acute physical trauma and emergency care. These practitioners can also concentrate on rare disease conditions; many diseases also have few symptoms and thus go unrecognized for a long time. Medical specialists will also be needed to plan and coordinate major body modifications such as cosmetic surgeries and genetic upgrades, as well as more comprehensive procedures such as whole-body rejuvenations that may involve cell repair of most of the tissue cells in the body and might require several days of continuous treatment in a specialized facility.

Cost containment? Costs can be held down because molecular manufacturing can have intrinsically cheap production costs (probably on the order of $1/kg for a mature molecular manufacturing system) and can be a “green” technology generating essentially zero waste products or pollution during the manufacturing process. Nanorobot life cycle costs can be very low because nanorobots, unlike drugs and other consumable pharmaceutical agents, are intended to be removed intact from the body after every use, then refurbished and recycled many times, possibly indefinitely. Even if the delivery of nanomedicine doesn’t reduce total health-care expenditures – which it should – it will likely free up billions of dollars that are now spent on premiums for private and public health-insurance programs.

Many are working to extend the bounds of conventional medicine, so here it is relatively difficult for one person to make a big difference. Few are given the opportunity (the perspective, the resources, and the willingness) to look a bit farther down the road, identifying an exciting long-term vision for medical technology and then planning the detailed steps necessary to achieve it. Planning and executing these steps toward the long-term vision has been my career and my passion for the last two decades. As the technologies I’m working on come more clearly into focus, more people will acknowledge them as realistic and their enhanced trust in the longer-term vision will help speed the development of medical nanorobotics.

About the Author

Robert A. Freitas Jr. is senior research fellow at the Institute for Molecular Manufacturing (IMM) in California, after serving as a research scientist at Zyvex Corp. in Texas during 2000-2004. He is the author of Nanomedicine (Landes Bioscience, 1999, 2003), the first technical book series on medical nanorobotics. Web site www.rfreitas.com. Freitas is the 2009 winner of the Feynman Prize in nanotechnology for theory.

Remaking Education for a New Century

Communications scholar Janna Anderson is charting a new path for education outside of the classroom.
The following interview was conducted by FUTURIST senior editor Patrick Tucker.

THE FUTURIST: You’ve talked about entrenched educational institutions of the industrial age, and how those will be replaced as computer interfaces will be improved. You’ve said that developments in materials science will make learning into a process that happens via computer and video game, and that may even be a precursor to learning by computer implant by 2030 or 2040. My first question is: What role does the classroom have in the classroom of the future?

Janna Anderson: I do believe that a face-to-face setting is an important element of learning. The era of hyperconnectivity will require that most professionals weave their careers and personal lives into a blended mosaic of activity. Work and leisure will be interlaced throughout waking hours, every day of the week. We need to move away from the format of school time and non-school time, which is no longer necessary. It was invented to facilitate the agrarian and industrial economies.

Faculty, teachers, and principals could inform students that they expect them to learn outside of the classroom and beyond homework assignments. The Internet plays a key role in that. Rather than classrooms, one can see the possible emergence of learning centers where students with no Internet access at home can go online, but everyone will be working on a different project, not on the same lesson. You can also imagine students making use of mobile and wireless technology for purposes of learning.

More importantly, we need to teach kids to value self-directed learning, teach them how to learn on their own terms, and how to create an individual time schedule. We need to combine face time with learning online. And we can’t be afraid to use the popular platforms like text-messaging and social networks. As those tools become more immersive, students will feel empowered and motivated to learn on their own — more so than when they were stuck behind a desk.

THE FUTURIST: One thing you and many others have said is that neuroscience has the potential to radically change the way we teach. As we develop a more real and full understanding of the way the brain accumulates knowledge, what technology, aside from IT, could change education?

Anderson: It’s hard to predict which new technology could capture people’s imaginations. I think the combination of bioinformatics — biology and information technology — could have the biggest impact in the next couple of decades. If we continue to see the digitization of all information, which renders even our chemistry knowable, the ramifications for education could be immense and unfathomable. But the far future is the confluence of too many different factors to see.

THE FUTURIST: Right now, many educators perceive a digital divide between the members of different socioeconomic classes. You’ve talked about how scalability — technology becoming cheaper and more available in the future — could help solve that. But what if some people adopt the new technology faster than others? There are early adopters and late adopters. Being a late adopter is a small matter when you’re talking about the new iPhone, but as education becomes increasingly digitized, late adoption could have significant consequences in terms of the educational quality. Do you see any threat of an adopter divide?

Anderson: There’s no doubt that there are capacity differences. When we’re talking about the digital divide, we’re not talking just about access to equipment, but also the intellectual capacity, the training to use it, and the ability to understand the need for it, as well as its importance. There’s no doubt that cultural differences are also a huge factor. In areas that have been less developed, especially in the global south, a capacity gap in terms of adoption of a new technology may emerge because some societies are less able to adopt something new at this point in time.

THE FUTURIST: How can this cultural divide be overcome?

Anderson: This is why the effort to educate women is so important. In cultures where women are highly educated and tend to be heads of the family in terms of the upbringing of their children, there’s a higher likelihood that those children are going to show a more open cultural perspective and be more willing to take up new technologies.

THE FUTURIST: So, you still see an active role for actual physical teachers. In many ways, teachers will be more necessary than ever if they’re going to help people, especially in less-developed nations, to pick up these technologies to improve their own lives?

Anderson: There’s definitely a role for technology evangelists who can help people to understand how to use information technology no matter what level they happen to be at. But the traditional idea of the teacher may be much less valuable to the future, just like the traditional library will have much less value. We need to remove the old books that no one has opened in twenty years and put them in nearby storage. What we do need are places were people can gather — places that foster an atmosphere of intellectual expansion, where learners can pursue deeper meaning or consult specialists with access to deep knowledge resources. It’s all about people accessing networked knowledge, online, in person, and in databases. We need collective intelligence centers, and schools could be that way, too.

THE FUTURIST: The Internet is inherently disruptive to business models; the decimation of the newspaper industry is a case in point. One of the aspects of digital education that people don’t talk about much is how disruptive it could be to the career of teaching. On the one hand, really great teachers will be able to reach a broader audience than ever before, but younger educators — teachers who have not yet hit their stride — could be left out. What happens when the educational community one day realizes that they’re facing the same forces of creative destruction that newspapers are facing today?

Anderson: Today there’s actually an advantage for young teachers because they generally understand better than the oldest generation how to implement new digital tools. If we eventually are able to “patch in” to all of the knowledge ever generated with a cybernetic implant, or if we are able to program advanced human-like robots or 3-D holograms to deliver knowledge resources, “elders” will have more influence over the content delivered. Regarding forces of advancing technology and their influence on things such as the news industry, the story of the entrenched institutions fighting change is an old one. We have to overcome the tyranny of the status quo. Many media leaders understood in the 1990s that they had to prepare for a new day, but they had this great profit machine. They wouldn’t let go of it until the economics of the situation forced them to change. Economics is generally the force that pushes leaders of stagnating institutions to adopt new paradigms. It will be interesting to see how all of this develops over the next few years.

Maybe what we need is a new employment category, like future-guide, to help people prepare for the effects of disruptive technology in their chosen professions so they don’t find themselves, frankly, out of a job.

About the Interviewee

Janna Anderson is an associate professor in Elon University’s School of Communications and the lead author of the Future of the Internet book series published by Cambria Press. She is also the author of Imagining the Internet: Personalities, Predictions, Perspectives (Rowman & Littlefield, 2005). She will be speaking at the World Future Society’s 2010 conference in Boston.

Literary Learning in the Hyperdigital Age

Emory University professor Mark Bauerlein is fighting to preserve literary thought in an age of digital distraction.
By Mark Bauerlein

When the Boston Globe reported that an elite prep school in Massachusetts had set out to give away all its books and go 100% digital, most readers probably shrugged. This was just a sign of the times: Everyone now assumes a paperless future of learning through screens, not Norton anthologies and Penguin paperbacks. After all, the headmaster of the school told the Globe, “When I look at books, I see an outdated technology, like scrolls before books.” Who wouldn’t believe that every school a decade hence will display a marvelous, wondrous array of technology in every classroom, in the library, in study hall?

It won’t go that far, though, not in every square foot of the campus and every minute of the school day. In 2020, schools will indeed sport fabulous gadgets, devices, and interfaces of learning, but each school will also have one contrary space, a small preserve that has no devices or access, no connectivity at all. There, students will study basic subjects without screens or keyboards present — only pencils, books, old newspapers and magazines, blackboards and slide rules. Students will compose paragraphs by hand, do percentages by long division, and look up a fact by opening a book, not checking Wikipedia. When they get a research assignment, they’ll head to the stacks, the reference room, and the microfilm drawers.

It sounds like a Luddite fantasy, but even the most pro-technology folks will, in fact, welcome the non-digital space as a crucial part of the curriculum. That’s because over the next 10 years, educators will recognize that certain aspects of intelligence are best developed with a mixture of digital and nondigital tools. Some understandings and dispositions evolve best the slow way. Once they mature, yes, students will implement digital technology to the full. But to reach that point, the occasional slowdown and log-off is essential.

Take writing. Today, students write more words than ever before. They write them faster, too. What happens, though, when teenagers write fast? They select the first words that come to mind, words that they hear and read and speak all the time. They have an idea, a thought to express, and the vocabulary and sentence patterns they are most accustomed to spring to mind; with the keyboard at hand, phrases go right up on the screen, and the next thought proceeds. In other words, the common language of their experience ends up on the page, yielding a flat, blank, conventional idiom of social exchange. I see it all the time in freshman papers, prose that passes along information in featureless, bland words.

English teachers want more. They know that good writing is pointed, angular, vivid, and forceful. A sharp metaphor strikes home, an unusual word catches a perceptive meaning, a long periodic sentence that holds the pieces together in elegant balance draws readers along. There are the ingredients of style, the cultivation of a signature. It happens, though, only when writers step outside the customary flow of words, especially those that tumble forth like Yosemite Falls. Because writing is a deep habit, when students sit down and compose on a keyboard, they slide into the mode of writing they do most of the time on a keyboard — texting (2,272 messages per month on average, according to Nielsen), social networking (nine hours per week, according to National School Boards Association), and blogging, commenting, IM, e-mail, and tweets.

It’s fast and easy, but good writing doesn’t happen that way. As more kids grow up writing in snatches and conforming to the conventional patter, problems will become impossible to overlook. Colleges will put more first-year students into remedial courses, and businesses will hire more writing coaches for their own employees. The trend is well under way, and educators will increasingly see the nondigital space as a way of countering it. For a small but critical part of the day, they will hand students a pencil, paper, dictionary, and thesaurus, and slow them down. Writing by hand, students will give more thought to the craft of composition. They will pause over a verb, review a transition, check sentence lengths, and say, “I can do better than that.”

The nondigital space will appear, then, not as an antitechnology reaction but as a nontechnology complement. Before the digital age, pen and paper were normal tools of writing, and students had no alternative to them. The personal computer and Web 2.0 have displaced these tools, creating a new technology and a whole new set of writing habits. This endows pen and paper with a new identity, a critical, even adversarial one. In the nondigital space, students learn to resist the pressures of conformity and custom, to think and write against the fast and faster modes of the Web. Disconnectivity, then, serves a crucial educational purpose, forcing students to recognize the technology everywhere around them and to see it from a critical distance.

This is but one aspect of the curriculum of the future. It allows a better balance of digital and nondigital outlooks. Yes, there will be tension between the nondigital space and the rest of the school, but it will be understood as a productive tension, not one to be overcome. The Web is, indeed, a force of empowerment and expression, but like all such forces, it also fosters conformity and stale behaviors. The nondigital space will stay the powers of convention and keep Web 2.0 (and 3.0 and 4.0 ) a fresh and illuminating medium.

About the Author

Mark Bauerlein is a professor of English at Emory University. He’s served as a director of the Office of Research and Analysis at the National Endowment for the Arts, where he oversaw studies about culture and American life. He’s published in the Wall Street Journal, The Weekly Standard, The Washington Post, and the Chronicle of Higher Education. His latest book, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future; Or, Don’t Trust Anyone Under 30, was published in May 2008 by Penguin. Web site www.dumbestgeneration.com.