When I was working on my doctorate degree in history, people would quip: “Why study history? There’s no future in it!”
On the contrary, there may be a great deal of history in the future. Throughout my four-decade career as a historian engaged in futuring, I have used the past to explore the future. Like the study of history, futuring is heavily based on facts, evidence, solid research, and sound logic—more science, less science fiction.
Futuring is an example of what I call “applied history,” or the use of historical knowledge and methods to solve problems in the present. It addresses the question “What happened and why?” in order to help answer the question “How might things be in the future and what are the potential implications?” Futuring, at least in a management context, combines applied history with other methods adapted from science, mathematics, and systems analysis to frame well-considered expectations for the future. This process will help us to make decisions in the present that will have positive long-term consequences. In the language of business, futuring is an aspect of due diligence and risk management.
History provides indications of the future. Identifying historical trends helps us see patterns and long-term consistencies in cultural behavior. History may not repeat itself, but certain behaviors within cultures do. We can spot patterns in persistent traditions, customs, laws, memes, and mores. Debating whether a historical event is unique or a manifestation of a long path of behavior is like arguing whether light is a particle or a wave—the answer depends entirely upon your perspective.
The past provides precedents for future behavior. When you understand how things happened in the past, you gain much foresight into the things that might happen in the future—not as literal reenactments, but rather as analogous repetitions of long-term behavior that vary in their details according to historical conditions.
Let me hasten to qualify my view of history by saying that I see no immutable forces in the flow of history, no invisible hands of predestination, fate, or economic determinism. Time may be like an arrow, in the words of Sir Arthur Eddington, but I very seriously doubt that it has a prescribed target. I am also skeptical of the concept of political or economic cycles recurring with regular periodicity. If there were any determinism or predictability whatsoever in human behavior, it lies in our evolutionary biology and cultures. Luck, randomness, and the idiosyncrasies of free will play important roles in determining the future as well.
While the study of history has been rich in philosophy, it has lacked theories such as those found in the natural and social sciences. Most historians have not pursued such theories, because they see each period of history as being unique and as having little or no practical applications for problem solving today. Futuring as applied history, however, needs basic principles upon which to build forecasts that can be used for long-term decision making.
The study of the future is very sparse in both philosophy and theory. Theories (which may also be seen as mental or analytical models) provide a framework for forecasts and give them a credibility that increases managers’ willingness to take calculated risks. In addition, they can help us utilize our knowledge of demonstrated trends, interactions, and causes to better anticipate the future. The theories do not have to be rigid, but they do need to provide an explicit framework that can be modified, expanded, and even rejected by experience.
To that end, I have been working on a set of theoretical principles for futuring from the point of view of an applied historian. I offer them now as working guidelines until others can offer better.
Futuring Principle 1>> The future will be some unknown combination of continuity and change.
After an event occurs, you can always find some evidence of the path that led up to it. Sometimes when viewed in hindsight, the path looks so linear that it is tempting to conclude that the outcome was inevitable all along. In reality, it is the historical research that is deterministic, not the events themselves.
No historical event has ever occurred without antecedents and long chains of cause-and-effect relationships. Nor was there ever a time when decision makers did not have choices, including the simple option to do nothing. Yet, in the present moment, one can never be certain which chains of events will play out. While there are continuities in the past and the present, there are also changes, many of which cannot be anticipated. Sometimes these changes are extreme, resulting from high-impact, low-probability events known as “wild cards.”
Thus, the future always has been and most likely always will be an unknown combination of both trend continuities and discontinuities. Figuring out the precise combination is extremely difficult. Therefore, we must study the trends but not blindly project them into the future—we have to consider historical trends, present conditions, and imagined changes, both great and small, over time. You might say that trend analysis is “necessary but not sufficient” for futuring; the same goes for imagined changes, too.
Futuring Principle 2>> Although the future cannot be predicted with precision, it can be anticipated with varying degrees of uncertainty, depending upon conditions. Forecasts and plans are expectations for the future, and they are always conditional.
As twentieth-century physicist Niels Bohr famously said, it is very hard to make predictions, especially about the future. Yet, we can and do form expectations about the future ranging from ridiculous to prescient. David Hume, Werner Heisenberg, and Karl Popper cautioned us to be wary of drawing inductive inferences about the unknown from the known. This caution applies as much to futuring as it does to science.
All events occur in the context of historical conditions; likewise, all events in the future will almost certainly occur within a set of conditions. Therefore, all forecasts are conditional.
We may not be able to anticipate specific events in the future, but we can form well-considered expectations of future outcomes by looking at specific conditions and scenarios. For example, “When will the United States experience again an annual GDP growth rate of 7% or higher?” is a much more elusive question to address than “Under what likely conditions would it be reasonable to expect the United States’ annual GDP growth rate to be 7% or higher in the future?”
Futuring Principle 3>> Futuring and visioning offer different perspectives of the future—and these perspectives must complement one another.
This principle draws a distinction between futuring and visioning. Futuring looks at what is most plausibly, even likely, to unfold, given trends, evolving conditions, and potentially disruptive changes. It emphasizes conditions that are partially if not largely out of your own control.
Visioning, on the other hand, involves formulating aspirational views of the future based on what you want to see happen—in other words, how you would like events to play out. Of course, just because you want a certain future to happen does not guarantee that it will.
Strategic planning is a manifestation of visioning. If an organization does not engage in forecasting with all the rigor of historical criticism and good science, strategic planning can be just so much wishful thinking. I find that wishful thinking is alive and well in many corporations and institutions. Both futuring and visioning are necessary and they go hand-in-hand—just be careful to correctly identify which you are doing and why.
Futuring Principle 4>>All forecasts and plans should be well-considered expectations for the future, grounded in rigorous analysis.
Futuring methods fall into three broad, fundamental categories: trend analysis, expert judgment, and scenarios (also known as multi-optional or alternative futures). Historical research methods and criticism play well in all three categories.
As a futurist, I have no data from the future to work with. I cannot know in the present whether a forecast of mine will turn out to be “right,” or “accurate,” or even “prescient,” but I know what I can and cannot convincingly defend as being well-considered expectations for the future.
In this regard, the soundness of philosophical premises and theories, along with familiarity with best research practices, will add much to your foresight credibility and to the usefulness of your futuring activities.
Futuring Principle 5>>There is no such thing as an immutable forecast or a plan for a future that is set in stone.
Forecasts and plans must be continuously monitored, evaluated, and revised according to new data and changing conditions in order to improve real-time frameworks for making long-term decisions and strategies.
A forecast is a well-considered expectation for the future; it is an informed speculation or a working hypothesis, and as such is always a work-in-progress. Forecasts, like historical research, can never be completed. There is always more to be said on the subject as time passes. We must continuously use new and better information to evaluate and modify our expectations for the future.
* * * *
Futurists, like historians, must examine events in a large and complex context. My challenge to futurists, forecasters, strategic planners, and decision makers is to apply a historian’s rigor to their futuring endeavors. Think through a foundational philosophy of the future and theories concerning why some futuring methods are more trustworthy than others.
When generating forecasts, rely upon well-tested theories and best practices to justify your methods and conclusions. Use the five futuring principles offered above to guide your formulation of forecasts as well-considered expectations for the future.
Stephen M. Millett is the president of Futuring Associates LLC, a research and consulting company, and teaches in the MBA program at Franklin University. He received his doctorate in history from The Ohio State University. His career at the Battelle Memorial Institute spanned 27 years. He is a past contributor to THE FUTURIST and World Future Review and he was a keynote presenter at WorldFuture 2003 in San Francisco. He may be reached at smillett@futuringassociates.com.
A more thorough discussion of these principles and supporting case histories appear in his forthcoming book, Managing the Future: A Guide to Forecasting and Strategic Planning in the 21st Century, to be published by Triarchy Press, www.triarchypress.com/managing_the_future.
The theories of subjective probabilities advocated by eighteenth-century English mathematician and theologian Thomas Bayes and by twentieth-century Italian statistician Bruno de Finetti are very applicable today when we assign likelihood to any future conditions or outcomes.
Bayes (circa 1702-1761) used prior knowledge as a starting point for calculating the probabilities of events. “Prior knowledge” may mean a hunch or an educated guess in lieu of non-existing facts. To illustrate this concept, Bayes’s one paper on the topic described how an initial estimate of the positions of balls on a billiard table may lead to more accurate calculations of where they are likely to land next. With increasing information, one may see patterns that both explain unknown causes and anticipate the future.
Bayes’s approach has led to the information theory stating that expectations for the future must always be modified by new information. Yet, some critics contended that probability should be reliant on data-based statistics rather than subjective judgment.
About two centuries later, de Finetti (1906-1985) provided a proof that all probabilities, particularly those concerning the future, are subjective. He concluded that it is better to admit your subjective judgment than to hide it in apparent objectivity. One way to do this is to assign a future event an a priori probability: While it may or may not be prescient, it can reveal how likely you think a future event may be according to your own biases—information that can give you a sense of how objective your forecasting actually is.
—Stephen M. Millett