The Future of Evidence-based Futures Research

Subject(s):
Alireza Hejazi's picture

Have you ever wondered why so many "assumption-based" forecasts have been proved to be untrue? Have you considered that there could be more "evidence-based" ways of forecasting? Have you ever felt that "assumption-based" forecasting was just too much of a struggle and in many cases failed to trigger timely actions? Wait a minute—those questions have been around for about two decades, but what responses have we given to them?

Normally, it is difficult to find any evidence of future. There are logical critics even over the most well-known techniques like trend analysis. In fact, trends are extrapolations of the past and the present, not future facts, and usually trends have uncertain future trajectories. So, futurists may be interested in using more evidence-based and intelligence-driven techniques that may enable them enrich their scanning activities in more reliable ways. If we are going to appear useful to next generations through our futuring efforts, we have to pay more attention to evidence-based techniques.

The growing application of evidence-based inquiry that was originally born in medical fields and now is extending to social policy (Wikipedia, 2012) may tempt the futurists to update their current way of research. In the coming future, there may be more quantitative research methods jointly investigating traceable evidences, but the main idea behind all these initiatives is that futurists are becoming keen to set up their research projects on measurable foundations more than ever. It's a wise echo of this well-known saying: “You can’t manage what you don’t measure.”

Clark (2010, p 9) reports that it was in the last part of the twentieth century when the medical profession was the first applied field to formally adopt the incorporation of evidence into clinical decisions. Only in the last 20 years has evidence-based practice migrated to the social sciences. Today, they futurists may like to set a number of professional guidelines based on both research data and expertise in a similar manner. This may lead us to more evidence-based practices and a fundamental commitment to a well-established evidence-based rather than assumption-based professional foresight practice.

You might wonder whether evidence-based practice has any value in futuring. Unlike medical practitioners, most foresight practitioners do not see their work as having life or death consequences. But at international level it really becomes a matter of life and death when it comes to decision making over war or peace. Referring to differences between evidence, assumptions, and realities; Hines and Bishop (2006) show us how failure to detect that each piece of evidence was based on one or more assumptions led decision makers to advise Bush Administration on invading Iraq to find and destroy Saddam Hussein's WMDs. An effort that later turned out to be completely wrong.

I remember how well-informed Victoria Edge and Sherilee Harper dedicated their workshop "Scenario Analysis in Public Health" to that topic just last year during WorldFuture 2012 Conference. They pointed to the use of evidence-based futures planning at national, provincial/territorial/state, and community levels. They were trying to explain they superiority of evidence-based foresight over assumption-based planning.

A historical review of forecasts shows that they can be very precise, but at the same time fairly inaccurate. Forecasts can be both self-fulfilling and self-defeating. Forecasting the potential existence of a condition may make that condition become more likely. The mechanism is very simple: people who read about the possibility, work to make it happen. Yes, it's a matter of inventing the future not its exploration.

Conversely, a forecast of war may make the war less likely if it triggers preventive actions. So, forecasting itself can produce political implications. Willy-nilly the futurists have an end in politics and policy making affairs. On the other hand, if a self-defeating forecast triggers action to avoid the forecasted problem, then the forecast may have been highly inaccurate; nevertheless, the forecast may have been awfully useful!

Do you see the fascinating dilemma? On one hand, we have to forecast future problems or the consequences of our actions and on the other, we should always keep an eye on the possibilities that our forecasts may (desirably or undesirably) produce! Anyway, we are faced with the challenge of making right and timely decisions. So, foresight is the science of decision.

As Glenn (2009) puts it, "futures research should be judged by its ability to help decision makers make policy now, rather than whether a forecast was right or wrong." If we accept that foresight is a decision-shaping profession, then we have to admit that foresight practitioners seek to identify and describe current forces that would affect our choices and thereby our decisions.

While the futurists prefer asking "what difference does it make?" to "how well do we know it? ", making a distinction between normative and exploratory forecasting is vital as we should also ask "what future do we want?" instead of probing "how will be our future?".

By using the term "evidence-based" methods in futures research, I preferably bring quantitative research methods under consideration. By emphasizing evidence-based research, I don't want to erase the need for qualitative methods that could enrich our insights and visions. I agree with this fact that no single method should be trusted by itself and cross-referencing methods can improve foresight outputs (Glenn and Gordon, 2009). I also do believe that a combination of foresight methods is needed for both normative and exploratory forecasting, but the fact is that some tend to be used more for one than the other either now or in the future. Let me refer you to a recent case study.

Studying the role of "big data" in an evidence-based manner, McAfee and Brynjolfsson (2012) conclude that data-driven decisions are better decisions. In their inquiry, they found that the more companies characterized themselves as data-driven, the better they performed on objective measures of financial and operational results. In particular, companies in the top third of their industry in the use of data-driven decision making were, on average, 5% more productive and 6% more profitable than their competitors who relied just upon insight or vision-based consultancy.

Today senior decision makers have to embrace evidence-based decision making. Companies need to hire scientists who can find patterns in data and translate them into useful business information. This trend is growing rapidly to near future. Aren't the futurists originally created to take such roles? Don't we seek to glean intelligence from data and translate that into business advantage? Yes, we do. It's our job. The art and science of foresight is to inject skillfully that intelligence in the body of alternative scenarios we suggest.

Evidence-based research relies on the inclusion of diverse datasets in the analysis in order to obtain an in-depth and accurate understanding of scientific progression, competencies and potentialities (Halevi, 2012). It is a descriptor that is often used to describe evidence-related reference resources. Medical researchers (Dynamed, undated) have suggested a 7-step evidence-based methodology taking these steps: (a) identifying the evidence, (b) selecting the best available evidence, (c) critical appraisal, (d) objectively reporting the evidence, (e) synthesizing multiple evidence reports, (f) basing conclusions on the evidence, (g) updating daily. These steps may be translated into foresight practice with some modifications.

In futures studies, evidence-based methods like literature review, trend extrapolation, and horizon scanning may be more favorable than other methods in the eyes of futurists who prefer to brand their research outputs with quantifiable features. In fact, they look for models (using judgmental or statistical knowledge) and frameworks (including conceptual, methodological and analytical ones) that may provide typical outputs of evidence-based foresight projects (Popper, 2012). They gradually add new material to their evidence base and determine how policy and strategy might need to change as things evolve.

Jackson (2011, p. 6) reminds: "discriminating between the uses and usefulness of data is essential to manage the tension between requirements for evidence-based strategy and policy making, and the nature of horizon scanning which seeks to extrapolate possible outcomes from limited intelligence." Sometimes shortage of relevant data may lead to a special activity to generate relevant insights. In such cases, assumption-based techniques enter the scene. They are often more reliant on expert practitioners than on more interactive approaches.

Miles and Keenan (2003, p. 65) reiterate that distinction in other words: "it would be easy to imagine that assumption-based methods are mainly quantitative in form. This is inaccurate. Delphis are expert-based and yield quantitative results. Some sorts of scenario work are mainly qualitative but highly assumption-based. Essentially, the point is that in some circumstances we are able to rely upon data and knowledge of processes and relationships that has already been codified and subject to some scrutiny. In other circumstances, we need to elicit opinions and “guesstimates” from experts when we are considering rapid change, qualitative breaks, social and technological innovations."

Perhaps we need to draw our attention to "intelligence analysis" rather than to foresight alone. Systems like SEAS (Structured Evidential Argumentation System) show us how effectively intelligence analysis can be done, even easier than what we usually perceive (Schum and et al, 2009). However, we need to remember that sometimes we provide seemingly good evidence, but we may not be really sure of their accuracy. Iraq's WMDs can be a good case as mentioned earlier.

We may even get sure about the accuracy of our evidences, but later numerous uncertainties and factors enter the scene and make it difficult to think comprehensively about the consequences of our actions or decisions. Yes, it's a matter of competitive intelligence analysis. That means who will be the first guy in making the best use of intelligence in right time.

Porter’s model of competitive analysis (Daft, 2010, pp. 65-67) may give us a good hint in identifying forces that influence evidence-based foresight practice. There is always the threat of new entrants not just for the futurists, but also for all who are working on strategic issues and topics.

We know that futurists are tasked to reduce uncertainty and provide timely warnings about potential threats to national and international interests. Success in that mission requires the identification of the most accurate evidences that are pointed to alternative futures. In fact, one of the most important challenges ahead is the redefinition of foresight profession in terms of roles, duties and responsibilities that futurists should take with evidences they find in their national and international activities.

While the futurists are set to improve the quality and quantity of evidence-based foresight knowledge, how effectively that knowledge is going to be used and who will benefit from? Who is going to decide over using evidence-based foresight knowledge and what will be the purpose of evidence-based foresight projects at national and international levels?

In addition, who is going to buy evidence-based foresight knowledge? Dominant customers will be those who have foresight needs for a wide range of missions and objectives. Naturally, public and governmental institutes will be the main consumers of such knowledge and private users may stand in the second row.

These customers ask different questions, require different foresight support, and have different tolerance levels for false alarms and ability to plan for worst-case scenarios. Their questions require analyses on complex, interrelated domestic and foreign issues; with players from multiple governmental and nongovernmental entities; and with a wide range of political, economic, social, and technical dimensions. Addressing these questions may need different perspectives and expertise out of foresight profession. Multi-disciplinary scanning and analysis teams are surely needed. Any evidence-based foresight project should be designed and conducted keeping those points in mind.

How about evaluation? Evidence-based foresight practitioners will be evaluated based on the value of intelligence they can provide for their customers. If the clients can't find the answers of their questions in the words of evidence-based foresight service providers, they will refer to others who are capable of answering such questions. We should not neglect cultural considerations. Schultz (2006) warns that there are "cultural contradictions" between scanning methods and those preferred by conventional approaches to evidence-based policy making.

There is always a threat of substitute for foresight outputs and services. That will intensify the possibility of rivalry among competing foresight service providers. Those who can afford of managing foresight "big data" in better ways will find a greater chance of survivability in that competitive market. So, all will depend on our ability in developing a reliable evidence-based foresight practice that appreciates the value of future-oriented data and uses it in right time and place for shaping better decisions.

References:
Clark, Ruth Colvin. (2010). Evidence-based training methods. Alexandria, VA: ASTD.
Daft, Richard L. (2010). Organization theory and design. Mason, OH: South-Western Cengage Learning.
Dynamed. (undated). 7-Step evidence-based methodology, retrieved from https://dynamed.ebscohost.com/content/7-step-process
Glenn, Jerome C. & Gordon, Theodore J. (2009). Futures research methodology—Version 3.0. Washington, DC: Millennium Project.
Halevi, Gali. (May 2012). Evidence-based science policy research seminar, retrieved from http://www.researchtrends.com/issue28-may-2012/evidence-based-science-po...
Hines, Andy & Bishop, Peter. (2006). Thinking about the future: Guidelines for strategic foresight, DC: Social Technologies.
Jackson, Michael. (January 2011). Practical foresight guide. Shaping Tomorrow, http://www.shapingtomorrow.com/media-centre/pf-complete.pdf
McAfee, Andrew & Brynjolfsson, Erik. (October 2012). Big data: The management revolution, Harvard Business Review, 60-68. http://www.hbr.org
Miles, Ian & Keenan, Michael. (2003). Practical guide to regional foresight in the United Kingdom, retrieved from http://cordis.europa.eu/foresight/cgrf.htm
Popper, Rafael. (2012). Understanding foresight and horizon scanning (FHS), retrieved from http://community.iknowfutures.eu/pg/blog/popper/read/12065/understanding...
Schultz, Wendy. (2006). The cultural contradictions of managing change – Using horizon scanning in an evidence-based policy context, Foresight, 8(4),--.
Schum, David. Gheorghe Tecuci, Mihai Boicu & Dorin Marcu (2009). Substance-blind classification of evidence for intelligence analysis. In Proceedings of the conference “Ontology for the intelligence community: Setting the stage for high-level knowledge fusion,” George Mason University, Fairfax, Virginia Campus, 20 - 22 October. SEAS homepage: http://www.ai.sri.com/~seas/index.html

Alireza Hejazi is the founding editor of Futures Discovery website: http://www.futuresdiscovery.com/. He is a graduate of Strategic Foresight from Regent University School of Business & Leadership.

Comments

Post new comment

The content of this field is kept private and will not be shown publicly.
By submitting this form, you accept the Mollom privacy policy.