8 Decisions and information
Dual-process theory
One of the striking scientific developments around the turn of the century was that several compatible and often complementary views of mental duality emerged independently, in different contexts and disciplines, ranging from social psychology and neuropsychology to decision theory and economics. The notion of many different systems in the brain is not new but what distinguishes the dual-process theory from earlier views is that it builds on a better understanding of the brain’s biologic and cognitive structure to suggest that there are two different types of thinking, each with different functions, strengths and weaknesses:
- Type 1 processes: these are autonomous, unconscious, automatically executed when relevant stimuli are encountered, unguided by higher-level control systems, fast, low-effort, often parallel and with a high processing capacity. Examples of Type 1 processes are simple mental calculations, such as 2 + 3, the recognition of a common animal species like a dog, the detection of hostility in a person’s voice and the identification of a roof in an elevation drawing.
- Type 2 processes, which are the opposite: controlled, analytical, slow, high-effort, generally serial and with a limited processing capacity. Examples of Type 2 processes are demanding mental calculations, such as 3943 × 2187, filling in an insurance form for a motor vehicle accident and looking for inward-opening swing doors wider than 67 cm in a floor plan of a large building.
For their immediate responses in a variety of situations, Type 1 processes rely on encapsulated knowledge and tightly compiled learned information, accumulated through exposure and personal experience common to the majority of people. Some of the best examples concern affordances: the actionable properties an environment presents to us, such as that we can walk on a pavement or go through a door. These are things anyone can do, usually equally well. Other things are restricted to a minority of individuals and are often an indication of expertise, for example in a hobby like fishing or a sport like table tennis. Nevertheless, they are all acquired and encoded in a similar manner. In fact, expertise in dual-process theory is seen as an extension of common capacities through practice: in the same way a child learns to recognize animal species, an expert learns to recognize familiar elements in new situations and deploy ready interpretations and actions. It should be noted that expertise is not a single skill but a large collection of small skills: an expert footballer is capable of simultaneously (or in quick succession) doing many small things both individually and in a team, from controlling a ball coming at different speeds and from different directions to passing the ball to teammates according to an agreed plan, taking into account their individual capacities and position in the field.
Type 2 processes are distinguished into two categories. The first consists of reflective processes, which evaluate Type 1 thinking and its results on the basis of beliefs and goals. Reflection often involves cognitive simulation and can be triggered by the outcome of Type 1 processes, such as the feeling of doubt that arises from frequent failure, for example constant slipping on a frozen pavement. It is also linked to interpersonal communication and the need to explain or justify proposed joint actions and goals, such as the tactics of a football team. Once activated, reflective processes can interrupt or override Type 1 processing, suppress its responses and reallocate expensive mental resources from failing Type 1 processes to searches for alternative solutions through Type 2 thinking.
These solutions are the subject of the second category in Type 2 thinking, algorithmic processes: strategies, rule systems, general and specific knowledge, usually learned formally and therefore bounded by culture. For example, unlike a simple DIY job, a loft conversion requires more meticulous organization, which can be based on empirically acquired knowledge, a textbook on home improvement or Internet tutorials. If the project is undertaken by AECO professionals, it inevitably also draws from their training in design, construction, time planning and site management. One of the basic functions of algorithmic processes is cognitive decoupling: the distinction between representations of the world used in Type 1 processes and representations required for the analysis of imaginary or abstract situations. Cognitive decoupling concerns both substituting the naïve representations implicit in daily life, such as that of a flat earth, and allowing for imaginary situations, such as a cubic earth, which form settings for hypothetical reasoning.
It should be pointed out that Type 2 processes do not necessarily return better results than Type 1 ones. Being highly demanding, mentally expensive, subject to personal intelligence and knowledge, and founded on Type 1 biases or professional thinking habits, they may also lead to failure. Being usually acquired through formal learning, they may also be limited or failure-prone because of theoretical and methodical issues. For example, before the Copernican revolution, learned astronomers made fundamental mistakes because they based their work on erroneous earth-centric models, not because they made errors in their calculations. What Type 2 processes certainly do is avoid the biases of Type 1 thinking and therefore have smaller error margins, moreover neither too high nor too low with respect to the truth. Kepler’s laws of the heliocentric model and Newtonian mechanics form a sound basis for calculating planetary motion with sufficient precision and accuracy, regardless of the means used for the calculation.
Dual-process theory has a number of advantages that explain its acceptance and popularity in many application areas. One advantage is that it makes evident why we are are so good at some tasks and so poor at others. For example, we are good grammarians: children at the age of four are already capable of forming grammatically correct sentences. By contrast, we are poor statisticians: we are clever enough to have invented statistics but nevertheless fail to apply statistical thinking in everyday situations that clearly demand it, such as games of chance. The reason for this is that Type 1 processes represent categories by prototypes or typical examples. As a result, they rely on averages and stereotypes, avoiding even relatively easy calculations for drawing conclusions about individual cases from what is known about relevant categories. Interestingly, most people have little difficulty making these Type 2 calculations when asked to do so.
Such variability in cognitive performance should not be mistaken for inconsistency. It is instead an indication of conflicts that are inherent in our cognitive mechanisms. Dual-process theory has the advantage that it includes these conflicts in its core, as opposed to treating them as a loose collection of anomalies to be resolved afterwards, as exceptions to the rules. The most fundamental conflict is that Type 1 processes which undeniably serve us well in most situations are also the cause of frequent and persistent failures, some of which are discussed below. A related conflict is the constant struggle between Type 1 and Type 2 thinking for dominance and mental resources. At practically every moment of the day, we need to choose between what we do automatically and what requires analytical treatment. Sometimes what attracts our attention is an established priority, such as as an exam question. At other times, it is a sudden occurrence or a new priority, for example a sudden opening of a door or a cramp in the writing arm. At yet other times, it is anti-data, such as a pen that fails to write. We are constantly asked to prioritize in a continually changing landscape of often apparently unrelated tasks that nevertheless affect our performance both overall and with respect to specific goals. Unfortunately, we often fail because of Type 1 biases and underlying cognitive limitations.
Cognitive illusions, biases, Fallacies and failures
Inattentional and change blindness
A typical failure caused by our cognitive limitations is inattentional blindness: we systematically fail to notice unexpected things and events, especially when we are concentrating on another, relatively hard task. This is why we may fail to see a cyclist appear next to the car we are driving in heavy traffic (unless of course cyclists are an expected part of this traffic, as in most Dutch cities). Inattentional blindness is hazardous in traffic but the same neglect for unexpected things around us is actually helpful in many other situations because it allows us to reserve our limited cognitive capacities for important tasks at hand. When taking an exam, for example, we do not want to be distracted by extraneous stimuli, such as noise coming from outside, unnecessary movement in the room or incoming messages on social media.
Closely related is change blindness: the failure to see obvious changes in something unfolding in front of our eyes, primarily because we are concentrating on something else, usually a narrative (a subject discussed in more detail later in this chapter). Typical examples are continuity errors in films, which most viewers miss until someone points them out, making them immediately and permanently glaringly obvious to all. Just like inattentional blindness, change blindness is due to our memory limitations: perception is followed by recognition of meaning, which is what we encode in memory rather than every detail of the percept. What we retain in memory is an image or narrative that is above all coherent and consistent with its meaning. This also means that we may embellish the memory with fictitious details that fit the meaning and the emotions it elicits. Vivid details are often an indication of such reproduction after the memory was formed.
Planning and sunk-costs fallacies
A popular subject in news stories are projects poorly conceived by clients and developers, inadequately understood by politicians and authorities that endorse them, and unquestioningly attempted by designers and engineers, with dramatic failures as a result. Such projects are often linked to the planning fallacy: we tend to make designs, plans and forecasts unrealistically close to best-case scenarios, neglecting to compare them to the outcomes of similar projects, therefore repeating mistakes that have led to previous failures. Common reasons behind the planning fallacy are the desire to have a design approved, pressure to have a project completed and a general tendency to act fast so as not to delay. These push decision makers to quick, overoptimistic, typically Type 1 decisions that remain unchecked.
Interestingly, these attitudes persist when the failure becomes apparent, usually by the height of sunk costs. Rather than accept defeat and cut their losses, many stakeholders insist on throwing good money after bad, desperately continuing the project, in the vain hope to change its fortunes around. The escalation of commitment caused by the sunk-costs fallacy is often celebrated for merely reaching some goals, ignoring the devastation it has brought along. It seems that the height of the sunk costs actually increases commitment, as well as the misguided appreciation of partial goals (often just the completion of the project), which confuses stubbornness and incompetence with heroism.
In principle, optimism in the face of danger or failure is a positive characteristic. It encourages us to persist against obstacles and, in many cases, to overcome them. A defeatist attitude under difficult conditions is obviously unhelpful for survival but the same also holds for insistence that is uninformed and uncontrolled by rational analysis. For example, any decision to repair a damaged car should depend on the technical and economic feasibility of the repairs, in relation to the value and utility of the car. Endless expensive repairs of a car that can be easily replaced by another rarely make sense.
Inside and outside view
Such fallacies are reinforced by the tendency of stakeholders to stick to inside views: views shared by the project participants, relying too much on information produced in the project by the participants themselves. By repeatedly using and sharing this information, such as an early budget or time plan, they end up believing it unquestioningly, even when new developments should cast doubt on earlier decisions and lead to major adjustments in the project (relation to change blindness). Consequently, participants subscribing to an inside view tend to underestimate problems and dangers that are evident from an outside view, i.e. one that covers the whole class to which the project belongs and the statistics of this class. Failing to adopt outside views reduces the validity of any basic decision in a project. It also causes unwarranted, pervasive optimism: the mistaken belief that this project is surely not subject to the causes of failure in other, very similar projects.
Illusions of knowledge, confidence and skill
Planning fallacies and inside views are linked to the illusion of knowledge: we think that we know more than we actually do because we understand what happens and mistake it for an understanding of why it happens. The consequence is that we rarely doubt our beliefs and assumptions, and, when confronted with a task, we plunge into direct action before we fully understand the situation. The more complex the task, the more profound the illusion and the more dangerous its effects.
A complementary illusion is that of confidence: we tend to delegate tasks to actors on the basis of what they believe they can do. Even without relevant qualifications or a convincing track record, a person can be confident about their competence to do something and claim it. Even worse, others are inclined to accept the claim: we treat the confidence of a person in their abilities as a true measure of their skill or expertise and therefore overestimate their capacities. However, as talent shows make abundantly and embarrassingly obvious, incompetent people are often overconfident about their abilities. They audition for something that they clearly cannot do, not as a joke but because they genuinely believe in themselves. Moreover, the illusion of skill from which they suffer means that they are less inclined to improve their skills. By contrast, highly skilled persons, such as top athletes and celebrated musicians, always look for ways to improve themselves, e.g. through constant, demanding and often innovative training. Similarly, true experts are aware of their limitations and scope (unlike users of questionable or arbitrary expert-like heuristics), and constantly try to augment or refine their interpretations and solutions in every new situation they encounter.
The illusions of knowledge, skill and confidence are not confined to extreme cases, like the ones in talent shows. Most people suffer from it, typically undertaking jobs they botch and abandon, e.g. in DIY. More importantly, they tend to attribute good performance to superior skills or expertise rather than luck and bad performance to accidental or unforeseen conditions that are beyond their control or to the incompetence or obstruction of others. Even seasoned professionals may be confident that they can achieve the necessary goals in a project, despite having failed to deliver in previous projects. However, experience is not the same as expertise or ability. The true hallmark of knowledge and skill is a persistently high performance, as attested by our expectations from e.g. professional musicians and surgeons.
Substitution
One of the clever strategies of Type 1 thinking is that difficult problems are routinely substituted by simpler ones. Rather than taking the trouble of calculating a sum like 3943 × 2187 precisely, we approximate by calculating 4 × 2 and then adding zeros for the thousands. This is acceptable in many situations but can be misleading in others. For example, when people are asked how happy they are, they invariably base their answer on their current mood and give an answer that reflects only very recent events and conditions. Regrettably, the fluency by which Type 1 thinking finds solutions to the simple problems makes us think that we have found adequate solutions to the complex ones they have substituted. Consequently, we seldom attempt to solve or even understand the complex problems. Instead, we resort to approximations and received wisdom, and so frequently fail to utilize the tools, knowledge and information at our disposal. Simplifying a mental calculation may be a clever strategy but doing so in a spreadsheet or a calculator is pointless, especially if precision is required.
Framing
Fallacies and illusions often relate to narrow framing: focusing on the individual project rather than the enterprises involved in it or focusing on a single problem rather than the whole project. What appears to be the right decision in a narrow frame is often patently wrong when considered more broadly, in relation to other issues and options or for a longer term. However embarrassing, the termination of a failing project may be right for the enterprise and its fortunes, as it allows allocation of resources elsewhere instead of increasing sunk costs. Crucially, a situation can be framed positively or negatively. Our reactions to e.g. the Internet are influenced by how it is presented: either as a source of hidden dangers or a universe of new opportunities. Similarly, a budget overrun by 46,3 million euros for a major, demanding building may raise few eyebrows among people familiar with much worse cases in infrastructure and defence projects but the same overrun expressed as 26% of the original budget sounds more serious.[1]
One of the most significant recent developments in decision theory goes beyond the realization that framing and the context it creates influence decision taking: it actively deploys a choice architecture that structures this context and nudges decisions towards better choices without forcing any outcomes. This involves using suitable defaults (e.g. opting out of organ donation instead of opting in) and making the various options more comprehensible (by providing clear information on what they entail in both short and long term) to prevent people from taking simplistic solutions when the number of choices and complexity of problems increase. Nudging accepts the possibility that people make errors and develops means to suppress them. In addition to the above feedforward mechanisms, a choice architecture should also provide feedback that informs people on the consequences of their decisions, warns them if things go wrong and allows them to learn about the tasks they are facing and their own behaviour and performance.
Narratives, coherence and causes
We have all experienced it in one form or another: we have attended a presentation that convinced us but afterwards it was unclear why; we have enjoyed watching a film only to realize later that the plot was full of holes. The success of a narrative is often due to the halo effect: a presentation delivered by someone we like, respect or admire is received positively by Type 1 thinking, often impeding Type 2 processes from analysing its content. But even without this effect, our love of narratives and the coherence they bring to our perceptions and experiences are decisive. Type 1 thinking looks for coherence in a story and confuses it with quality of evidence: if the narrative is coherent, it is plausible and therefore acceptable. Its probability matters little; it suffices that it is a simple, concrete story about a few striking events, with enough detail to make it realistic.
This is the kind of stories that convinces and appeals to us in real life, in literature or in films. These stories usually tell us that practically everything is due to talent, stupidity or intention. They can therefore be didactic or inspirational, allowing us to marvel at the leadership of a great military hero or a titan of industry. The role of luck and circumstance is often ignored, in a way that makes us believe not only that we understand the past and the present but also predict and control the future. On the positive side, this allows the development of beliefs and confidence in them, so that we are not easily daunted by adversity or yet unsolved problems. The “never mind, we’ll find a solution; let’s crack on now” mentality can be beneficial for survival but not necessarily helpful with problems that require careful analysis and planning, such as big construction projects.
Unfortunately, narratives may contain invented causes that help facts connect and make sense together. This also implies selectivity as to which facts are included. As information is costly to obtain, store and process in our brains, we apply the same simplification as in our memories: we reduce narratives into something simple that makes sense and reveals agents, intentions and causes with clear characteristics and relations. Interestingly, we then embellish narratives (similarly to memories) with details that make them more believable, consistent and coherent — details that may have been invented or imagined, at the cost of others that reveal the complexity or randomness of reality. This retrospective distortion of facts enhances the illusion of fully understanding reality and being able to predict it, and sustains a dangerous eagerness to accept overgeneralizations like “generation X”, zodiac signs or perfect curves that profess to be analytical, despite their lack of foundation in real data.
Part of the appeal of narratives lies in their linear structure, which reinforces three biases that make us jump to conclusions: our minds are built to assume causality on the basis of precedence in time, to infer causes from coincidence and to detect meaning in patterns. These Type 1 cornerstones allow us to connect events like the flicking of a switch to a light turning on, identify malfunctions in a machine by the sounds it makes and recognize familiar persons in a crowd by the way they move. On the negative side, they lead to illusions of cause when we infer relations and patterns in random data. The illusion may affect even experts who, when confronted with randomness or unknown patterns, do not pause to consider alternatives but remain in Type 1 mode and seek confirmation of what they know or expect. Spurious correlations are quite easy to find given enough data but unfortunately not all of them are as obviously ridiculous as the link of US spending on science, space and technology to the number of suicides by hanging, strangulation or suffocation; margarine consumption to the divorce rate in Maine; the age of Miss America to murders by steam, hot vapours or hot objects; the consumption of mozzarella cheese to civil engineering doctorates; or superstitions like an athlete always wearing the same “lucky” socks.[2]
Narratives are also inherently unfair, as we realize from the need to change the narratives of the colonial past: the old depictions and descriptions of persons and events were at best partial and selective, providing coherence and simplicity by downplaying the complexity of a world that often seems inexplicable because it contains things we consider improbable or fixed. Change blindness makes us fail to notice things that change if they fall outside the narrative and so eliminate them from the story, which is then presented in a way that reinforces the remaining points. In this way, other facts and perspectives remain hidden, only to resurface too late, when the partiality and unfairness of the narrative has become painfully obvious.
Dual processes and information
A widely accepted explanation for our frequent, systematic cognitive failures is that the otherwise highly efficient Type 1 processes stem from our evolution and adaptation to a world quite different to today’s highly technological, fast and busy environment. Back then, our cognitive systems developed ways to allocate their limited resources for the requirements of fundamental tasks at walking or running speed, not for demanding problems in mathematics or the much higher speed of motorized traffic. The world has since changed, as have our activities and priorities it it, but Type 1 processes remain the same, making us prone to biases and errors. What makes us able to use simple construction tools to build a garden wall is not what is required in planning and executing the realization of a skyscraper.
Unfortunately, our capability at many everyday tasks makes us overconfident about our cognitive abilities in general. We confuse the fluency of Type 1 processes with deep understanding and treat it as a general performance guarantee. This causes the cognitive illusions that make us underperform with worrying regularity. Even worse, we seem unable to appreciate the significance of these illusions and the frequency or magnitude of our failures. Law courts, for example, insist on putting too much weight on eyewitness accounts, despite known limitations of our memory and especially the tendency to reconstruct memories and complement them with fictitious details that enhance their meaning.
Most scientists agree that our cognitive limitations cannot be avoided. All we can do is be aware of them and remain alert to situations and conditions that require activation of Type 2 reflective processes: learn to recognize and neutralize biases, so as to avoid the pitfalls of intuitive, automatic decisions and actions. To do so, we require information that helps us both fully understand the situation and identify better solutions through Type 2 algorithmic processing. Making explicit the information surrounding every task in a process is therefore a prerequisite to any improvement in our decision making. Finding this information and processing it towards better solutions often involves the use of technologies that aid memory, such as writing, or processing, such as calculators. It follows that digital information technologies are a key part of the further, hybrid evolution of human thinking. Matching their structure and potential to our cognitive capacities is the starting point for understanding how IM can support decision making.
AECO and dual-process theory
Among the examples of failure mentioned in dual-process literature, AECO has a prominent place: its history is full of examples of the planning fallacy. Defence, industrial, ICT and infrastructure projects may result in higher overruns but the AECO examples are far more frequent and widespread. The reasons behind the planning fallacy appear to be endemic in AECO and regularly lead to hasty designs or project plans, build on largely unfounded Type 1 decisions. As one would expect, the disregard for realism and reliability does not change when projects start to fail or are affected by sunk costs. AECO stakeholders stubbornly keep investing in failures, focusing on project goals and anchoring on plans that are clearly faulty. Despite the availability of data, knowledge and advanced tools, many decisions are taken on the basis of norms and rules of thumb, which encourage superficial treatment of building performance and process structure. The result is that costs increase disproportionately, while even minor cuts or concessions dramatically reduce quality and scope. In the end, it seems that all that matters is that the buildings are realized, however expensive or poor. Even when we avoid outright failure, we generally produce buildings with the same severe limitations, through the same processes that have returned so many earlier mediocrities or failures.
The planning fallacy in AECO is reinforced by strong inside views. The dictum “every building is unique” is misleading because it ignores similarities not only between the composition and performance of buildings at various levels (from that of individual parts like doors and corridors to common configurations like open-plan offices) but also between the processes of development, realization and use in different projects. It leads to an arrogant, persistent repetition of the same mistakes, coupled to lack of interest in thorough analyses that can reveal what goes wrong. It seems than the main priority of AECO is to keep on producing, even though the high turnover in construction is linked to a rather low mark-up for many stakeholders.
Under these conditions, substitution is rife in AECO. Practically everything is kept as simple as possible, regardless of what is actually needed. This especially affects forecasts, such as cost estimates, and analyses, which are reduced to mere box ticking against basic norms like building codes. Compliance clearly weighs more heavily than real performance and this adds to the persistence of outdated approaches and technologies. While other production sectors increasingly invest in advanced computerization, AECO insists on doing too much manually, simultaneously heavily relying on cheap labour.
Such issues are exacerbated by the frequently loose structure of AECO processes, which include large numbers of black boxes with uncertain connections between them. These black boxes generally relate to the illusion of confidence, which underlies the delegation of aspects and responsibilities to specific actors. It is often enough that these actors represent a discipline relevant to an aspect; if they also emanate confidence in their knowledge and decisions, we tend to take it for granted that they know what they are doing and so impose few if any checks on their contributions — checks on which any sensible manager should insist anyway, given the high failure rate in AECO projects and the blame games that follow failure.
As for narratives and their capacity to obscure true causes, the halo effect has a strong presence in AECO, as attested by the attention given to famous architects and the belief that their designs are good by default. Any failure by a grand name is either a heroic one, attempted against all odds, or the result of unjust lack of acceptance or support by others. The culture of learning from prominent peers means that lesser AECO professionals are also fully aware of the power of a coherent narrative and therefore often choose to focus on simple, strong ideas rather than detailed, fully worked-out designs and plans. This is facilitated by ritual processes, formulated as prescriptive, box-ticking sequences of actions, products and stages: a process can usually proceed only if the previous steps have been completed, for example, if there are drawings of the design at the agreed scale, budgets and time schedules. The presence of these carriers is more important than the quality of their content, which remains unchecked until a problem emerges later on. So long as the coherence of the core narrative holds, we see what we want to see, oblivious to our inattentional or change blindness. This also leads to illusions of cause: we often attribute problems and failures to the immediately previous stage of a project, instead of searching for true causes throughout the project.
The fallacies and illusions in AECO are closely related to narrow and biased framing that isolates problems and solutions. For example, policy makers propose intensified, denser and higher housing construction in the Dutch Randstad in order to meet demand, without linking this to other issues, such as environmental concerns (e.g. the negative effects of urban heat islands) or transportation problems. These issues are subjects of other ongoing debates and current policies, which are kept separate, even though they are obviously related to urbanization and will be directly affected by housing development. By keeping them out of the housing demand frame, their resolution is deferred to a later stage, when the effects of this urbanization wave may have become painfully apparent and possibly irreversible. The danger is that they will be addressed only then, again in a piecemeal, narrow-frame fashion.
In short, despite the nature of its problems and solutions, AECO thinking appears dominated by Type 1 processes. Quick decisions based on norm, habit, principle or goal proliferate, notwithstanding the availability of detailed specifications (e.g. construction drawings) that could be used for precise evaluations of validity and feasibility. Moreover, such specifications and evaluations usually come after the decisions are taken and are constrained by resulting narratives: what does not fit the basic message that should conveyed in a project is frequently underplayed.
The social and information sides
An ingrained bias in process management is the frequent overemphasis on the social side: the stakeholders and actors, their interests, what they should do and when, and how to align them towards common goals and joint products. While this side of management is obviously significant, it also entails the danger of black boxes, formed out of generic, vague assumptions and expectations. We roughly know the capacities and remit of each participant in a project, so we feel disinclined to specify their contributions to the project or the connections between contributions in detail. Instead, we assume that everyone will do their bit and problems will be solved automatically, perhaps even before we realize their existence.
However, as the dual-process theory explains, this view is a highly suspect product of Type 1 thinking. The consequences of uncontrollable black boxes and undefined connections can be quite grave. A manager can always adopt a laissez-faire attitude and wait for crises to emerge in a project, in the knowledge that crises trigger action. Unfortunately, not all problems qualify as crises. In many cases, failure is the sum of many smaller problems that remain systematically unsolved (a characteristic malaise in AECO). But even when the problems are big enough for the crisis to be recognized, it may be too late for effective and economic solutions (as the sunk-costs fallacy indicates). The obvious solution is to structure processes clearly and consistently as a sequence of specific tasks for particular actors (the subject of the chapter on process diagrams), so that we can deploy constructive Type 2 reflection. However, the resulting critical review of a process is not enough. We additionally need to understand and control the process in terms of its main currency, information: what connects stakeholders, actors and tasks, what is consumed and produced in each task.
By managing information in addition to social interactions, we ensure that each task receives the right input and returns the right output, that tasks are unambiguously linked (as the output of one becomes input to another) and that all tasks are consistently and coherently specified. Each participant can consequently know what is required of them, and what they should have to do their job. This makes the management of a process transparent and controllable, devoid of black boxes and grey areas. In this sense, the information side validates the social side, revealing possible gaps and inconsistencies, and removing unnecessary vagueness from the process.
Decision making in AECO is ostensibly based on expertise and often takes place in settings that justify the emphasis on expertise: confusing and dynamic situations, unclear goals and poorly defined procedures, a looming larger context and time pressure. These are typical of conditions in the field, where true experts, such as firefighters, pilots and nurses, must take rapid, difficult decisions under pressure and uncertainty. One could argue that similar conditions exist in many AECO situations, from design reviews to construction sites. However, the similarity is generally superficial or even artificial. There are very few situations in AECO that qualify as true emergencies. In most cases, it is the emphasis on the social side of management and the power games it involves that define the conditions, underplaying the information side and its significance for decision making. In essence, emergency conditions are created in AECO through poor preparation, inadequate procedures and lack of analyses or development in depth. Such inadequacies create the illusion that decisions must be taken on principle or on the basis of expertise and in total disregard to the huge amounts of information AECO professionals typically produce in documenting a situation and developing a design or plan. Relegating this information to a mere background for the personalities involved in a project makes little sense for the functioning of decision makers, as well as with respect to the information-intensive character of AECO.
Even when expertise is called for, it is not a matter of gut feeling alone. Analyses of decision making by experts suggest that it is a two-stage process: it does start with intuition (recognition of how one needs to respond to a situation) but this is followed by deliberate evaluation (simulation of response and its outcomes). In this blend of intuition and analysis, the simulation can be mental but is nevertheless based on information, including both general rules and specific characteristics of the particular situation. A firefighter, for example, needs to know the materials and structure of a building on fire in order to predict how the envelope and the load-bearing structure may behave or how smoke may develop in egress routes. The more relevant information is available, the better for the decision making. Furthermore, experts often rely on explicit information tools for aiding their memory and structuring their procedures, such as data charts and checklists.
The conclusion we can draw from this is that even if we treat every participant in an AECO project as an expert, capable of amazing Type 1 feats, there is every reason to invest in IM for a number of critical reasons:
- Clear information structures and flows allow managers to understand what takes place and guide the process.
- Reliable and meaningful information around each task helps other participants evaluate and adjust their own actions, generally without the need for interventions by manager.
- Any such intervention can be made less confrontational, as well as more operational, if conveyed through information.
Recommended further reading
Four books that explain dual-process theory in order of accessibility to a wider audience:
- Kahneman, D. (2013). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
- Chabris, C. F., & Simons, D. J. (2010). The invisible gorilla: and other ways our intuitions deceive us. New York: Crown.
- Stanovich, K. E. (2011). Rationality and the reflective mind. New York: Oxford University Press.
- Evans, J. S. B. T., & Frankish, K. (2009). In two minds: dual processes and beyond. Oxford: Oxford University Press.
Nudge theory and choice architecture are presented in: Thaler, R. H., & Sunstein, C. R. (2021). Nudge: the final edition. New Haven: Yale University Press.
A enlightening analysis of true expertise is: Klein, G. A. (1998). Sources of power: how people make decisions. Cambridge MA: MIT Press.
The benefits of checklists for medical and other experts are presented in: Gawande, A. (2010). The checklist manifesto: how to get things right. New York: Metropolitan Books.
Key Takeaways
- Thinking comes in two different types, Type 1(fast but biased) and Type 2 (analytical but slow and expensive)
- Type 2 processes can be reflective (evaluation of Type 1 results) or algorithmic (strategies, rule systems etc., often based on cognitive decoupling)
- Failures due to Type 1 thinking include: inattentional and change blindness; planning and sun-costs fallacies; inside views; illusions of knowledge, confidence, skill and cause; inappropriate substitution; narrow framing; false coherence, invented causes and partiality in narratives
- The activation and execution of Type 2 processes depends on information
- AECO exhibits many failures that suggest a dominance of Type 1 thinking
- Management has a social and an information side
- True expertise relies on information
Exercises
- Analyse a project known from literature to have suffered from planning or sunk-cost fallacies:
- What were the causes of the failure in terms of the dual-process theory?
- What would be the right frame and outside view for the project?
- Which information is critical in this frame and view?
- Study the MacLeamy curve (any variation):
- Is it a law, a generalization (like Moore’s “law”) or an overgeneralization?
- Which information is necessary for constructing the curve?
- Give three examples that do not fit the curve, including specific information and the resulting curve.
- The budget overrun example concerns the Amare cultural complex in The Hague, as calculated by the local Court of Auditors: https://www.rekenkamerdenhaag.nl/publicatie-onderwijs-en-cultuurcomplex/. ↵
- For a number of amusing spurious correlations, see http://tylervigen.com. ↵