Resolving Bias Through Data
This is Part 5, the final part, of a series on the top problems with Project Management Tools. See Part 1: Old Tools, New Rules, Part 2: The Illusion of Control, Part 3: Breaking the Cycle, and Part 4: One Source of Truth to catch up.
So much of our work and personal lives is about communication. Everything we do, whether building companies or families, collaborating with business and romantic partners, or even negotiating over the price of a new car comes down to our ability to communicate effectively. It’s a fine blend of listening, interpreting, understanding, and sharing that takes a lifetime to master.
Image credit: https://theimmeasurable.org/cognitive-bias
Part of the challenge is an almost endless list of cognitive biases that play out in our brains, shaping our thoughts and decisions in ways so subtle we are almost never aware they even exist. The unconscious biases are particularly nefarious, guiding us to regularly make what seemed like highly informed decisions without any real awareness of where that certainty came from. As of the time of this post, Wikipedia lists 125 decision-making, belief, and behavioral biases, plus a whole bunch more social (attributional) and memory error biases. Our brains are complex machines, full of levers and valves we’ll never even see or understand.
A handful of those decision-making ones are particularly relevant to the question of project management and collaboration. Take, for example, ambiguity effect, or the tendency to avoid options for which the probability of a favorable outcome is unknown. Let’s say you’re managing a complex technology delivery project and your team calls an emergency meeting to figure out how to overcome a particularly thorny new obstacle. One of the developers has discovered a problem with the plan — maybe a third party API doesn’t actually do what it’s supposed to — and it’s going to cause a significant delay. You have two options to choose from: continue down the current path and resolve the issue for sure but with three extra weeks added to the critical path, or replace the API with a different, unknown vendor who claims they can have everything connected and working in one week. Ambiguity effect says you’re more likely to pick the first option since you know it will result in a favorable outcome. Your decision might even be further complicated by anchoring (or focalism), which is the tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject). If your first exposure to the new vendor was one developer on the team mentioning that she’d heard bad things, that might further anchor your tendency toward option A.
Image credit: Scott Adams
Another particularly pernicious one is confirmation bias, or the tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions. The #FakeNews epidemic we find ourselves in today is an extreme version of this in which people only seek out news and opinions from news sources who agree with them, further confirming their existing perspectives and creating a bigger and bigger divide. The same happens when making decisions on projects. Let’s imagine this time that you’re working on a massive transformation project for a Fortune 500 consumer packaged goods firm. The strategists leading the charge have been given a blank slate mandate by the leadership team and are working hard to reinvent the business model from the ground up, rethinking everything from packaging to distribution to digital services and products. Given such an unbounded canvas, it’s natural that groups form within the strategy team who believe the company should go in very different directions. Each of those groups will go off and do their research and discovery workshops and vision exercises and will come to the seemingly inevitable conclusion that theirs is the only path. Confirmation bias virtually guarantees that outcome, ensuring that all of them went off armed with strong conviction and an equally strong tendency to only pay attention to the data that confirmed it.
Truth can be a very subjective thing. At some level, all data is inherently biased because it was created by humans. We unconsciously imbue everything with the biases rampant in our own heads. We look for loopholes and ways to game systems because it’s in our nature to find shortcuts and to optimize outcomes. All of those loopholes affect the data that gets collected, resulting in what may seem like a clean, data-driven view that is built on very unsound foundations. Even unintentionally, the design of our research can return highly biased results, leading to very faulty conclusions.
In our New York Times bestseller The Decoded Company we talk about two kinds of data:
There are two types of data: self- reported and ambient. Self-reported data requires someone to manual input it in order to measure it. Filling out time sheets, surveys, performance evaluations, and expense reports are all examples of this type of data. Ambient data is information about a behavior that is automatically collected without the subject having to actively enter each data point. Swiping into work with an active RFID badge, sending e-mails, making calls, and even adding events to an electronic calendar are all examples of ambient data.
We went on to explain that the crux of the problem with self-reported data is that it’s full of biases. Too much of the information being tracked has been filtered through the interpretations of the people who input it. They might have distorted it deliberately, much as a team member might fill in a time sheet to reflect what they think their boss wants to see rather than how they really spent their time. Or it might happen subconsciously. A self-evaluation can skew more positively or negatively depending on a person’s beliefs about their own behaviors, strengths, and aptitudes.
In the workplace, the tools that most people use are often full of self-reported data — both qualitative and quantitative — that paint a biased and usually self-serving perspective of project status. The real, hard truth lies somewhere in between and can only be reached through a combination of ambient data, including real-time status updates produced as a byproduct of the actual work getting done, and qualitative data that captures team members’ gut instinct. Consider the difference, for example, between a complex project plan that no one ever looks at (as we explored in Part 3: Breaking the Cycle) and a modern, collaborative work management tool like Conductor where all of the work happens in a shared platform that dynamically adjusts the plan in real-time. Or the incredible amount of time and effort that goes into producing regular status reports — our partners and customers estimate that about 25% of their work effort goes into that — rather than an automated Flash Report process that produces PowerPoint decks with the latest information already gathered from a single source of truth.
Only by moving away from slow, legacy systems filled with self-reported data toward real-time, modern tools automatically filled with ambient data can we get away from a biased, rearward looking view of reality and move toward true project orchestration.