Relatively small misconceptions about evidence inside of businesses that you can be pretty certain are common, and that have pretty big consequences imo:
-
Relatively small misconceptions about evidence inside of businesses that you can be pretty certain are common, and that have pretty big consequences imo:
- people are very bad at estimating base rates and at remembering them. We quite often think things are MUCH rarer or MUCH more common than they are. Check base rates, contextualize conversations with them, drop real numbers in instead of letting the conversation slide around on vague claims
-
- People don't like to talk about what we're NOT measuring. But it is easy to see what a measure doesn't measure when you're designing it. Perhaps it is a certain use case, or a certain type of information. Whatever it is, people probably knew it at the beginning. Document that absence and mention it when you report the measure. I've seen teams that "hold the space" on the things where they're like we know we're not measuring X, and that is really useful context/reminder.
-
- People are very bad at estimating what a *reasonably sized change* would be. This could be an entire piece of scientific work, but even for your average person, you can get far by literally just saying to yourself "hm what do I think is a reasonable amount of change to expect in this amount of time, for the amount of effort/investment/intervention we're putting into this." Better yet to try to get examples of similar change that can ground your prediction, but it's a start
-
- no surprise to UX or any other "user" facing part of the business but most people seem to prefer to dwell on ideal use cases versus solving implementation in the real world problems. Implementation in the real world determines everything. Asking yourself if we have honestly gathered enough data about that real world and are looking at and talking about the right data really seems to make or break evidence-based reasoning in a lot of businesses.
-
@grimalkina As someone who bridges UI/UX/Eng/Data Science, so much THIS
Software teams are so used to making things up from their priors and then acting as if they were handed down from the heavens, they don’t believe you when you show them things from their own real world data