One of the main arguments of the strategic model in this platform is that humans really suck at predicting the future. We are not even good at predicting our own behavior let alone events that we do not control. Yet we cling to our beliefs that we can. Om Malik writes about some high level — and wrong — predictions in the tech world.
Can we do better? Of course — one reason prediction is so difficult is identifying what “clarifying universals” are at work. What is a clarifying universal? These are the things driving events. The example that comes to mind is from Jeff Bezos who based his Amazon business model on 3 clarifying universals – people always want lower prices, wider selection and faster delivery.
Learning to look for clarifying universals and avoid clinging to silly ones is a critical part of building strategic capacity. That is why it is a sub.menu in our “future” page. Dal with the future, dude. Or it will deal with you.