Blackjack, aka 21 is a card game in which you, as a player, try to get the value closest to 21 (21 ideally) from the cards that you get.
The whole game it’s you vs the casino. At the beginning of every deal, you get two cards, both face up. Croupier, on the other hand, has one card face up and one card face down.
Based on what you see you have to decide whether you stay (make no changes), hit (take another card), split (if you have two cards of the same value), or double-down (increase the bet by 100% at the beginning of a deal).
Depending on the strategy you choose, you’ll take different actions in different scenarios.
These scenarios are a good metaphor for how we should approach testing, marketing, and making decisions in general.
Namely, you should always know when to:
- stay - leave the situation as it is. Assume that what’s present is the best possible option.
- hit - make a step forward and reach for more resources. Decide that the best decision is to get more insights into the situation.
- split - diversify your decision. E.g. you may have two very different roads to take and both are equally promising.
- double-down - double the initial input. If the decision you’re about to make is “the minority” in your 80/20 distribution, go all in.
Getting to know what to do is only the first step.
Real wisdom lays in knowing when to do it.
Like in blackjack, in life we use different strategies and approaches to gauge what’s going on around us. Without a certain framework in place we find ourselves wandering around with no purpose, no goal, and no tangible way to follow.
There’s no real answer to the question when to do what. In blackjack, some will hit (take another card) when they have 14 and the others will stay.
In real life, such decisions depend on (among others):
- our moral code
- past experiences
- teachings we follow
- authorities we look up to
Hence, there are many famous moral dilemmas that show the diversification in how different people assess the same scenarios.
Even the latest talks about morality in regards to AI show how different we are when it comes to our morality and therefore making decisions.
Here, you can play a moral machine game yourself. It represents how the autonomous car will react in a given situation based on your choices. Do you sacrifice the drivers and save the passersby or do the opposite?
To end this post, here are some frameworks for decision making that might give you a hint of whether you should stay, hit, split, or double-down:
- “Hell Yeah Or No” by Derek Sivers
- The Golden Rule - “Do unto others as you would have them do unto you” (Matthew 7:12)
- Skin In The Game - “Though shall not have the upside without bearing the downside yourself. You need to own your own risk.” – Nassim Taleb
- Ergodicity - “Don’t look at things as a single event, look at things as a series of events.” - These are very different situations (outcome-wise):
- 6 people playing Russian roulette (with a gun) for $1 billion each (if they survive)
- 1 person playing Russian roulette, for 6 times, for $1 billion
Find the framework that works for you, stick to it, and monitor the decisions you make. If the outcome is what you’re looking for, maybe it’s worth to double-down?