Before you read on, take a minute and think about this question for a while.
Look back at your recent transactions.
How did you arrive at those decisions?
Are you happy with your past decisions?
C’mon. If you’ve never yelled “How could I have been such an idiot?” at yourself in a fury, you are not an investor and certainly not a trader.
Now back to my headline question:
My take is, you would like to belief that you invest like a scientist.
Purely interested in objective facts drawing logical conclusions from them. You might even have an engineering background which has helped you to design the perfect trading system. You think that the more rules your system has, the more intricate the analysis, the better the decision-making process — the more likely it is that you’ll crack the code of the market. Obviously profitable trading is simply about hacking the market with the right set of tools. You’ve back-tested your system, and your software confirms that you would have made 200% last year. You are in control of your emotions.
But somehow, reality doesn’t work that way.
In reality you might be suffering from the Lotto Bias (or Toto Bias here in Singapore). You feel that increased confidence people have when they, in some way, manipulate data, as if manipulating data is somehow meaningful and gives you control over the market. It is the illusion of control that people get when they play the Lotto game. People think because they get to pick the numbers that their odds of success are somehow improved.
The simple truth is this: Once you have settled on a system (I am talking about your trading / investing system and not your Lotto system), your trading / investing success does not depend on the system, but on how you use it.
The reality is that most probably all of us invest more like lawyers. We are crafting the best possible justification for a predetermined conclusion. We are emotional and let emotions overpower our logic. Often. Too often!
We human beings are the most illogical of all animals.
Despite having the power of reason and logic, we are the only creatures on this planet that will go out of our way to make sure we are in harm’s way.
Let’s face it, most people will not be able to deal with the issues that come up in trading system design until they’ve solved some of their personal psychological issues dealing with fear or anger and greed.
If you want to create success in the markets, you need to look at the things that cause success or failure, instead of just looking at the end result (= Outcome Bias).
And emotions are the main causes and driving forces in the market.
Understanding the mind as a tool that tries to live in an uncertain world is an important challenge. And that mind of ours tricks us all of the time.
We suffer from many psychological quirks or cognitive biases. And there is a whole set of psychological quirks we are saddled with as part of our evolutionary baggage.
While these quirks might have helped us on the savannah to figure out how the seasons change and where food might be year after year, they are not always the most useful in our interconnected, highly complex, and fast-moving world.
Systematic errors often spoil everything
Typically, a cognitive bias is a systematic error in how we think, as opposed to a random error or one that’s merely caused by our ignorance.
To make better decisions it could be quite valuable to learn more about cognitive biases. Certainly more valuable than spending endless hours on tweaking your trading system and manipulating some past data whether you call that technical analysis or fundamental analysis.
My like-minded friend James Hon (hat tap) recently pointed me towards an excellent website containing a ginormous graphic categorizing the most common and uncommon cognitive biases. It is way too big to write about in this post. I will write about it in my next post.
In the mean time you may want to explore my purely alphabetically listing of biases that I have explored so far.
Or check out this Bonus-Bias-Of-The-Week:
Conservatism Bias: This bias represents an important aspect of our factual inertia. Even if we are confronted with facts that should cause us to update our understanding of the way the world works, we often neglect to do so. We persist in only adding facts to our personal store of knowledge that jibe with what we already know, rather than assimilate new facts irrespective of how they fit into our worldview.
This is akin to Daniel Kahneman’s idea of theory-induced blindness: “an adherence to a belief about how the world works that prevents you from seeing how the world really works.” This change blindness, also known as inattentional blindness, is a quirk of our information-processing system. When looking for one thing, we completely ignore everything else around us.