Signal vs. Noise

Est. Reading Time: 5-8 Minutes

As of this writing we are in a unique time for the market. Energy prices are remaining stubbornly high, CPI numbers keep climbing (even with food and energy removed from the equation), unemployment is at an all-time low while workers switch jobs like they are speed dating, geopolitical risks not seen for decades are coming back and now corporations are reducing their expectations on future demand.

Each one of those could be its own rabbit hole to go down but overall, it would seem there is much more noise in the system than usual. Which increases the difficulty of making sound decisions.

But more data is useful right?

To answer the above question, I’ll give an eyeroll inducing response that was common in my days as an engineer… “that depends”

In applications such as say meteorology more data would be preferred over less. In fact, any system that is aiming for a high degree of accuracy would seem to benefit from more information. But there is a point of diminishing return to additional data and in your personal life that point occurs way before it would for an engineered system.

The Pareto Principle, or the 80/20 rule, states that roughly 80% of the consequences come from 20% of the causes. Most people know this. It is a popular mental model and has found its way into common vernacular.

But let’s say diminishing marginal utility be damned, and you want to go for that additional 20% for the sake of accuracy. This is the era of big data after all, and we are now capable of efficiently evaluating large swaths of data to capture as much “accuracy” as possible.

What is less spoken about is the slippery slope that exists on the other side of Pareto’s 80% threshold, and why it will cause subpar decision making.

How Long is Britain’s Coastline?

Lewis Fry Richardson, a mathematician, physicist, and meteorologist, was looking to test a theory that the probability of war between neighboring states was proportional to the length of their common border. Or, put another way, the longer the fence between you and your neighbor’s yard the more likely you two are to argue. To test this, he delved into the riveting task of measuring borders, and noticed there was a sizable difference in the lengths between various published sources.

For instance, the border between Spain and Portugal was sometimes 987 kilometers and sometimes 1,214 kilometers. This was around 1950, so it was hard to believe such discrepancies existed when the height of Mount Everest was known to within a few feet before the end of the nineteenth century.

What he found was that this discrepancy was due to varying degrees of resolution. That is, if using measuring lengths of say 50 kilometers to approximate length (borders are not straight lines after all) you would get one measurement of the overall distance. But reduce the individual measurement length (increase the resolution) to 5 kilometers and a longer overall distance would result.

This is because physical objects have no absolute objective length, and with an infinitely smaller unit of measure you will get an infinitely longer resultant length. While the cause of this specific example has its own entire area of study, what is important is it provides a useful example of more data being no more helpful past a certain point, even if accuracy is technically increasing. But, if one focuses on perfect accuracy, the endeavor will technically never cease.

Using this as an example, you can see how aiming for perfect accuracy can never be obtained. But thankfully in the real world good enough will often suffice.

It’s better to be approximately correct than precisely wrong.

This is probably not a revelation to anyone reading this. But, there is an often unappreciated “tax” you are paying even if you decide to dive into the data deep end knowing full well good enough will do.

The Volatility Tax

To pull this concept into the realm of real-world decision making we’ll use the example of an investment portfolio that returns 15% a year with 10% volatility or standard deviation. This is pulled directly from Nassim Taleb’s book Fooled by Randomness and it is worth a read.

Now, with the above portfolio, we can expect the chance of success (positive returns) in any one year to be 93%. Said another way, if you looked at the portfolio at the end of every year for one-hundred years you’d see a positive return in ninety-three of them. Assuming those seven down years didn’t come one after another, we’d spend those one-hundred years pretty levelheaded about our investments. However, if we narrow the time scale to smaller and smaller units things changes dramatically.

Below is a table of the chances our hypothetical portfolio with be in the green when we check up on it at various times between those checks.

You can see that the more often we pull up our portfolio to give it a look the closer we move to a 50/50 chance we are up or down, despite marching towards the average annual returns of 15%.

Although I may draw the ire of anyone from the technical analysis crowd, it is hard to believe that there exists any useful information within a short time frame when there is a roughly 50/50 chance of being up or down when looking at your portfolio on an hourly or even daily basis.

In fact, the only thing that can be said definitively based on how we are wired as humans is that the back-and-forth oscillations of the portfolios will likely cause you to make a decision based purely on noise and not on a signal that will drive the increase or decrease of the investment’s ultimate value.

If we flip our view of this hypothetical portfolio around, we can say that for every one unit of signal (the facts that matter like earnings) there is 0.67 units of noise that can be attributed to randomness (10% of volatility for every 15% worth of return).

So, restating the above table with this additional view gives:

Now if we are going to check our portfolio daily, there exists 10.75 units of noise per every single unit of signal. With the market open 252 days out of the year, that’s a lot of needles to find in a lot of haystacks.

This is an interesting characteristic of systems, the higher the complexity the more fragile they are.

Better to focus on bigger ticket items than the daily oscillations of the market’s mood.

How to Find the Signal – An Investing White Noise Machine

It could be stated that every activity within the investment world is ultimately trying to find the signal. Because of this there isn’t a clean unified prescription. If there was, I am pretty sure Bridgewater would’ve found it already.

Since everyone’s goals are different, the definition of what the signal is changes.

If you want the most low touch and best sleep at night approach it is hard to argue against the diversified ETF portfolio. Just make sure you are validating what is actually in those ETFs. But, if you don’t care to find the signal OR the noise, you can’t really make many irrational knee jerk decisions if you’re not even looking.

For those trying to get beyond passive macro level returns, special situations provide not only a great pond to “fish” in but by default narrows your focus to identifying the variables that truly matter.

As an example, right now Microsoft (MSFT) is looking to acquire Activision (ATVI) for $95/share, and it is expected to close in the first half of 2023.

I am choosing this as it’s one of the more well-known acquisitions currently underway and there is no shortage of coverage. But, I am not advocating for it one way or another.

Both boards have approved the transaction and it will be paid for in cash, of which MSFT has no shortage. The shareholders need to vote, but they rarely vote against an acquisition in which the purchase price is well above the pre acquisition announcement share price and when the boards have approved and recommended the transaction.

At the time of this writing the difference in the current price of ATVI stock and MSFT’s purchase price represents an annualized return of approximately 21%.

Why is that?

The main linchpin in the deal is that MSFT receives the necessary green light from the regulators to allow the transaction. So, there exists a return potential due to this risk.

Whether this approval is likely is not the point. The point is here is an example where the structure of the acquisition allows you to narrow in on a few key items of merit against the backdrop of broader uncertainty.

Ideally we can ignore the noise that doesn’t matter regardless of the investment approach or macro environment we are in, but we are human.

More of a Suggestion Than a Prescription

There is ample evidence showing that the risk adjusted returns for situational or corporate event investing increases in times of heightened volatility, but from a subjective standpoint adjusting your investment approach to areas that allow you to shut out the noise (however that approach might look), may keep you from paying any unnecessary “taxes” due to data overload.

Previous
Previous

A Rough Overview on Inflation

Next
Next

Drawdowns Done Two Ways