← Back to main blog

Human behaviour is inherently difficult to predict

man walking through a dual blue light and yellow light vibrant energy burst
Photo by Stockcake

Systems built on rigid, overly simplified algorithms will almost certainly fall short

A well-known example comes from the U.S. housing crisis: some companies used mathematical models to estimate home values and assess lending risk, assuming people would behave in stable, rational ways. But human behaviour didn’t follow the models. Banks approved loans for people who couldn’t realistically afford them, and when the market shifted, many homeowners simply walked away. The algorithms failed because they treated a complex human system as if it were predictable and linear, and the real world isn’t.

Companies like Zillow used algorithms/automated valuation models (AVMs) to value homes, then tried to buy/sell them via their “iBuyer” programs and, they faced large losses when the market changed.

The AVM was trained on historical data and “average” conditions, but the housing market experienced rapid changes and unpredictable shifts, e.g., during the pandemic. That meant many assumptions in the model didn’t hold.

Zillow underestimated the risk that having to hold and resell many homes would expose them to market downturns, cost overruns (renovations, labour, materials) and other operational risks.

The mismatch between model outputs and real-world outcomes highlights how modelling human-led systems (people buying/selling homes, values shifting, sellers choosing offers) is much more complex than many algorithms assume.

Research shows real-estate valuation models (including regression, machine learning, logarithmic models) struggle with real-world nonlinearities and complexity. For example, one study uses a logarithmic growth model for real estate economics.

People are emotional, situational, inconsistent, influenced by context, incentives, fear, hope, social pressure, misinformation, and countless other factors that no simple model can fully capture.

Algorithms that assume “if X, then Y” will break when humans behave like:

As an example, during the lead-up to the 2007–2008 housing crisis, many financial institutions used:

These systems assumed:

But when prices stopped rising, borrowers stopped behaving in ways the models anticipated. Many simply abandoned homes (“jingle mail”), leaving lenders exposed.

This is a textbook example of algorithmic determinism failing in a complex, human-led system.

The deeper lesson

Algorithms aren’t the problem, overconfidence in simplified models is. Human behaviour is nonlinear, contextual, and shaped by feedback loops. This is exactly what systems thinkers warn about:

“If you treat a complex system like it’s simple, it will find a way to surprise you.”

This housing example is great to explain the unpredictable behaviour of humans and events that need to be taken into consideration when designing a product.

Its important to consider:

Reading list

← Back to blog