← Back to main blog

This is a ready-to-print page

Human behaviour is inherently difficult to predict

This article explores a central tension in product design, systems thinking, and algorithmic decision-making: human behaviour does not follow neat, rigid, predictable rules. Systems built on simplified assumptions may appear logical on paper, but they can fail badly when exposed to the complexity of real life.

The piece looks at how deterministic thinking often breaks down in human-led systems, especially when technology is used to model messy, emotional, contextual behaviour as if it were stable and linear.

Man walking through blue and yellow vibrant energy burst

Photo by Stockcake

The challenge

Systems built on rigid, overly simplified algorithms will almost certainly fall short when they try to predict human decisions. A clear example comes from the U.S. housing crisis, where some companies used mathematical models to estimate home values and lending risk based on assumptions that people would behave in stable, rational ways.

But human behaviour did not follow those models. Banks approved loans for people who could not realistically afford them, and when the market shifted, many homeowners walked away. The algorithms failed because they treated a complex human system as if it were predictable and linear, while the real world was anything but that.

Companies like Zillow used algorithms and automated valuation models (AVMs) to value homes and run iBuyer programmes, but they faced significant losses when the market changed. The models had been trained on historical data and average conditions, yet the housing market experienced rapid shifts, especially during the pandemic.

This exposed a deeper issue: models that work under familiar conditions can break quickly when context changes, incentives shift, or humans behave in ways the system did not anticipate.

Research shows that real-estate valuation models, including regression, machine learning, and logarithmic models, struggle with real-world nonlinearities and complexity.

My role

My role in this piece is to frame this problem through a design and systems-thinking perspective. Rather than treating algorithms as neutral or automatically intelligent, I am interested in examining the assumptions built into them and what happens when those assumptions meet actual human behaviour.

The goal is to connect this thinking back to product design. When we design systems, services, or interfaces, it is not enough to rely on idealised user models. People are emotional, situational, inconsistent, and influenced by context, incentives, fear, hope, social pressure, misinformation, and many hidden variables.

In practice, that means a system based on “if X, then Y” may fail in a world where humans behave more like “if X, then maybe Y, maybe Z, maybe nothing, depending on twenty hidden factors.”

Process and approach

The approach in this article is grounded in a real-world case study and then expanded into a broader design lesson. During the lead-up to the 2007–2008 housing crisis, many financial institutions relied on:

These systems assumed:

But when prices stopped rising, borrowers stopped behaving in the ways the models anticipated. Many simply abandoned homes, leaving lenders exposed. This became a textbook example of algorithmic determinism failing in a complex human-led system.

The deeper lesson

Algorithms themselves are not the problem. Overconfidence in simplified models is. Human behaviour is nonlinear, contextual, and shaped by feedback loops. This is exactly what systems thinkers warn about.

“If you treat a complex system like it’s simple, it will find a way to surprise you.”

This example is especially useful in product design because it reminds us to consider:

My contribution

My contribution is to bring systems thinking into the conversation about design, technology, and human behaviour. This article is not only about finance or housing. It is about the wider danger of treating human systems as if they can be fully controlled through rigid logic.

For designers, this matters because products are not used in a vacuum. They exist inside emotional, economic, political, and social systems. Good design needs to account for complexity, uncertainty, and context, rather than pretending they can be eliminated.

Reading list