Why Your Direct Approach Is Failing To Produce Change
The Case for Oblique Design
Welcome to The Design Loft.
What if the best way to address something was to do it indirectly, rather than take it head-on? We’re often taught about the importance of direct, focused action on a problem, but as research in complex systems shows us, that’s often not the best way. This obliquity has profound effects on how we design for change.
In this issue, we’re going to look at the concept of obliquity and what it means for designing for change in complex systems.
Obliquity and John Kay
The concept of obliquity in design is inspired by economist John Kay’s work, which synthesized ideas from multiple lines of theory and science. The concept originated with Nobel laureate Sir James Black’s observation that pharmaceutical researchers often succeeded financially when focused on science rather than profit. Other examples include John Stuart Mill’s philosophical insight that happiness comes as a byproduct of meaningful pursuits (rather than the pursuit of happiness on its own) or Richard Dawkins’ evolutionary biology showing how successful adaptation emerges without conscious design. Kay’s core argument is that, in uncertain, complex environments where our actions trigger cascading responses, indirect approaches consistently outperform rigid, direct planning.
Kay writes: “Obliquity is the idea that goals are often best achieved when pursued indirectly. Obliquity is characteristic of systems that are complex, imperfectly understood, and change their nature as we engage with them.”
This is at the core of much of our design for living systems — organizations, communities, families, schools, and workplaces.
What it means for design is that we need to understand the kind of systems and situations we’re dealing with and the dynamics that influence how what we create shapes and is shaped by this context in practice.
Designing with Obliquity
Urban design is a good example of where obliquity comes in. Kay argues that great cities lift our spirits not because some designer set out to achieve that effect, but because of their lack of planning, their diversity and vitality. It is the unexpected, unconventional, and different within each city neighbourhood and its lack of standardized consistency that makes them attractive places. He contrasts this with planned cities like Brasilia, Canberra, and Chandigarh, which he describes as dull and lacking the vitality of other places.
Research by Deborah Rowland from the London School of Economics takes this concept of obliquity to organizational transformation, arguing that complex change efforts fail because leaders rely on overly-direct, top-down programmatic approaches. For designing complex change, Rowland advocates three counterintuitive principles: first, change starts with leaders’ inner state rather than external programs; second, creating deliberate disturbance can paradoxically build safety (like the practice of allowing small fires to prevent larger ones); and third, structure should follow chaos through emergent processes rather than rigid pre-planning.
The relationship to designing for complexity is profound: rather than creating detailed change blueprints with fixed milestones, designers of complex change should relax their constraints, clarify their intentions, establish some clear guidelines (sometimes referred to as “simple rules”) encourage experimentation, and intentionally build in open space for emergent patterns to develop—recognizing that you cannot directly leap from one structure to another without first moving through creative breakdown and uncertainty.
Using Experience, Data and Feedback
Reflective practice and data are two ‘secrets’ to designing for complexity. Reflective practice is the structured act of mindful attention to what we do, how we do it, and the experience and outcomes of it. It’s about creating systematic, if flexible, means of documenting the work we do, and reflecting on it as we do it. When done consistently, it provides a record of practice-related actions and activities that inform how we design for complexity.
Other sources of data can complement reflective practice. By gathering feedback throughout our active design work, we build the sensory capacity. Design-driven evaluation approaches to gathering data throughout the lifecycle of a design process are another way to build the data that allows you to tackle complex issues and use obliquity to inform your strategy.
(We’ll be covering these design recommendations in future articles)
Practice Considerations
Here are some practical, design-driven ways to build in the kind of sensing and thinking approaches that can have you looking at indirect solutions to addressing complex issues in your design work.
Embed Data Collection
Before You Start:
- Identify your different user types (not just beneficiaries—include skeptics, staff, intermediaries)
- Define 2-3 principles to serve as your ‘simple rules’ to guide and evaluate your initiative (e.g., low-cost, flexible, engaging, inclusive)
- Decide what you need to learn, not just what you need to prove
During Design & Development:
1. Make Feedback Continuous, Not Final
- Build feedback moments into every touchpoint (like interactive boards at events or feedback walls)
- Test small prototypes (workable models) with real users early and often
- Ask “What are we learning?” not just “Does this work?”
2. Co-Design Your Data Collection
- Involve users in creating how you’ll gather data (they know what questions matter)
- Prototype your evaluation methods the same way you prototype your service
- Keep it light and integrated—data collection should feel natural, not burdensome
3. Design for Multiple Perspectives
- Capture data from different stakeholder groups simultaneously
- Look for unintended effects and surprises, not just success metrics
- Create space for dissenting or unexpected voices
4. Build Learning Loops
- Schedule regular “sense-making” sessions to interpret what you’re seeing
- Use data to surface leverage points (places where we can make the greatest positive difference) and guide adaptation
- Treat insights as design inputs, not just reports
Key Mindset Shifts:
- From: Evaluation at the end → To: Learning embedded throughout
- From: Measuring compliance → To: Understanding the system
- From: Rigid plans → To: Adaptive strategy
- From: Data as proof → To: Data as design material
Design your data collection the way you design your service—with users, through iteration, and for continuous learning.
This is among the reasons for incorporating learning and design-driven evaluation and reflective practice information into our design work and how to design with obliquity in mind.
Thanks for reading.
As part of the Design Loft program, we’ll be doing deeper dives for paid members to learn the specific tactics and tools of each of these methods. To be a part of that, consider upgrading your membership. For less than two fancy coffees or a sandwich, you’ll have covered a monthly tuition to your own design school.






