Innovative design and digital product development are often high risk, requiring significant resources to bet on an uncertain outcome. Gaining insight early into possible future states can go a long way towards mitigating risks and delivering value. 

Traditional workflows 

Designers have many tools in their arsenal to empower the decision-making process. One of the most important of these is user testing – a robust set of techniques to gather feedback and insight –  but it has drawbacks. It can be expensive and time consuming, requiring asset creation for testing as well as the testers’ and users’ time. To warrant the commitment of the time required to make user testing effective, designers must have a high degree of commitment to concepts before approaching users. This frequently limits the number of ideas they can put in front of end users. Furthermore, even the most rigorous testing strategies are unlikely to help a team understand the cumulative impact of design decisions at the scales expected when a solution launches. Quantitative insight is simply not possible with standard qualitative methodologies.

Data-driven design

Data-driven design can help. Through data, there is an opportunity to augment design practice by introducing an intermediary step between problem definition and user testing. This process, designed to give insights at scale, can empower teams to experiment with concepts that may have yet to make it into standard user testing. Doing so allows teams and clients to explore a wide range of possible solutions.

Method’s Data Design team employs computational modeling to create this intermediary step. These data models provide three significant advantages:

  1. Facilitate the assessment of ideas by uncovering their potential value in a low-cost, low-risk environment, allowing us to test tentative and speculative solutions.
  2. Rapidly test at scale and pace, enabling many possible solutions to be interrogated for their potential. 
  3. Provide insights at a system-wide level, allowing us to understand impact at scale across a broader ecosystem than just the isolated product or service.

So what is computational modeling, and how do we use the models?

Computational modeling: Digital twins

We use data to replicate real-world systems through mathematical models. The models that deliver this form of quantitative analysis are called digital twins. As the name suggests, digital twins are a computational model replicating a real system. 

By employing the practices of systems thinking, the digital twin is created as a mirror of the system as it exists.  In essence, we take the complex mechanics of a real system and condense them to their most straightforward, most suitable mathematical description. This enables us to describe real-world processes that may seem beyond quantification in a cost-efficient way, providing designers with a playground for experimentation and testing that they would not have had previously.

To create a digital twin, real-world data that allows us to uncover, understand, and describe the underlying mechanics of the system being replicated is required. In the case of sparse data, statistical tools – such as Bayesian modeling – are used to take available aggregate level data and build an entire view of the underlying behavioral relationships. This is then used to inform and drive the digital twin. With experience a digital twin can be created in the order of days, allowing for truly rapid prototyping.

Facilitating the assessment of ideas: Simulations

Simulations are the tool we use to “switch on” a digital twin. For example, we can run millions of virtual uses of an app, of customers interacting with a business, or of cycles within any process. Simulations offer the means to evaluate the impact of design decisions with nothing more than a few hours of computational resources.

For example, a recent Method client wanted the user experience of their telecom app to be more delightful and fun for end users. How might they ensure interactions were functional, cohesive, and engaging? 

A digital twin was created that replicated the flow of users through the app. Hundreds of thousands of simulated journeys were run, following virtual users through the app from entry all the way to exit. Screens naturally lent themselves to a certain feeling based on their surrounding structure. For example, a deep and controlled navigation structure created a feeling of discovery as users navigated down through a sequence of screens. In contrast, a shallow and open structure lent itself to a sense of accessibility & utility, with points of interest just a few clicks away. The results of the simulations allowed us to quantify the feeling that a real user might have of a screen when they visit it. 

Scaling insights: going beyond current systems 

Digital twins do not need to be constricted by the current state or as-is model of the current reality. We can use simulations to help us conduct change experiments to experience how an overall system might react if we change certain elements or parts.

Our teams run experiments that allow them to speculate possible futures. Using systems thinking approaches, we identify leverage points within the current as-is system and explore the impacts of targeted change. Adjusting the critical parts of the system, we see what happens to outcomes as a whole. 

The inputs to these experiments are our clients and teams’ proposed business decisions or ideas. Data designers transform those inputs into the computational modeling world, creating a mathematical description of the change in the system, requiring a deep understanding of the user, system and data. Simulations are run in this new and speculative world, and the emerging patterns & outcomes are computed.

Referring back to our case study with the telecom application, we tested simulated changes to the app’s structure. Alongside product designers, we experimented with modifications and augmentations to the app’s current state, such as removing or adding connections between different screens. Speculated possible future states were created that either leaned into the natural structural feeling, to create regularity, or worked against this, to provide a sense of personality.  

New simulated journeys were run for each of these speculated futures, and the change to the feeling invoked in virtual users computed. Experimenting in this manner we were able to explore dozens of ideas, far beyond what was feasible with traditional user testing — allowing designers to explore potential behaviors and gain insight over a greater breadth than would otherwise have been possible. 

We used this insight to visualize how potential design interventions could affect the customer experience. From here, we identified the areas within the app that were best suited to an injection of delight and fun. Rapidly testing at scale, unconstrained by the feasibility and resources required to validate tentative ideas, the simulations were used as a tool for experimentation and collaboration between our teams and business stakeholders.  

Bringing it all together with data-driven design

At Method, we are exploring how we apply simulations to multiple levels of problems, from product through to business operations and systemic change. Together, digital twins and simulations become tools to facilitate the quick and inexpensive exploration of decisions across a vast spectrum of problems.

In the hands of a data designer, these tools can efficiently mine for opportunities while testing and validating hundreds of different ideas quickly to select the most valuable. This allows our design teams to scale up the ideation stages of design to evaluate dozens or even hundreds of ideas, potentially even more. 

With wider adoption, the proposed methodology could evolve the current design process, pushing the boundaries of what we can do in partnership with our clients, mitigating risk and taking on even more extensive and bolder business decisions.Learn more:

Recommended reading:

 How We Design Digital Experience Strategies at Speed