Today’s blog post is about how we test assumptions we have with users before making decisions.
An assumption is something we assume to be true until tested. Some assumptions are thoughts and ideas that we have before a piece of research starts. Others develop based on what we learn once research has started. We then test these assumptions to reduce the risk of creating the wrong thing.
We’ve been doing discovery work on our Local Offer website. The Local Offer details services available for children, young people and families with Special Educational Needs and Disabilities.
We wanted to develop a new Local Offer that could guide these families to the next steps in their journey. We’ve spoken to families and individuals with additional needs. We also spoke to professionals, and providers to understand the routes people can take.
One idea we wanted to test was a smart answers solution. Smart answers are a tool that allow us to present a series of questions that lead the user to an outcome.
What we wanted to learn
We wanted to test some of our assumptions:
- Assumption 1: We assume that using smart answers will help families to find the best next step for them.
- Assumption 2: We assume that designing for a person who is new to SEND will make the Local Offer more usable for all.
To validate these, we needed to know:
- would families find and use smart answers from the homepage
- if families who are new to SEND would find value in answering a few questions that would guide them towards their next action
- if users trusted smart answers
- who our audience is for the local offer and what they look for
- where we should be directing families to in certain situations
How we tested it
Before testing, we reached out to teams in Essex County Council to help us shape the questions and recommended next steps. These teams answer families’ calls, provide and enable support and direct people to the right services.
Designing a polished prototype would have taken more time and effort. Instead, using Typeform we routed questions to recommendations based on the answers. This helped us to test the smart answers quickly and with minimal effort.
We wanted to test if answering questions online and getting recommendations gave the users the confidence to know what to do next. We used mockups of what a new Local Offer homepage would look like to test if people would go from here to smart answers.
We tested with families who had little or no previous knowledge about SEND. We asked them to imagine their child was 4 years old and about to start school. They’ve noticed that their child has started to display different behaviours.
We shared the prototype and asked what they would do given this scenario. What they thought was happening, what they would do and who they would speak to.
What we learned
We learned about which teams we should be directing families to in situations related to social care, health and education.
In testing smart answers, we learned that:
- it works when guiding families towards support
- parents found it reassuring to have their instincts validated
- it was useful to have next steps to take
But users without prior knowledge would not look to the Local Offer for support as it is too early on their journey. They’re not likely to have made the connection with SEND without a teacher, GP or other professional pointing them towards it. The families we spoke to felt that this type of support is for children with high levels of need.
Also, as SEND has 'education' in the title parents felt the Local Offer looked education focused rather than including support for a wide range of needs and disabilities.
- what you want to test
- why you want to test it
- what you want to find out
Now ask yourself what the quickest way to test it is. It doesn’t need to be super slick and refined. Make something to show your organisation and test with users before committing.
It’s okay to get it wrong early. This will point you towards what’s right for your users.