Lessons from involving children and young people in service improvement

By Tony Ghaye

The challenge

In order to listen to and learn from children and young people, we need a new way to gather quality data. Children and young people often have pre-designed data-gathering tools (e.g. questionnaires) imposed upon them and have rarely been involved in either the choice of content or the design. In our project, children and young people have generated the content and designed (with adult IT support) a new touch-screen healthcare technology system that provides data, in real-time, to those who have the power to improve services for them. We hope that the lessons we have learnt from our project will be transferable to research.

The project

REDY (Real-time Evaluation Device for Youth) gathers information on children and young people’s health and social care experiences using touch-screen technology. The project is informed by sustained engagement, over three years, with 1,000 children and young people aged 7-18 years old, and those working in the wider health and social care system in the English Midlands.

children at computer

We worked in 16 primary and secondary schools because we felt we could access children and young people, and work with them, more easily. We worked closely with school nursing services to identify schools and classes, to negotiate regular access and get full ethical consent. We explained the project to headteachers, classroom teachers, teaching assistants and school governors. Some of these people were also parents of the children involved. On every occasion when working with children and young people, there was at least two other members of staff in the room (e.g. the class teacher and teaching assistant). This was important because we wanted to be inclusive and everyone to understand how we were working.

For primary-aged children we started with either: (a) a small group role play “When I see  ……. a doctor, nurse, dentist …” or (b) draw-and-tell where children drew pictures of “What I remember most when I saw a doctor, nurse, dentist …” For secondary-aged young people we began with: (a) small group role play or (b) a technology design challenge. The outcome of these activities was a set of ‘quality standards’, which were then incorporated into the design of the REDY system. These included: the system had to be touch-screen not paper-and-pencil, quick to do (under two minutes), confidential, look interesting, colourful and uncluttered on the screen, have both number and smiley face scoring systems and be useable in audio versions and in different languages.

The REDY system produces bar and pie charts of children and young people’s experiences, over a particular day or time periods to show trends and variability. The charts are then used to fuel a service improvement conversation. Traditionally and conventionally we find data like this is accessed, analysed and discussed only by adults. REDY gives an opportunity for children and young people themselves to be fully included in interpreting results. Younger people can look at just one ‘bar’ (which we call a ‘coloured stick’) or one ‘pie’, so they aren’t overwhelmed by the data. Older pupils have the possibility to filter the data in ways most relevant to them (e.g. by gender, age, or day). Adults can, of course, be part of the improvement conversation and can learn much from listening to what is said rather than containing or steering the conversation. REDY brings added (learning) value in that it helps young people develop their data analysis and conversational skills.

What have we learnt?

1.   The need to develop a flexible real-time feedback system:

The children and young people involved represented a wide range of age, ethnicity, location and individual circumstances. So they developed a system that reflects this. It allows data to be ‘filtered’ by age, gender, location, day (morning and afternoon) and allows a question-by-question analysis. As the system’s adoption spreads, for example for use with children and young people with special educational needs and disabilities and looked-after young people, then questions can be changed and languages other than English used.

2.   Children and young people have a right to be listened to:

We need to respect children and young people’s views and experiences, listening to them for understanding, rather than listening to make a judgement. This does not mean that children and young people’s opinions should be automatically endorsed. Expressing an opinion is one thing. Making a wise service improvement decision is something else.

3.   The need to take children and young people’s views seriously: 

The fact that children and young people express themselves differently from adults does not justify dismissing them. Too often, token efforts are made to listen to children and young people, but little effort is subsequently made to take on board the views they express. Even where it is not possible to act on their concerns, some children and young people have told us that they feel they are entitled to an explanation of what consideration was given to them and why their suggestions cannot be implemented.

4.   How to use data to drive service improvement conversations:

If (practitioner) researchers are serious about putting children and young people experiences at the heart of what happens, we need to find authentic and appropriate ways to draw on their unique insight, in order to provide an authoritative voice in conversations about service improvement. This is more than consulting, engaging, participating with and involving children and young people. It is essentially about wholeheartedly creating ‘communication friendly spaces’.

Feedback

The following comments are from children and young people and healthcare professionals involved in the project:

Children and young people:

“It was cool to feel like we were the experts.”

“I felt involved in something important that would help other kids.”

“For the first time, I really felt listened to.”

Clinical Director:

“It’s really helped us to have a different conversation and also to have more of a focus for our meetings. Analysing the results for each question, over time, has been very helpful for monitoring progress by the team and for the children.”

Nurse manager:
“It took us a while to learn to make the most of the richness of all the data, to cope with the surprises it contained and to use it positively. But once we got the hang of things our conversations were directly linked to making improvements. We couldn’t ignore what the graphs were telling us!”

Head of Looked After Care:
“It saves us so much time. It’s much better than the seven-page questionnaire we gave out, once a year, and then deploying someone to process all the data.”

Contact: Professor Tony Ghaye, Director, Reflective Learning-International

Email: tonyghaye@gmail.com