The report for the alpha assessment of DfE’s Manage Key Stage 2 and 3 curriculum resources service on the 11/06/2019.

From: Government Digital Service
Assessment date: 11/06/19
Stage: Alpha
Result: Met
Service provider: Department for Education

Service description

This service intends to reduce teacher workload and improve learner outcomes by changing teachers’ relationship with curriculum materials for lesson planning. Coherent programmes of sequenced learning objectives support teachers without the expertise to design their own.

Resources provided for each learning objective enable teachers to quickly and easily plan their lessons. Unit-level subject knowledge helps generalist teachers prepare themselves to teach the term ahead. In its essence, this is content-as-a-service.

It will be a discretionary service – those who wish to use it will be offered a range of curriculum programmes for their subject and key stage to choose from. These programmes will be procured from expert curriculum designers.

The service helps to deliver against a manifesto commitment to “provide greater support for teachers in the preparation of lessons and marking, including through the use of technology”.

Service users

Teachers. School leaders (often still teachers) responsible for choosing curriculum programmes. The service targets primary schools, underperforming schools and those with poor teacher retention as their needs for this are the strongest.

1. Understand users and their needs

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team did an outstanding job on their approach to access needs, embedding inclusive design into their thinking and processes from the start, rather than just their testing
  • their understanding of their core user group (newly qualified teachers and curriculum leads) is good and well researched.

What the team needs to explore

Before their next assessment, the team needs to:

  • develop more detailed plans for selecting participants (i.e. schools and end-users). The team could segment and characterise their selection of schools according to a wider variety of parameters than they currently have (e.g. high/low socio-economic status context, high/low retention rates, high/low outcomes, different size and structure of the school)
  • conduct more research with participants beyond their core user groups (NQT and curriculum leads) to include other users of the system (e.g. headmaster/principals, teaching assistants, account administrators, senior teachers who have a lot of materials already accumulated). This is particularly important because their approach is to go by adoption by schools, rather than by individual teacher
  • expand research methods: the current data collection emphasis for beta seemed to be around diary studies. The panel suggests the team should spend some time thinking of different approaches, especially of ways to benchmark how long something takes now versus when they use the proposed new solution
  • conduct more thorough research with providers who would (bid for and) create the lesson content.

2. Solve a whole problem for users

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done good work to understand the needs of teachers and their desire to adapt lessons beyond the given assets. The team should continue to monitor this to make sure they are supplying lesson material in the most appropriate formats.

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate how the service fits into the wider landscape of services for teachers. In particular the team should speak to other DfE service teams to align their approaches and reduce unnecessary duplication of work
  • explore options for the future in more detail. Specifically, more definition is needed about the plans for people and assets needed to deliver this service beyond development in a sustainable way
  • think of the service as more than a temporary solution to tackle a gap in the market, detailing various options (some temporary, some less so) for sustainable delivery of this service so that these can be seen at the next GDS assessment
  • create a third ‘team’ or core priority to focus on options for the future sustainability of the service. Plans for current beta team set-up includes only two sides: content management and digital service development.
  • test whether the current name of the service works for end users (e.g. test whether users are more likely to need to ‘Plan my lessons’ rather than ‘Manage curriculum resources’)
  • better understand the potentially differing needs of schools with differing resources.

3. Provide a joined-up experience across all channels

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is thinking beyond what happens in the digital portion of the tool, exploring printing and filing options to support users throughout their journey
  • the team has explored how user behaviour patterns, especially how they might access the service at different times on different devices
  • the team has worked hard to understand the need for teachers to download lesson material in different formats.

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct more research to better understand user support requirements. The prototype indicates this will be provided by webchat and phone. The team should make sure these are the appropriate channels and that it has the resources to provide this support. It should speak to other service teams across to DfE to see where resources can be shared
  • continue to do research to make sure the service is meeting user needs across different devices and user contexts.

4. Make the service simple to use

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done good work creating a service that is beginning to meet the needs of teachers and subject leads, including some excellent in-depth work on lesson objectives
  • the content structure around things such as ‘Misconceptions to address’ and ‘Building on previous knowledge’ is particularly strong
  • the team has done excellent iterative improvements to make content succinct and digestible based on strong research
  • the team has made good efforts to use design system patterns and should continue to do this.

What the team needs to explore

Before their next assessment, the team needs to:

  • do further iterative improvements to the user journey based on usability testing and research with end users of the service
  • work on more complex interaction patterns. For example sorting and ordering of lists need to be put through thorough usability testing, and findings should be fed back into the GOV.UK design system community. Accessibility of these patterns also needs to be fully considered
  • focus on researching and improving the journey of finding the service on GOV.UK and the journey from GOV.UK
  • research account creation, structure and permissions to thoroughly understand and meet the needs of the different types of users across schools of varying sizes and structures
  • focus efforts on a greater understanding of the formats in which teachers need to download lessons after they’ve been planned online using the service
  • understand how suppliers will submit lesson content to the platform and the needs and governance around that process.

5. Make sure everyone can use the service

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done strong research with teachers that have neurodiverse access needs (such as Dyspraxia, Dyslexia, Attention Deficit Hyperactivity Disorder and Dyscalculia)
  • the team are aware of the need to look at access needs across the whole service scope, including the lesson material.

What the team needs to explore

Before their next assessment, the team needs to:

  • have planned and implemented appropriate accessibility audits
  • have put appropriate actions in place to make sure the lessons published on the service meet the access needs of school children
  • have expanded on the strong access needs work by researching with teachers who have other access needs such as sight or motor difficulties
  • do more research with users low on the digital literacy scale to understand how comfortable they are using the service.

6. Have a multidisciplinary team

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team included an appropriate mix of skills and expertise at alpha
  • the team are empowered to make decisions about the service and that product owners are collocated with the team
  • the product owners work closely with the relevant policy teams, and form an effective bridge between policy and service delivery
  • the teams has regular check-ins and a good working relationship with the SRO.

What the team needs to explore

Before their next assessment, the team needs to:

  • urgently address its reliance on temporary contractors to deliver the service. The panel’s biggest concern was the absence of internal DDaT capability and the likelihood that knowledge may not be effectively transferred to the next phase of delivery
  • fully explore routes for hiring in-house technologists in key roles (e.g. delivery management, service design, user research etc). The panel considers the lack of appropriately skilled civil servants to be a risk to the sustainability of service delivery
  • assess its staffing requirements for the content management aspects of the service, in particular evaluating whether this can be effectively delivered by business analysts only, rather than a full multidisciplinary team.

7. Use agile ways of working

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used an effective range of agile methods, including:

    • regular stand-ups
    • sprint planning
    • managing a product backlog using Jira
  • the team iterated the service based on user feedback
  • some agile governance is in place to improve the service and respond to changing requirements
  • the team successfully changed the timeline for service delivery on the basis of lessons learned during delivery.

What the team needs to explore

Before their next assessment, the team needs to:

  • address issues around composition of the team for beta to ensure it does not slow down delivery, and the transition out of the next phase is seamless
  • fully understand the role of the bridging team that will work between content management and front-facing service.

8. Iterate and improve frequently

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • focussed its activities at alpha on the part of the service that delivers most value (i.e. the part accessed by teaching professionals)
  • showed good evidence of having changed the service based on user research, and of testing its riskiest assumptions. In particular the work the team did around understanding how prescriptive to make the outputs of the service was carefully considered.

What the team needs to explore

Before their next assessment, the team needs to:

  • build on its good work and bring in appropriate capacity, resources and technical flexibility to continue to iteratively develop the service during beta.

9. Create a secure service which protects users’ privacy

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team identified potential security threats and designed an architecture to mitigate them
  • the team considered how to collect only the minimum amount of information needed from teachers
  • the team contacted DfE Security to audit the service and that it was classified as low impact.

10. Define what success looks like and publish performance data

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is setting itself ambitious targets to meet high-level policy objectives
  • the team has contacted performance platform (N.B. It is not measuring cost per transaction.)
  • the team is planning to use standard tools for measuring KPIs, such as Google Analytics
  • the team has commissioned a market impact report to understand the implications of content provision to the service.

11. Choose the right tools and technology

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team plan to use the common GOV.UK components, GOV.UK Design System and GOV.UK Notify
  • the team plan to use the standard DfE tech stack – Ruby on Rails, Postgres hosted on Microsoft Azure
  • the team has considered whether it is preferable to buy or build high-level components that make up the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • decide if DfE SignIn is going to be a viable authentication solution, and if not investigate alternatives.

12. Make new source code open

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team plan to develop in the open which is the default approach for DfE.

What the team needs to explore

Before their next assessment, the team needs to:

  • make the source code for the alpha prototype publicly available.

13. Use and contribute to common standards, components and patterns

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team plan to use common GOV.UK components where appropriate. Specifically the GOV.UK Design System and GOV.UK Notify
  • the team chose open standards, open source solutions and common platforms throughout the service (development, continuous integration, monitoring, alerts, logging).

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how the service can produce materials in open formats as per the Open Standards principles. However the service team are already planning to speak to the Head of Open Standards at GDS about this.

14. Operate a reliable service

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had gained significant insight into probable usage patterns and consequent demands on service (e.g. noting greatest use is likely to be on Sundays)
  • the team is considering the potential impact of shorter and longer down times
  • the team has started to think about workarounds for periods of down time, e.g. providing temporary accounts to the digital bank of materials at the backend of the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • address an issue they have identified about DfE sign-in currently being unable to support the expected number of users of the service.

Published 2 August 2019 Contents