The report for the beta assessment for Her Majesty’s Land Registry’s Search for Local Land Charges and Maintain Local Land Charges on 01 November 2018.

Service Standard assessment report

Search for Local Land Charges and Maintain Local Land Charges

From: Government Digital Service
Assessment date: 01 November 2018
Stage: Beta
Result: Met
Service provider: HM Land Registry

The service met the Standard because:

  • the team continue to make excellent progress on a complex and highly technical service challenge, working in an agile and iterative way, learning from doing
  • key technical decisions are within the team’s gift, and they are using the Beta phase to test their choices, but also refine the wider data migration approach, holding suppliers to account through their own research and development approach
  • the team are proactive in securing user insight, through primary research with a range of users, supplemented by an active and well-considered approach to the use of analytical insights.

About the service


The service records and presents data on ‘local land charges’, geographical constraints on property, including smoke-control zones and tree preservation orders. HM Land Registry are building a registry of local land charges, developing a service for adding and amending existing charges, and providing a service for people with an interest in property to search for information within a particular area.

Service users

The users of this service are Local Authority staff (and some additional trusted individuals) to add, edit or remove local land charges within their area (‘maintain’ users), and members of the public or other interested parties to view charges in their area (‘search’ users). These searches will typically be part of a decision to buy, or the process of buying, property, and ‘searches’ can be guaranteed by HM Land Registry as part of securing loans/mortgages for property purchases. This guarantee is a ‘paid’ option within the service.


User needs

The team have done an impressive amount of research, and have spoken to a substantial amount of local authorities and private search companies across the UK. User researchers on the team have done research with over 500 users and contextual research with 15 users with cognitive impairments, using the users own equipment and devices.

The team have used a range of methods in their research. They have interviewed current beta users, tree-tested charge category content with citizens, carried out contextual observations of people going through the current planning application process and using the private beta service. The team have carried out usability testing in the lab on a range of devices and pop-up research at a citizens advice bureau.

Weekly calls are held by the team to discuss feedback and findings. User researchers on the team are working closely with analysts. They should keep working closely together to learn more about users.

Research findings have enabled the team to create three personas for the maintain part of the service and five personas for the search part of the service, which include behaviours, pain points and needs. The team have also identified that there are currently a couple of ways that people can search for local land charges. The current ways of doing this can be complex, confusing or unknown to the end user, and often involve various roles, authorities and organisations. Research findings have enabled the team to discover and understand the pain points for those journeys.

The team used a screener that mainly focused on people’s attitudes to tech to recruit people with assisted digital needs. The team mentioned this didn’t work very well. The panel advises that the team revisit their screener (set of questions that will screen out potential participants to help ensure the team get the right users), adding questions around people’s behaviours when using technology as well as their attitudes towards technology. It might help if the team set clear parameters on a digital inclusion scale to ensure they are doing research with people who score in the lower quartile of this scale for both maintain and search journeys.

The team have now migrated four local authorities on to the service and another five are due to go live by April 2019. The team aim to take the total number of migrated local authorities to 26 during the remainder of 2019. As more local authorities onboard and more people start using the search part of service, it’s crucial that the team ensure the service meets accessibility requirements and works for assisted digital users for both the search and maintain parts of the service. Following the recommendations from the accessibility audit (October 2018) the team should carry out more research with people with accessibility needs and ensure that all parts of the service meet accessibility requirements, especially focusing on the map functionality across a range of devices.

The team have done a lot of research with local authorities but should look at doing more research with all the other identified ‘maintain’ users, including national parks, highways, bishops adding markers for burials, and other authorised individuals.

The team should continue to do research with support centres, looking at end-to-end support journeys for both maintain and search journeys. The team should consider support staff for the service as a user group(s) in their own right. Although the service is self-serving, support could be especially important when the service goes live and more people start to use it for local land charge searches.

The login journey for the paid official search should be explored further, looking at the differences and user expectations for both end-to-end search journeys, including payment and content..

The team should also consider doing more research to explore how users will know when an local authority becomes a searchable area on the service. It’s important that users can tell early on whether areas are searchable or not, so they’re not wasting time trying to search for something that’s currently unsearchable.

The team did outline that the offline journey will consist of an in-person paper service and a phone service. It would be good for these journeys to be clearly mapped so the team can be sure they meet user needs.

The team mentioned that their plan for public beta will involve contacting search and maintain users for follow up research sessions. They will be doing research with search users in November to explore users expectations and reactions to the re-design of the downloadable PDF of paid search results. The team are also planning on doing more research into the choose a charge category part of the journey, as more categories and subcategories will be need to added as more local authorities are on-boarded. They also plan to do more research into the end-to-end journey for search, looking at how users could search by title number and how users filter searches.

Research has identified that there’s a need for local authority spatial and point data to be more consistent and accurate, so the team also plan to do more research into data migration and data quality.


The team is a blend of permanent HMLR specialists and subject matter experts supported by a supplier. They are largely co-located and are working in a mature, agile and iterative way. The service owner is empowered, and effectively supported by a proportionate governance regime. There was a great spirit within the team, and it’s clear that they relish the challenge they have, and are working to a high standard to deliver it.

This programme of work was originally set-up relatively isolated from other activities within HMLR. It is encouraging to see that they are working more closely with other colleagues and teams in HM Land Registry, and this should increase further as rollout continues. We would expect to see more consideration being put into the end-to-end experience for users, with greater alignment between work to improve the land register, local land charges, and other relevant streams of work within HMLR.

The panel recognises the challenge of migrating a large amount of organisations’ data into a single repository, and that the ‘national’ focus of other programmes of work in HMLR do not fit well with the gradual rollout of this programme. However, we would expect to see a clearer view of the end to end service that this service is part of, and evolution of the team structure and governance models supporting it to enable ongoing organisational and service transformation.

As discussed later in this report, the panel had some reservations on the current gap between the development team and HMLR’s existing Web Operations (WebOps) capability. Effort should be made to close this gap, and to ensure that the team is empowered and enabled to deploy at pace, with effective control and communication between functions within the organisation.


The team used the same technologies as in Alpha – Python, Flask, PostgreSQL, AWS (Amazon Web Service), which made sense for them. When choosing a technology the team is using a combination of Spikes and Open Design Proposal approach.

The team considers auditing and monitoring part of their company culture with another dedicated team responsible for synthetic monitoring service three times every five minutes, triggering alerts, etc.

The system is using common government platforms such as GOV.UK Pay and GOV.UK Notify and is integrated with legacy finance system. The system is also integrated with Index Map API which is shared with another Land Registry project – Property Development system.

The team automated their deployments which include automated acceptance tests. The team has multiple environments, but some work is currently required to streamline these. Deployments are usually done once a sprint and require WebOps approval.

While Alpha source code is open source – Beta code is in Land Registry’s GitHub and is yet to be made public after a security review. The review would ensure the sensitive information is removed and the code can then be added to an open internet source code repository such as a public GitHub repository.

The system does not currently have redundancy/failover as there are two other ways of accessing this data, with only search available to citizens. The team would consider implementing an architecture that would support redundancy/failover when more local authorities start using the system. The team relies on a separate HMLR WebOps team for supplier outages, infrastructure issues and addressing DDoS attacks.


The team seems to have a very good and inclusive design process in place. They showed thorough iteration and the integration of learnings in the service via approaches that worked and others that didn’t.

While the map was iterated on it seems to have been iterated mostly in isolation. The panel strongly recommends that the team’s designers should form a working group with other teams in government who use maps – to share findings and issues and to add map elements to the GOV.UK style guide and pattern library.

The service has two different user types (1. ‘maintain’ users – professional users who enter, change or remove charges, 2. ‘search’ users – citizens, or their representatives such as ‘personal search’ companies, who want to check constraints on a property) who approach it in different ways and with different frequencies. ‘Maintain’ users already work with digital mapping tools/apps and their specific mental model strongly informed the interactions with the map for both user groups. The panel recommends the team keep a close eye via qualitative and quantitative research on how both users types actually use the tools – and if behaviour differences emerge, they should refine the interaction options.

It is great to see the team is already monitoring different interaction pattern on different device types and will be in a good place to understand those user journeys better the more data they will be able to get on this.

The team seems to have good awareness of accessibility and assisted digital needs with alternative routes and support channels in place and user research being ongoing.

The design team has a clear list of priorities with the next steps being to address content issues, the list page, report pdf page & thumbnails.

The panel have provided a content review from the GOV.UK content team so the service team can address discrepancies with the GOV.UK style guide.


It was good to see the team’s expanded use of data now that the service is being used by four local authorities and especially good that their performance analyst attended the assessment.

The team demonstrated effective working between user researchers and analysts to enhance insight about user behaviour and the analyst discussed the team’s backlog for data stories. By developing a performance framework, the team is focused on user needs, developing hypotheses and features to test ‘what success looks like’.

The team makes use of a range of dashboards to socialise the performance of the service within the team and to wider stakeholders.

The team has done a lot of iteration with the mapping tool and has built a number of events into analytics to capture the users’ interaction with the maps. As a relative complex interaction, this emphasis on testing usability of tools and actively seeking to learn how to improve them is very positive – we would encourage the team to blog about the approach they’ve taken, and the results they’ve achieved, and promote these findings to others working on geospatial tools within government.

The team has engaged with the Performance Platform and a draft dashboard has been produced.


To pass the next assessment, the service team must:

  • do more research with users with accessibility needs, for both maintain and search journeys using a range of devices and equipment, focusing on the map functionality and drawing tools
  • explore end-to-end journeys, including support with all the other identified ‘maintain’ users, such as national parks, highways, bishops and authorised users
  • explore end-to-end journeys for both search journeys (paid and free searches), including login and payment, across a range of devices
  • implement redundancy and failover support
  • list the most likely causes for the service going offline and have a plan to stop them from happening. If this is done by WebOps – have an awareness and copy of this plan.

The service team should also:

  • explore the compensation journey for paid search users
  • research how expiry dates for local land charges work for both maintain and search users
  • test how users will know if and when a local authority is a searchable area or not
  • investigate how the service can ensure accurate information is displayed on the service for both maintain and search users
  • explore and map the offline journeys for both search and maintain parts of the service
  • continue to do research with more people who have low scores on a digital inclusion scale for both maintain and search journeys
  • continue their work on streamlining the environments
  • try to move towards continuous deployment and remove administrative blocks if possible.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service is already running on a domain, so there are no formal steps to take at this stage.

However, recognising that the scale of migration means that it will be some time before full national coverage is achieved, we would strongly encourage the team to arrange a GDS workshop in 6-12 months time to review progress, rather than awaiting completion of the register and subsequent Live assessment.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Published 22 July 2019 Contents