SLC Centre for Evaluation and Monitoring – track application for student finance beta assessment

Service Standard assessment report SLC Centre for Evaluation and Monitoring – track application for student finance 13/01/2022

Service Standard assessment report

SLC Centre for Evaluation and Monitoring – track application for student finance

From: Central Digital & Data Office (CDDO)
Assessment date: 13/01/2022
Stage: Beta
Result: Not Met
Service provider: Student Loan Company (SLC)

Previous assessment reports

Service description

SLC will replace and enhance an existing service that will enable our customers to track their application for student finance. The service also includes prompts to complete any outstanding actions through their digital account. This will ensure that our customers receive their funding without delay leaving them to focus on their studies.

It will provide an end-to-end digital experience for both students and their sponsors, allowing them to engage with SLC through preferred online channels.

Service users

  • Students - who have applied for student finance. They use the service to manage their student finance and keep up to date on the progress of their application and payments.
  • Sponsors - who live in the same household as the student at the time of their application. They could be parents, guardians or partners who provide their financial information to support the student’s application. They will use the service to manage their account and check on the progress of their part in the process.

1. Understand users and their needs

Decision

The service met point 1 of the Standard. (with the caveat that they need to get user researchers in the team asap)

What the team has done well

The panel was impressed that:

  • there is great enthusiasm for users that came across in the session
  • user research is feeding into the iterations and that we are seeing progressive improvements throughout the design of the service
  • the whole team is involved in observing and analysing user research – keep this up! It would be good to have this documented in the service assessment slide deck next time
  • they have planned user research with content in the medium term
  • user research is being continually conducted with users with accessibility and assisted digital needs
  • the team is looking towards creating a more comprehensive way of gathering all the insights about users’ experience of the service going forward
  • how the team has adapted to not having full UR resource

What the team needs to explore

Before their next assessment, the team needs to:

  • have user researchers embedded in the team for this work
  • work to ensure they are aligned and to establish key questions/hypothesis/goals/KPIs and what success looks like
  • embed user researcher and performance analysts in the team and ensure they work together to establish the best methods to acquire the qualitative and quantitative data they need to measure these agreed KPIs/goals/measure of success and to answer outstanding questions the team has about the service. (and through this resolve heavy reliance on quantitative data from a feedback survey)
  • carry out user research on actual mobile phones in further round of testing – mobile testing is not only about screen size but also about interactions and haptic/physical experience that is different on mobiles
  • there is a lot of personal information being submitted to this service – there may be some UR to explore trust and what transparency/ messaging you can give users re data storage practices (a link to a general privacy policy might not be the right thing)

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team clearly demonstrated the problem they were trying to solve and have developed a solution that meets that problem
  • the team have worked with other departments on the chat bot experience and have learnt from well-established chat bots to deliver a better experience that solves the whole problem for users
  • the team grounded their work in user research with a broad range of users (students and sponsors with varying needs)  and an understanding of the user journey
  • the whole team is involved in reviewing UR sessions and creating user stories when looking at development of new features of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue working towards joining services together and for the service team to demonstrate in future assessments the wider landscape of SLC services and how they join together
  • ensure they use feedback not as an end in itself but as the springboard for more in depth research based on questions and hypotheses that arise from the surveys, areas of  friction in the service, etc   - making improvements isn’t always about what users want but about understanding user behaviour through observation and analysis
  • explore trust and what transparency/ messaging you can give users re your data storage practice (a link to a general privacy policy might not be the right thing) -  There is a lot of personal information being submitted to this service – there may be some further UR to on this

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understand and have a process in place to continue iterating the chat bot journey as they discover new opportunities to make chat journeys automated
  • the team showed how a user may navigate through the self-help services and the contact centre and how the journey has been improved with the introduction of the chat bot feature

What the team needs to explore

Before their next assessment, the team needs to:

  • continue monitoring the automated contact routes and identifying opportunities to improve the service journey to reduce any customer contact
  • continue to monitor how users who need to submit paper evidence experience the service and how the digital elements of the service align with this

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed how they have made improvements from the Private beta such as moving the navigation and implementing performance improvements to solve a ‘flickering’ issue
  • the service uses components from other Government departments such as the progress bar from the HMCTS design system reducing duplication of effort and inconsistencies between departments
  • the service has been tested with a range of devices that reflects users’ behaviour

What the team needs to explore

Before their next assessment, the team needs to:

  • there are a few inconsistencies in the design such as the account links near the questions and no error summaries when error messages were triggered during the demo. Continue working towards reducing these inconsistencies.
  • continue working with GDS and the wider design community to introduce components such as the progress bar and alert panels to the main GOV.UK Design System. It would benefit the wider community to see how this service has iterated and captured any feedback on these components.
  • note and action comments made in the attached Content Review, particularly reviewing the ‘Understanding Student Finance’ which is described as ‘must review’ by GOV.UK.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been able to use automated accessibility testing with AXe to provide a degree of assurance around accessibility compliance.

What the team needs to explore

Before their next assessment, the team needs to:

  • review how well the chatbot feature is meeting the needs of users on mobile devices. The team was able to demonstrate that users find chatbot feature useful, but, due to the limitations of their analytics, they are currently unable to break this down to compare desktop vs. mobile users. Given that mobile users are a very significant proportion of the users, the team must be able to show how well the chatbot tool works for mobile users and if it is not meeting the needs of these users, provide alternative channels for its mobile audience. The MoJ and DVLA have published useful (but general-purpose) case studies on chatbots that may guide your research (see https://www.gov.uk/government/case-studies/how-the-ministry-of-justice-explored-using-a-chatbot and https://www.gov.uk/government/case-studies/how-the-dvla-deployed-a-chatbot-and-improved-customer-engagement).
  • address issues raised in the separate Accessibility review document and complete further accessibility auditing and corrections as necessary – please note the attached review is not a detailed audit, but highlights errors likely to be indicative of further problems.
  • we understand that SLC have conducted their own independent accessibility audit, and issues raised by that must be resolved before launch.

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is multi-disciplinary with all the key roles covered by dedicated individuals
  • the team has, since alpha, improved its mix of delivery partner staff and civil servants; and is clearly on top of both its need for continual knowledge sharing and its contractual obligations
  • the team has autonomy to make decisions at the right level as well as a governance structure that is both proportionate and works to ensure the team can deliver at its pace rather than be slowed down by decision making

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work on improving their supplier to permanent staff mix
  • ensure there are user researchers embedded in the team
  • begin to start planning what size of team they will require for this product once it moves into a Live environment to ensure it is kept evergreen and continuing to meet user needs in the future

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a commitment to working in the open so stakeholders can continuously review the outputs of their iterative development process.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how they might improve their collaboration with cross government communities and software delivery teams working on similar problems. Access to tools such as the cross-government Slack channel (ukgovernmentdigital.slack.com) would help in this regard

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • there has clearly been a lot of work to ensure the team collect qualitative data via user research and user feedback; which is then used to drive iterations and improvements to the service
  • the team was able to show multiple examples of how the service has been iterated to make things simpler and easier to understand for users.

What the team needs to explore

Before their next assessment, the team needs to:

  • fully implement Google Analytics on the service, including customising the implementation to capture event level data
  • incorporate analytical insight into the iteration cycle. All iterations should have a hypotheses for how they will affect user behaviour, and be fully tested using analytics and quantitative data to understand their impact.

9. Create a secure service which protects users’ privacy

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is protected by a standards-compliant 3rd-party provider (Okta)
  • there is a good level of governance over security and data protection and the team has access to a security architect to provide oversight
  • data is hosted in the UK and access to customer data has appropriate levels of control and auditing
  • password resets are protected by two-factor authentication
  • personal documents uploaded by the user are cleared from the system within a reasonable timeframe if they are not submitted
  • citizens are able to communicate safely with SLC through the secure messaging feature.

What the team needs to explore

Before their next assessment, the team needs to:

  • protect the service or parts of the service with a second factor. The current service uses only a username and password combination. Given the amount of sensitive data users can store in the system (personal contact details, plus a degree of information on disability and finances), two-factor authentication should, at very least, be offered to users as an option. A ‘medium’ level authenticator is highly recommended for the service, more details on this can be found at: https://www.gov.uk/government/publications/authentication-credentials-for-online-government-services/giving-users-access-to-online-services
  • the service currently uses static, knowledge-based information to help secure the service. This should be replaced (or possibly augmented by) a more modern authentication approach, such as SMS or an authentication app such as Okta, Google Authenticator or similar. User research with your audience will help determine the most suitable approach
  • start working with GDS and the ‘One login for government’ project (aka GOV.UK Sign In) project, to provide, as a minimum, single sign-on authentication and (eventually) proof of identity. The GDS GOV Account project has two relevant streams of work, a) authentication and b) identity proofing. Authentication is currently in Private Beta and is accepting new services to onboard with (see https://www.sign-in.service.gov.uk/ and https://gds.blog.gov.uk/2021/09/08/how-will-the-new-single-sign-on-and-a-gov-uk-account-work-together/, to start with – GDS has many other blog posts on this project, but this is a good starting point) while identity proofing is in a closed Private Beta with the DBS service
  • work more closely with DWP and look to remove the need to collect the citizen’s National Insurance number. GDS Service Manual guidance advises against using NI number for identity purposes (see https://www.gov.uk/service-manual/technology/protecting-your-service-against-fraud#avoid-using-national-insurance-numbers-to-verify-identity) and DWP could potentially provide the required functionality through its new ‘matching’ service. We have shared some details of this with your technical architect
  • evaluate the risk presented by users entering personal information into the chatbot feature and what measures can be put in place to prevent this
  • do further work to establish the need to store personal documents uploaded by the indefinitely. Once the documents have been reviewed by SLC agents, there should be no further need to store the documents uploaded by the citizen.

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have implemented a basic configuration of Google Analytics
  • the team have used call data to inform where to focus the initial minimal viable product.

What the team needs to explore

Before their next assessment, the team needs to:

  • implement Google Analytics fully, including all event data
  • build analytics into the iterative cycle - all iterations need hypotheses for how they are going to affect users, and these fully testing using analytics
  • identify data sources and calculate the 4 GDS Metrics (cost per transaction, user satisfaction, digital takeup and completion rate)
  • create a stable staging account to implement analytics
  • create a Performance Framework to work out key performance indicators for the digital product. This will help the team understand where the digital service is successful during Public Beta
  • they need to dedicate someone within the central team responsibility for analytics. Need to ensure they have someone in the department with the expertise to use it
  • interrogate the analytics on the current service to fully understand the pain points that exist.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been able to deploy regular patches and improvements during Private Beta.
  • the team is using good testing tools (e.g., Selenium) to provide assurance.
  • GitLab is in place to manage source control and is being used appropriately.

What the team needs to explore

Before their next assessment, the team needs to:

  • improve their approach to prototyping. For building prototypes, the team referred to using Axure. Based on experience, we have seen that tools like Axure don’t provide the fidelity needed to get really good feedback from users when testing ideas. Instead, the team should make use of the GDS Prototype Kit (https://govuk-prototype-kit.herokuapp.com/docs) and understand the recommended approach to prototyping in government (see https://designnotes.blog.gov.uk/2014/10/13/how-designers-prototype-at-gds/).
  • get a better understanding of how the requirement for client-side JavaScript affects your users. The Service Manual strongly advocates for ‘progressive enhancement’ (see https://www.gov.uk/service-manual/technology/using-progressive-enhancement) and this panel agrees that the requirement for client-side JavaScript is not ideal. The team must test the service with users on slow connections or older devices to validate the choice they have made on client-side JavaScript. Regular, automated client-side performance testing using tools such as BrowserStack is also highly recommended.
  • review implementation of Javascript as it has been implemented, particularly to ensure that where it is necessary, it is coded to maximise speed for users and reduce potential risks to service availability (particularly resulting from in-line JS, where hosting standalone JS files, file compression and using content delivery networks (CDN) would improve availability and performance.

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has joined and contributed to a cross-government Salesforce community.
  • the team is working with Salesforce and is contributing to the Salesforce Component Library.

What the team needs to explore

Before their next assessment, the team needs to:

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is integrated with GOV.UK Notify for email/SMS and HMPO for UK passport validation checks.
  • the team have been able to re-build GDS front end components as Salesforce ‘Lightning Web Components’.

What the team needs to explore

Before their next assessment, the team needs to:

  • have a plan for integrating with ‘One login for government’ (see point 9)
  • explore the option of using the DWP ‘matching’ service to improve the user experience as well as the business performance rather than the current National Insurance number batch process currently in place (see point 9)
  • have a plan for updating the Salesforce Lightning Web Components they have built in the eventuality that the GDS front end is updated
  • share the good work they have done on the chatbot feature more widely across government, for example by publishing a blog or case study or by proposing a pattern (see https://design-system.service.gov.uk/community/propose-a-component-or-pattern/). CDDO has published chatbot guidance (see https://www.gov.uk/guidance/using-chatbots-and-webchat-tools), but there are currently no patterns to work with. Given the amount of work going into this feature, the team should consider how other government departments could benefit from your work.
  • reach out to Nigel Emmerson ([email protected]), who leads for the HM Passport Office on OGD usage of LEVWEB, an API that provides access to births, deaths and marriage data which could be utilised by the service instead of asking users to submit their certificates by post.

14. Operate a reliable service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has a number of discrete environments for development, testing, integration, UAT, training and production.
  • elements of the service can be tested through an automated framework (approx. 60-70%) and this can be triggered as needed.
  • Terraform is used to automate the deployment of the infrastructure.
  • code is deployed into integration very regularly.
  • performance testing has been carried out.
  • there is static code analysis is place.
  • there are good levels of monitoring and alerting in place.
  • there is a plan for migrating users from the

What the team needs to explore

Before their next assessment, the team needs to:

  • integrate test automation into the build pipeline. This should run code quality and integration tests before merging code into the QA environment
  • strive for a higher level of test coverage. GDS has recently published a blog post on this subject (see https://technology.blog.gov.uk/2021/10/08/a-new-standard-of-testing-for-gov-uk/) which is helpful in terms of approach and guidance.
  • ensure that test automation and quality is consistent across environments that the build of each environment is reliable and stable and that users don’t encounter problems, regardless of the environment they are using. This includes having an environment that can be used reliably for ongoing usability testing and Show & Tell type demonstrations.

Updates to this page

Published 22 January 2024