Apply For A Phytosanitary Certificate Beta Assessment Report
The report for DEFRA's Apply For Phytosanitary Certificate Service beta assessment on the 13th of April 2021
Service Standard assessment report
Apply for a phytosanitary certificate
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 13/04/2021 |
Stage: | Beta |
Result: | Not Met |
Service provider: | Department for Environment, Food and Rural Affairs |
Service description
In order to export regulated plants and plant material from the UK, exporters need to get a phytosanitary certificate. This document provides evidence that the goods meet the plant health rules of the importing country. Without it, the goods will not be allowed to enter the importing country.
The Apply for a phytosanitary certificate service gives users in Great Britain a simple, online method of applying for this certificate. Once the application is received, the Animal and Plant Health Agency (APHA) will certify the goods by conducting a physical inspection or by commissioning a lab test. If the goods pass these checks, the inspector then issues a phytosanitary certificate to the exporter.
Service users
Apply for a phytosanitary certificate caters for the following users:
Exporters, including
- growers, producers or traders selling their goods overseas
- retailers moving goods from Great Britain to their stores outside GB
- agents, who apply for phytosanitary certificates on the exporter’s behalf
Inspectors, including
- office inspectors. – conducting casework checks on applications
- field inspectors – conducting physical inspections of the goods – and issuing the certificates
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team have conducted research with 82 users, made up of both internal and external users from a range of export areas (although external users made up a large proportion of the research participants for this phase). These users also included a mixture of those that had previously used the legacy (paper) service and those that were brand new to the service, to eliminate research bias and learned behaviours
- a clear set of personas have been developed to articulate who the service users are and what they need - these personas cover three external users and one internal user
- user needs have been articulated for external users in order to build an intuitive service
What the team needs to explore
Before their next assessment, the team needs to:
- prioritise the research with internal users. Although the team had conducted an impressive amount of research with external users, no user needs for back-office staff (Inspectors) had been articulated and therefore the team risks creating a service that is difficult or unusable for Defra staff
- inability to use the service effectively could lead to certificates being issued in error, so the team should prioritise the research with internal users moving forwards to reduce this risk
- conduct more research with users with assisted digital and access needs. The team had managed to speak to 1 user with an assisted digital need and 2 users with access needs in this phase. Given that the team have decisions to make around the assisted digital offering for this service, it is crucial that they understand the needs and capabilities of their users as a whole in order to refine their service offering. The service manual’s page on Making your service more inclusive gives more detail on how the team can achieve this
- researchers may wish to use the digital inclusion scale to plot their user base against, which will, in turn, allow them to understand their user base more holistically
- similarly, the team must conduct research with users with access needs in order to understand the overall usability of their service. So far, the team have been reliant on outsourced accessibility audits, however, these audits focus on technical accessibility and not the extent to which the service makes sense to a person with access needs. The service manual includes more detail as to how the team can find and recruit users with access needs for their research
- consider ways to articulate their user needs in line with the test of a good user need. Although the team had a host of external user needs to be documented, these were often solution specific rather than behavioural. Re-framing the user needs as behavioural will enable the team to build a truly valuable service for users, based on their goals and drivers
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the service team showed a good understanding of where their service fits in the wider context of the user journey. Clearly describing how users need to understand their obligations when exporting plant material
- the team are working with Government Digital Service, the Forestry Commission and other teams to make sure appropriate signposting is in place to take the user to the right place
- the team are exploring opportunities to exchange data automatically with HMRC to minimise duplicating entry of data for export declarations
What the team needs to explore
Before their next assessment, the team needs to:
- whilst the team demonstrated extensive examples of iterating and improving the application form, little had been done on the back office side to ensure Inspectors were able to do what they need to do quickly to enable the exporter user need to be met. The service team must reprioritise its attention to the Inspector user experience
- consider the wider trader journey which spans many departments: DIT, HMRC etc. The team have considered this and are aware of other services across government but working in the open to join-up and proactively sharing learnings is crucial to try and join-up the already fragmented user journey
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- there are representatives from each area looking at the operations function and a business analyst sitting across the online and offline parts of the journey
- research is carried out for the end-to-end user journey and fed into the operations team to improve the offline journey as well as the online service
- the team are carrying out qualitative and quantitative research into issues and queries that come through to the call centre to improve the digital journey
What the team needs to explore
Before their next assessment, the team needs to:
- DEFRA provides devices to inspectors, both smartphones or tablets and laptops. Whilst the panel can not be sure, it’s highly likely that users will struggle to use Dynamics on smaller devices. Prior to the next assessment, the team should collect evidence Dynamics can be used on the devices Inspectors have access to, and whether there is an efficiency benefit in providing an in-house developed user journey rather than using the native Dynamics front-end (which should be considered as a current ‘minimum viable product’ rather than an end-state for the service in the absence of more definitive insight)
- more research and testing is needed into the online versus offline offering and a plan to improve digital take-up over time. There is a lot of learning from the use of the existing legacy service, e-Domero, and why some users are still choosing this route
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the panel were able to demonstrate how they have continually iterated and improved the content and interaction patterns used in the exporters user journey and these designs had been validated with user research
- the team demonstrated several features such as the ability to clone submissions which would significantly improve the exporter user experience
- the team have done excellent research and use of data to ensure the service is named something that users would recognise and understand
- the team demonstrated how they have reused existing patterns for things not found in the design system such as file upload
What the team needs to explore
Before their next assessment, the team needs to:
- the demo given of the Inspector facing parts of the service was extremely confusing and showed how easy it was for users to make a mistake meaning extensive training would need to be provided for Inspectors in order for this to be usable. The UCD team had not had the scope to be able to iterate and improve the Inspector user experience. By the next assessment, we would like to see this team applying their already established excellent working practises to all parts of the service end to end & front to back
- the team have defined one of the core KPIs as the speed by which exporters receive a certificate. There was no evidence to suggest that continuing to iterate the application form would have an impact on this metric. Therefore in the next assessment, the team should be able to demonstrate how they have used data in order to prioritise their focus
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the application form for exporters has had an external Accessibility Audit and the service team have addressed all issues ensuring the service is accessible
- the team have done some early thinking on assisted digital such as the different kinds of channels users might need support on and at what point in the roll out lifecycle this would be feasible to deliver
What the team needs to explore
Before their next assessment, the team needs to:
- confirm a formal mechanism for separating charges for offline support (resulting from the cost-recovery basis for the service) are entirely separated from any additional help necessary to meet assisted digital or accessibility challenges with the service
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there are multiple scrum teams working together based around the capability (product, development teams) and this is coordinated by the product owner with a clear roadmap
- the product owner transitioning from the plant health inspectorate has really supported the join-up of the product team and the business unit
What the team needs to explore
Before their next assessment, the team needs to:
- have a plan for a sustainable model going forwards, not relying too heavily on contractors
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team operate a scrum of scrums and come together regularly to agree on the priorities
- the team works closely with the rest of the department and there is a daily meeting where issues can be surfaced and escalated with the senior team
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team have continuously iterated the service based on user research in order to make the public-facing aspect of the service as intuitive as possible
- the team were able to evidence multiple content iterations and design decisions that had been influenced by research
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has built a secure infrastructure, using modern tools to assess and control that the service will protect users data
- the team have a roadmap for enhancing security further
What the team needs to explore
Before their next assessment, the team needs to:
- implement said roadmap
- make sure the service is resistant to supply-chain attacks (eg, npm package highjacking)
- document what processes are in place in case of incidents, eg data leaks: how to restore the service, root-cause analysis, interacting with stakeholders and so on
- generally implement all recommendations from the Service Manual for the Live phase
10. Define what success looks like and publish performance data
Decision
The service did not meet point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team have defined some KPIs that they will use to track their service
- the team is collecting data from a variety of sources
- the team has used data to help define the order that they are adding more functionality to the site
What the team needs to explore
Before their next assessment, the team needs to:
- create a performance framework that shows how they will measure their user needs for both external users and internal users (inspectors)
- understand where all the data will come from to meet the measures defined in the performance framework
- show how the collected data has been, or will be, used to measure and inform iterations in private and public beta
- an exploration of the “so what” in the data. We can see that performance data is being collected but would like to see evidence of how this is going to be used to monitor and improve user journeys
- ensure the cookie consent mechanism is in line with regulations and works appropriately
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses a modern tech stack, based on the Defra Common Technology Choices list
- the test stack is is exhaustive (automated and manual)
- deployment is mostly automated
What the team needs to explore
Before their next assessment, the team needs to:
- re-examine why the service is written using 3 different languages, which seems too many given its relatively low complexity. This may not help with keeping the service easily maintainable in the future
- work with Defra Identity to remove the sign-on CAPTCHA, which is not user-friendly or accessible. Moreover the protection that a CAPTCHA typically provides is not needed here as it sits behind the Government Gateway
- the CI/CD pipeline that the team designed allows deploying to production multiple times a day, yet one deployment per month is expected. The processes and policy that impose such infrequent releases should be revisited, or at least challenged, as they are an impediment to good user-centred design which requires fast iteration
12. Make new source code open
Decision
The service did not meet point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team are keen to make their work public, to demonstrate how the Government wants to be transparent, to show how it works and “gives back” to the software community
- there is already a plan to make source code open
What the team needs to explore
Before their next assessment, the team needs to:
- publish the service’s source code
- finalise designing a solution to check that no secrets end up in the source code. The panel appreciate that the team have plans to do “secrets sniffing”, but ideally this should have happened at the start of the beta phase, as everyone would have been able to see that the team’s developers use modern software development practices and write good code, even before the service goes live
- changes in departmental security policy, and improved willingness to release code, as a result of the end of EU Exit transition should be consolidated, and any blockers or barriers can be raised to the Central Digital and Data Office to help address
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses CSV as an import file type and no other proprietary format
- the service is built on the GOV.UK Design System
- the team has worked with members of the cross-government forms design community and intends to contribute to the new Forms task force
What the team needs to explore
Before their next assessment, the team needs to:
- feed back to the design system team any findings that would be of benefit to the community (eg insight on the design of the filtering component)
- contribute code or APIs if possible, for easy reuse. For instance work with Defra to see if the plant name API could be made public
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team have carried out many tests on the service, including performance, security and accessibility (internal and external reviews)
What the team needs to explore
Before their next assessment, the team needs to:
- make sure the system is not too much locked in the Microsoft ecosystem. Defra may one day change cloud providers, therefore it should be possible to migrate the service over, with as little rewrite as possible
- evaluate when, why and how much traffic might increase when the service is live, based on realistic hypothetical events, then test the service with numbers found: usage peaks at specific dates, times; incidents (for instance, denial of service attacks, or external APIs not responding). Make sure the service sustains the extra load or recovers gracefully. This should be documented in case a root-cause analysis is carried out in the future