UX & product design
Deep dive
I worked at DLUHC in the Funding Service Design team, responsible for the design of onboarding, applications and assessments of new funds. There was not a single source of managing applications and assessments in central government for DLUHC. After Brexit, the EU took away funding management tools.
I worked on the DLUHC project for almost 12 months, and during this time I led design of various product enhancements and new features for assessment and application tools. I worked with different fund teams such as Community Ownership Fund, NightShelter, High Street Rental Auctions and more. As this was a central government project, there were lots of GDS coded prototypes used before we eventually moved to Figma. Everything was designed following GDS and accessibility best practice as we were preparing for a GDS assessment pre-election.
Applicants could apply to an application multiple times at various different points in the application process. An applicant may have ownership of multiple assets in their community, each asset would require a new funding application. There was lots of repetition when applying for multiple funds and this was flagged in analytical data that we were able to access regarding time to complete an application. We were unable to conduct observational research with applicants for legal reasons so had to use the data we had to make improvements
Using the data we had access to, we were able to do some desk research to highlight the problems, we also implemented feedback forms into the service to capture additional user feedback. We then identified any repetitive data entry points in the application. We worked with the technical partner on the project to get an understanding of any technical considerations that design needed to be aware of before ideating a suitable solution.
I designed a solution whereby users could reuse application information from a previous application. The information that was identified as reusable in this early integration of this feature was organisation information. This often did not change in multiple applications from the same applicant. To ensure validity with the data entry, applications needed to go through the application and review the auto filled data. This ensure the data was accurate and kept up to date. The impact of this design meant that lots of time was saved by the applicant, ensuring more time was spent on the asset value instead of repetitive data entry.
Assessors needed to be able to understand all actions taken on assessment to make informed decisions. They also needed to understand all previous actions on assessment by other assessors to get a clear view of actions taken, why and when. There was no way to track these actions easily in the tool. It was crucial that there was a way of seeing an overview of actions when making funding decision on whether an application was viable or not.
We used desk research and observational research methods, that highlighted the difficulties in picking up an assessment from other assessors. It was highlighted that they was currently no way of being able to see all actions taken on an assessments in a format that was easy to understand and accessible. An audit trail of actions was vital to see what actions took place prior to a decision on an assessment.
I designed a solution whereby users could see an activity trial of an assessment, where an assessor could see any and all possible actions taken on assessment. Additional filtering was included to allow for better control of the data when being audited. We went through a multi stage delivery of this feature with the technical partner, ensuring that GDS best practices were being followed, in addition to progressive enhancement to improve the user experience. We received positive feedback from Grade 7 Assessors, highlighting how this would improve their ways of working in assessment teams due to the flexibility and complete overview this solution provided.
We used observational research to understand exactly what an assessor needed to see when at a high level when viewing more than one fund. We went back and fourth with assessors to understand their needs and refine the design.
A new design was implemented improving the layout with the addition of a search and filtering features to improve oversight of many funds and round/windows within them
There was a lack of consistency in design on this page, and the page was also missing an ability to be able to leave comments at a high level.
We used desk research and observational research to gain a deeper understanding of what the user was experiencing. It was highlighted the current user experience was messy and unclear, and felt disconnected from the rest of the tool. We discovered that comments were being lost deep in assessment section and these were impacting on the overall funding decision. Using GDS best practices and guidelines, in addition to user patterns already used elsewhere in the tool, I designed a new layout for the assessment overview page.
The new design incorporated existing design patterns and layouts already used elsewhere in the tool to create familiarity in the design and create a better user experience. Assessment actions were now stored neatly on the left, instead of various locations at the top of the page. This new layout also allowed easy introduction of new features into the tool without creating a user interface that would be messy or unclear, and create a bad user experience. The new design was received very well with assessors from multiple funding teams, all highlighting that the new design made the tool easier to use and the assessment process more straight forward.
Back to work