Reflection - service assessment

Good news, we passed the service assessment!

What does this means for the team and the product?

It means our service has passed a rigorous government standard and can be considered a “Live” service.

On a practical level it means the product will continue to be funded because it’s deemed valuable to citizens and performs at a high technical standard.

I’d like to take a bit of time to reflect on the service assessment experience.

What went well

Practice, practice, practice

First and foremost I’m glad we practiced, practiced, and practiced more.

I gained a deep understanding of our systems and felt prepared for the questions from our assessors.

Secondly, it was productive to frame and practice the discussion around assessment criteria.

Technical standards or expectations can vary across government departments, therefore it was useful to have a predefined set of criteria to refer back to. It focused the conversation and ensured we could stay on the same page.

Finally, it was especially helpful to have weekly check-ins with the different team specialisms (UX, design, content, infrastructure, development, architecture, product, and delivery).

We reviewed each other’s work, provided feedback, and supported each other.

Clarify feedback

The initial feedback from our first assessment needed clarification and I’m glad I advocated for a clarification session with the technical assessor.

Through the conversation we realised there were some assumptions which needed checking on both sides and it ended up being a fruitful discussion on how to improve the service in the short, medium, and long term.

It also meant we could de-scope work because there had been a miscommunication –– always a win.

Collaborate with experts

This was a good call. I’d recommend it to other teams going through a similar assessment process.

I reached out to the cybersecurity team to assist with the final assessment because it’s impossible to know all the things and important to recognise the boundaries of our knowledge.

Their expertise and support were invaluable.

I also learned a lot about attack vectors, best practices for securing data, and wifi security. I’d love to do a secondment to their team to dig into these ideas further.

What could’ve been better

Collaboration with assessors

Each assessor brings a slightly different perspective to assessment. This is human, we all come from different specialisms as well as different government departments and professional cultures.

It’d be helpful if the assessment process was geared toward collegiate conversation not a grilling, although the format lends itself to a grilling.

Perhaps this speaks more to my discomfort with that type of adversarial conversation because at the end of the day it pushed us to deliver an improved service.

Highlight session best practices

Communication styles and expectations differ across teams, let alone different parts of government.

I recommend teams going into this process to set session expectation/best practices at the start of the assessment.

Things like using the raise hand feature (if remote) and not speaking over/interrupting can make a big difference to smooth conversation.

Conclusions

Navigating and leading the technical team through the service assessment was one of the most challenging leadership activities I’ve done in my time as engineer.

It required a truly deep dive into our product and intense collaboration across our specialisms as well as external teams.

Clear, constant communication was invaluable and I’m glad I had the communication tools, skills, and experience to do this effectively.

Most importantly I’m grateful for the team.

It was such an honour and a pleasure working with my teammates on a project we care deeply about.