How to run and optimise a 360-degree feedback in a small team

Inga Driksne
14 min readDec 4, 2020

--

“I need to review two members of the team before the end of the week and I haven’t prepared a single thing — help me!”

This is how a 360-degree feedback process started at GoSquared (all-in-one growth platform for SaaS businesses), when James and I caught up in June. I’ve been supporting and advising James Gill and the rest of the leadership team for 3 months by that time and found out that they have never run an “official” feedback process at GoSquared, yet alone for the whole team.

In other words, not even the co-founders really knew how well they’ve been performing this entire time. It didn’t take James much convincing that we don’t just review the 2 members of the team, but the whole team instead, including the 3 co-founders.

For those who are unfamiliar with a 360° feedback process, here is a good definition (stolen from Wikipedia — the source of all truth, obviously):

“A 360-degree feedback is a process through which feedback from an employee’s subordinates, colleagues, and supervisor(s), as well as a self-evaluation by the employee themselves is gathered. Such feedback can also include, when relevant, feedback from external sources who interact with the employee, such as customers and suppliers or other interested stakeholders.”

In other words, you try and get as much feedback from different sources and angles as possible, to build a 360 degree view of employee’s strengths, weaknesses, performance at work, cultural fit and so on.

Warning — this is going to be a pretty thorough and detailed read so tuck in with a cuppa (or some vino) as this is what we will be getting into:

Part 1:

  • Running a 360° feedback process in a small team for the first time
  • Results, feedback and learnings from round 1

Part 2:

  • Optimising a 360° feedback process for round 2
  • Results, feedback and further learnings

Part 1: Running a 360° feedback process

At Duel I used the SKS (Stop/Keep/Start-doing) framework to collect and categorise 360° feedback. I can’t quite remember how I’ve come to know about it but I’m pretty sure it was my then boss Paul Archer who came to me one day and said that that’s the framework Netflix used.

At Duel we asked 3 qualitative questions and 3 “scoring” questions to keep things fairly simple:

Qualitative (open-end) questions:

  • What should [Name] stop doing?
  • What should [Name] keep doing?
  • What should [Name] start doing?

Quantitative (rating out of 4) questions:

  • What is [Name’s] overall performance this quarter?
  • What is [Name’s] contribution to the company culture?
  • What is [Names’s] contribution to team work this quarter?

This was a good start but after a quick round of feedback from the founders and a few team members at GoSquared, we ended up adjusting the first 3 qualitative questions to suit their team and company culture better, to:

  • What could [Name] be doing more of? (i.e. things this teammate is currently doing well but not enough of)
  • What could [Name] be doing less of? (i.e. any development areas + provide suggestions on how to improve)
  • What should [Name] continue doing? (i.e. give praise and celebrate this teammate’s achievements)

We also agreed on a few other factors for how to run this process:

  1. All answers (apart from self-assessments) will be kept anonymous.
  2. All responses will only be accessed by me (as an external neutral party and not the leadership team) to ensure we keep this as objective as possible.
  3. Once all the feedback is collected, I will summarise it for each team member and share it with their line manager, who will then go over it in a 121 session with each employee.

Results

Slack message from James introducing a 360° feedback process at GoSquared

James shared a link to Typeform on the 3rd of July and 10 days later:

  • We received a total of 47 responses (including self-assessments)!
  • That’s almost 6 feedback reviews per team member
  • Which accounted to a total of 35 pages of feedback

For a team of 8 people at the time, it was a very promising start, not to mention that all of this was happening right in the middle of a new OKR cycle.

But there were also a bunch of things that didn’t go so well:

  1. There was confusion around who to review and some of the questions had many overlaps, e.g. “could be doing more” vs “could be doing less”, or contribution to “company culture” vs. “team work”.
  2. It took more than (originally predicted) 10 minutes to review each team member because people needed to collect their thoughts first, then scribble some notes down, and only then submit their feedback.
  3. I unnecessarily ended up using too many tools, jumping from Google Sheets (to filter all the responses by name) → to Google Docs (to copy / paste responses, highlight themes and summarise feedback in a few bullet points) → to Notion (to share with the rest of the team).
  4. 360° summaries took forever to compile and finalise (on average 2.5 hours per team member, if not more) + I also spent a significant amount of time writing an “executive summary” for everyone (which to be honest, no one has asked me to).
  5. It took over a month to run this entire process, from creating the initial Typeform, all the way to 360° summaries being delivered to everyone on the team.

Feedback on Feedback

Since this was the first time running such process at GoSquared, it was extremely important to follow up and collect as many suggestions and tips on the process from every team member as I could. The problem was that by the time all the line managers delivered everyone’s summaries, 2 more weeks have passed so some struggled to remember every detail of the process already.

So I prepared a list of questions for some guidance:

  • Was it straight forward to decide which team member you needed/wanted to review?
  • How did you find the format of submitting answers via Typeform?
  • Is there anything that could have made the submission process easier?
  • Did all the questions make sense/were easy to answer (including the scoring questions)?
  • Were there enough questions, too many or too little? Did you mention everything you wanted? What other questions would have helped you to review a team member?
  • Was 10 minutes a fair estimate to review a team member, or did it take longer/shorter?

Aside from a few items mentioned above already, I got A LOT more feedback on the 360° feedback process — anything from design to format, to even suggestions of establishing a more open feedback culture at work.

Part 2: Optimising a 360° feedback process

Since the first round of 360s took me over a month and we agreed to do them quarterly, it was time to do another one… 2 months later. This time round I started off by organising a 15 minute Zoom call with the team to go over all the feedback they have given me and to remind them of the process again.

This was my top line summary of improvements from the first round:

1. Collecting feedback for the next round of 360s

  • Some team members mentioned that it took them way longer to complete 360s because they needed to collect their thoughts, think of specific examples and so forth.
  • Thanks to a recommendation from one of the co-founders I created a “360° Feedback Template” for the team to use throughout their quarter, making any notes in real-time, assuming that this would result in a much more efficient feedback submission of 360s in round 2.
  • Sounds great in theory but what happened in practice? When I asked everyone on the call to see how many actually used this template… the answer was — no one. Not even the co-founder with the original idea.

2. Creating a culture of feedback — work in progress?

  • This point certainly deserved most of the attention as we ended up having an open discussion about how everyone on the team felt about giving more real-time feedback (rather than every 3 months), and whether the team felt like they were on their way to creating an open feedback culture.
  • Personally I think feedback should start with the leadership team. When working at Duel, Paul — the co-founder and CEO, was amazing at always asking for feedback. At the end of every sales call, investor pitch, strategy meeting — he always asked the following question: “What could I have done better?”. He also challenged the rest of the team to always reflect and give their opinion on how they thought the meeting went.
  • Feedback culture is not an easy thing to achieve, but it has to start somewhere. And it should start with you asking for feedback. It’s a lot easier to give feedback but much harder to ask how you can improve.

3. Who to review?

  • Some team members found it hard to judge who to review from the original description of “colleagues you work closely with”, so this time I provided a baseline table in Notion showing who should review who. I also added “interact 3 or more times a week” for a clearer definition. Either way, the general rule of thumb for a team of 10 was that ‘more is more’ and ‘more is better’.
  • In addition to this, I also reached out to a few external parties for round 2. The recommendation came from Russell — Sales Engineer at GoSquared, who wanted to know how well he’s been working with his largest customers and partners.

4. Tool for collecting feedback

  • I made a switch from the sexy Typeform to the classic Google Form. Some team members wanted to see all the questions upfront so that they can structure their responses better, and have a clear overview of what they’ve written before submitting.
  • Typeform also didn’t allow me to have ‘out of 4’ rating for the quantitative questions which made scoring a bit flawed as the majority chose the “safe” option of 4 out of 5.
Not so sexy (but more practical) Google Form

5. House-keeping

Finally I outlined all other smaller amendments that I’ve made for round 2:

  • Anonymity — was now optional rather than set by default.
  • Questions — had better explanatory notes. I also combined “contribution to company culture and teamwork”, and added a new question instead “How much feedback from the previous round do you think you / this teammate had taken on board?”.
  • Scale — was changed to qualitative options of “Outstanding”, “Strong”, “Moderate”, and “Needs improvement”.
  • Examples — I emphasised that people should think of more examples when giving feedback as these have been the most useful when delivering 360° summaries.
  • Time — I also reminded that everyone should now estimate around 15 minutes per review.

Finally, I blocked 2-hour slots x 2 for the whole team to ensure they dedicate some time to completing 360s, and set the following deadlines:

  • Everyone to complete the 360s: Fri, 2 Oct
  • Inga to complete 360 summaries: Fri, 9 Oct
  • Line Managers to deliver 360s in 121 sessions: Fri, 16 Oct

These deadlines ensured that the entire process would be completed half way through the month, and not at the end like last time. However, these deadlines held a couple of challenges:

  1. I was asking for 360s to be completed by the whole team in 1 week during the same week as the new OKR cycle (even though we already knew this made it quite challenging the first time round).
  2. I was giving myself only 1 week to complete all the summaries. Last time on average a one-page summary took me between 2.5–3 hours per team member which I managed to finish in about 2 weeks (spread across and in between all other ongoing Ops projects).
  3. Finally, the line managers also took around 2 weeks to deliver all the 360° summaries. Half of the team members have their 121s one week, the other half the following week so coordinating this was going to be tricky.

Having said that, halving the time it took me to run a 360° feedback process was my operational goal from the beginning so as challenging as it seemed at first, I had to get there (somehow).

Results from Round 2

I shared Google Form with the team on the 25th of September, and these were the results:

  1. Time taken for the entire team* to complete 360s: < 2 weeks
  2. Total number of reviews (including the self-assessments): 67
  3. On average almost 7 reviews per team member (a 16% increase)
  4. Average time taken per 360° summary: 1.5 hours (a 66% decrease)
  5. Date when all 360° summaries were delivered: Tue, 20 Oct

OK, let’s take a closer look at this.

  1. The team didn’t start reviewing their teammates until the 1st of October (almost a week after 360s form was shared). Reason being — everyone’s focus was on the ‘OKR Week’ and not 360s. They simply got de-prioritised and pushed aside (regardless of my follow ups). In addition one of the co-founders was playing a catch up after being away in the middle of this process. The timing simply wasn’t great.
  2. The team was now 10 people (including myself) and we managed to increase the average total number of reviews per team member to 6.7 from 6 last time! This was due to more people reviewing each other and the additional 7 reviews we received from the external parties.
  3. I tried a different format for my 360° summaries this time round — less filtered, longer format, with more original quotes and examples. It meant that the summaries were now 3+ pages long but it also enabled me to cut down 1–1.5 hours per summary, saving me a total of 10+ hours of work!
  4. This was by far my proudest result. I set the original deadline as Fri, 16th of October which meant that I only missed it by 2 working days! This was a big achievement, given that not all 360s were completed in one go and not all summaries were written in 1 week.

Prioritise

The trick here was to focus on who had their first 121 with a line manager coming up and working backgrounds, i.e. making sure that everyone on the team had reviewed that team member, and then me prioritising writing their summary over anyone else’s. By coordinating this prioritisation process, all summaries were delivered a mere 2 days of the original deadline goal!

Retro time!

Even though the whole process took me 14 days (if I calculate from the time the first 360° review was received to the last 360° summary delivered by a line manager), there were still a lot of things to learn and improve on for the next round. I wanted to make sure that feedback on the process was more real-time whilst fresh in everyone’s minds so I also made a few changes to collecting feedback, e.g.

  1. I added a question at the end of the Google Form, asking for feedback on the process,
  2. I followed up with every team member after their 121 session to see how did they find the new longer format and if there was anything else they wanted to add or reflect on, and
  3. I did a retro* session (having collected all the feedback from the team) and ran it pass the leadership team for any additional feedback, agreeing on improvements for the next round.

*For those who are unfamiliar with a Retrospective session, you can find more information here.

Further Learnings

The whole team was very grateful for having run the 360° feedback process again but like with anything — there is always room for improvement!

So what could have gone better?

1. Having a deadline in the same week as the OKR week — this was by far the biggest learning.

  • To avoid this next time, I proposed to push the OKR week by 1 week earlier so that we can run the 360° feedback process the following week and have all summaries delivered by mid month again.

2. Time taken to review a team member on average took 5 minutes longer → 20 minutes in total. Also blocking time for the whole team didn’t work, everyone completed their 360s differently.

  • Knowing the average time per review now and how many team members each person needs to leave feedback for — gives me a much more accurate time estimation per each teammate, e.g. by telling our CS Lead that 360s will take her around 3 hours in total — she can plan better for it.

3. There was a flaw in anonymity, e.g. in one case 2 out of 3 reviewers decided not to be anonymous so it was relatively easy to guess who was the 3rd person giving feedback. The extended feedback format also meant that it was easier to guess people’s writing styles so the focus shifted on who said what vs. what was the overall feedback.

  • We agreed to keep anonymity optional but only remove names if the above scenario happened again.

4. Some team members struggled to answer “How much feedback from the previous round do you think you / this teammate had taken on board?”, whilst others couldn’t remember what actions from the last feedback round they committed to.

  • I decided to push for “Key Takeaways” again and asked every team member to write down 2–3 items from their 360° summary they wanted to improve on in the next 3 months. This essentially turned into “Personal Goals” table and is now linked to the 121 catch up template in Notion as a reference point. In other words every time a line manager is catching up with their team members — they will be checking in on OKRs, as well as their Personal Goals.

Things that went well and we decided to keep for Round 3:

1. Collecting external feedback for Sales and Customer Success Leads (or for anyone else who works closely with external parties such as customers, suppliers, partners, collaborators).

“Inga I think you did a great job. I absolutely loved the external feedback!” — Russell, Sales Engineer

2. Qualitative rating system for performance, contribution to company culture and teamwork, and previous feedback taken on board — was received much better this time round.

“Also — preferred the rating system this time!” — Matt, Front-end Engineer

3. Displaying names of those who didn’t want to stay anonymous in 360° summaries (though only 40% of the team didn’t stay anonymous).

“I really like that the names are there…, and I think it creates a nice culture of feedback where there’s no shame or scare in giving and receiving feedback.” — Beth, Content Lead

4. More examples and longer format meant that there was more context to feedback this time round which helped the team a lot. Plus it saved me a heck of a lot more time!

”I found the session really helpful. It’s great to get such in-depth perspective from others in the team. It’s always hard to know one’s own blindspots or assumptions, so having feedback from others in this manner is insightful.” — Chris, Customer Support Associate

5. Prioritising 360° summaries in line with the upcoming 121 sessions with line managers.

“Quicker delivery made it more relevant.” — James, CEO

“Apply the concept of 360-degree feedback to the process of running 360s.”

360 the 360s

This process is far from being perfect but in just 2 rounds I managed to optimise the time it took me to run and complete it by 50% (give or take).

One of the key learnings to take away from this is this — make sure that you apply the same concept of 360° feedback (i.e. making sure you get as many reference points as you can) to the actual process of running 360s (possibly feedback inception?). Collect ideas, feedback and suggestions from the team members, leadership, reflect on it yourself and run the process again to see how you can improve again and again.

Oh, and one last thing — any feedback process (be it 360-degree or not) — ALWAYS takes longer than you would expect so always leave some extra room!

P.S. Thanks for reading! This past month I’ve been doing a deep dive into salary review process and bonus schemes, and will be sharing my summary of findings and research later this month with you so stay tuned ;-)

--

--

Inga Driksne

Interim Ops Lead and Advisor to early stage SaaS founders. Interested in supporting health tech and female-led startups.