This past spring I initiated a 360-degree review of myself. The general goal of a 360° evaluation is to gather feedback from people who are organizationally oriented to an employee in different ways. The name “360” is meant to refer to people who are organizationally on all “sides” of the employee hierarchically including people who report to the employee, their supervisor(s) or people above them organizationally, lateral colleagues and peers, and more. I initiated this activity to learn more about how people experience my leadership, supervision, communication, and facilitation, but also to learn about the process as a practice in general. Here’s how I approached this and a reflection on process elements in case this project is of interest to others.
Question Design
As I conceptualized this process, a colleague reminded me that everyone has different values and beliefs about leadership. I realized the evaluation would be more effective if I shared what I was trying to do (e.g. communicate clearly, provide opportunities for input) and ask if I was accomplishing that. Drafting questions based on my values and areas that are important to me produced feedback that was more meaningful.
Working with a Reviewer
One of my priorities was to create a process where people felt comfortable to give quality feedback without holding back and to draw out perspectives I don’t currently have or can’t see. I opted to work with a trusted colleague who would serve as the reviewer, receiving the responses and summarizing them for me. While some respondents could choose to give me access to their original comments, this approach created an additional level of anonymity for folks who wanted it. I would take this route again; I appreciated the opportunity to have someone help make meaning of the results.
Invitations and Response Rate
I invited 46 people to complete the review. This resulted in 30 responses. I built the Qualtrics survey, but my reviewer copied it and sent it out so I wouldn’t have access. The challenge in this format was not being able to monitor response rates, and I had no way of knowing who to nudge to complete it. I also found myself wondering what perspectives I missed because I did a more targeted invitation instead of putting out a broader call.
Explaining Myself
Throughout the early stages of this process, I encountered a few folks who heard I was doing this and asked, “what problem are you trying to solve?” There seemed to be a common perception that 360-degree reviews were for performance issues and not part of a natural feedback cycle to help leaders see potential areas for growth. When I sent out invitations, I included a document that detailed the rationale and purpose of this exercise, and I also think if we normalize this feedback loop more, it won’t be viewed as reactionary or negative.
Unpacking Results
After the reviewer had completed the analysis of findings, we met and he walked me through a summary of responses to different sections, general themes, and provided me with a box folder of the 19 participants who gave permission for me to read their responses. I appreciated the space to ask the reviewer follow-up questions and have a conversation about what I was hearing in the data. Being able to review the raw data for some responses was by far the most valuable part of this process as it added nuance and understanding to some of the individual responses. Following my own exploration of the data, I circled back with the invited participants to thank them for their insights. For the folks who report to me directly, I shared results and take-aways in more depth.
Strengths & Limitations
Spending time reflecting on my values and what I wanted feedback on was not originally part of my process, but it proved to be an invaluable exercise. Giving folks the opportunity to release their verbatim responses to me was an element I would keep as it gave me additional rich data, but with their permission.
While the format was helpful to gather a range of feedback from a group, it was challenging to have anonymous feedback at times. Some comments and suggestions would have been better with context. Some suggestions seem so specific to one respondent that it would have been useful to gather that feedback in a 1:1 so we could have created a plan moving forward to better meet those needs.
Overall, I really enjoyed both the process of setting this up and the learning that came from the results. I think this is a useful tool within the feedback toolbox and would recommend it for folks in any position type. Questions? Comments? Interested in seeing the survey tool I created? Please be in touch, I’m happy to share more about my process and questions.