How LinkedIn turned to real-time feedback for developer tooling

Over the last year, we have been using real-time feedback to evolve our tooling and provide a more productive experience for LinkedIn’s developers. It’s helped us double our feedback participation, and more importantly, better tailor our recommendations and improvements. 

For any engineering organization looking to improve developer experiences, the following questions will provide a good starting point:

  • How can we make our developers more productive and happier? 
  • What investments should I make, as an owner of an internal-facing developer-focused tool, to best help my users? 
  • What if there was a way to get immediate feedback from developers right at the time they experience a problem or have a great experience? 

By sharing our learnings from our implementation journey of real-time feedback, we hope it will help other organizations improve their own processes to create a natural feedback loop between their developers and tool owners. 

The challenges of traditional surveys

For the past several years, we had relied on quarterly developer surveys as a way to reach engineers and solicit comprehensive feedback on the tools they used. However, we found ourselves running into a number of challenges with this approach: 

  • Developers sometimes provided opinions on tools that they hadn’t used in a long time. In some cases, we found that over 90% of the people that rated a specific tool had not used the tool in the 6 months before the survey was released.
  • Tool owners had very little context about the provided feedback. What was the developer providing a rating about? What specific tool or action were they referring to? This made large portions of the feedback unactionable.
  • It was difficult to measure what was going on between periodic surveys. When features and bug fixes were released between surveys, we couldn’t tell if the user rating was applicable to the “before” state or the “after” state. In addition, we couldn’t get a pulse on developers’ thoughts in between surveys.

At a high-level, the results helped us understand the general sentiment about the entire tooling ecosystem and allowed us to make significant progress on the tooling NSAT. However, we lacked a solid understanding of the specifics of the pain-points and sought a more accurate pulse on how internal users were feeling about the tools at their disposal.

The solution: Real-time Feedback

In order to overcome these challenges, we developed a mechanism called Real-Time Feedback. Real-time Feedback is a system that first collects information about actions that developers take across our tooling ecosystem and then, based on this contextual information, decides if, when, and how to solicit feedback from the developer. 

For example, we might notice that a developer has just completed a deployment, and hasn’t been asked for feedback on any tooling activity in the last two weeks. This presents a case in which we can send an email asking for feedback on how the tooling experience was for that specific deployment. The seamless integration of feedback solicitation into the day-to-day workflow means that developers don’t need to sit down once a quarter and rely on their memory, but instead can just weigh in a little bit at a time, when they are automatically prompted for feedback.

Capturing context

When we deliver feedback to the developers of our tools, we are able to include context about what specific activity the developer was engaged in when they provided that feedback.

As noted above, one of the challenges with periodic surveys was extracting actionable information from the feedback. Comments that mentioned “sometimes” were especially difficult to understand. By knowing exactly what happened at the specific time, and asking in the moment (Real-time), the feedback we pass along is much more precise with additional detail on the specific session and use.

To capture this contextual information, we created a system that logs developers’ actions via multiple channels: internal web UIs (by utilizing Matomo), command-line interfaces (CLIs), and internal APIs (by tapping into our internal logging and auditing mechanisms that publish information to Apache Kafka). 

Context capturing is a key element of our strategy, and is used for more than just feedback. We plan on using it as the foundation of a new system that will help developers get help with tools. The theory is that if we have the full context of what the developer was doing when they asked for help, it’s going to be much easier to provide that support.

Participation rates

For any survey process, a major factor in its efficacy relies in the willingness of the user populations to respond. In fact, as we transitioned into the Real-time Feedback methodology, one major concern we had was consistent participation rates.

It turns out that using a real-time solicitation approach was much more effective, in terms of population coverage, than our traditional surveys. We managed to double our survey-participation population (from roughly 15% in the periodic surveys, to over 30% with the real-time approach). 

Thanks to this increased participation, we’re now able to implement better market segmentation on the developer population. At LinkedIn, there are many kinds of developers—UX developers, backend developers, site reliability engineers, and machine-learning experts, to name a few. These different developer types—groupings/cohorts of developers—have different needs, usage patterns, and productivity issues. More precise targeting and segmentation will lead to a better and more personalized tooling that cater to the specific needs of each developer type. 

We’re also cognizant not to overwhelm developers with too many requests for feedback—otherwise, participation could drop dramatically. We, therefore, implemented smart throttling mechanisms to ensure that we’re only asking for feedback when a developer is done with their intent (that is, they are at the end of a flow). The solicitation itself is centered around their experience, rather than around a specific tool or layer of the tech stack. For example, we might ask them how their whole deployment went, as opposed to how their experience was with just one tool.

We also make sure to honor preferences with respect to feedback solicitation. Developers can choose how often they want to be asked for feedback, and what channels (such as email, web, Slack, etc.) they want to be asked on. We believe that allowing for this flexibility so that delivery of feedback is incorporated into each developer’s workflow has also played a role in increasing participation.

Developers can choose how often they want to be asked for feedback, and what channels they want to be asked on. These channels include: 

  • Email: This relies solely on the email client to capture the feedback (single-click, based on mailto links). 
  • Pluggable UI widget: We developed an in-product pluggable UI widget that can serve all kinds of solicitation mechanisms, both passive and active (e.g., in-line, toast notifications, and pop-ups). 
  • Slack: By integrating with instant messaging, we also developed a way to collect feedback over Slack. 
  • Web portal: We developed a web portal that allows developers to provide feedback as a stand-alone experience, as well as interact (i.e., voting, commenting) with what other users may have reported. 

Taking action on feedback

Listening to feedback is a great starting point, but what comes after when the feedback is implemented is what motivated us to create Real-time Feedback. Once we gather feedback from the various listening channels, we synthesize them into shareable reports (documents and other offerings, such as dashboards). We make sure that the relevant teams are acting on the feedback, and when they choose not to, we make sure that the reasoning is shared back with the developer that provided the feedback. Making sure that this loop is completed is important to sustaining a healthy culture of feedback so that the providers of feedback know they’re being listened to and continue providing valuable input.

  • diagram-showing-the-act-listen-share-framework-that-real-time-feedback-enables

Our “Listen, Act, and Share” framework

Fostering those two relationship types—inwards (towards tool developers) and outwards (towards the engineers across the company)—allows us to have a symbiotic relationship. We have our ears to the ground gaining insights about what will make developers more productive and happy, and are also communicating with tool developers to suggest improvements that will better help the audiences they serve. 

Insights gained from feedback

We’d like to share two examples of major insights that we gained from the subjective feedback we’ve collected.

  • Our in-house CI system provides estimates for the expected pipeline runtime. It turns out that these estimates, which sometimes were not very accurate, contributed to doubts among developers on the perceived reliability issue of the tooling in question.
  • Our canary-deployment monitoring system (EKG) added a protection layer when traffic shifts occur to fail the analysis, in order to make sure developers are aware that the analysis is unreliable, as the control might have changed their behavior. Unfortunately, developers ended up thinking that EKG itself was unreliable, and ignored the analysis. By getting feedback about this phenomenon, we’ve improved our systems and are now handling this case in a better way.

To summarize, realizing the blindspots of our previous approach with traditional surveys helped us rethink how we collect feedback. By creating Real-Time Feedback, an omnichannel contextual way of collecting feedback from developers, we’ve been able to increase both the quantity and quality of feedback received, leading us to drive more actionable insights in how we can better support our developers.


We’d like to thank Vineet Juneja who’s helped guide us in this area based on his previous experience; Ben Lai, Awais Tariq, Narsi Nagampalli, Jeff Galdes, and Jared Green from the management team; Naman Jain, Troy Holsapple, Sahil Patwardhan, Yuting Sun, Aaron Dai, Barry Warsaw, and Max Kanat-Alexander from the engineering team (a special shoutout to Max for his help with this article!); our UX partners, Arun Yegappan, Kuan-Ying Chen, and Kyle Smith; and finally, our Data Science partner, Yue Wu. Many teams across LinkedIn collaborated with us to roll out Real-Time Feedback—we are thankful to them all for their support.

Read More