Grant Jenks Logo

Descriptive to Prescriptive Analytics and Beyond for Developer Productivity


Developer productivity teams make the most impact by presenting the right insights to the right people at the right time. Now in the trenches of development, developer pain is often obvious and acute. But at the mile-high view of the executive level, understanding bottlenecks and priorities conjures the imagery of a crystal ball. See how the Developer Insights team integrated subjective and objective telemetry sources into analytics presented through a single pane of glass to all levels. At LinkedIn, development occurs in an ecosystem, and many kingdoms of biological diversity exist: backend, frontend, mobile, and more. Each of these kingdoms has patterns and nuances that no single team can understand. Moving through the four kinds of analytics: descriptive, diagnostic, predictive, and prescriptive, the Developer Insights team is now looking beyond. As LinkedIn continues to scale, the team is turning to data science and AI models for analytics and services.

Slide Notes

Title Slide

  • Smile! Excited to be here.
  • Descriptive to Prescriptive Analytics and Beyond for Developer Productivity

DPH Slide

  • Measure the experience of developers and gather feedback to provide actionable insights to teams and partners.
  • Richard Hamming
  • Data-informed decision making.

Scale Slide

  • Some are snapshot values and some are per quarter.
  • Call out builds as developer builds. SLOC as snapshot. Merges like PRs.
  • Not everything grows linearly! Social connections grow by the square of network size. Software complexity, like dependencies, sometimes grows exponentially!
  • Story: “Weather” analogy — thermometers, barometers, altimeters, and more!

HEE Slide

  • SEE is the “what” framework
  • Component and aggregate metrics:
    • Duration of review response to duration of PR open to deployed
    • Success rate of individual test to success rate of CI overall
  • “How” Framework: Goals / Signals / Metrics
  • What you don’t see: volume/counts of builds, PRs, or deploys.
  • “Velocity over volume.”

Descriptive Slide

  • X-axis is time; Y-axis is download bandwidth; bandwidth drop is the impact of shelter-in-place.
  • Big credit to Gradle Enterprise Buildscans for providing the telemetry!
  • Sneak peak: downloads occur during cold builds!
  • Could we engineer a system where cold builds never occur? Shivani and Swati at 11 tomorrow

Diagnostic Slide

  • Peak at the feedback side — subjective analysis; CSAT scores
  • Story: imagine taking an Uber ride and being asked about it 3 months later.
  • What’s also there are dimensions: size of change, region/tz of author, download bandwidth, etc.

Predictive Slide

  • Complicated, we want to predict the Next Best Action (NBA).
  • Mixed results. Taught us new things but didn’t “work”.
  • Interesting from the “referrer” perspective, as you’d have with online visitors.
  • Significant when looking at an individual node, e.g. support portal.

Prescriptive Slide

  • How do you accommodate both the executive wanting a 10,000 foot view and the Senior IC wanting nitty gritty details?
  • Combine objective (Experience Index) and relative measures.
  • Layering and incremental discovery.
  • Productivity is not the same as performance! Stay away from employee performance!
  • Important not to make a dashboard that emphasizes the LOC, PRs, or Deploy count for all to see.

Types of Analytics Slide

  • Now to talk about the progression and future, Shailesh Jannu.

Final Slide

  • Highlight progression: descriptive analytics to “augmented” analytics.
  • Share credit with the broader team and organization.
  • Thank you to LinkedIn for letting us share.