real-user-monitoring

1 posts

datadog

How Datadog uses Datadog to gain visibility into the Datadog user experience | Datadog (opens in new tab)

Datadog leverages its own monitoring tools to bridge the gap between qualitative user interviews and quantitative performance data. By "dogfooding" features like Real User Monitoring (RUM) and Logs, the product design team makes evidence-based UI/UX adjustments while gaining firsthand empathy for the user experience. This approach allows them to identify exactly how users interact with specific components and where current designs fail to meet user expectations. **Optimizing Font Consistency via CSS API Tracking** * To ensure visual precision in information-dense views like the Log Explorer, the team needed to transition from a generic system font stack to a standardized monospace font. * Designers used the Web API’s `Document.font` interface and the CSS Font Loading API via Datadog RUM to collect data on which specific fonts were actually being rendered on users' machines. * By analyzing a dashboard of these results, the team selected Roboto Mono as the standard, ensuring the new font’s optical size matched what the plurality of users were already seeing to avoid breaking embedded tables. **Simplifying Components through Interaction Logging** * The `DraggablePane` component, used for resizing adjacent panels, was suffering from UI clutter due to physical buttons for minimizing and maximizing content. * The team implemented custom loggers within Datadog Logs to track how frequently users clicked these specific controls versus interacting with the draggable handle. * The data revealed that the buttons were almost never used; consequently, the team removed them and replaced the functionality with a double-click event, significantly streamlining the interface. **Refining Syntax Support through Error Analysis** * When introducing the `DateRangePicker` for custom time frames, the team needed to expand the component's logic to support natural language strings. * By aggregating "invalid inputs" in Datadog Logs, the team could see the exact strings users were typing—such as "last 2 weeks"—that the system failed to parse. * Analyzing these common patterns allowed the team to update the parsing logic for high-demand keywords, which resulted in the component’s error rate dropping from 10 percent to approximately 5 percent. Leveraging internal monitoring tools allows design teams to move beyond guesswork and create highly functional interfaces. For organizations managing complex technical products, tracking specific component failures and interaction frequencies is an essential strategy for prioritizing the design roadmap and improving user retention.