웹뷰 엔지니어를 위한 iOS Webview Input 경험 개선기 -- Share 안녕하세요. 당근 커뮤니티실에서 Software Engineer로 일하고 있는 Dave예요. 저는 지난 4년간 웹뷰 기반의 커뮤니티 당근모임을 만들어 왔어요. 당근의 커뮤니티 서비스인 동네생활에서 이웃들과 일상을 공유하고, 모임에서 같은 관심사를 가진 사람들과 대화하며, 카페에서 특정 주제로 정보를 나누고 이야기하는 등 유저 간 게시글, 댓글, 채팅을 통해 소통하는 순간이 많은데요. 그렇다 보니 커뮤니티 프로덕트에…
Managing complex multi-page onboarding funnels often leads to documentation that quickly becomes decoupled from the actual codebase, creating confusion for developers. To solve this, the Toss team developed an automated system that uses static code analysis to generate funnel flowcharts that are never outdated. By treating the source code as the "Source of Truth," they successfully transformed hard-to-track navigation logic into a synchronized, visual map.
### The Limitations of Manual Documentation
* Manual diagrams fail to scale when a funnel contains high-frequency branching, such as the 82 distinct conditions found across 39 onboarding pages.
* Traditional documentation becomes obsolete within days of a code change because developers rarely prioritize updating external diagrams during rapid feature iterations.
* Complex conditional logic (e.g., branching based on whether a user is a representative or an agent) makes manual flowcharts cluttered and difficult to read.
### Static Analysis via AST
* The team chose static analysis over runtime analysis to capture all possible navigation paths simultaneously without the need to execute every branch of the code.
* They utilized the `ts-morph` library to parse TypeScript source code into an Abstract Syntax Tree (AST), which represents the code structure in a way the compiler understands.
* This method allows for a comprehensive scan of the project to identify every instance of navigation calls like `router.push()` or `router.replace()`.
### Engineering the Navigation Edge Data Structure
* A "Navigation Edge" data structure was designed to capture more than just the destination; it includes the navigation method, query parameters, and the exact line number in the source code.
* The system records the "context" of a transition by traversing the AST upwards from a navigation call to find the parent `if` statements or ternary operators, effectively documenting the business logic behind the path.
* By distinguishing between `push` (which adds to browser history) and `replace` (which does not), the documentation provides insights into the intended user experience and "back button" behavior.
### Tracking Hidden Navigation and Constants
* **Custom Hook Analysis:** Since navigation logic is often abstracted into hooks, the tool scans `import` declarations to follow and analyze logic within external hook files.
* **Constant Resolution:** Because developers use constants (e.g., `URLS.PAYMENT_METHOD`) rather than raw strings, the system parses the project's constant definition files to map these variables back to their actual URL paths.
* **Source Attribution:** The system flags whether a transition originated directly from a page component or an internal hook, making it easier for developers to locate the source of a specific funnel behavior.
### Conclusion
For teams managing complex user journeys, automating documentation through static analysis is a powerful way to eliminate technical debt and synchronization errors. By integrating this extraction logic into the development workflow, the codebase remains the definitive reference point while providing stakeholders with a clear, automated visual of the user experience.
How Razorpay sharpened developer workflows Working Well Design systems Razorpay’s design systems team developed Blade to keep user experiences consistent and intuitive while supporting multiple products across platforms. As one of India’s largest and fastest growing financial se…
Datadog’s Frontend Developer Experience team migrated their massive codebase from a fragile, custom Puppeteer-based acceptance testing framework to Datadog Synthetic Monitoring to address persistent flakiness and high maintenance overhead. By leveraging a record-and-play approach and integrating it into their CI/CD pipelines via the `datadog-ci` tool, they successfully reduced developer friction and improved testing reliability for over 300 engineers. This transition demonstrates how replacing manual browser scripting with specialized monitoring tools can significantly streamline high-scale frontend workflows.
### Limitations of Puppeteer-Based Testing
* Custom runners built on Puppeteer suffered from inherent flakiness because they relied on a complex chain of virtual graphic engines, browser manipulation, and network stability that frequently failed unexpectedly.
* Writing tests was unintuitive, requiring engineers to manually script interaction details—such as verifying if a button is present and enabled before clicking—which became exponentially more complex for custom elements like dropdowns.
* The testing infrastructure was slow and expensive, with CI jobs taking up to 35 minutes of machine time per commit to cover the application's 565 tests and 100,000 lines of test code.
* Maintenance was a constant burden; every product update required a corresponding manual update to the scripts, making the process as labor-intensive as writing new features.
### Adopting Synthetic Monitoring and Tooling
* The team moved to Synthetic Monitoring, which allows engineers to record browser interactions directly rather than writing code, significantly lowering the barrier to entry for creating tests.
* To integrate these tests into the development lifecycle, the team developed `datadog-ci`, a CLI tool designed to trigger tests and poll result statuses directly from the CI environment.
* The new system uses a specific file format (`.synthetics.json`) to identify tests within the codebase, allowing for configuration overrides and human-readable output in the build logs.
* This transition turned an internal need into a product improvement, as the `datadog-ci` tool was generalized to help all Datadog users execute commands from within their CI/CD scripts.
### Strategies for High-Scale Migration and Adoption
* The team utilized comprehensive documentation and internal "frontend gatherings" to educate 300 engineers on how to record tests and why the new system required less maintenance.
* To build developer trust, the team initially implemented the new tests as non-blocking CI jobs, surfacing failures as PR comments rather than breaking builds.
* Migration was treated as a distributed effort, with 565 individual tests tracked via Jira and assigned to their respective product teams to ensure ownership and a steady pace.
* By progressively sunsetting the old platform as tests were migrated, the team managed a year-long transition without disrupting the daily output of 160 authors pushing 90 new PRs every day.
To successfully migrate large-scale testing infrastructures, organizations should prioritize developer trust by introducing new tools through non-blocking pipelines and providing comprehensive documentation. Transitioning from manual browser scripting to automated recording tools not only reduces technical debt but also empowers engineers to maintain high-quality codebases without the burden of managing complex testing infrastructure.
Behind the build: a Q&A with developer Tru Narla We reached out to three developers to learn more about who they are, what they built, and what they’re working on next. Tru answers our Q&A and shares the story behind her Soundboard widget. Maker Stories Plugins & tooling Enginee…