data-analysis

4 posts

toss

Enhancing Data Literacy for (opens in new tab)

Toss’s Business Data Team addressed the lack of centralized insights into their business customer (BC) base by building a standardized Single Source of Truth (SSOT) data mart and an iterative Monthly BC Report. This initiative successfully unified fragmented data across business units like Shopping, Ads, and Pay, enabling consistent data-driven decision-making and significantly raising the organization's overall data literacy. ## Establishing a Single Source of Truth (SSOT) - Addressed the inefficiency of fragmented data across various departments by integrating disparate datasets into a unified, enterprise-wide data mart. - Standardized the definition of an "active" Business Customer through cross-functional communication and a deep understanding of how revenue and costs are generated in each service domain. - Eliminated communication overhead by ensuring all stakeholders used a single, verified dataset rather than conflicting numbers from different business silos. ## Designing the Monthly BC Report for Actionable Insights - Visualized monthly revenue trends by segmenting customers into specific tiers and categories, such as New, Churn, and Retained, to identify where growth or attrition was occurring. - Implemented Cohort Retention metrics by business unit to measure platform stickiness and help teams understand which services were most effective at retaining business users. - Provided granular Raw Data lists for high-revenue customers showing significant growth or churn, allowing operational teams to identify immediate action points. - Refined reporting metrics through in-depth interviews with Product Owners (POs), Sales Leaders, and Domain Heads to ensure the data addressed real-world business questions. ## Technical Architecture and Validation - Built the core SSOT data mart using Airflow for scalable data orchestration and workflow management. - Leveraged Jenkins to handle the batch processing and deployment of the specific data layers required for the reporting environment. - Integrated Tableau with SQL-based fact aggregations to automate the monthly refresh of charts and dashboards, ensuring the report remains a "living" document. - Conducted "collective intelligence" verification meetings to check metric definitions, units, and visual clarity, ensuring the final report was intuitive for all users. ## Driving Organizational Change and Data Literacy - Sparked a surge in data demand, leading to follow-up projects such as daily real-time tracking, Cross-Domain Activation analysis, and deeper funnel analysis for BC registrations. - Transitioned the organizational culture from passive data consumption to active utilization, with diverse roles—including Strategy Managers and Business Marketers—now using BC data to prove their business impact. - Maintained an iterative approach where the report format evolves every month based on stakeholder feedback, ensuring the data remains relevant to the shifting needs of the business. Establishing a centralized data culture requires more than just technical infrastructure; it requires a commitment to iterative feedback and clear communication. By moving from fragmented silos to a unified reporting standard, data analysts can transform from simple "number providers" into strategic partners who drive company-wide literacy and growth.

toss

In an era where everyone does research, (opens in new tab)

In an era where AI moderators and non-researchers handle the bulk of data collection, the role of the UX researcher has shifted from a technical specialist to a strategic guide. The core value of the researcher now lies in "UX Leadership"—the ability to frame problems, align team perspectives, and define the fundamental identity of a product. By bridging the gap between business goals and user needs, researchers ensure that products solve real problems rather than just chasing metrics or technical feasibility. ### Setting the Framework in the Idea Phase When starting a new project, a researcher’s primary task is to establish the "boundaries of the puzzle" by shifting the team’s focus from business impact to user value. * **Case - AI Signal:** For a service that interprets stock market events using AI, the team initially focused on business metrics like retention and news consumption. * **Avoiding "Metric Traps":** A researcher intervenes to prevent fatigue-inducing UX (e.g., excessive notifications to boost CTR) by defining the "North Star" as the specific problem the user is trying to solve. * **The Checklist:** Once the user problem and value are defined, they serve as a persistent checklist for every design iteration and action item. ### Aligning Team Direction for Product Improvements When a product already exists but needs improvement, different team members often have scattered, subjective opinions on what to fix. The researcher structures these thoughts into a cohesive direction. * **Case - Stock Market Calendar:** While the team suggested UI changes like "it doesn't look like a calendar," the researcher refocused the effort on the user's ultimate goal: making better investment decisions. * **Defining Success Criteria:** The team agreed on a "Good Usage" standard based on three stages: Awareness (recognizing issues) → Understanding (why it matters) → Preparation (adjusting investment plans). * **Identifying Obstacles:** By identifying specific friction points—such as the lack of information hierarchy or the difficulty of interpreting complex indicators—the researcher moves the project from "simple UI cleanup" to "essential tool development." ### Redefining Product Identity During Stagnation When a product's growth stalls, the issue often isn't a specific UI bug but a fundamental mismatch between the product's identity and its environment. * **Case - Toss Securities PC:** Despite being functional, the PC version struggled because it initially tried to copy the "mobile simplicity" of the app. * **Contextual Analysis:** Research revealed that while mobile users value speed and portability, PC users require an environment for deep analysis, multi-window comparisons, and deliberate decision-making. * **Consensus through Synthesis:** The researcher integrates data, user interviews, and market trends into workshops to help the team decide where the product should "live" in the market. This process creates team-wide alignment on a new strategic direction rather than just fixing features. The modern UX researcher must move beyond "crafting the tool" (interviewing and data gathering) and toward "UX Leadership." True expertise involves maintaining a broad view of the industry and product ecosystem, structuring team discussions to reach a consensus, and ensuring that every product decision is rooted in a clear understanding of the user's context and goals.

kakao

How the POPM program became (opens in new tab)

Kakao developed its internal POPM (Product Owner/Product Manager) training program by treating the curriculum itself as an evolving product rather than a static lecture series. By applying agile methodologies such as data-driven prioritization and iterative versioning, the program successfully moved from a generic pilot to a structured framework that aligns teams through a shared language of problem-solving. This approach demonstrates that internal capability building is most effective when managed with the same rigor and experimentation used in software development. ## Strategic Motivation for POPM Training * Addressed the inherent ambiguity of the PO/PM role, where non-visible tasks often make it difficult for practitioners to define their own growth or impact. * Sought to resolve the disconnect between strategic problem definition (PO) and tactical execution (PM) within Kakao’s teams. * Prioritized the creation of a "common language" to allow cross-functional team members to define problems, analyze metrics, and design experiments under a unified structure. ## Iterative Design and Versioning * The program transitioned through multiple "versions," starting with an 8-session pilot that covered the entire lifecycle from bottleneck exploration to execution review. * Based on participant feedback regarding high fatigue and low efficiency in long presentations, the curriculum was condensed into 5 core modules: Strategy, Metrics, Experiment, Design, and Execution. * The instructional design shifted from "delivering information" to "designing a rhythm," utilizing a "one slide, one question, one example" rule to maintain engagement. ## Data-Driven Program Refinement * Applied a "Product Metaphor" to education by calculating "Opportunity Scores" using a matrix of Importance vs. Satisfaction for each session. * Identified "Data/Metrics" as the highest priority for redesign because it scored high in importance but low in satisfaction, indicating a structural gap in the teaching method. * Refined the "features" of the training by redesigning worksheets to focus on execution routines and converting mandatory practice tasks into selective, flexible modules. ## Structural Insights for Organizational Growth * Focused on accumulating "structure" rather than just training individuals, ensuring that even as participants change, the framework for defining problems remains consistent within the organization. * Designed practice sessions to function as "thinking structures" rather than "answer-seeking" exercises, encouraging teams to bring their training insights directly into actual team meetings. * Prioritized scalability and simplicity in the curriculum to ensure the structure can be adopted across different departments with varying product needs. To build effective internal capabilities, organizations should treat training as a product that requires constant maintenance and versioning. Instead of focusing on one-off lectures, leaders should design structural "rhythms" and feedback loops that allow the curriculum to evolve based on the actual pain points of the practitioners.

kakao

Were we solving the real (opens in new tab)

The POPM (Product Owner/Product Manager) training course at Kakao focuses on restructuring existing professional knowledge into a cohesive framework for solving real-world business problems. Rather than simply delivering new information, the program emphasizes aligning strategy with execution, transforming "strategy" from a vague concept into a practical set of decision-making criteria. The ultimate goal is to move teams away from a "release-only" mindset toward a cycle of continuous hypothesis verification and learning. ### Strategic Thinking and Metric Modeling * **Strategic Decision Criteria**: Strategy is redefined as the standard for team judgment, utilizing frameworks like MECE, MVP, and priority-setting models to align daily tasks with long-term goals. * **Metrics as Problem-Solving Language**: Key indicators such as Funnel, Retention, Cohort, and LTV are treated not just as data points, but as a language used to define and reveal underlying product issues. * **Context-Based Design**: UX design is approached through "context-based logic" rather than intuition, encouraging teams to ask which specific design fits the current user journey. ### Systematic Experimentation and A/B Testing * **The MASS Framework**: Experiments are designed and evaluated based on being Measurable, Attributable, Sensitive, and having a Short-term cycle. * **Failure Analysis Routines**: The curriculum emphasizes the importance of establishing a routine for interpreting failed experiments, ensuring that every test contributes to the team's institutional knowledge. * **Incremental Testing**: Encourages a culture of "starting small," giving teams the confidence to run experiments without requiring massive resource allocation. ### Building Repeatable Execution Loops * **Metric-Based Retrospectives**: Teams transition from simply finishing a release to a structured loop of "Problem Definition → Hypothesis → Metric → Verification → Retrospective." * **Formalizing Problem Definitions**: Using templates to 명문화 (formally document) the problem, expected behavior, and success metrics ensures that the entire team—not just the PO—understands the "why" behind every task. * **Operational Rhythms**: Teams are adopting fixed weekly or bi-weekly cycles for sharing insights and adjusting priorities, turning data-driven execution into a natural habit. The most critical takeaway for product teams is to constantly ask: "Is the work we are doing right now actually a solution to a defined problem, or are we just busy releasing features?" Success lies in moving beyond the sense of accomplishment from a launch and establishing a repeatable rhythm that validates whether those efforts truly move the needle.