I designed and led Market Tribe, a digital learning system that replaced fragmented training processes and reduced moderation errors by 35% on average across Southeast Asia.
ByteDance's e-commerce moderation teams across Southeast Asian markets relied on multiple disconnected tools to keep 12,000+ moderators aligned on rapidly changing platform policies. Updates were delivered through live training sessions that consumed significant time for both trainers and moderators, with no reliable way to track knowledge retention.
How do you keep 12,000 people across 6 countries consistently aligned on complex, constantly evolving policies, while cutting training time in half?
As project owner, I designed and led Market Tribe through three phases over two years, progressively replacing manual training with a scalable, automated digital learning system.
Mapped the existing training workflow end to end. Identified that fragmentation across five overlapping tools, not content quality, was the core problem. Overlapping systems caused duplication, inconsistent delivery, and blind spots in tracking.
Created a centralized e-learning system using Articulate Storyline 360 with SCORM packaging. Structured content into categories: urgent policy updates, QA trend alerts, clarifications, and deprecated case reminders. Each module was short, visual, and followed by a knowledge test.
Built automated notification workflows using internal messaging and collaboration platforms. Daily reminders and per-update alerts pushed directly to teams. Completion tracking replaced manual spreadsheets with real-time LMS dashboards.
Launched a self-study model for minor and medium policy updates, reducing reliance on live sessions. Training time per cycle was cut by 50% while maintaining knowledge retention through embedded quizzes and knowledge checks.
In the final phase, I evaluated three LMS platforms across 15+ criteria covering automation, content integration, analytics, confidentiality, and setup time. Recommended and executed migration to two integrated platforms, sunsetting three legacy tools entirely. Setup time per update dropped from 15 to 25 minutes down to 5 to 10 minutes.
After Market Tribe launched, moderation error rates dropped across multiple policy areas. The numbers below show weekly error volumes before and after deployment.
Policy names shown above have been substituted with generic categories to protect confidential operational data. Actual metrics are accurately represented.
| Dimension | Before | After |
|---|---|---|
| Training delivery | Live sessions only, lengthy per cycle | Self-paced microlearning, 50% shorter |
| Tools | 5 overlapping systems, fragmented | 2 integrated platforms with unified analytics |
| Update speed | Weekly batch updates with delays between sites | Same-day delivery to all sites simultaneously |
| Completion tracking | Manual spreadsheets, inconsistent | Real-time dashboards with automated reminders |
| Knowledge verification | No systematic testing | Embedded quizzes after every update |
| Setup time | 15 to 25 minutes per update | 5 to 10 minutes per update |
| Error rates | High volumes across multiple policies | 17% to 84% reduction within weeks |
Training content was never the bottleneck. Delivery and tracking were. Moderators were willing and able to learn. They needed the right content in the right format at the right time, with accountability built in.
Platform selection also mattered more than I initially expected. Evaluating three platforms across 15+ criteria before recommending the migration saved months of rework. The comparison framework I built became a template the broader team adopted for future decisions.
Finally, phased rollout was essential. Piloting in Indonesia first, measuring results, then expanding to all 6 markets meant each rollout was backed by real data rather than assumptions.
I am available in Germany from 27 April 2026 and open to L&D, enablement, and instructional design roles.