1. Job Tracker, from insight to v1
Through interviews and surveys, we found job search is messy, fragmented, and emotionally draining.
Users juggle job boards, notes, drafts, emails, and deadlines. Job Tracker was built to make progress visible and reduce admin work,
starting with a simple Kanban-based MVP.
v1 shipped as a Kanban board where users created job cards and moved them through stages.
AI features came later as we validated the workflow and identified repetitive writing as a major friction point.
Impact
Adoption ramped at a medium pace but grew steadily as the workflow matured. Users reported the tracker made job searching feel more manageable by keeping applications, notes, and next steps in one place. We shipped and iterated the core board, then layered AI features to remove repetitive writing and prep work.
2. AI tools inside the job application workflow
We introduced AI to reduce repetitive writing and help users move faster.
Users could generate application content for different scenarios such as cover letters, application emails, follow-up emails,
thank you notes, and polite replies.
Because model quality varied at the time, we designed flows that prompt review and make it clear when user attention is required.
We also evaluated scalability and costs. Early experimentation suggested the product upside outweighed the cost of usage.
Monetisation model
We let everyone try every AI feature for free, but limited usage based on token cost, typically 1 to 3 generations per feature. After the trial, the feature was gated behind Premium with a clear upgrade prompt. The strongest demand showed up around live interview preparation, salary calculation and negotiation, and cover letter generation.
3. Novocareer, learning as a retention loop
Novocareer expanded the product portfolio into structured career learning.
The goal was to give users a reason to return beyond document creation, with courses and progress tracking that support longer-term career development.
The platform launched with multiple courses and later added AI-generated courses and stronger progress tracking.
I owned product direction and delivery while course content was created by a dedicated colleague.
Impact
Adoption was slower than expected in absolute users, but course ratings and qualitative feedback were very positive. The content added clear value for existing subscribers and strengthened the retention story by giving people a reason to return between document edits.
4. Innovation leadership, alignment, and tradeoffs
As Head of Innovation, I drove product direction and execution in a small team.
I owned PRDs, acceptance criteria, analytics planning, instrumentation, release notes, experiments, QA workflows, and stakeholder alignment,
delegating work when it was the smartest way to increase speed and quality.
A key tradeoff was choosing speed and focus over completeness.
We shipped with fewer secondary features and deprioritised mobile support early, based on observed user behaviour.
On AI, I advocated for deeper automation across onboarding and application workflows, with UX guardrails that keep users responsible for final quality.
Leadership chose a more cautious rollout to manage quality and risk. This difference in direction shaped what we delivered and how fast we pushed automation.
Key lesson
Distribution matters. Go-to-market was not formally owned, which made adoption harder. I learned to treat GTM commitments as first-class delivery work with clear owners and measurable goals.