Transforming Trainual into an AI-Driven Product Experience

Industry

SAAS, LMS

Client

Trainual, Inc

Year

(2026)

Role

UX / Product Design lead

Product design lead

Year

(2026)

Trainual, AI
Trainual, AI
Trainual, AI

01 - Challenge

01

Trainual needed to evolve from a structured knowledge platform into an AI-powered product experience. Embedding a fully integrated assistant directly into the application to support content creation, editing, search, and contextual guidance. The initiative came with real constraints: a compressed timeline driven by competitive market pressure, the need to layer AI onto a system that was not originally built for it, ongoing LLM experimentation happening in parallel with design, and the challenge of maintaining user trust while introducing generative capabilities at scale. The problem was not simply adding a chat interface. It was embedding intelligence into established workflows in a way that felt native, scalable, and trustworthy... and doing it fast.

Trainual needed to evolve from a structured knowledge platform into an AI-powered product experience. Embedding a fully integrated assistant directly into the application to support content creation, editing, search, and contextual guidance. The initiative came with real constraints: a compressed timeline driven by competitive market pressure, the need to layer AI onto a system that was not originally built for it, ongoing LLM experimentation happening in parallel with design, and the challenge of maintaining user trust while introducing generative capabilities at scale. The problem was not simply adding a chat interface. It was embedding intelligence into established workflows in a way that felt native, scalable, and trustworthy... and doing it fast.

Trainual needed to evolve from a structured knowledge platform into an AI-powered product experience. Embedding a fully integrated assistant directly into the application to support content creation, editing, search, and contextual guidance. The initiative came with real constraints: a compressed timeline driven by competitive market pressure, the need to layer AI onto a system that was not originally built for it, ongoing LLM experimentation happening in parallel with design, and the challenge of maintaining user trust while introducing generative capabilities at scale. The problem was not simply adding a chat interface. It was embedding intelligence into established workflows in a way that felt native, scalable, and trustworthy... and doing it fast.

02 - Approach

02

As Lead Product Designer on a team of 6 designers, I owned UX strategy, interaction architecture, and the foundational AI system layer for this initiative. I partnered directly with engineering and product leadership to shape model behavior, define guardrails, and ensure the assistant felt coherent within the existing product ecosystem. I structured the work around three principles: defining clear interaction models for AI within existing workflows, building a dedicated AI component library and visual language from scratch, and designing modular UI patterns for chat, inline editing, prompt states, and generation states that could scale across the product without requiring one-off solutions. I also introduced AI-assisted design tooling into the workflow itself, using Figma Make, Replit, and the Figma MCP connected to our component library. This allowed the team to move significantly faster without sacrificing system integrity.

As Lead Product Designer on a team of 6 designers, I owned UX strategy, interaction architecture, and the foundational AI system layer for this initiative. I partnered directly with engineering and product leadership to shape model behavior, define guardrails, and ensure the assistant felt coherent within the existing product ecosystem. I structured the work around three principles: defining clear interaction models for AI within existing workflows, building a dedicated AI component library and visual language from scratch, and designing modular UI patterns for chat, inline editing, prompt states, and generation states that could scale across the product without requiring one-off solutions. I also introduced AI-assisted design tooling into the workflow itself, using Figma Make, Replit, and the Figma MCP connected to our component library. This allowed the team to move significantly faster without sacrificing system integrity.

As Lead Product Designer on a team of 6 designers, I owned UX strategy, interaction architecture, and the foundational AI system layer for this initiative. I partnered directly with engineering and product leadership to shape model behavior, define guardrails, and ensure the assistant felt coherent within the existing product ecosystem. I structured the work around three principles: defining clear interaction models for AI within existing workflows, building a dedicated AI component library and visual language from scratch, and designing modular UI patterns for chat, inline editing, prompt states, and generation states that could scale across the product without requiring one-off solutions. I also introduced AI-assisted design tooling into the workflow itself, using Figma Make, Replit, and the Figma MCP connected to our component library. This allowed the team to move significantly faster without sacrificing system integrity.

03 - Process

03 - Process

03

I began by building a dedicated AI component library and visual language that aligned with our existing design system. This included modular UI patterns for chat, inline editing, generation states, and trust indicators. To accelerate iteration, I leveraged both traditional workflows and AI-assisted tooling, including Figma Make and Replit, and connected the Figma MCP to our component library to maintain design-to-code alignment. We tested multiple LLM configurations, refined system prompts, and adjusted UX flows based on real customer usage. The most complex challenge was embedding AI into legacy workflows without disrupting familiarity. We integrated intelligence into existing surfaces rather than forcing new interaction paradigms.

I began by building a dedicated AI component library and visual language that aligned with our existing design system. This included modular UI patterns for chat, inline editing, generation states, and trust indicators. To accelerate iteration, I leveraged both traditional workflows and AI-assisted tooling, including Figma Make and Replit, and connected the Figma MCP to our component library to maintain design-to-code alignment. We tested multiple LLM configurations, refined system prompts, and adjusted UX flows based on real customer usage. The most complex challenge was embedding AI into legacy workflows without disrupting familiarity. We integrated intelligence into existing surfaces rather than forcing new interaction paradigms.

I began by building a dedicated AI component library and visual language that aligned with our existing design system. This included modular UI patterns for chat, inline editing, generation states, and trust indicators. To accelerate iteration, I leveraged both traditional workflows and AI-assisted tooling, including Figma Make and Replit, and connected the Figma MCP to our component library to maintain design-to-code alignment. We tested multiple LLM configurations, refined system prompts, and adjusted UX flows based on real customer usage. The most complex challenge was embedding AI into legacy workflows without disrupting familiarity. We integrated intelligence into existing surfaces rather than forcing new interaction paradigms.

04 - Results

04

15% conversion rate on AI-assisted content creation at launch. ARR growth is tied directly to AI feature adoption, contributing to record company profitability during the transition. Improved customer retention as users adopted AI workflows into their regular content creation process. The AI component library and interaction patterns I established became the standard system across all AI features, eliminating the need to design from scratch for subsequent releases.

15% conversion rate on AI-assisted content creation at launch. ARR growth is tied directly to AI feature adoption, contributing to record company profitability during the transition. Improved customer retention as users adopted AI workflows into their regular content creation process. The AI component library and interaction patterns I established became the standard system across all AI features, eliminating the need to design from scratch for subsequent releases.

15% conversion rate on AI-assisted content creation at launch. ARR growth is tied directly to AI feature adoption, contributing to record company profitability during the transition. Improved customer retention as users adopted AI workflows into their regular content creation process. The AI component library and interaction patterns I established became the standard system across all AI features, eliminating the need to design from scratch for subsequent releases.

02 - Approach

02

As Lead Product Designer on a team of 6 designers, I owned UX strategy, interaction architecture, and the foundational AI system layer for this initiative. I partnered directly with engineering and product leadership to shape model behavior, define guardrails, and ensure the assistant felt coherent within the existing product ecosystem. I structured the work around three principles: defining clear interaction models for AI within existing workflows, building a dedicated AI component library and visual language from scratch, and designing modular UI patterns for chat, inline editing, prompt states, and generation states that could scale across the product without requiring one-off solutions. I also introduced AI-assisted design tooling into the workflow itself, using Figma Make, Replit, and the Figma MCP connected to our component library. This allowed the team to move significantly faster without sacrificing system integrity.

As Lead Product Designer on a team of 6 designers, I owned UX strategy, interaction architecture, and the foundational AI system layer for this initiative. I partnered directly with engineering and product leadership to shape model behavior, define guardrails, and ensure the assistant felt coherent within the existing product ecosystem. I structured the work around three principles: defining clear interaction models for AI within existing workflows, building a dedicated AI component library and visual language from scratch, and designing modular UI patterns for chat, inline editing, prompt states, and generation states that could scale across the product without requiring one-off solutions. I also introduced AI-assisted design tooling into the workflow itself, using Figma Make, Replit, and the Figma MCP connected to our component library. This allowed the team to move significantly faster without sacrificing system integrity.

03 - Process

03 - Process

03

I began by building a dedicated AI component library and visual language that aligned with our existing design system. This included modular UI patterns for chat, inline editing, generation states, and trust indicators. To accelerate iteration, I leveraged both traditional workflows and AI-assisted tooling, including Figma Make and Replit, and connected the Figma MCP to our component library to maintain design-to-code alignment. We tested multiple LLM configurations, refined system prompts, and adjusted UX flows based on real customer usage. The most complex challenge was embedding AI into legacy workflows without disrupting familiarity. We integrated intelligence into existing surfaces rather than forcing new interaction paradigms.

I began by building a dedicated AI component library and visual language that aligned with our existing design system. This included modular UI patterns for chat, inline editing, generation states, and trust indicators. To accelerate iteration, I leveraged both traditional workflows and AI-assisted tooling, including Figma Make and Replit, and connected the Figma MCP to our component library to maintain design-to-code alignment. We tested multiple LLM configurations, refined system prompts, and adjusted UX flows based on real customer usage. The most complex challenge was embedding AI into legacy workflows without disrupting familiarity. We integrated intelligence into existing surfaces rather than forcing new interaction paradigms.