case study

An onboarding experience that empowers iterative learning design

Designing a new project creation experience that encourages users to be the human-in-the-loop by ensuring that every AI-powered interaction has a manual counterpart.

Role
Product Designer

Team
Founder
1 engineer

Skills
Product design
User research
Product strategy

product

Lazuli is an AI-powered instructional design tool for interactive, evidence-based learning

Lazuli is a collaborative learning design tool built for instructional designers, educators, and L&D teams. Lazuli empowers users to author interactive lessons and assessments, publish to any LMS, and gather evidence to continuously improve learning outcomes for students.

Iterate on learning content in days, not weeks with Lazuli

user research

One-click course generation doesn't fit the reality of our users' jobs

When conducting user interviews, we found that our one-click course generation was exciting, but didn't fit the reality of our target audience's jobs.

01 Feedback

“It’s very rare for us to design a brand new course. The majority of use cases would be redevelopments where we already know what we want the course to look like.”

02 Observation

Learning designers rarely start with a blank canvas — they bring baggage. Source materials, organizational requirements, and existing processes all hold space.

The revelation: our onboarding flow forced users down a single funnel that was only useful for greenfield projects. When users realized that Lazuli didn't work for the realities of their job, they returned to their well-worn tools.

Users might start a new Lazuli project at any point in the design process.

team experiment

We ran a team experiment to redevelop a course and uncovered an unhealthy habit

We iterate quickly at LDA. When QAing, our team had developed a habit of taking the path of least resistance through the product. We were focused on smashing bugs, not stepping into our users' shoes.

For the next week, I facilitated an experiment in which each team member would attempt to redevelop a real course in Lazuli (what we called "Lazulifying" an existing course).

A Linear task that I shared with the team

At the end of the week, we shared our findings in a FigJam. A scorecard full of thumbs-down emojis might seem disappointing, but it was simply a conversation starter. The real value was that each team member brought annotated screenshots of areas we could improve, which would inform our game plan for next release.

Our scorecard for course redevelopment

Annotated screenshots of opportunities

principles

I distilled the results of our research into four principles that would guide our next sprint

Make first, reflect later
Planning is optional
AI is assistive, not exclusive
No implicit correctness
design

"Start anywhere" — Allowing learning designers to seamlessly enter Lazuli at any point in their workflow

Lazuli is a collaborative learning design tool built for instructional designers, educators, and L&D teams. Lazuli empowers users to author interactive lessons and assessments, publish to any LMS, and gather evidence to continuously improve learning outcomes for students.

Users start a new project from a welcoming blank canvas

Upload source files anywhere

Import an existing course at any point

Import existing skills directly into your Lazuli project for the AI to reference

what I learned

AI products should support our journey, not rush us to our destination

A few learnings I will carry into my next project:

  1. Build for nuance. It's rare that any of us get to indulge in a truly greenfield project.

  2. If a user feels out control in your product, they will quickly return to their tried-and-true tools (often a spreadsheet). They won't risk their deadlines for you until they know they're in safe hands.

  3. Adoption of an AI product is laced with emotion. To dispel anxieties, build in human controls: manual interactions, pauses for permission, and language that provides transparency into the AI black box.