Designing an AI-powered conversational learning experience
The tl;dr version I defined and designed a learning experience aimed at introducing its users to AI and teach them about its capabilities and how they could apply them in their day-to-day work. I defined requirements in collaboration with a product manager based on user experience principles related specifically to conversational interactions and designed various concepts including flows and interaction mechanisms to support learning goals. My product partner and I worked with a group of talented software engineers to develop an internal facing live proof of concept for testing.
I’ve organized the following screens and deliverables to showcase my user interface and visual design skills. If you’re looking to evaluate the aesthetic quality of my design work, these are for you.
Screens and Deliverables
Screens showing the initial design concept (Concept A), intended to facilitate interactions between learners and the AI-powered bot.
This was the original design concept; where users interact conversationally with the AI as it guides them to explore various use cases through narrative-led scenarios while delivering explanations and clarifications via meta-commentary.
The following screen shows the interface’s initial state. The bot greets learners on their first session and introduces itself. This is intended to frame learners minds around the chat-based conversational nature of the experience.
The following screen shows the bot responding to an off-topic question; We ensure that it does not respond to or act on requests that are not aligned with the main objective of the learning experience.
This is meant to reinforce the experience’s educational purpose and ensure learners follow and progress through our predetermined conversation flow.
Annotated screen describes the rational behind where and how UI elements are placed; The main content is vertically aligned centered on screen and has plenty of open space to anchor user’s attention on the chat elements.
This screen shows the UI in full context, and a loading indicator (three dots in the lower right corner) indicating when the AI is processing information.
The bot characterizes its own responses, provides explanations and describes ways to phrase (or rephrase) instructions to modify and adjust its output.
In this concept, the AI is both the tool and the instructor. It responds to the messages and provides useful tips to help learners understand how their messages influence its output.
Screens showing the second (Concept B), intended to facilitate interactions between learners, the AI-powered bot, and an author who is represented directly in the conversation.
In this concept, the session starts with the author and the bot introducing themselves and describing their respective roles to set user expectations.
Suggested prompts have a star symbol to denote when the AI will respond or when a message will be interpreted by the AI itself.
Most stakeholders responded positively and favorably to this concept and any other concept where an author or a guide was explicitly represented. Stakeholders seemed to resonate better with the method of instruction delivery implied by this concept.
Screens showing the 3rd concept (Concept C), where the author guides the conversation from outside the chat window.
In this concept the AI behaves like any other tool, waiting for instructions to perform specific tasks and the instructions are delivered by the author persona who guides the interactions from outside the chat-window.
I developed this design concept in response to stakeholder feedback about the perceived lack of clarity on the relationship between the bot and the guide in the previous concept. Stakeholders also expressed wanting to provide learners with an “inflection point” and a “moment of reflection” and they felt that the other concepts did not provide that.
Stakeholders were interested in exploring this concept as a product over all the others I defined due specifically to its split view layout. I believe they especially resonated with this concept due to the layout’s familiarity.
Why do I keep mentioning Stakeholders?
Although my preferred approach when developing design concepts or product solutions is to test them directly with target users, in the organizational and cultural environment I devised these product concepts in, stakeholders expect them to align specifically to their individual preferences from a layout, design and product perspective.
For example, stakeholders have a strong preference for split-view layouts and information dense user interfaces, and they believe that users react positively to and are more task efficient when there’s “a lot more to see on the screen.”
It is a difficult challenge to overcome as an individual contributor, especially in an organization with a top-down decision making framework. Although I’ve always advocated for it, stakeholders have not always been receptive to or supportive of user-centric design methods.
Skeleton placeholder works as the feedback mechanisms to communicate to learners when the AI is actively processing information.
Users follow guided instructions in a linear sequence until they complete the introductory exercises. Dialog options are represented as clickable tiles with a dashed outline.
A disabled text input field shows a message when learners hover over it, the message states that they must complete the walkthrough before they can begin writing their own prompts.
The intention behind this is to ensure that all learners review the instructional guidance that we’ll prepare to equip them with a foundational understanding of Generative AI and its concepts.
Screen shows author (guide) informing the learner that they are now able to experiment with writing their own prompts.
…and now, to give you some context
The following tells the story of project itself, the historical context behind what initiated it and how I collaborated with my product counter-parts to define, design and deliver a product experience aligned with business objectives.
It also highlights the specific design decisions I made to support the desired learning outcomes and high-level product goals.
I’ve organized the following artifacts and described my overall design process to showcase my analytical thinking, critical thinking and problem solving skills. If you’d like to understand how I make decisions, this is for you 😄
This was a stakeholder-initiated project; Our goal was to develop a product to meet the needs of enterprises looking to up-skill their workforce on GenAI.
Exec stakeholders across marketing and sales’ were looking to offer a product solution to meet the needs of enterprises who sought to provide members of their organization with foundational AI knowledge and skills.
The target audience was non-coders who functioned in departments across HR, legal, finance, operations, marketing, manufacturing, customer service, project managers, designers, administrators, and executives.
Proto-Persona (Analytical Ana) The following describes the presumed behavioral and attitudinal characteristics of the target audience for this learning experience based on anecdotal evidence and domain exposure. These are things I assumed to be true about our audience that may relate directly to their goals and motivations for learning how to use AI that I believe would inform product and design decisions. These are educated assumptions and not validated empirical data and also not what I'd typically reference when making design decisions.
Why a conversational learning experience? Because it reflects the real world use of GenAI tools like ChatGPT, Gemini.
Based on research, an effective learning experience is one that is engaging, relevant to the learner, interactive, provides opportunities for application and promotes active participation;
I believed that a learning experience which mimics the real world interaction users have with GenAI tools (like ChatGPT) would make for a compelling learning experience and it would enable users to build accurate mental models which I believed they could draw from when putting their knowledge into practice—in short it is:
- Accessible and a practical way of learning; a chat-based, conversational format would encourage active engagement, which I believed would help learners better retain the concepts.
- Personalized and represents its real-world usage context: one of my concepts involved using a chatbot to teach about GenAI which would provide a practical, hands-on experience that mirrors real-world interactions learners would have with AI tools, plus, a conversational approach allows for adaptive feedback which makes for a personalized learning experience.
Defining requirements for a chat-based AI-powered conversational learning experience
To identify requirements for a conversational interaction, I held a design exercise (a live workshop) with my product partner where we mapped out the desired flow of conversation between an artificially intelligent chatbot (the system) and learners.
Together we devised the supporting system behaviors that would create the desired user experience and support our users’ goals.
I learned about the conversational design exercise from Strategic UX writing by Torrey Podmajersky and although this was our first-time defining and designing for a conversational learning experience, we successfully identified relevant requirements which we analyzed to capture dependencies and implications. As part of the exercise we each took turns assuming the role of learners and the system, then alternated while enacting various scenarios to identify and capture key considerations and experience requirements. Note: I prefer to visualize concepts with stakeholders when discussing requirements for complex systems and that typically takes the form of live sketches on a whiteboard.
Defining tasks, usage and context scenarios
I defined user and bot conversation flows based on the goals and desired outcomes which aligned with the high level design approach.
Mapping the conversation flow based on product goals and user experience principles.
I created a flow diagram to convey the ideal task flow, which we collaboratively analyzed to identify functional and data requirements.
The following screen shows a portion of that flow where a feedback loop occurs as a user learners how to write effective prompts through experimentation.
This learning experience is currently in active development
After several rounds of stakeholder feedback, we scoped a MVP release for concept testing and I’m currently overseeing the development process, clarifying requirements and providing supporting deliverables as needed.
Currently, there exists no similar product experience in the market for learning about GenAI and I’m optimistically anticipating a successful user-facing public release sometime in 2025 👍