How I optimized hands-on learning in interactive coding environments for usability
Since 2021, I’ve been embedded into one of O’Reilly’s dedicated product teams focused on meeting the needs of learners, teachers, and businesses seeking to create training of open source technologies for their users, employees, and customers to meet internal training goals.
Hands-on learning is the way that people can learn to apply theoretical knowledge and skills to make sure they’re truly learning and achieving their goals and the Interactive team at O’Reilly (my team) ensures customers have the best tooling and mechanisms to create hands-on learning experiences.
Below: A content (lab) detail and overview screen in high-fidelity
The following case study covers some of the most impactful outcomes I actualized, the historical context which outlines the chain of decisions, and everything I did to ensure a useful, intuitive, and compelling user experience while optimizing for usability.
I defined and designed the interface and user interactions to enable hands-on programming for software engineers.
Learners use the interface to take interactive lessons (Labs) to improve their understanding of and proficiency in coding, while experimenting with new technologies or programming languages.
This makes the usability of the coding environment’s interface a crucial component of their experience with the learning material and instructions.
It enables users to get hands-on coding practice
I lead the definition and design of all related user-facing features and enhancements.
I work directly with a dedicated cross-functional team which includes 3 product managers, 7+ senior engineers, and a Sr. project manager.
My tasks and responsibilities (What I do):
- Lead all research efforts to understand user needs, elicit requirements and identify areas of opportunity to improve the user experience overall.
- Observe usage and behavioral data to understand how (and why) learners engage in interactive learning and help shape solutions they need to succeed.
- Design the flows, interactions and layout to facilitate hands-on learning—helping users achieve their learning goals—while optimizing for usability.
- Support product management to define, monitor, and report on data-informed feedback loops for each product initiative or feature release.
Designing key elements, flows, features and interactions:
While I design to enable discovery and define the browsing flows specific to interactive content, I focus primarily on the in-content learning experience, where users spend 96% of their time while on the platform.
The following outlines the key flows and components of the overall user experience and highlights key design decisions I made, features and capabilities I’ve led from definition through implementation in service of improving learning material visibility, discoverability, and usability.
Individual content units are represented as cards in the browsing and discovery flows, including landing pages and search.
This is what a user sees and evaluates to understand the subject matter of any individual content item.
- By design, cards feature technology logos which aids users when scanning to visually identify relevant or related content more easily (images are more quickly cognitively processed than text)
- The layouts adapt to accommodate various content length. As it is part of our content strategy to consistently display full titles, text is never truncated but instead wraps to a new line when horizontal space is limited.
- Their height extend dynamically to allow wrapping titles to preserve the layout’s integrity and prevent content overlap.
- The presentation is primarily text-dominant; This reduces our over-reliance on imagery and custom assets, and limits the cognitive load caused by excessive imagery.
The height of cards match the height of the tallest card in a row when they are presented together in a list.
Each browsing page organizes and lists all available content units for browsing and evaluation.
Users are able to browse to understand the lessons available to them. I designed these to minimize visual noise. An aesthetically minimal design allows the content to stand out and enables better scanning.
I designed the browsing page to optimize for content findability and discoverability; topics are listed alphabetically in a sidebar and represented as selection controls. Popular topics and topics of interest are featured at the top of the results as selectable tiles for quick access.
An aesthetically minimal interface allows the text-dominant content to stand-out. Spacing is applied intentionally following 4px ratio to create vertical and horizontal rhythm.
By design, the cards representing sandboxes (preset coding environment) prominently feature technology logos. I sourced the most up-to-date logo for each programming tool or technology from official developer documentation sites and platforms.
Labs include author step-by-step instructions thus their presentation is more text-dominant while sandboxes are preset coding environments and are more image dominant.
The in-content views are optimized for usability foremost as it is where learners spend 96% of their time on the platform.
The core of the hands-on learning experience involves reading instructions and performing code related tasks using an IDE (Integrated development environment) a terminal or both.
- I introduced a timer to allow users to keep track of their session’s duration when I discovered that we impose usage limits on labs in order to reduce the cost of allocating their resources.
- The instructions are displayed in a fixed sidebar and I ensure each element is optimized for legibility, readability and accessibility ensuring that it meets both the authors and learners needs.
- The sidebar includes interactive elements for copying code to a text editor, executing code commands, and showing solutions, each designed to share a cohesive visual identity and optimized for usability.
Each element’s appearance is considered in context. The overall visual identify adapts O'Reilly's brand identity to design conventions from developer documentation of popular front-end frameworks (material UI) and functionally similar developer tools.
Managing technical complexity, uncovering edge cases and making compromises while optimizing for usability
On many occasions I receive feedback from multiple stakeholders, sometimes they are conflicting, very prescriptive. Everyone has their views for what makes an ideal learning experience and by extension, its UI; This often includes what content is available, how it is formatted, and expressed.
I have to balance all of that, managing stakeholder expectations, while educating and advocating for a positive user experience for learners, and making trade-offs where applicable to optimize for performance.
Communicating system status
Because lab use is time restricted and are technically live software (that rely on a steady internet connection to be available) they disconnect when the time expires or when network issues disturb their connection.
I defined and designed the logic for communicating state changes after convincing stakeholders on its importance to the core learning experience as the lack of state indication caused frustration for learners for whom labs would become unresponsive as they were actively writing code.
Unanticipated technical constraints
Identifying specific technical constraints is a challenge as they usually become evident during development. In many cases, I define an ideal experience first, then work with engineering to resolve edge cases as they emerge.
I initially desired to disambiguate errors in the loading process which occasionally causes a lab to fail to connect properly from errors which caused a lab disconnect with distinct states, however our system did not enable this level of granular visibility over each component impacting lab status.
“We can’t do that” — the ultimate software developer trump card
For status indication, this meant we could only have one state to indicate failed states from disconnects.
Technical Communication barriers and misalignment
Typically code is the source of truth, however due to specific nuances of our implementation model, UI elements inherit styles from a multitude of front-end frameworks simultaneously which—despite our best attempts—remain difficult to override in some cases.
This is a cause for frustration for Software Engineers during the QA phase (which I lead) when elements are not in desired state despite their code reflecting the correct values.
We have to balance design intent and code; while the right values may be applied to the right properties they don’t always render properly and need to be adjusted often as a result.
Working with software engineers and managing individual working styles, workflows and preferences
While some may be able to implement a design to pixel perfect precision by referring to a clickable prototype, others are only successful if each and every nuance of a design is specified in detail—sometimes repeatedly…
Leading quality testing and enabling the conversion of over 1000 labs to the new UI in confidence
As part of the design/development process, UI impacting changes are reviewed and tested for quality prior to user-facing release and I lead the initial rounds of the QA effort where I ensure parity with design intent from an interaction design and visual design perspective.
While we could review each lab individually to ensure it is functional, meets standards and is up-to-date technology-wise, with each change impacting over 1000 labs and the urgency to deliver on a stakeholder expected timeline we sought to implement a mechanism for identifying problematic labs post implementation.
I designed flows and interactions for submitting in-product feedback as a mechanism for identifying problematic labs.
O’Reilly maintains a large number of labs which have been built by several authors over many years.
We deliberated at length on strategies for identifying problematic labs and ultimately defined the following simplified flow as an MVP for identifying problematic labs
We would introduce a feedback submit form in tandem with converting all remaining labs to the new UI. More content would be optimized for usability and those with issues weren’t would allow users to report issues for us to triage.
As labs are subject to obsolescence due to their underlying technology being constantly updated, A problematic lab is a cause for negative user experience and usability issues could exacerbate them.
The form featured 2 clear and distinct options in the flow for users to select from; We would prioritize any reported issue where users selected “The lab is not working” — a label I chose to compensate for potential ambiguities traditionally associated with feedback.
We received 100+ individual feedback reports from users since its release. the majority of which are related to system performance and stability.
It was clear that although the UI is effective, system related performance and stability issues along with a lack of data persistence is a primary cause negatively impacting our KPIs.
With the interface now optimized for usability, users could follow the authors instructions to perform code-related tasks and progress effectively.
User feedback, or rather, the lack of critical usability-related feedback proved the success of the interface’s design from a usability standpoint.
User outcome; The design is sound and optimized for usability however, performance issues and system related performance issues linger.
I proactively identified and addressed potential usability issues as we progressed through each phase (for a total 7, each lasting 2 weeks; a sprint) of development while collaborating with the Product Managers, scoping and prioritizing design work for implementation according to impact and technical feasibility.
Till today, most of the feedback we’ve received has been related to system performance and functional stability. The fact that we have received no usability related feedback to date indicates the success of my design efforts, and although its yet to be publicly recognized, I rejoice on the realization that when everything works as expected…
In order to maximize impact, I prioritized performance, stability and efficiency when enhancing the learning experience.
Despite the investment in “modernizing” the UI throughout the years and the usability improvements I’ve implemented surrounding it, our KPIs remain stable. They did not lead to significant changes to our key metrics (e.g session duration, session frequency).
My research and experience in this area indicates that the target audience for a learning interface for programming is willing to tolerate minor cosmetic or aesthetic issues, they are however unwilling to tolerate bugs, functional instability, network instability, latency (including input lag) major data loss, performance issues and cognitive excise. I’ve documented and reported these specific attributes (KPI-impacting qualities) for the team to prioritize as the product evolves over time to achieve meaningful business impact.