Chat-based tutoring: decreasing friction to maximize effective lesson time

Of the products I’ve worked on at Chegg, my favorite, and the one I’m most proud of, has been chat-based tutoring. Though it’s a fairly unassuming product that on first glance appears to be little more than a web-based chat client, what the student is interacting with is a complex service that involves several humans and automated processes that make it all appear simple and approachable. The end result is that students can get specialized help with a specific problem, from another human, in real time, within a couple minutes of their initial request, and without running afoul of academic honor codes.

The team

I worked with a lead researcher to explore solutions, and roped in the PM and our front-end lead to observe UX research sessions. I was the sole designer on this team and worked with everyone to define possible solutions, then produced all prototypes used for testing, and assisted with research.

The problem

After our product launch, we quickly realized a business need to add time limits for chat lessons. While most sessions were less than 45 minutes, a few outliers were taking multiple hours and causing resourcing issues. So, we decided to limit lessons to one hour. In most cases this is a generous amount of time, but with the introduction of the limit, we needed to do everything we could to make sure the product touchpoint (the chat client) was staying out of the way of the student’s conversation with the tutor, and not adding any needless friction.

One major point of friction in the experience was the “scratchpad,” the main supplementary tool we provided to the student in the chat client. It was essentially a drawing canvas that they could use to work through a math problem, draw diagrams, or share images. Though we planned to eventually introduce more tools, we found in our MVP research that the scratchpad was the tool most applicable to the widest range of academic subjects, from math to sciences to humanities. The problem was, our scratchpad tool didn’t match students’ mental model of how this sort of thing is supposed to work. Almost across the board, students assumed it was a shared whiteboard space that would be visible to both parties in real time. In reality, the scratchpad was a private space. To share content from the scratchpad, the student had to select it using a “screen clip” tool and send it into the text chat. Only then would the content be shared with the tutor. The whole flow of opening the scratchpad, drawing something, and then sharing the image looked like this:

As far as usability problems go, this was annoying but not catastrophic. Remember that this product is actually a service, and the chat client is only a touchpoint. In most cases, the tutor would simply tell the student how to send their scratchpad content into the chat, and then the lesson would continue. However, we found that in certain subjects that make heavy use of visuals or non-typable content (like equations), tutors were spending a significant portion of the lesson time explaining how our tool worked.

The constraints

As with many complicated products, several components of our MVP–including the scratchpad–had made use of third-party libraries. We had limited technical resources to do much hacking at them, so we were looking for smaller tweaks we could make to solve the most pressing points of friction, without redesigning it from the ground up.

Ideation + RITE testing

It was obvious to most of the team that there was some low-hanging fruit we could tackle to see some improvement. But would it be enough? We worked in an extremely scrappy and agile environment in those days. Schedules were tight, and we didn’t have a lot of engineering support to conduct research. We needed to be sure that the solutions we implemented would be effective. You know… all the familiar “agile UX” challenges.

Fortunately, we were able to combine our ideation with some evaluative research. I’ve said it in other posts, but this is my favorite way to do usability testing.

Round 1

We tested what seemed to us the most obvious low-hanging fruit:

  • Opened the scratchpad when the page loaded, instead of requiring the user to discover it.
  • Removed the “how to use the scratchpad” modal (our analytics and user research had shown that people closed this as soon as they saw it, so it wasn’t providing any clarity)
  • Added a “graph paper” background to the scratchpad, to more clearly delineate it from the rest of the chat client
  • removed the “screen clip” tool from the toolbar, and replaced it with a large CTA that appears as soon as something has been drawn on the scratchpad, which launched the same tool
  • inverted the selection color of the screen clip tool. Formerly, the selected area had been dark gray, and the unselected area had no color applied to it. This increased cognitive load when first learning how to use the tool. The redesigned version included a default selection that the user would adjust before sending.

You can see the fully-interactive prototype here. For the purposes of this test, you’ll have to wait until the tutor asks you to draw a normal curve before the scratchpad is enabled.

Opening the scratchpad on pageload and giving it the graph paper background was a success. Students instantly recognized that the left area of the screen was for typing, and the right area of the screen was for drawing; no onboarding content necessary.
The redesigned screen clip tool was only partially successful, however. Even though the dynamic “Send to chat” CTA was much more visually salient than the old icon in the toolbar, students’ assumption that the scratchpad was a realtime whiteboard was so strong that they overlooked the button. When their attention was called to it, they understood, and the inverted selection color of the tool was more helpful than the previous version.

So, we had made progress, but we weren’t there yet.

Round 2

Keeping in mind that we were still attempting to make the most minimal possible changes, we built on our insights from Round 1 to try to make things a bit more obvious. This time, when the user’s cursor entered the drawing area, we used the information box from our design system to show this message:

This didn’t work. Similar to the old “how to use the scratchpad” modal, our participants instantly closed the information box so they could focus on the scratchpad area. We had hoped that the information box looked more like the rest of the ui that it would trigger less of a “close the pop-up” response, but alas, it was not to be.

At this point, I began to think we might need to make a larger change than we would like. It seemed like we had applied so many tweaks and adjustments to the current system, yet the “realtime whiteboard” concept was so powerfully conveyed by the layout of the screen that our efforts just weren’t having an effect.

Before we tried something else, though, I wanted to be certain that simply attempting to draw our users’ attention to the “send to chat” CTA would not meet our goal.

Round 2.5

So, in a mini-round, I added a little bit of motion to make the CTA even more noticeable. Now, understand, this was mainly for experimental purposes. In general I think that while motion can enhance usability, we shouldn’t rely on it for meaning. Not to mention, if you’re building it correctly, your product will respect a user’s accessibility settings, which may eliminate motion. But the point of this test was to see if there was any way we could make the current system work. It was a bit ridiculous:

You can see the fully-interactive prototype for Round 2.5 here.

Fortunately, this round showed that the motion not only didn’t draw attention to the CTA, it possibly made it even less noticeable. Perhaps because motion in advertisements is so ubiquitous nowadays, our participants’ eyes skipped right over entire section of the scratchpad with the “send to chat” CTA.

One of the many nice things about the RITE method is that it’s extremely engaging. Team members are able to sit in on sessions and, in a very short timeframe, see design iterations succeed or fail. Throughout this research, I was constantly grabbing our PM and our eng lead and taking them to watch the sessions. So by this time, while we were all hoping that we could achieve our goals with minimal changes to the existing scratchpad, everyone was on the same page that our minimal changes weren’t going to cut it.

Round 3

So, for the third round, we tried something that, from a user’s standpoint, might seem like a pretty radical change. On the engineering side, what we tested would require a bit more work, but would still allow us to use the same third-party library and many interactions we had already built.

Rather than keeping the same page layout with the scratchpad embedded on the page, we completely removed it. Now, to send a drawing, a user would click on the “drawing pencil” icon inside the text input box itself. This would open a “New drawing” modal overlaying the chat client, making it clear to the user that they couldn’t continue the conversation until the sent the drawing:

See the fully-interactive prototype here.

This completely changed the way our participants thought about the scratchpad. We didn’t need any new onboarding content to explain how to use it, and we were able to completely remove the screen clipping tool.

When we rolled it out, the instances of tutors needing to explain how the scratchpad worked dropped to zero, and the effective time a student had before the end of their lesson increased. As an added bonus, the new scratchpad form factor was more compatible with our responsive web transition, as well as the “embedded” version of our chat client, which is an entry for later.