Friday, January 8, 2016

Teaching usability of open source software

In Fall semester, I taught an online class on the usability of open source software (CSCI 4609 Processes, Programming, and Languages: Usability of Open Source Software). This was not the first time I helped others learn about usability testing in open source software, having mentored GNOME usability testing for both Outreachy and the Outreach Program for Women, but this was the first time I taught a for-credit class.

I'd like to share some reflections on this class.
CSCI 4609 was an elective in the computer science program at the University of Minnesota Morris, where I was the Director of Information Technology. (I have since moved on to a new CIO position in St. Paul, Minnesota.) The class was taught entirely online.

I was pleased that ten students signed up for the elective. This may seem small, but it is a significant number for a campus of some 1,900 students and a small computer science department. The same number of students also signed up for other electives that semester, including a course on databases.

I organized the class similarly to the usability projects I mentor for Outreachy. Over thirteen weeks, students learn about open source software and usability testing. Most weeks included two assignments: summarizing several assigned articles, and exercising their knowledge of that week's topic. Later in the semester, students moderate two in-person usability tests; the first is a "dry run" for a final project.

For some students, this course was their first introduction to open source software. So I began the course with an introduction to free software and open source software. What is open source software? How is open source software different from proprietary/commercial software? I provided links to several articles on free software and open source software, including the Free Software Definition and the Open Source Software Definition.

The second week introduced the topic of usability. What does usability mean? How is usability different from user experience (UX)? Also in this week, students examined how usability applies to open source software. Why is usability an issue in open source software? Is this any different than in proprietary/commercial software?

In following weeks, I helped students "ramp up" to usability testing. We learned about the process of usability testing, from defining Personas, drafting Use Scenarios, and identifying Scenario Tasks. Each week's assignment challenged students to learn about the topics through several online articles, and then to apply what they learned by writing their own Personas, Scenarios, and Tasks. Throughout, students reviewed and commented on each others' reflections, and I provided guidance and helped students to build their own understanding of the usability testing process.

Before starting their own usability test, students also learned to do a user interface analysis. What makes up a user interface? How are are these UI elements used? Do they have specific purposes? Are any common between platforms? We referenced the GNOME Human Interface Guidelines, as well as other freely available references on user interface design. Understanding the user interface elements is an important step to ensure that a usability test properly exercises the UI elements you are interested in. A usability test isn't very useful if it only hits one or two menus.

Students then moderated their own usability test mini-project. Almost everyone had used LibreOffice or OpenOffice before, so we used LibreOffice as our program to test. To prepare for the usability test, students went through the standard process: they identified sample Personas of possible users, documented several Use Scenarios for each Persona, and created Scenario Tasks that addressed (most of) the Personas and Scenarios for LibreOffice.

The usability test mini-project was a very straightforward test. Our sample Persona was an 18-year-old university student (liberal arts major) using LibreOffice for the first time. Our Scenario Tasks (the tasks that form the usability test) exercised only basic functionality. This was not a "deep dive" into Microsoft Word fidelity or exercising "power user" menus; these tasks were about a college student working on a class paper.

I was really pleased how well the students did in preparing for the usability test mini-project. They clearly put a lot of thought into the process, as demonstrated by the completeness of the Personas and Scenarios we ended up with. The ten scenario tasks for the mini-project asked testers to Indent each paragraph in a document that was provided to them, Double space the document, Switch the order of paragraphs in the document, Check the word count, and Save as PDF. Testers then had to Create a new document, Add a title and center it, Add a page number/header, Save the document in LibreOffice format, and also Save as Word 2010 format.

For the mini-project, all students moderated a usability test with one tester, but using the same scenario tasks. This meant we had ten data points. From the combined data, we generated a heat map to display our results:
In a heat map, you code the difficulty experienced by the tester during each task. Each column represents a different tester. If the tester was able to do the task smoothly, without difficulty, then it's green. Some difficulty is coded in yellow, but not too bad. Increasing difficulty is represented in orange and red. If the tester was unable to complete the task at all, we code in black. (White shows a task that was skipped during the usability test.) Each row in the heat map shows a scenario task, summarized here in a few words.

I used this heat map in a discussion a few weeks ago on LibreOffice's design changes.

The usability test mini-project was a "dry run" for the students. I wanted them to make mistakes here, and to learn from those mistakes. One tester was enough for the students to learn about what it takes to be a successful moderator. They also learned the challenges of moderating your own usability test: You can't give hints. You need to be prepared. You need to give testers each Scenario Task one at a time.

In the weeks that followed, over the rest of the semester, students worked on their own usability test as part of their final project. We started by examining what open source software program they wanted to study in their usability test final project. A few students wanted to look at other parts of LibreOffice, but most chose other open source software projects including web browsers, voice chat software, and text editors. One enterprising student chose to examine the Ubuntu desktop (this can be a big project, so we narrowed it down to focus on only a few features and utilities).

Students then identified the possible users for their selected open source software program, and documented a few sample Personas, and several Use Scenarios for each Persona. From this background material, students created their Scenario Tasks. Along the way, students commented on each others' work, and I added my own coaching.

Over two weeks, each student moderated their own usability test of the software, using at least five testers.

To wrap up the course, students wrote a final paper about their usability test, using the format of an article for a journal or trade magazine. To lend some realism to their final paper, I provided students with a template borrowed from a professional journal, although it was somewhat simplified to better suit an undergraduate project.
Over the semester, students learned about open source software and usability testing. This was essentially the same outline as the usability projects I mentor in Outreachy, although with a bit more exercise of process. In Outreachy, I help interns learn about usability testing, then we jump right into a usability test of GNOME. Perhaps the pedagogy used in the course provides a better structure to learning about usability testing. More importantly, the course outline provided for a "dry run" usability test where students could make mistakes without affecting their final project. I may consider adjusting future Outreachy projects to include a similar mini-project.

The next round of Outreachy begins soon! I haven't decided if I'll have time to mentor this round, since my new role is much more intensive than in my previous organization. I'll have to make that decision by February 9, 2016. Applications for Outreachy will open on February 9 and the deadline for applying will be March 22. The internship dates will be from May 23 to August 23.

If you are interested in applying for a usability project in the next Outreachy, leave a comment here or send me an email. It would help me to gauge interest as I decide my availability.
image: University of Minnesota Morris

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.