Friday, January 15, 2016

Possible usability testing topics

You may have seen Allan's recent post about Settings design updates. I wanted to connect this back to my offer to mentor usability testing in the next GNOME Outreachy.

Interns who are interested in usability testing can suggest other areas of GNOME for study, but I recommend we focus on the updated features in Allan's note. If we can work with a test build of GNOME with the new features, I would like to see a usability test of:
Can testers quickly navigate to the correct system settings, and change the settings to a new appropriate value?

Can testers adjust volume settings for media, alerts, and other sound events?

Adjusting display modes for mirroring between two screens, joining two screens, or displaying to a projector.

Setting up a new printer, or adjusting settings for an existing printer. One possible use scenario would be changing the driver to get better printed output.

Configuration for new wireless networks.
If you or someone you know is interested in applying for a usability project in the next Outreachy, leave a comment here or send me an email at jhall - at - gnome - dot - org. It would help me to gauge interest as I decide my availability.
image: Outreachy

Monday, January 11, 2016

Interested in usability testing? Let me know!

In case you missed this note at the end of my (very long) post about Teaching open source usability:

The next round of Outreachy begins soon! I haven't decided if I'll have time to mentor this round, since my new role is much more intensive than in my previous organization. I'll have to make that decision by February 9, 2016. Applications for Outreachy will open on February 9 and the deadline for applying will be March 22. The internship dates will be from May 23 to August 23.

If you or someone you know is interested in applying for a usability project in the next Outreachy, leave a comment here or send me an email. It would help me to gauge interest as I decide my availability.
image: Outreachy

Friday, January 8, 2016

Teaching usability of open source software

In Fall semester, I taught an online class on the usability of open source software (CSCI 4609 Processes, Programming, and Languages: Usability of Open Source Software). This was not the first time I helped others learn about usability testing in open source software, having mentored GNOME usability testing for both Outreachy and the Outreach Program for Women, but this was the first time I taught a for-credit class.

I'd like to share some reflections on this class.
CSCI 4609 was an elective in the computer science program at the University of Minnesota Morris, where I was the Director of Information Technology. (I have since moved on to a new CIO position in St. Paul, Minnesota.) The class was taught entirely online.

I was pleased that ten students signed up for the elective. This may seem small, but it is a significant number for a campus of some 1,900 students and a small computer science department. The same number of students also signed up for other electives that semester, including a course on databases.

I organized the class similarly to the usability projects I mentor for Outreachy. Over thirteen weeks, students learn about open source software and usability testing. Most weeks included two assignments: summarizing several assigned articles, and exercising their knowledge of that week's topic. Later in the semester, students moderate two in-person usability tests; the first is a "dry run" for a final project.

For some students, this course was their first introduction to open source software. So I began the course with an introduction to free software and open source software. What is open source software? How is open source software different from proprietary/commercial software? I provided links to several articles on free software and open source software, including the Free Software Definition and the Open Source Software Definition.

The second week introduced the topic of usability. What does usability mean? How is usability different from user experience (UX)? Also in this week, students examined how usability applies to open source software. Why is usability an issue in open source software? Is this any different than in proprietary/commercial software?

In following weeks, I helped students "ramp up" to usability testing. We learned about the process of usability testing, from defining Personas, drafting Use Scenarios, and identifying Scenario Tasks. Each week's assignment challenged students to learn about the topics through several online articles, and then to apply what they learned by writing their own Personas, Scenarios, and Tasks. Throughout, students reviewed and commented on each others' reflections, and I provided guidance and helped students to build their own understanding of the usability testing process.

Before starting their own usability test, students also learned to do a user interface analysis. What makes up a user interface? How are are these UI elements used? Do they have specific purposes? Are any common between platforms? We referenced the GNOME Human Interface Guidelines, as well as other freely available references on user interface design. Understanding the user interface elements is an important step to ensure that a usability test properly exercises the UI elements you are interested in. A usability test isn't very useful if it only hits one or two menus.

Students then moderated their own usability test mini-project. Almost everyone had used LibreOffice or OpenOffice before, so we used LibreOffice as our program to test. To prepare for the usability test, students went through the standard process: they identified sample Personas of possible users, documented several Use Scenarios for each Persona, and created Scenario Tasks that addressed (most of) the Personas and Scenarios for LibreOffice.

The usability test mini-project was a very straightforward test. Our sample Persona was an 18-year-old university student (liberal arts major) using LibreOffice for the first time. Our Scenario Tasks (the tasks that form the usability test) exercised only basic functionality. This was not a "deep dive" into Microsoft Word fidelity or exercising "power user" menus; these tasks were about a college student working on a class paper.

I was really pleased how well the students did in preparing for the usability test mini-project. They clearly put a lot of thought into the process, as demonstrated by the completeness of the Personas and Scenarios we ended up with. The ten scenario tasks for the mini-project asked testers to Indent each paragraph in a document that was provided to them, Double space the document, Switch the order of paragraphs in the document, Check the word count, and Save as PDF. Testers then had to Create a new document, Add a title and center it, Add a page number/header, Save the document in LibreOffice format, and also Save as Word 2010 format.

For the mini-project, all students moderated a usability test with one tester, but using the same scenario tasks. This meant we had ten data points. From the combined data, we generated a heat map to display our results:
In a heat map, you code the difficulty experienced by the tester during each task. Each column represents a different tester. If the tester was able to do the task smoothly, without difficulty, then it's green. Some difficulty is coded in yellow, but not too bad. Increasing difficulty is represented in orange and red. If the tester was unable to complete the task at all, we code in black. (White shows a task that was skipped during the usability test.) Each row in the heat map shows a scenario task, summarized here in a few words.

I used this heat map in a discussion a few weeks ago on LibreOffice's design changes.

The usability test mini-project was a "dry run" for the students. I wanted them to make mistakes here, and to learn from those mistakes. One tester was enough for the students to learn about what it takes to be a successful moderator. They also learned the challenges of moderating your own usability test: You can't give hints. You need to be prepared. You need to give testers each Scenario Task one at a time.

In the weeks that followed, over the rest of the semester, students worked on their own usability test as part of their final project. We started by examining what open source software program they wanted to study in their usability test final project. A few students wanted to look at other parts of LibreOffice, but most chose other open source software projects including web browsers, voice chat software, and text editors. One enterprising student chose to examine the Ubuntu desktop (this can be a big project, so we narrowed it down to focus on only a few features and utilities).

Students then identified the possible users for their selected open source software program, and documented a few sample Personas, and several Use Scenarios for each Persona. From this background material, students created their Scenario Tasks. Along the way, students commented on each others' work, and I added my own coaching.

Over two weeks, each student moderated their own usability test of the software, using at least five testers.

To wrap up the course, students wrote a final paper about their usability test, using the format of an article for a journal or trade magazine. To lend some realism to their final paper, I provided students with a template borrowed from a professional journal, although it was somewhat simplified to better suit an undergraduate project.
Over the semester, students learned about open source software and usability testing. This was essentially the same outline as the usability projects I mentor in Outreachy, although with a bit more exercise of process. In Outreachy, I help interns learn about usability testing, then we jump right into a usability test of GNOME. Perhaps the pedagogy used in the course provides a better structure to learning about usability testing. More importantly, the course outline provided for a "dry run" usability test where students could make mistakes without affecting their final project. I may consider adjusting future Outreachy projects to include a similar mini-project.

The next round of Outreachy begins soon! I haven't decided if I'll have time to mentor this round, since my new role is much more intensive than in my previous organization. I'll have to make that decision by February 9, 2016. Applications for Outreachy will open on February 9 and the deadline for applying will be March 22. The internship dates will be from May 23 to August 23.

If you are interested in applying for a usability project in the next Outreachy, leave a comment here or send me an email. It would help me to gauge interest as I decide my availability.
image: University of Minnesota Morris

Changing jobs

I wanted to share a brief update that I have changed jobs. As of December 2015, I am the new Chief Information Officer for Ramsey County, Minnesota. Ramsey is the second largest county (by population) in the State of Minnesota, and also includes the State Capital of St. Paul, Minnesota.

I leave behind a wonderful IT organization at the University of Minnesota Morris, where I served as IT Director and IT Executive for the last five and a half years. Before that, I was at the University of Minnesota Twin Cities in Minneapolis, where I led Enterprise Operations and Infrastructure as part of the systemwide Office of Information Technology. Overall, I was with the University of Minnesota system for some 17 years, most of my 20 years total in Information Technology.

I still plan to continue my work in open source software. So that means I will keep posting items about the usability of open source software. And I'll continue working on my other open source software projects, most notably the FreeDOS Project.

But I do find my new role is much more intensive than in my previous organization, so I don't have much free time to check personal email. So if you try to contact me and I don't get back to you right away, please understand that it may take me a few days to reply. I usually set aside time to do email over weekends.
image: Ramsey County website

Monday, January 4, 2016

Eye tracking in usability tests

A few weeks ago, I read an article about a Python-based open source eye tracking tool.

Eye tracking can be an important tool in usability testing. When we conduct a usability test, we usually ask participants to speak aloud whatever they are thinking during the test. For example, if the tester is looking for a Print button, we encourage the tester to say "I'm looking for a Print button." Using the "speak aloud" method allows the moderator or observer to take notes on what happened while the tester was trying to complete each scenario task.

This works well as long as testers are willing to talk out loud and give a "stream of consciousness" narration. Some testers do this better than others; some prefer not to do it at all. But without that input, we don't know why a tester had problems completing a task. Was the tester looking for a menu instead of an icon on the tool bar? Where was the tester looking on the screen for the solution? If we know the answers to these questions, we can better understand how users approach the software. In turn, the designers and developers can modify the interface to make the software easier to use.

That's where I wish for easily available eye tracking. And with PyGaze, it looks like this may finally be within reach of open source usability testing! PyGaze is a set of software that, among other things, provides eye tracking. You can learn more about PyGaze, including samples of heat maps, fixation maps, and scan paths, at the page that describes PyGaze Analyzer.

Here's a sample image from the PyGaze website, showing an eye tracking session for a website. "Figure 7 shows that our volunteer first looked at the pictures on the documentation site of OpenSesame (an open-source graphical experiment builder that’s becoming popular in the social sciences), and then started to read the text."

As we do more usability testing with GNOME via Outreachy, I hope our future interns can get PyGaze working so we can examine eye tracking along with our other usability data.
image: Alper Çuğun