Loading...

Cloud Computing UX

Studying and Improving the Usability of Scientific Infrastructure
A screenshot of a load average graph shown to testbed users.

The Project:

It is often said that software developers design for themselves. Scientific infrastructure is no different. To break that pattern and improve the usability of scholarly technology, my collaborators at the University of Utah (Jason Wiese, Kazi Sinthia Kabir, & Tamanna Motahar) and I are doing user research for a cloud computing testbed. The testbed allows users to construct clouds of their own, arranging nodes in whatever configuration neccessary and directly controlling the hardware they use. Through interviews and contextual inquiries we are learning how researchers leverage the testbed, what their pain points are, and how the testbed can better aligned with scholarly workflows.

Our research is ongoing. A current focus is optimizing time spent on the testbed by encouraging swift return of nodes and reducing the likelihood of lost data because of missed exportation deadlines. We believe there is opportunity to simultaneously meet the goals of the testbed developers (e.g. increase access to nodes), reduce these pain points, and facilitate open science and reproducible work through automation.

My Role:

I joined the project as a postdoc for Dr. Wiese in fall 2022 and have been conducting interviews and observation sessions along with graduate students Kazi Sinthia Kabir and Tamanna Motahar. I brought sociological expertise to the otherwise UX focused project, prompting a publication submission to CSCW23 (and subsequent acceptance with minor revisions!). In that manuscript we frame use of the cloud computing testbed as a collaborative activity where users optimize node allocation for themselves and the userbase as a whole. Using that lens, we recognized opportunities for design interventions to increase the collective understanding of time on the testbed and encourage quick node return.

As our research progressed, I have started meeting regularly with the development team to convey our insights and better appreciate their goals. We are presently evaluating some design interventions through surveys and interviews and hope to implement changes soon.

Simultaneously, we are exploring the artifact evaluation process, hoping to identify ways that infrastructure like CloudLab can better support reproducible research. This research is ongoing.

Keep in Touch

I look forward to hearing from you.
hannah.cohoon@utah.edu
School of Computing, University of Utah: MEB 4154