Tasktop Integration Hub suffers from an interesting, almost paradoxical problem: the users who benefit most from our product often have no idea that it exists. They’ve never seen its user interface (UI), never done any configuration, and probably couldn’t even name our product if asked. When our product functions as it should, there’s no reason for users to know about it — the “users” of our product simply reap the benefits of improved communication, fewer silos, and less time wasted on data entry brought about by the automated flow of information from tool to tool.
I’m reminded of the U.S. government shutdown in early 2019. During that time, many gained a new appreciation for our park rangers – a role we rarely consider or appreciate. It was only when those park rangers stopped working during the government shutdown that we truly grew to understand their enormous impact. The lack of rangers lead to park damage, closed bathrooms, overflowing garbage cans, and theft of natural resources (Joshua Tree has been projected to take 200-300 years to recover from the government shutdown!). Similarly, most users who benefit from Tasktop only become aware of it when it stops working. And even then, they rarely see or interact with our product’s UI.
An interesting byproduct of this phenomenon for a Product organization is that it makes carrying out user research and usability testing much harder. When designing the UI of our product, we must focus on one specific user – the admin of the tool who configures and maintains the integrations. Though Tasktop benefits all the specialists using the tools that Hub integrates, our user experience (UX) only impacts a small set of users (often just one person!) at an organization — the user who is configuring the integrations. Our product is an enterprise business tool designed for a specific business domain, in contrast to consumer-facing products built for huge numbers of diverse users.
Because Hub is primarily an on-premises (desktop) application, we aren’t able to glean insights from usage data as we could with a cloud application. Without a way to track anonymous click path analytics across our on-prem users, usability testing became a central method for obtaining user research to shape our product design.
When we kicked off our user research program, our goal was to generate objective data to guide our UX decisions. A usability testing session consists of a subject working through a wireframe that our UX team has created in order to complete a task. By assessing whether users are able to successfully complete the task (along with additional metrics, such as number of clicks, time on task, and more), we are able to identify which areas of the feature are most in need of focus and UX improvements. These sessions enable us to collect data illuminating a wide range of user experiences to support the decisions we make regarding feature design.
While designing this program, our limited set of users was something we had to consider. Since our external user base was limited when compared to large, consumer-facing products such as iTunes or Netflix, we had to get creative to start building a roster of usability testing subjects. To build momentum, we kicked off the program using internal stakeholders as our test subjects. We sent out a survey to Tasktop employees to gather a large swath of data regarding areas of expertise, experience levels within various components of our product, and level of interest in participating in usability testing. Each time we conduct a session, we refer to the same survey results to find candidates that represent a range of user types and experience levels.
Using Tasktop employees as usability testing subjects allowed us to gain meaningful insights into product pain points and ease of use of our product design. We were able to combine quantitative and qualitative research with anecdotal and gut feelings to motivate product design decisions. Usability testing helped remove assumptions and provided increased confidence in our design decisions. Working with internal users also allowed us to experiment and formalize our user research processes – now a sophisticated program that makes use of testing scripts, team retrospectives, result templates, and iterative testing.
Keep your eyes peeled for next week’s blogs as my colleagues Zhen Wang and Larry Burks add further color to our UX process, explaining how you can reduce risk and waste to drive further efficiencies to improve the user experience of your software products.
Are you a tool admin looking for a simple way to integrate the tools in your software delivery value stream? Or maybe one of the specialists involved in planning, building and delivering enterprise software who’s fed up with pesky manual overhead? A highly-personalized demo will show you how our sophisticated toolchain integration automates product-critical information between teams and tools to make your job dramatically easier and help you accelerate the value delivery of your products.