Partner-enabled data configuration for enterprise

PROBLEM: Field configuration was custom and clunky

Captricity was originally designed to digitize data from simple signup forms and surveys. As our machine learning capabilities grew and our customer use cases became more complex, it became clear that the implementation team’s process hacks were not enough and the core product needed to be overhauled to reach the level of maturity required for serving enterprise.

Along with complex enterprise use cases, the company was also newly embarking on implementation partnerships. However, the lack of in-product process and functionality prevented the level of involvement both Captricity and partners desired.

I had spent 9 months studying the entire product’s real life usage by our implementation team and the processes and hacks they had patched together to pull off each newer and more complex use case. 3 of those months I spent documenting their process for the explicit purpose of prioritizing changes needed for partner enablement.

The foundational issue that I uncovered was the lack of a customer-centric setup approach. The legacy of the company’s start was wreaking havoc on present-day. Instead of starting projects with a complete understanding of the customer’s agenda: digitized data, the implementation team was distracted by the complexity around conforming to the non-human aspect of setup: templatization of forms to be successfully processed within a ML black box for sorting and digitization.

The requirements gathering process was disjointed and inefficient, causing multiple rounds of rework for each field. All configuration was managed in a collection of spreadsheet outside of - and disconnected from - the product. All data enrichments were built in a custom, unrepeatable way for each use case. The lack of standard enrichments negatively impacted ML contribution during the digitization process. And because the focus and framing wasn’t customer-centric, the value of complex setup tasks was not made apparent to the customers. When rework of setup tasks made setup take longer, customers were confused and frustrated. This caused them to become disenchanted with Captricity before being able to prove value in production.

HYPOTHESIS: Upfront requirements gathering and out of the box enrichments

It became clear to me that if setup was to work for Captricity’s internal users (the implementation team), partners, and customers, the focused needed to shift. The data needed to become the primary, guiding focus.

An artifact I had created during my research was the end to end setup process flow of the implementation team. This was the first time their process was accurately recorded and the resulting artifact brought many insights.

The biggest takeaway was that tasks related to field configuration were done by different roles at different times within setup. This caused a lot of unnecessary rework because although disjointed, these tasks were inextricably conditional to each other. I believed that if all requirements for a given data set were defined altogether up front, a united strategy for how to achieve field configuration could be defined and efficiently executed while avoiding the need for major course correction mid-setup.

An architectural shift of field management was something I had been advocating for since my first week at the company. I knew other parts of the product vision depended it, but it wasn’t flashy enough to get prioritized on it’s own. It was gratifying to be able to prioritize my plans because they were now couched in business value: partner enablement and increased ease of adoption of the new CaseCorrect tool.

The goals for this project shaped up as follows:

FIELD CONFIGURATION: Finding a way to make field management finally happen

The first new tool I needed to design had to be a central place to easily manage hundreds of fields. For the first phase of this functionality, this became the Field Configuration Spreadsheet. The decision to use Google Sheets instead of building this inside the product was hard for engineering to buy into. Google Sheets had a lot of limitations - and we found out even more deeper in the project.

However, my confidence that this strategic decision was worth it won out. The product, in its state of immaturity, could not house this tool yet. Architectural account and user permissions rework was a large project that the PM could not justify at this time. I advocated for the Google Sheet, and even though it wasn’t ideal long term, it wound up bringing partners into this part of the process months and months sooner than if we would have pushed the entire project back to build a more conventional, in-product scenario could have. The business value made this compromise worth it.

TEMPLATE DEFINITION: A modular, intuitive, and fast UI

Although we already had a version of this tool, my new vision for its use was to focus in on a single task: assign fields to the template. All configuration of the field itself was organized elsewhere, so the work done with this task was modular enough for any internal or external user to do in isolation without needing to understand the greater, more complex context to other pieces of setup.

In talking through my design with engineers, we came to the decision that the best plan of action would be to rebuild the tool from scratch using the most up-to-date framework. I conducted a handful of iterative co-design sessions with the implementation team so that they could start to adhere to my new vision for their process as early as possible. This proved to be helpful by the time the tool launched.

I worked with the PM to pare down the initial MVP to a size that fit within her scope. The final result allowed the implementation team to be faster than ever and created an intuitive experience for partners and customers to autonomously take on.

USER FEEDBACK: Eliminating ambiguity around a new process

For the Field Configuration Spreadsheet, I knew adoptability would be an issue for the initial rollout. This was something I was suspicious of and was able to convey to the PM and engineers through user testing. With resources being pulled to other parts of the project, a prescribed organizational feature would not make the cut. However, later post-release, when related pains were inevitably revealed, it was not a shock to the PM and she was able to quickly provide short and mid-term guidance.

Even if it was deliberately decided that that issue would not be addressed within the design, it was important to make the cause and effect of configuration within the spreadsheet as obvious to the users as possible. Through testing, I was able to update microcopy and column ordering so that users’ actions would be better aligned to actual outcomes.

In most of these sessions, I invited one or more of the engineers on the team to participate and observe. I find this to be a helpful method to ensure a user-led agile development process and a general understanding of the value their work will make once completed.

For the Template Definition tool, existing users had familiarity with the old tool. This provided a unique layer of considerations for successful adoption of the updated tool. The biggest takeaway for me was how this tool’s functionality differed between setup and production. Although this tool is most heavily used during setup, ongoing tinkering is common during production. I had initially designed the tool to be optimized for setup, but it was clear that that flow would require annoying extra clicks every time the tool was referenced in production. I reworked the flow so that setup functionality became easily accessible, but secondary. This made the Template Definition tool, for the first time, a true compliment to the end to end flow.

REFLECTION: Formalizing beta testing to expand culture of usability testing

With an engineering team that was not used to much follow through after a release and a PM with no experience carrying out a project past the initial launch into adoption, there were a few rough edges that had to be dealt with after the initial launch that I would like to prevent next time.

Engineering leadership embedded a formal beta testing program into the agile process. I saw this as an opportunity to advance the culture of usability testing throughout development and worked with the principle engineer leading the effort to optimize it for all stakeholders. Through this process, engineers were able to understand the value of user feedback at different stages of development (local testing versus staging, etc). And PMs and internal users were able to find a safer space for critique so less issues slipped into production unnoticed.

The bumpiness of this launch made it clear that I’ll need to formalize a migration process alongside PMs to support users for a more swift and successful adoption. The product maturity is happening alongside scrum team maturity and it’s gratifying to see how insights into one area positively improve the other.

Other Work

Human feedback loop for ML-based digitization

Mobile functionality for an end to end experience