Captricity’s core product automated insurance workflows by using machine learning to digitize handwritten data from paper forms. However, it was not yet an end to end solution. The offering contained a major workflow gap that caused duplicate efforts and a large portion of digitized data to be thrown out. We needed a way to store partial data to be repaired so that higher throughput could be accepted downstream.
The result is an isolated, UX-friendly interface, “REVIEW”, in which data entry clerks can log in and review digitization results. They can edit values using the “human” resources unable to be hardcoded into Captricity’s models. The result of this tool is more cases being accepted with a fraction of the manual effort.
As the company's sole designer, I vetted and championed the problem with a junior PM and owned the end to end UX of this project. I worked side by side with a scrum team containing 6 app engineers and consulted with 2 MLEs throughout the project.
Inefficient workflow creates a bad customer experience:
- If a data set contained even one piece of invalid data in a single field, the entire case had to be kicked out. Captricity’s throughput was low and there was no solution in place to repair the partial data
- These rejected cases were sent back to the customer for their staff to manually key in, creating duplicate efforts. Workflows were so inefficient that often customers were sent new blank forms to start the process again from scratch. This was a bad customer experience that prolonged the application process and caused potential customers to drop off
Testing and User-Centered Development
Consistent feedback for iterative design:
- At the project launch, a small team flew to the customer site to kick off the project. There, I was able to shadow the current workflow, validate a list of assumptions, and introduce future users to early mockups to familiarize them with the project in person before our remote sessions. This trip enabled me to start the project with a comprehensive understanding the problem in the greater context
- For the rest of the project I conducted remote user feedback with end users on a regular basis. Due to insights from these sessions, I was able to adapt the product to the most efficient user flow, shaving off minutes of processing time
User journey mapping:
One of the biggest concerns for this project was whether or not our small, young team was capable of launching a completed V1 product in a three month deadline. The user journey mapping artifact I created provided the following benefits that were visible throughout the project:
- Context. By understanding the users and their challenges and goals, engineers were able to pursue solutions that were more closely relevant to the user.
- Focus. I was able to prevent scope creep within a sprint because engineers had visibility into every step of the project. Expectations around iterative design were noted upfront so engineers were able to buy into the agile development process, trusting that all of the tasks had been allocated a strategic time to be worked on.
- Intentional milestones for user testing. By focusing on each consecutive step of the user flow, we enabled incremental user testing throughout development. An example of how this worked for this project: The simple dashboard screen was worked on first and complex interactions of the tool were slated toward the final sprints. This maximized iterations without causing rework. I was able to refined final designs to reflect multiple rounds of user testing.
Final Design and Resolution
Captricity’s machine learning technologies are so advanced because of our foundational understanding that human input is required to achieve the best results. REVIEW gives users a place to review and repair invalid forms, providing the following value:
- Enable ML contribution: REVIEW is the missing piece to an ML-only digitization path. Accuracy could be obtained at the same level we were currently offering, but with a much faster turnaround time at a more competitive price point. The human feedback in REVIEW can train our OCR engines to increase accuracy.
- Reduce cycle time: Forms marked as invalid by Captricity needed to be reviewed for a second time. By providing a way for the partially digitized data to be preserved, a REVIEW interface would eliminating duplicate efforts of manually re-keying entire forms. The cycle time from application submission to policy issuance could be reduced.
- Increase straight-through processing: the REVIEW interface would be able to deliver more forms downstream through a single cycle, kicking less forms out and preventing duplicate work from associates or customers.
- Decrease final error percentage: By creating a more supportive union between automated digitization and human review, REVIEW users would be able to more strategically catch errors before they created risk later in the pipeline.
Launching an alpha product was a very critical strategic initiative for Captricity, and the REVIEW launch at MetLife would not have been possible without the user-centered design focus that I employed. My contributions to the project went beyond delivering a functional and intuitive UI - I created a transparent, open culture within the scrum team to drive everyone to efficiency, I iterated on the design to fit the available scope, and I engaged users diligently for feedback but filtered through just enough to create a great UX without pushing back the timeline.
From a marketing and sales perspective, the alpha went well. Within the first few months, MetLife was seeing 35% higher throughput volume - a 50% increase over the initial success metric we had agreed upon. They renewed and agreed to a press release.
Since the release I have improved the design for higher complexity use cases. I’ve been heavily involved in the transition process as customers decide to change their monolithic workflows to choose smart, lean UX and adopt REVIEW. Standing by the current state of the product and assessing feedback to shape its future directly with customers has been a delightful challenge.