top of page
DS cover.png

DocuSign 

Foundational research on process reporting, a critical new feature on DocuSign's contract lifecycle management platform

UX Research

Duration

12 weeks

June - Sept 2022

Role

UX Researcher

Tools & Methods

Figma

Miro

Zoom (Remote moderated testing)

User interviews

Inferential statistics

Affinity mapping

Jobs-to-be-Done (JTBD)

This project was completed during my UX research internship at DocuSign in summer 2022.

I led the research project from start to finish, including developing and running the research study, and analyzing and synthesizing study results, while collaborating with a cross-functional team of designers, project managers, and researchers.

Overview:

Prior to the 2022 Release 3 in November, DocuSign's basic contract lifecycle management platform (CLM Essentials) lacked contract process reporting, a key feature that many competitors offer in their base-level product and one that users expect as a basic feature. This was resulting in lost sales and increased churn.
Goals: 
  • Understand how to most effectively help CLM customers understand contract process efficiency
  • Successfully implement process reporting into CLM Essentials in the upcoming release
 
What I did: Semi-structured interviews, co-design, usability testing

Results:
  • Data and visual restructuring and optimization 
  • Successful feature launch
  • Greater understanding of customer needs and expectations

Defining Contract Process Reporting

Contract process reporting (or process reporting) encompasses the data and reports that would help users analyze the efficiency of the processes that a contract goes through. This include data and metrics on different stages and steps such as negotiation, review, approval, editing, and sending for signature. 

Timeline

Timeline.png

Project Background:

By speaking with stakeholders and working together with the cross-functional team, I gathered and evaluated the current understanding of process reporting from customer, business, and technical perspectives. Using this information, I was able to set realistic objectives and goals for the research project and its outcomes.

Why Process Reporting?

Expectation vs Reality: 

  • There is currently no reporting in CLM Essentials

  • CLM Essentials customers expect reporting as a basic CLM functionality

    •  Currently, CLM customers have no visibility into the efficiency of their workflow processes and have no way to demonstrate the ROI of CLM.

 

Losing to the Competition:

  • Competitor systems do offer reporting of this nature 

    • Leading to lost sales and increased churn.

 

Road to 2022 Release 3 and beyond:

  • Initial addition of basic reporting for Release 3:

    • Breakdown of average duration to complete all processes/document types 

    • Breakdown of average duration spent in each stage of a process

  • While reporting is being built out for the initial release, the research project was simultaneously run to determine a longer-term approach for reporting in CLM Essentials.  

Previous Research

By studying previous research, I learned that customers:

  1. Want process reporting

  2. Want to be able to ID bottlenecks in the contract process

  3. Want evidence to show where contracts are and are not getting held up

  4. Want to optimize productivity

background.png

Understanding Customers

JTBD Hierarchy 

I used the jobs-to-be-done (JTBD) framework that was currently being developed at DocuSign to better understand these customers and their needs related to process reporting in context of both the larger DocuSign and customer ecosystems.

 

JTBD Diagram.png

Job breakdown of target customers:

Target customer demographic: Admin roles at small & mid-size companies

JTBD Core Job: Optimize agreement process

  • Micro Job 1: Identify bottlenecks & Optimize agreement workflow/process

  • Micro Job 2: Assess performance & Identify individual bottlenecks

  • Micro Job 3: Evaluate ROI of tool

General Goals

I determined two main goals for the study related to the general understanding of process reporting within CLM Essentials:

  1. Evaluate proposed design concepts for process reporting 

  2. Understand if the designs are sufficient in supporting the known jobs of these users: Identifying bottlenecks; Optimizing workflow process; Assessing individual and team performance; Evaluating ROI of product

Business Goal

Through the greater understanding gained by this research, DocuSign hopes to implement a much needed critical feature, thus the overall business goal of this research was that:

 

100% of CLM Essentials accounts will utilize the process reporting feature.

Study Design:

The design of this study relied on a careful balance between rich, generative research and addressing particularly vital points and questions of interest. Each part of the study served a specific purpose and ensured appropriate methodology for addressing key aspects of the experience and gathering reliable data.

Research Questions

To address our main objectives, I developed several more specific research questions. They fall into three main categories: 

  1. Assessing if users can perform the key jobs related to process reporting: Identify bottlenecks; Assess Performance; Optimize process efficiency; Evaluate ROI.

    • Confirm that the data types we are presenting are the most critical for users; Identify any missing critical data 

    • Assess whether the proposed locations for the information are acceptable for users 

    • Assess whether the presentation of information is the best format for displaying process data

    • Assess how consumable the data is in the way it’s displayed and visualized

    • Assess how users need and expect to be able to interact with the data visualizations

  2. Learning about the context around process reporting data

    • What triggers needing to see it? How frequent are those triggers?

    • What actions will they take from the information?

  3. Determining what other data, reporting and visualization needs our users have beyond the concepts tested

    • In what contexts and scenarios are these needed?​

    • What do competitor products currently offer?

 

Detailed sub-questions in each category served as a checklist to reference as I developed and ran the study.

Methodology

I formulated the study methodology with the following criteria:

  1. Address all 3 research question categories

  2. Generate both qualitative and quantitative data

  3. Remain open ended enough for organic discovery

  4. Fall within a 60-minute study timeframe to avoid participant fatigue

  5. Appropriate for a remote setting

Participants

Participants were recruited through partnerships with CLM Essentials account managers and contacting participants from previous studies who fit our target demographic. 

I was able to recruit 8 participants who were:

  • From mostly SMB to mid-market companies with a couple representative from larger companies

  • 4 Current CLM Essentials customers in admin roles 

    • 3 Live

    • 1 Onboarding

  • 4 Non-customers in admin roles (previous participants in a separate DocuSign admin study)

Structure

60 minute, 3 part 1:1 remote moderated sessions via Zoom

 

study flow.png

​I administered and ran all studies while another research on the call observed and took notes during most sessions.

  • Paying attention to both explicit feedback and implicit signals, such as tone of voice, actions performed even if not acknowledged verbally, body language,

Since not all participants were of the exact same demographic, the study plan left room for flexibility in terms of tailoring details of the structure, tasks, and questions asked on a case by case basis.

 

Introduction

  • Briefing about the study overview and how to use the prototypes

  • Reminders that we will not be offended by any feedback, honest thoughts and opinions are most valuable

  • Short discussion asking about how they currently use DocuSign at their company and their responsibilities related to process reporting.

    • Important to put in context their feedback and decisions throughout the study. 

    • Often referenced these points later during probing questions

 

Part 1: Co-design

Main focus: Learning about the context around process reporting

This brief co-design activity utilized a static Figma mockup of the current CLM Essentials interface. This allowed us to see where they might intuitively and naturally go to find information on process reporting and why, without any additional intervention or bias. Since this required familiarity with the platform, this part was only done with participants that were current CLM Essentials customers.

 

"Imagine you’re looking for this feature in CLM essentials, where would you go on this screen to do it? Try to describe  where on the page you would go to and what you would expect to see and do when navigating to this feature. What path and actions would you take from here to find data about your processes?"

 

Follow-up questions focused on why they would take certain paths and actions:

  • Why in this particular location?

  • Why a certain interaction?

  • Does this choice make another interaction or task easier?

  • How does this choice fit in with the rest of the workflow?

method.png
Screen Shot 2022-08-29 at 11.49.11 AM (1).png

Current CLM Essentials prototype

Part 2: Scenarios & tasks (Usability testing)

Main focus: Assessing if users can perform the key jobs, learning about context around process reporting

 

  • Participants given a series of hypothetical scenarios related to process reporting

  • Participants navigated a semi-functional prototype incorporating initial design ideas for the implementation of process reporting.

  • Scenarios simulated a general end-to-end experience of identifying a bottleneck in a contract process; finding and confirming a contract, addressing it, and monitoring it.

 

Participants were asked to think aloud while they walked me through this process so that I knew why they were taking certain actions. 

 

"I want you to navigate to where you might go to see the data that you need in order to learn about approval time for Standard NDAs. Remember to please think out loud as you look around the pages and navigate."

prototype progression copy.png

Process reporting prototype

Part 3: Data type scoring

Main focus: Determining what other data needs users have, learning about context around process reporting

  • Participants presented with the eight main data types that we were planning on including in process reporting.

  • Participants were asked to score each of these eight data types on a scale of 1 to 5 on a Miro board

    • 1 = not useful at all and 5 = extremely useful for their core responsibilities

    • There could be as many or as little placed in each score

  • Discussion of each placement and reasoning

  • Participants had opportunity to add and discuss any missing data types

This helped us understand if we were including the right kind of data and if there was any additional data that we are missing in addition to gathering some quantitative measurements of data type needs and wants.

 

"Take a look at all of these 8 options and rate the usefulness of these data types for your core responsibilities and tasks at your job."

Screen Shot 2023-01-02 at 5.37.34 PM.png

Data scoring Miro board

Post-task questions

Each of the 3 main parts of the study were followed by a set of post-task questions to get metrics on and dig deeper into how participants felt about elements of the task.

 

Example of post-task questions for part 2:

On a scale of 1 to 5, with 1 being strongly disagree and 5 being strongly agree, how would respond to the following statements: 

  1. It was easy to find the information I was looking for using the DocuSign process reporting feature

    • Why?

    • Is there anywhere else you would have expected to or would like to find this information?

  2. It was easy to interpret the data that was presented to me in the DocuSign process reporting feature

    • Why?

    • Are there any other ways the data should be presented? Does the visualization(s) and text content both make sense?

 

On a scale of 1-5, with 1 being low and 5 being high, how well did this experience help you analyze contract processes and identify bottlenecks?

  • Why?

  • What specific things in the experience or product make you feel that way?

  • Are there any other products, experiences, or methods that help you optimize your contract processes? OR: you mentioned that you are/were doing XYZ to monitor and/or optimize your contract processes…

    • If yes: How does this help you with that? How does this experience compare to what you are currently doing?

 

Conclusion & wrap up

Along with thanking and debriefing the participants, participants were also given the opportunity to share anything that we did not get to cover earlier, any notable aspects of the study (positive or negative), and ask any questions.

See full study plan here

Analysis & Synthesis:

Because of the large amount of rich qualitative data that was generated through this research, careful and thorough analysis and synthesis was vital to interpret and make use of it. During this process, I included and often touched base with my cross-functional partners to ensure that the conclusions I was drawing were sensible and relevant, and get their unique perspective on emerging insights.  

Coding & Analyzing Data

Analysis process:

  1. Carefully rewatch study session recordings

  2. Fill in any missing information in notetaker files while watching recordings and referencing transcript

  3. Extract all possibly interesting, useful, and notable quotes, notes, and observations from study sessions

  4. Organize extracted notes on Miro board; sort by participant and study part

  5. Group extracted notes insights by research question

  6. Synthesize and group insights by themes within each research question

While extracting and synthesizing insights, I took into account the participant's company size, structure, and industry, their specific responsibilities, whether or not they were a current DocuSign customer, and any additional context they provided during the study session. 

Insights1.png

Notes organized by participant and study part

Insights2.png

Notes and insights organized by research question (mid-way through analysis)

Data scoring results

No significant difference between average scores for all 8 data types

  • All scored on average between 4-4.5 (1= Not useful at all, 5 = extremely useful)

  • Confirms current inclusion of critical data types

  • No significant difference between most useful data types for customers vs non-customers

 

Additional data types to consider adding:

  • Generally applicable:

    • Incomplete envelopes

    • renewal and expiration dates

    • Average time to sign by deal size (overall value)

    • Tracking by clauses and terms​

  • More company/industry specific:

    • # of contracts that needed redlines

    • Bid vs actual award amount

High-Level Findings

Because of the sheet volume of insights and notes extracted from the data, it was a considerable effort to distill them down to consumable findings. Naturally, not all insights were relevant and/or important enough to consider in the final report. By consulting with the cross-functional team and using my knowledge about DocuSign's priorities, I extracted and categorized the insights with appropriate and useful levels of detail, variety, and relevancy.

Overall, all participants saw great value in DocuSign's current vision for process reporting in CLM Essentials. People want and need this type of reporting.

“This would be a vast improvement because I am not doing anything of this nature at this level of detail right now.”  -P2, current customer

“I really like this tool, though! I am really glad you are adding it! I need to talk to my boss!” -P4, non-customer

Positive aspects of the current experience mostly centered around:

  1. Vast improvement from manual, homegrown, informal process reporting methods used often by small companies​

  2. Clean UI and visual language​

  3. Confirmation of the need for regular process reporting insights

    • Most participants would access reporting with some form of regularity depending on company needs and size​

    • Ad hoc investigations and checks​

Critical insights and findings from the study mostly fell into 4 main categories:

  1. Difficulty finding location of process reporting information

    • Proposed locations for process reporting were not intuitive or consistently discoverable​

  2. Suboptimal presentation of data

    • Current presentation of data was confusing, misleading, and too specific

    • Participants mentioned needing more way to see changes and trends at a glance

    • Interactions and clickability of visualizations do not consistently match existing mental models 

 

   3. Data types lacking

  • Trend data​

    • Participants were looking for more trend and comparison data rather than just the raw numbers that are currently presented

    • More basic customization needed (e.g. time period)

    • Used for benchmarking and looking at precisely before and after changes are made

    • Appropriate data visualizations needed

  • Individual data

    • All participants wanted more individual-focused data than provided

    • The individual is especially relevant for smaller companies, where a team or department may be one or two people

 

   

 

 

 

 

 

4. Confusing definition of "reporting"

  • Mismatch between company and customer definition

Screen Shot 2022-08-29 at 11.49.23 AM.png
Screen Shot 2022-08-29 at 11.51.14 AM (2).png
Screen Shot 2022-08-29 at 11.50.27 AM.png

It was unclear what these cards and graphs represented and which elements were interactive

Raw numbers and bar graphs presented are less useful than seeing trends and change over time

"Completed by" column was the extent for individual-based information provided

High-Level Recommendations

While the full report expanded on more detailed findings and recommendations, these high-level recommendations speak to the findings outlined above. When making these recommendations, I kept in mind current design and technological requirements and restrictions, feasibility for upcoming release versus long-term considerations, and relevancy to stakeholders.

Weigh learnability and natural instinct & intuition when considering location of process reporting

  • The location that participants thought was the most intuitive was already being used for another function. Perhaps find a location that is learnable with proper training. 

Match clickability throughout process reporting

  • Match existing models and expectations to avoid confusion and roadblocks for users

Leverage "At a Glance" insights (summarized data that is featured on the main process reporting page)

  • Be thoughtful in the data presented in this section

  • Consider presenting insights that would be most immediately useful and consumable by the user (trend data)

  • Consider light visuals & symbols/iconography to convey information quickly

 

 

Screen Shot 2022-08-29 at 11.51.26 AM (1).png

Prototyped "at a glance" section -- More trend metrics can be featured here

Screen Shot 2023-05-17 at 2.41.16 PM.png

Feature trend data

  • Custom date ranges

  • Appropriate visualizations - line graphs

Feature individual data

  • Chart visualization options in addition to current table data view

  • Additional individual data: # and type of tasks/processes assigned to user

    • Helps gauge workloads and staffing needs 

All-encompassing reporting

  • Blur distinction between “process” reporting and “agreement/document” reporting

  • Default to showing all data, with ability to toggle

  • Options to look at both alongside each other

 

Separated "reporting" types should be consolidated

Final Deliverables

I presented my research in two separate settings:

  1. Formal readout (60 minutes presentation)

    • Complete and detailed report of all aspects and elements of the project, including background information, methodology, findings, and recommendations​

    • Attendees included my managers, members of the cross-functional team, directors of product management for CLM platforms, account managers for CLM Essentials

    • 91-slide presentation and accompanying documentation serves as the complete research report and is included in DocuSign's research archives

  2. Internship capstone​ (15 minute presentation)

    • ​Summary of research project and overall internship experience​

    • Presented at the weekly product experience team meeting (~95 attendees)

Outcomes:

In addition to directly improving the process reporting feature, this research project built on and influence internal research and cross-functional processes

Product/External

Main outcomes:

  • Data and visual restructuring and optimization 

    • ​Immediately implemented in first release​
      • Increased clarity of data presentation

      • ​​Greater customization​ (e.g. time periods)

      • Expanded individual insights

      • Consistency of interaction patterns

    • For future implementation

      • Trend data and visualizations​

  • Successful feature launch​

Process/Internal

  • Pioneering open collaboration in research and product experience

    • Positive feedback: Cross-functional team and peripheral stakeholders appreciated being in the loops, learning about and contributing to research

    • Proven benefits: 

      • Increased alignment between associated teams

        • Increased product alignment & understanding

        • Resolved internal conflicts

  • Greater understanding of customer needs and expectations

    • Updates to JTBD and persona cards​

    • Matching internal and external language and expectations

Final notes & reflection

These 12 weeks were invaluable for my growth as a researcher, UX practitioner, and industry professional. The experience further confirmed that I made the right decision to transition from academic to industry research; I felt proud of, rewarded by, and energized by the work that I did, and immensely enjoyed learning from, contributing to, and working with the communities that I became a part of.

Creating and executing a full research project with a limited timeframe and seeing the feature with my research implemented successfully shipped was a milestone for my UX career. Seeing tangible results and outcomes from research efforts, and experiencing this on a large scale for the first time was rewarding and empowering.

Successes and growth:

  • Became comfortable participating in and facilitating productive cross-functional conversations, critiques, and feedback sessions

  • Successfully built on skills in qualitative data analysis, synthesis, and communication, including drawing actionable insights, and creating effective presentations

  • Gained confidence with moderating study sessions and maintaining natural flow throughout a session while effectively addressing all research questions

 

Lessons for the future:

  • Allow more time for refining storytelling and presentation

    • Refining content prioritization for reports and readouts

    • Dedicate more time digesting and discussing findings with others before formally presenting 

  • Plan more proactively for team meetings and discussions

 

bottom of page