top of page

Process Samples

Accessing qualitative data of users

Analytics

  • Using Adobe Analytics framework

  • Created Business Requirements document

  • Created Solution Design Reference documents


Focus Group Facilitation


Guide


Live


Analysis




Usability Test Moderation

Objectives

Adoption

  • Identify Adoption barriers/opportunities

  • Evaluate volume of account and position information needs

  • Evaluate the willingness to continue to use Accounts

 

Usability

  • Evaluate the processes of reviewing the current account balance, the electronic delivery preference, the Registered account's plan details, client portfolio, and the journal trailers of a specific transaction for specific users in comparison to the legacy systems

  • Evaluate how much the processes function as expected

  • Evaluate the frequency users experience friction

  • Evaluate how intuitive the navigation is

  • Evaluate how frequently the information displayed is complete

  • Evaluate how frequently the information displayed is organized well

  • Evaluate the clarity of instructions and labels

  • Identify improvements to Accounts

  • Identify tasks related to finding client and account information

 

Satisfaction

  • Evaluate overall satisfaction of Accounts in comparison with legacy systems

  • Evaluate preference of Accounts to legacy systems

  • Evaluate how user processes compare with the legacy platform

  • ·Evaluate task efficiency with using Accounts


Participants

Usability studies may aim to recruit between 5 and 15 participants, depending on the complexity and scope of the product or service being tested. This is often referred to as the "rule of five," as it is generally believed that recruiting at least 5 participants can help identify most usability issues and provide a sufficient sample size for analysis.


Task/Scenario Definition

Script

 

Time

Scenario

Script/Prompts

24:00

 

We are moving right along. Thank-you for your insights.

 

 

Wonderful. We have finished the actual test and have a few questions to ask you before we finish. You’ve been a terrific participant for this test on our end. We hope the process has been interesting for you as well.

Thank you again for your patience and your willingness to participate in this Usability Test.  Is there anything else you’d like to share before we end this session?

I hope we speak again soon.

25:00

 

 

Post-test questions

Before you leave, I’d like to ask you a few final questions.

First let’s evaluate your satisfaction with the Accounts capabilities in comparison to the existing platforms (NetRep, My Accounts). Again, I will read a statement to you and I’ll ask you to complete the statement with your evaluation indicating on a scale of 1 through 5 indicating your level of satisfaction with it.

·       The first statement is: My satisfaction with the overall capabilities is…. On a scale of 1 through 5 where 1 is very dissatisfied and 5 is you very satisfied.

·       The second statement is: My satisfaction with the Accounts vs. NetRep is…. On a scale of 1 through 5 where 1 is very dissatisfied and 5 is you very satisfied.

·       The third statement is: My satisfaction with the Accounts vs. My Accounts is…. On a scale of 1 through 5 where 1 is very dissatisfied and 5 is you very satisfied

 





 Pre-test questions

1.        How frequently do you look up client or account information?

·      Daily

·      Weekly

·      Monthly

·      Rarely

·      Never

 

2.        How often do you access the following systems to retrieve information?

 

Daily

Weekly

Monthly

Rarely

Never

NetRep

 

 

 

 

 

OSS

 

 

 

 

 

My accounts

 

 

 

 

 

 

What are your top five tasks in finding client and account information using these systems?

1.      

2.      

3.      

4.      

5.      


Script In-test questions

Review the current account balance for a specific user after latest transaction

 

1.                     On a scale of 1 to 5, to what extent did the process function as you expected?


2.                     On a scale of 1 to 5, how often did you experience friction in the process?


3.                     On a scale of 1 to 5, how intuitive is the navigation?

4.                     On a scale of 1 to 5, was the information displayed complete?


5.                     On a scale of 1 to 5, was the information displayed in a way that was organized well?


6.                     On a scale of 1 to 5, were the instructions and labels clear?


7.                     On a scale of 1 to 5, how does this process compare with the legacy platform?


 

Post-test questions

 

1.        On a scale of 1 to 5, how do you describe your overall satisfaction with the Accounts capabilities in comparison to the legacy platforms (NetRep, OSS, My Accounts)?


What is the primary reason for your score?

 

2.        On a scale of 1 to 5, how do you compare Accounts vs. NetRep?


What is the primary reason for your score?

 

3.        On a scale of 1 to 5, how do you compare Accounts vs. OSS?


What is the primary reason for your score?

 

4.        On a scale of 1 to 5, how do you compare Accounts vs. My Accounts?


What is the primary reason for your score?

 

5.        On a scale of 1 to 5, what percentage of client and account information do you anticipate you will access with the new platform?


 

6.        On a scale of 1 to 5, how do you describe your task-efficiency using Accounts in comparison to the legacy platforms?


7.        On a scale of 1 to 5, how likely are you to continue to use Accounts?


8.        Describe any barriers you anticipate in continuing to use Accounts

Details:  _____________________________________________________________________________________________________________________________________________________________________________

9.        What are your top 3 needs that are addressed with Accounts?

1

2

3

10. Any suggestions to improve Accounts?

__________________________________________________________________________________________________

_________________________________________________

 

Analysis (Miro)





Research Planning


Product Readiness

For the customer segments we expect to adopt as of GA, we've identified and managed any detractors towards that goal.

•       Adoption level: Production-like adoption, functionality, detractors identified

•       Service levels met: Scalability - technical and operational

•       Limited issues reported: Reliability


Operational Readiness

Prepare the support and service teams to be operational in a production-like environment for GA.

•       Clear plans for support and service team with engagement to validate we are ready for GA

•       Seamless setup that can scale for GA

•       Limited issues raised from clients or operational partners

 

Research Questions


Research Question

Performance Indicators

What segments of firms are ready to adopt the current version of the new Compass and what are the critical capabilities needed to onboard the remaining firms?


Evaluate Adoption level (usage [CX & R2], task-efficiency & completion, performance, retention, satisfaction with using 2 systems and individual modules)

Evaluate technical service levels (feature upgrade workflow, user access request workflow)

Evaluate reliability (issues)

Evaluate efficacy of support partner teams (Release notes, Documentation, Instruction, problem-resolution)

How effectively are the support and service teams able to respond to requests triggered by the impact of the change management associated with the launch of the new Compass experience?

Evaluate operational service levels (migrating and user mgmt: capacity, timing and so on… )

Evaluate plans for support & service (efficacy of migration, onboarding process, Product Support ability to respond to user inquiries, issue-reporting, changed operations Trade Desk & MFDO)

Evaluate set up for scaling for each of the operational partners

Evaluate issues (clients, Trade Desk, MFDO)

Research Activities



 

 

Analytics

•       Users with daily product experience

•       Captures module access/use, page access, page selection navigations, account access, types of MF transactions, bulk upload, bulk downloads, types of service, search terms, filters and selections, types of browser use, frequency of uptime/downtime, frequency of bugs and issues

Walk me surveys

•       Users with daily product experience

•       Questions using rating scales, likert scales, matrix with comment fields for additional information

Intermittent Internal Surveys

•       Internal participants

•       Questions using rating scales, likert scales, matrices, and comment fields for additional information

Problem resolution satisfaction

•       External participants who raise issues

•       Questions using rating scales, likert scales, matrices, and comment fields for additional information

Final survey

•       External participants/users

•       Questions using rating scales, likert scales, matrices, and comment fields for additional information

Focus groups

•       SPOCs

•       Facilitated groups focusing on critical issues, transition issues, adoption barriers, detractors, inter-team co-ordination, issue resolution

 

Reporting


Cadence

Content

•       Throughout Pilot

•       Weekly

•       Analytic-driven reporting on usage by Firm

•       Walk-me reports at the end of weeks 1, 2 and 3

•       Reports available the week following execution

Template-based reporting including summaries of survey results:

•       Product Readiness

•       Evaluate task completion

•       Intermittent Internal

•       Surveys - Monthly

•       Reports available the week following execution


•       End of Pilot

•       Report available two weeks following end of Pilot

•       Final report including trends and summary of survey questions, focus group data, and testimonials:

•       Evaluate Product Readiness

•       Evaluate


bottom of page