onboarding package using ux
PROJECT UNDER
NocNoc.com
DURATION
3 weeks
BACKGROUND
based on service buyer requirements, a convenient service package typically includes starting price, minimum area, initial scope of work, and installer work examples.
These are essential for easy comparison between NocNoc platform and external installers. This directly impacts our service installer team to validate the following assumptions:
- Active installers unfamiliar with upfront pricing can set their service package on the system.
- Active installers in every category can provide necessary information for buyer comparison.
THE IMPACT OF PROBLEM
External platform installers typically offer pricing on a case-by-case basis after conducting site surveys or receiving customer-provided site information. Our challenge lies in encouraging installers to list all their services with the necessary details for buyers.
From a business perspective, this is uncommon in other home service marketplaces or retail platforms in Thailand. However, if we find a suitable solution and implement it, we could be the first innovator and gain a competitive edge in the market. Here are the initial buyer insights we gathered.
Initial buyer’s insight:
From a business perspective, this is uncommon in other home service marketplaces or retail platforms in Thailand. However, if we find a suitable solution and implement it, we could be the first innovator and gain a competitive edge in the market. Here are the initial buyer insights we gathered.
Initial buyer’s insight:
WHAT DO WE NEED TO KNOW
- Can we adopt a universal solution for all categories on the NocNoc platform?
- What is the installer's pricing mindset?
- Any obstacles to pre-setting service prices under NocNoc's current categorization?
- How to structure package onboarding to align with customer needs?
- Essential information installers need to specify in advance.
DISCOVERY PROCESS BASED ON MVP1
Desktop research:
We initiated the process by conducting desktop research on competitors and posting job listings on installer Facebook pages across five categories: wood flooring, painting, tile, curtain/wallpaper, and ceiling/light wall.
Catagorise attribution:
We organized all attributes in each category using affinity mapping and prioritized data from both customer and installer perspectives to identify must-have attributes.
Finally, for drafting input fields of service information based on the installer’s mental model, we classified services into three types: A) service only, B) product with service, and C) customized service, which is difficult to evaluate price in advance.
We initiated the process by conducting desktop research on competitors and posting job listings on installer Facebook pages across five categories: wood flooring, painting, tile, curtain/wallpaper, and ceiling/light wall.
Catagorise attribution:
We organized all attributes in each category using affinity mapping and prioritized data from both customer and installer perspectives to identify must-have attributes.
Finally, for drafting input fields of service information based on the installer’s mental model, we classified services into three types: A) service only, B) product with service, and C) customized service, which is difficult to evaluate price in advance.
UT PLANNING
- Utilize assumed mental model to craft user flow.
- Transform essential attributes into prototype testing format.
- Opt for a standardized form solution for initial testing, maintaining consistency across all categories with the aid of tutorials, labels, and placeholders.
- Conduct pre-testing with internal team and selected installers to refine functional prototype prior to official testing.
MEASUREMENT
Primarily a functionality test rather than a usability test, we employed Jotform to collect essential information from installers within the NocNoc system.
After distributing the Jotform to recruited installers, we analyzed the data to measure outcomes. Subsequently, we conducted interviews to gather feedback, insights, and validate related assumptions.
After distributing the Jotform to recruited installers, we analyzed the data to measure outcomes. Subsequently, we conducted interviews to gather feedback, insights, and validate related assumptions.
RESULT
How do we measure success?
In conclusion, the testing process revealed that the generic version of the form failed to prompt most installers, particularly those in group B, to input data consistently.
The causes include:
Moving forward, further development of the generic form requires additional tools, such as guidelines or tutorials, to ensure consistent package detail input by installers.
- 75% of installers successfully and accurately set up service prices within the appropriate time frame for the filled content.
- 55% of installers provide service information details as expected.
In conclusion, the testing process revealed that the generic version of the form failed to prompt most installers, particularly those in group B, to input data consistently.
The causes include:
- Unclear copywriting, hindering installer comprehension and consistency.
- Varied perspectives among installers regarding job details.
- Difficulty in setting up service packages due to unfamiliarity, especially with product details.
- Moderate performance by installers, falling short of expectations.
- Lack of visualization leading to doubts about the final look, highlighting the necessity of a preview function.
- Tutorials could assist installers with poor digital literacy in successfully completing the form.
Moving forward, further development of the generic form requires additional tools, such as guidelines or tutorials, to ensure consistent package detail input by installers.
DESIGN PROCESS
To initiate the design process, I outlined the design concept and strategy for the entire onboarding journey based on the current steps for both installers and admins in Service 2.0.
Then, I identified visible pain points to establish product goals, design strategies, and solutions. The main objective was to create a comprehensive service package.
Then, I identified visible pain points to establish product goals, design strategies, and solutions. The main objective was to create a comprehensive service package.
AFFINITY MAPPING
I analyze data from all form versions in both the 1.0 and 2.0 systems, including buyer requirements, to determine the necessary information for the upcoming version.
COMPETITIVE + UI SOUTION RESEARCH
Next, I researched case studies on form solutions to identify their pros, cons, and suitability based on our baseline information and goals.
WIREFRAME
Here's a quick wireframe collage of the package onboarding process.
- Upon approval, installers access an onboarding checklist.
- Initiating their first package creation prompts an info page detailing package value, benefits, and structure.
- Subsequent steps lead them through a form segmented into 4 parts, with guidelines provided.
- Previewing prompts a view of the buyer’s perspective.
- Completing and sending yields success confirmation and next steps.
NEXT STEP
We aimed to collaborate with the operations team to establish guidelines and example versions for installers based on package value and customer requirements. Subsequently, we planned a second round of user testing to validate our assumptions and guide further improvements.