top of page

Saving Pot

Create a savings pot section for neobanks interested in adding this area to their retail and hybrid user banking offerings.

OVERVIEW

I worked as a UX Researcher in the UX team of CREALOGIX on different projects. Together with Ignacio Puerto, we founded the Research Lab in this company.


In this case study, I will share with you the solution we developed for a Savings Pot section for a digital engagement platform that enables the acceleration of digital products and services for financial institutions. As in the previous case study, I cannot share the final design version of the Savings Pot section due to signed confidentiality agreements.


RESPONSABILITIES:

  • Conduct research to investigate whether the Savings Pot is an area in various neo-banks and how they offer this service. We took into account previous research done by the company on this topic!

  • Selected the appropriate UX research tool for the company's actual needs.

  • Educated the UX team  to work with Useberry, the chosen UX Research tool.

  • Decided on the research methods that need to be conducted for answering the established objectives.

  • Created the guides for conducting the usability tests and online survey (i.e. System Usability Scale and branding questions).

  • Recruited participants using Prolific, a flexible tool for online research.

  • Reviewed the high-fidelity prototype created by the main UX designer and adapted it for usability testing.

  • Conducted remote usability testing and online survey using Useberry.

  • Analyzed the results of the usability tests and survey questions and presented these results to the UX team and stakeholders.

  • Presented key insights & recommendations to the lead UX designer.

Design members:

·  Main UX Designer (Mauricio Alpizar)

·  UX team lead (Horaci Polanco)

Research members:

·  UX Researchers (Ignacio Puerto & Mónica Emch)

Year:

Year:

CLIENT

CREALOGIX: A global fintech organization focus on creating digital leaders by empowering financial institutions since 1996.

YEAR

March 2022 – April 2022

SKILLS

Desk research, Brainstorming, Online recruitment, Remote Usability Testing, Online Survey, User flows, Stakeholders presentation

TOOLS

FigJam, Figma, Useberry, Prolific, Microsoft Teams, Microsoft Powerpoint

TEAM

Design members:

·  Main UX Designer (Mauricio Alpizar)

·  UX team lead (Horaci Polanco)

Research members:

·  UX Researchers (Ignacio Puerto & Mónica Emch)

CHALLENGE

See how we can help users save more money on their digital banking platform.

SOLUTION

Create a high-fidelity prototype in Figma for potential retail and hybrid banking users.

END USER

Retail & hybrid banking users who want to save their money through their Neo Bank.

PHASE I: DISCOVERY

This section provides a brief description of the target audience for this particular digital platform and the research conducted to address the task.



WHO IS THE TARGET AUDIENCE?

All bank users interested in saving money through their banking app (i.e., retail and hybrid banking users).



RESEARCH (EVALUATIVE METHOD)

I conducted online remote usability tests and online surveys and researched various neo-banks and platforms that offer savings services.


  • Together with Ignacio Puerto, I organized a team meeting to define the main challenges and gather the main ideas. We discussed the appropriate research methods to conduct our investigation.


  • Conducted research on what could the perfect usability testing tool for CREALOGIX's current needs (i.e., specific price range, audio & video recording of the user, screen recording, remote, availability in different European languages such as English, German and Spanish).


  • Presented the best UX Research tool to the team, taking into account the specific needs and criteria of the company, and trained them on how to use it (i.e. Useberry).


  • Created and launched 6 remote online usability testing and online survey with users recruited through Prolific, taking into account the screening criteria that I decided based on our users. The tests took approximately 25 minutes for each participant. Before launch, we conducted a pilot testing to see if everything was in order.


  • The UX Research team analyzed each participant's videos, audios and screen recording. We also obtained a quantitative value per participant and on average (i.e. Systems Usability Scale), and analyzed the branding questions. We also ran an Affinity Wall to extract the key insights & design recommendations.


  • Created a Microsoft PowerPoint presentation with the insights & design recommendations to the UX team & participating stakeholders.

BENCHMARKING

I researched which UX research tools are state of the art and offer usability testing. We had personal conversations with the developers or digital managers of some of these tools to ask various questions that arose during the research: 



I looked at their services and Useberry was the best tool for us because of its amazing cost-benefit ratio! We opted for the Pro option ($67/month, annual billing) which allows you to run unlimited responses for your participants and unlimited projects at a time. 


Useberry is suitable for many different UX methods & not just usability testing! (e.g. tree test, preference test, card test, 5-second test, first click, open and closed question tests...). They work with Prolific to get high quality panelists. You are able to start and get answers within 2 hours!

Captura de Pantalla 2022-05-11 a las 15.37.47.png

PILOT TESTING

Before we present any prototype to our participants, it is important to do a short pilot test to see if everything is in order. In this picture you can see me checking if the Figma prototype is responsive and Ignacio writing down the changes that need to be made before launch. Teamwork :)

Captura de pantalla 2022-07-04 a las 15.37.33.png
IMG_20211110_163701.jpg

ANALYSIS OF USABILITY STUDY

Captura de pantalla 2022-07-01 a las 11.40.08.png
Captura de pantalla 2022-07-01 a las 11.41.11.png
Captura de pantalla 2022-07-01 a las 11.42.10.png

Each participant had to solve 6 tasks (one general and five specific). In the left figure you can see if they were able to solve the task and how long it took them, as well as their misclick rate. We also asked them if they understood the concept of "saving pot".


We wanted to examine the main path that users choose to get to the "Save Pot" section (i.e., user flow). 4 out of 6 users went through the bottom menu of the bank, as you can see on the middle image.


Heat maps allowed us to examine the areas where users clicked to complete a particular task. In the right image you can see that the red areas were clicked most often by the participants.


Analysis of the heat maps, user flow, screen recording, and audio and video recordings of each participant helped us to make the following findings: 


FEATURES & STORIES




RULES




ACCES POINT TO SAVING POT




GENERAL FLOW




83.3% of users understood the different rules for saving, although they did not know how the rainy-day rule works. 66.6% of uses wanted to skip the onboarding stories as they felt it wasn’t worth it.



83% of users would improve the location and relevance of the Pause button (for each memory rule) because they had difficulty finding it.



66.6% of users access the Saving Pot section first through the menu, rather than the Saving Pot section in the Dashboard.




83.3% of users intuitively go to edit the save pot via the image of the pot and not via the "Settings" section, although there is a learning curve. Users have understood the idea of the saving pot.

PHASE II: DESIGN CHANGES

Once the key insights were obtained, Ignacio Puerto and I decided to propose design changes based on the user insights. We presented them to the key designer in a 2-hour session (you can see in the image on the right our work). He accepted our design recommendations and incorporated them into the Saving Pot section.

Captura de pantalla 2022-07-11 a las 14.54.12.png
Captura de pantalla 2022-07-04 a las 16.04.02.png

We also included a simple survey, the Systems Usability Scale (SUS). It is a reliable tool for measuring the usability of a product. The SUS is calculated based on responses to a ten-item questionnaire with five response options (from "strongly disagree" to "strongly agree"). The score is a value between 0 and 100, with higher values representing better usability. 


All 6 users answered this survey after completing the tasks using the Banking Mobile App. The average score was 93, which means that the app received a very good score for usability. You can compare the SUS scores across different stages of the app to see how the usability changes.

PHASE III: DELIVERY & REFLECTIONS

As in the previous case study, we asked the different participants of the UX team to be involved in the whole research process, and in this way we also promoted user-centered practices and teamwork. We did 2 sessions to the team, one in which we showed the chosen UX research tool and another one in which we presented the whole research process, which included:


  • Objectives, 

  • The research method chosen and why, 

  • Selected user flow and heat maps extracted from Useberry, 

  • System Usability Scale results and answers to branding questions,

  • Key findings and design recommendations,

  • Implications of the research


The lead UX designer decided to change his design for the Saving Pot section based on the research results. The team realized the power of remote usability testing tools and was ready to use them on other projects.


At the end of any investigation, it is important to talk about what went well and what needs to be improved.


What went well:

  • We successfully conducted our first remote usability testing using a new UX research tool for the team.

  • We moved closer together as a team.

  • We showed how UX Research has a real impact on the product we develop for people.


Challenges:


1.) We waited way too long to buy the Useberry license because of internal bureaucracy.

      Solution: Try to find the key stakeholder yourself and talk to him/her directly instead of waiting for others to do something.


2.) We could not see a participant's screen because they did not share it.

     Solution: provide a clear statement at the beginning of the investigation that the participant will not be able to participate in the investigation and receive financial compensation without sharing his/her screen.


3.) On a certain screen of the prototype, the participant had to click on a certain area in order for the prototype to respond.

    Solution: We must never force a certain path on the user.

Image by Tezos
Image by Fabian Blank
bottom of page