This piece seeks to succinctly document the redesign process for the OneService case reporting application from August 2016 to October 2016.
The Ministry of National Development, or MND, is a governmental organisation in Singapore which deals with both national and town affairs. Always pushing for greater operational efficiencies as well as enhanced community participation, the Municipal Services Office, a branch of the MND, then developed the OneService portal — a one-stop access to information such as events and issues that occur in the neighbourhood. Having pushed this product out as a mobile application (that focuses on case reporting), and a website (that concerns community events to a greater extent), the ministry then engaged National University of Singapore students in a User-Centered Design Methodologies Class to redesign the product. Students had the option to either work on the application, or the website.
My team and I elected to focus on the case reporting application to take into account the increasing trend of mobile-first user habits, hence rendering this project to be more relevant to the real-world context. The conceptual bedrock of user-centered design thus underpins this project, and is best encapsulated in the diagram below:
We are informed that users of this application do already exist — there are people who are actively reporting cases (and the question of course is who are these people). We proceeded to understand that they are mostly constituted of males aged above 40 years old — much like my dad (and in fact, I can probably imagine him making a report of an issue that has occurred in his neighbourhood!). So, we embarked on a series of key steps that are part and parcel of the design process.
1. Qualitative Research
Semi-structured interviews were carried out with about 8 people, with half of the interviewees falling within the existing user demographic, and the other half belonging to a younger demographic (the general range being 22–40 years of age). In a sense, the younger demographic served to be a secondary target audience of sorts — and they may be the latent users, but key in influencing mainstream adoption of the product. A set of questions were crafted, and interviewees were also asked to engage in think-aloud protocols to allow us to understand the current user experience with the existing OneService application (and they were asked to quantitatively rate their experience as well). In fact, the old application looked something like this:
Certain critical questions were raised, that testified to various areas for improvement pertaining to the current user experience. They included (and are not limited to):
- Can I report a case on the fly — or must I create an account before being able to do so?
- What if there are overlapping categories — for example, a rat in a drain (falling under “Pests” and “Drains & Sewers”)? Where does that fall under? I do not intuitively look to select the category of “Others” though…
- What is the difference between “any additional information” and “any other details”? (refer to Case Reporting page)
- Will the relevant organisation promptly get back to me should I make a case submission? If the answer is no, what is the point for me to make a report in the first place?
The interviews were transcribed accordingly, and then we consolidated these findings. Thereafter, the Rose, Thorn, Bud technique was employed to flesh out any positive aspects of the user experience, salient pain points, and opportunities that could then be leveraged upon for the product’s redesign. It was messily divergent, but of course, a necessary part of the entire design process.
The thorns, or the pain points, thus concern issues such as:
a. First-time users experience a lack of perceived control during usage — the element of intuitiveness (i.e. design efficiency) is missing; given the relatively complex/cluttered interface being presented to them for them to manipulate. It is important to take note that the more active users are the older, and slightly less tech savvy ones (and even the tech savvy youths do express usage difficulties due to the extant clutter).
b. Poor category design- there are several redundant + non-mutually exclusive as well as non-exhaustive categories. Most users do not envision reporting spotting abandoned trolleys (a category found in the menu), for example. And as mentioned, the reliance on ‘Others’ as an independent, catch-all category is ineffective as it is not the top-of-mind option that a user will typically elect for — instead, they will look at a specific category that has been provided (e.g. ‘Animals’) and realise that it actually is unable to accommodate their case at hand — before actually heading over to the ‘Others’ category. This then entails extra steps and design inefficiency.
b. Making a report takes too much time, and it should be made quick and simple; particularly as many interviewees mentioned that they tend to encounter cases outside en route to some other location. The encounter is thus often accidental, and unintentional; which thus begs for the design to cater to efficiency and spontaneity.
c. It is interesting to see how interviewees belonging to the younger demographic will be less inclined to make a report. Purportedly, they are subject to a greater effect of social influence; where they express an increased propensity to report a case when they see that others are doing the same as well. Social proofing will thus need to kick in, to enhance stickiness. However, it does not currently exist as every individual is only tracking his or own case reports without access to the reports made by others.
d. There is a sense of uncertainty that prevails with regards to the degree of action or inaction that the relevant authorities might take vis-a-vis the case reports lodged by an individual. The interviewees feel demotivated to even engage with the application should there be a lack of tangible feedback by the government peeps — which is perfectly comprehensible in this context.
We took note of these issues, and then moved on to the next step.
2. Persona-building
We then moved on to create two personas for this project — our beloved 53-year old Jack who cares for his community; and the cool 32-year old Emily who tends to possess a more aloof disposition.
These personas thus represent composite archetypes of our prospective users, and we also determined certain key characteristics (e.g. how tech savvy may they be? how patient may they be? how much do they care about their family?). As we crafted these persona stories, we consistently maintained a keen eye on the user traits, goals, and needs. We then drilled down to two primary goals that we wanted to solve at the very end of the day:
- To be able to quickly make a case report with ease.
- To understand what is going in one’s neighbourhood to be able to notify/warn one’s loved ones for their well-being.
The need, ultimately, is to be able to confidently ascertain that their families — the people who they care for — are safe and sound. This product thus aims to fulfill this need, by always keeping the residents updated about the town happenings while encouraging the creation of a core community spirit that endeavours to a better home.
3. Storyboards
Storyboards were then sketched to depict different use-case scenarios for the personas that we have created. Simple wireframes on paper were also sketched to depict a revised app UI, but were definitely preliminary in nature and subject to many more iterations. The storyboards and wireframes were combined to form state transition diagrams, to explain two main user flows that we have designed: 1) reporting/submitting a case, and 2) tracking cases reported by the community.
4. Design
The final design requirements stood as such:
a. Instant case reporting function to facilitate quick and easy reporting that obviates the need for the user to create an account (for the purposes of making a report).
b. Reduction of visual clutter to improve cognitive processing and improved learnability while achieving design consistency (and this entailed removal of categories).
c. Inclusion of mobile number as alternative communication channel for the organisation to contact the individual.
d. Creation of community case tracker.
After re-evaluating the paper prototypes and iterating on them, we then moved on to the digital, hi-fi screens. They can be found below:
As it can seen above, the instant reporting function is available for the user to choose. Upon electing to make an instant report, the user is brought to the ‘Submit New Case’ page.
Overall, the quantity of clutter was drastically reduced. Users can take multiple photos of the case encountered; input the description (the system will detect the keywords and semantically classify the case to be directed to the relevant authorities to manage); and enter their mobile number. As the location is automatically detected, it technically takes the user 3 simple steps to submit a case. Upon tapping on the submit button:
As it may be noticed, we similarly incorporated a greater element of feedback in the form of expectation management — by explicitly rendering a window period in which the user will expect to wait for before hearing from the organisation. Essentially, we want to sustain a communicative touchpoint with the user, even after he or she has submitted a case. In fact, a user’s experience is also critically hinged on post-case submission — whether it will be a satisfactory one is majorly predicated on how responsive the organisation will be to his or her feedback, and how accepting/receptive he or she will be of any plausible delay in doing so.
To explore the application further, the redesigned landing page stands as such:
We wanted to make it clean, uncluttered, and enhance the visibility of all the available functions within the application to the end of eliminating the hamburger menu.
As it can be seen, a new ‘Community Tracker’ containing of cases submitted by residents of a particular neighbourhood has been created. This strives to augment the original ‘Case Map’ which is part of the old application.
A map and list view is thus created — as some interviewees did express the difficulties in solely using the map to navigate different locations and detect cases. The original list view thus defaults to all types of cases, in all of Singapore. The user is able to narrow down these options via the filter icon at the top right hand corner within the List View (above). Tapping on the icon will then bring the user to the screen (‘Order By’ being the default active option) on the left below:
The user then has the freedom to toggle the options (by recency or popularity) however he or she deems fit, and then move on to filter cases by location. Upon doing so and tapping on the tick icon, the user is brought to a filtered feed of cases within a specific location and ordered in a reverse chronological fashion — up to 7 days worth of cases:
Clicking on an individual case panel will then yield the individual details of a case:
The page thus contains a bookmark option, an image, a description of the case, the location it occurred at, the submission date and time details, and an option for the user to express that ‘I’m facing the same issue’. Tapping on it presents a thank you message. This aforementioned option serves as a crucial datapoint for the relevant organisation: the more people indicate that they are facing an identical/similar issue, the more severe it is — and the higher this case should be on the priority list.
5. Evaluation
The screens that you have just witnessed above have been subject to evaluation as well. We put our initial designs on Flinto, and subsequently carried out guerrilla user testing with a sample of about 10 people (who fall under the two demographic ranges of our personas). We listened, rigorously sought for feedback, and took note of certain unfulfilled opportunities that our design could have tapped on — vis-a-vis certain user goals and needs. It was then a matter of regrouping, analysing, and prioritising our evaluation findings. Who said what? What are some of the common threads that ran through each conversation? Who expressed a greater propensity to use the application, and hence be the likely end-users? Did they understand how to interact with the product? What difficulties did they face? …And so on.
Evaluation definitely proved to be useful in enabling us to further refine our design story, better validate some assumptions that we had been making, and take one step closer towards theoretical saturation to achieve an optimal level of understanding about the relationship between user and product.
6. Areas for improvement
Of course, there are certainly areas for improvement to be looked into/limitations with this project:
a. UI: I believe that the product’s visual design could be improved upon. Having evolved to become a pixel perfectionist and someone who is pushing for more aesthetically pleasing and creative designs, I’d say that my perspective has changed since I first designed this. First, I think that spatial alignment between objects could be improved upon. Second, elements such as colour and size could be played with to provide a different and potentially refreshing visual (and user) experience. Third, objects could also be manipulated and positioned differently on the interface to augment the entire UX.
b. Saturation: The time constraints of this project (in the context of a busy school curriculum) made it slightly challenging to achieve actual saturation in terms of data collection and analysis — our gut told us that we could have ventured further to evaluate and iterate. Still, it was the best that we could come up with within this time frame, but we acknowledge that there is definite room for further discovery and improvement.
And…yes! That’s that. This project allowed me to see how we should embrace the strategically iterative nature of design, while maintaining a constant, obsessive focus on the user. I also realised how important it is to test your design hypotheses with no one else but the user. Getting your hands dirty — out of the building (as Steve Blank said) — is a sure-fire way to getting closer and closer to verifying the problem as well as the proposed solution. It’s not rocket science, I suppose.
Well, I hope you enjoyed this rather long read once again. As usual, I’m open to constructive feedback/comments :)