PINATA- Pointing Interaction Notifications and AdapTAtions, is an Adaptive User Interfaces (AUI) tool (chrome extension) that helps individuals who face problems in accurately controlling the mouse pointing device, to perform pointing tasks effectively. It enables the users to discover their accessibility issues and offers feedback in the form of customizable notifications. It further allows the user to adapt their interface to reduce the pointing issue's effect which assists them in completing their tasks with ease.
I worked with my advisors on a semester long research project which aimed toward providing a hands-on experience with design and development of the Chrome extension PINATA through the ideation and implementation of features aiming toward providing more flexibility and power to the users with respect to their assessment and notifications.
Dr.Foad Hamidi, Dr.Amy Hurst
Pen and Paper- Sketching, Balsamiq- Wireframing, Adobe XD- Mockups and Prototyping
Feature ideation, wireframing, mockup design, prototyping, chrome extension development
The Phase 1 of the project was aimed at user research, problem research and development of PINATA's alpha version to address the identified pointing issues of its users. In the Phase 2 of the project, I focussed on analyzing the user data, collation of various performance metrics and identifying problems with PINATA's Alpha version to identify the features to be shipped in the Beta version. The analysis yielded following results about the users, the performance metrics and the existing product.
More than 50 performance metrics resulted from Phase 1 research which formed a foundation in pointing assessment of individuals. The visualizations used for providing assesment feedback and the zoom adaptation as a resolution over the pointing problems were kept same as they weren't any problems users faced on that front. Shown below is a snapshot of different types of metrics and their units. Highlighted in grey are the important ones which eventually went as features in the Beta version.
Shown below through the interfaces, are the Positives (things that worked) and the Negatives (things that didn't work) with the designs as described by the users of the Alpha version.
After gaining insights into Users, their goals, pointing problems and the Alpha version's feedback, the Beta version was aimed towards: a) adding more metrics (Click Timing and Pause Timing) along with Slip/Click ratio b) redesigning the interface and architecture for modularity and scalability and c) collect more data for research purposes.
The ideation phase was aimed at exploring various task flows and interfaces to enable the users to quickly and easily select their performance metrics and the relevant notification visualizations. Initially, I started with seperating the metrics, visualizations and adaptations for modularity, but soon realized that the metrics and visualizations must go hand in hand with a visual cue connecting them. A summary of the subsribed metrics was also thought to be helpful in letting the user easily anticipate the notifications. Further, the scalability in design was more required with the Notifications Management than the entire product flow which could make addition of new metrics easier for the future developers.
Shown below are the 3 different design iterations. The primary change in the final version includes an upfront navigation on the Home Page, a redesigned Notifications Manager Page, and a new Summary Page.
The listener script on the client side continuously captures user's mouse movement data and stores some of it internally in the chrome storage along with the notification preferences set by the user. Next, it performs an assesment of the actual pointing behavior and compares it against the preferences to generate notifications and adaptations. The notifications are then displayed on the user's current window while the data is being transferred from the background script to the server via REST calls. The server creates a csv file for logging purposes.
Since the project was more focussed on design and development of PINATA, a round of unit and system testing has been done for the tool. Going forward, the tool will undergo rounds of user testing to measure its effectiveness and efficiency. Below are the few evaluation metrics planned for user testing: Difficulty (captured on a scale) in clicking various UI elements, task effectiveness, task time, task error rate, confusion points and user feedback. Based on the testing results, PINATA will undergo further rounds of design and development iterations to fix the problems and enhance the user's experience.
● Learned desinging for younger and older adults under constraints like size of UI elements, # interactions, colors, measure of intuitiveness, etc.
● Learned the basics of Adaptive User Interfaces (AUI) tools
● Design: Creating a scalable intuitive mapping of metrics to visualizations on a small real estate
● Data storage: Storing large data without violating chrome extension's storage and write constraints