Introduction
Due to our drive for learning and improving, at OpenEarth we recently conducted user testing of our OpenClimate platform to assess its effectiveness in terms of user experience, navigability, and usability. Our main goal was to improve upon existing platforms and ensure that users can easily navigate and use the platform. In this blog post, we will discuss our testing methodology, the main results, the challenges users face when searching for climate information, and the conclusions we drew from the process.
Testing Goals
Our primary objectives were:
1. To achieve an improved user experience compared to other similar platforms.
2. To ensure that most users (>50%) find the platform easy to navigate and use.
3. To identify areas where we can further improve the platform.
Testing Focus
We focused on two main aspects during the testing:
1. The "explore" experience: We aimed to test the navigability, usability, and complexity of the Explore flow, which begins at the landing page and ends when the user reaches a city or company.
2. Presentation of information: The core of our product is the way we present information to the user. Given the complexity of the data used, our top priorities were making it simple, readable, and useful.
Methodology
We used Moderated Remote Testing, which involved a quick interview followed by a guided task completion sequence. We asked users questions about their persona, their connection with climate data, and their previous experience using similar platforms. Users were then asked to complete a series of tasks to test various aspects of the platform.
Main Results
Some key findings from the testing include:
1. Most users use a variety of sources to search for climate information, such as CDP, Climate Action Tracker, and Google Scholar.
2. Users face several issues when searching for climate information, which we will discuss in more detail below.
3. 80% of users found the platform simple and meaningful, but all users suggested adding more context or information.
4. Users encountered some issues with navigation and reading the information on the platform, primarily due to text size, phrasing, and lack of context.
User Challenges When Searching for Climate Information
Our testing revealed that users face several challenges when searching for climate information:
1. Data availability: Users often struggle to find relevant and up-to-date climate data. Many platforms lack comprehensive datasets or fail to update their information regularly.
2. Trust: Users are concerned about the reliability and accuracy of the data they find. Climate information often comes from various sources with different methodologies, leading to skepticism about data quality.
3. Methodological inconsistencies: Users encounter difficulties in comparing data across different sources, as these sources may use different methodologies or reporting standards.
4. Difficulty in comparing data: Users need to compare data from different regions, sectors, or actors, but often face challenges due to inconsistent data formats or lack of standardized reporting.
5. Accessibility: Climate data is often scattered across multiple platforms, making it difficult for users to find and access the information they need.
6. Harmonization: Users find it challenging to combine data from different sources into a coherent and meaningful analysis.
7. Accuracy: Data accuracy is a significant concern for users, as errors or inconsistencies in data can lead to incorrect conclusions or misguided actions.
Conclusions
The platform achieved a 60% overall score in average usability and navigability success. While users did not encounter any blockers in completing tasks, 80% of them experienced non-blocker issues when using the platform. Some commonly requested features included: downloadable data, actor comparison, and source citation, among others.
Recommendations based on the testing results include: enhancing font sizes, engaging the function and rendering the value, and facilitating data analysis.
Final Thoughts
A 60% overall score is a good result, showing that we are on the right track while also highlighting the challenges we need to overcome. It is essential to be data-driven in our design and continuously test our product with real users to improve the platform and better serve their needs.
In addressing the challenges users face when searching for climate information, the OpenClimate platform can focus on:
1. Ensuring data availability: Continuously update the platform with relevant and comprehensive datasets to provide users with the most recent and accurate climate data.
2. Building trust: Clearly cite the sources of the data used and provide explanations about the methodologies employed, to enhance transparency and user confidence in the data.
3. Standardizing methodologies: Whenever possible, harmonize data methodologies and reporting standards to facilitate comparison across different sources.
4. Enhancing data comparison capabilities: Develop features that allow users to easily compare data from different regions, sectors, or actors.
5. Improving accessibility: Create a centralized platform where users can easily find and access the climate information they need.
6. Supporting data harmonization: Develop tools that enable users to combine data from different sources into a coherent and meaningful analysis.
7. Ensuring accuracy: Implement rigorous data quality checks and validation processes to minimize errors or inconsistencies in the data.
By addressing these challenges and incorporating user feedback, the OpenClimate platform can continue to evolve and provide a valuable resource for those seeking reliable and accessible climate information.
If you want to know more or give feedback about the platform, visit www.openearth.org or send an email to ux@openearth.org.