Blackboard Data
Overview
One of our missions at Blackboard is to help users interpret, apply, and deliver insights to the right people, at the right time, to inspire and initiate action. And Blackboard Data, the new data and analytics platform, enables our users to place insights from other Blackboard SaaS tools alongside their Analytics for Learn content using the Pyramid BI tool included with Analytics for Learn.
However, when this product was kicked off, there were different experiences and analytics views across our portfolio of products. We were challenged to think about how we could consolidate those views into one data reporting experience, while also eliminating the large technical debt on our side.
Company
Blackboard, Inc.
Role
UXD Manager,
UX Contributor
Portfolio
Data & Analytics
Planning
When I joined Blackboard in 2017, this was a completely new product and initiative. I joined to help lead this effort with a small team of designers and an eager group of product managers and developers. At the time of conception, we had market analysis, but we were lacking in a cross-team understanding of our target users and vision. I worked with stakeholders to align on current state, develop a research plan, and define a strategic experience based roadmap based on our findings. We knew we needed a product that was a) scalable; to meet the market needs of different institutions and b) modular: to be able to provide distinct value for individual products but also amplified when using entire suite for cross-product insights.
Our first target date to deliver was our annual conference, Blackboard World, which takes place in July every year. By that time we were hoping to have a POC and working prototype to test in our UX Lab and to demo to clients.
*Post the first BbWorld, we iterated and took to more conferences over the years for feedback as the current product was built out.
Research
Since this was new territory for Blackboard, we knew we needed a fairly large generative research project to understand the scope of who our users are, how they work, and what they need. Our design research effort leveraged qualitative methods to explore how academic faculty utilize (or don’t utilize) data and analytics from the perspectives of analysts, novice faculty, and c-suite admins. During this research program, we sought to understand how academic faculty navigate their academic journey through the use of data and analytics.
Participants came from public and private universities that offer undergraduate education in the United States, including institutions that serve primarily transfer and online students. Our sample was small and not a randomized set, with many participants from underserved groups.
Research Objectives
Gather information and background on users to better understand the personas using analytics in education technology
Get a better idea of how Learning Analytics is understood and the various tools they use
Determine user expectations and processes for acquiring data/reports
Gather answers to analytics-specific information - ie what are users trying to measure?
Identify areas of confusion and frustration
Discover anything missing that the user finds important
Design Thinking Workshops
What we did: We facilitated workshops during Analytics Symposiums and our internal Teaching & Learning conferences in all of our regions to gather a better understanding of how our users currently incorporate analytics into their roles. We talked to them about the tools they use, got feedback on current reports, and identified key pain points among them.
Who we talked to: 25-30 Analytics Symposium attendees & various individual roles (Service Manager, Head of Office for Digital Learning, Developer, Advisor for LA PoC, Systems Engineer)
Generative Research
What we did: We performed 5 remote inquiries at 4 different universities. Each were recorded, 90 mins conversations that covered a wide range of topics from Data Architecture and Consolidation to Report Generation and Analytics Education. We also sent out an Analytics Questionnaire to answer specific questions we had about how they use analytics, the specific tools that they use, and what kind of struggles they have in their day to day work.
Who we talked to: Director for Educational Technology at Wartburg Seminary, Adaptive Learning and Teaching Analyst at Charles Sturt University, Manager of Technologies at Charles Darwin University, Data Architect at Gonzaga University, Director of Institutional Research at Gonzaga University, and 9 other individuals with varying roles who took the survey.
User understanding
As research was underway, we continuously built upon our knowledge of the users, which naturally landed on a spectrum of novice to expert. We mapped out their typical responsibilities, motivations, pain points, and noted professional job titles. We discovered that it was less about the user journey through a data product and more about the questions they were asking around their data and all the hands that need it. So we mapped out the lifecycle of data question to better see all the touchpoints, different products, and issues.
concepting & iterations
Based on our knowledge of the users and our core insights that surfaced, we entered the phase of big ideas and concept design, where we pushed to think about the pain points and potential opportunities we had to enhance our users’ workflows. Throughout brainstorming, we kept the main opportunity in the back of our minds to be able to explore ideas to solve this problem:
How can we exhibit the way our tools and offerings fit together and support the “next generation learning environment?”
We tested 3 different concepts that were informed by 1-2 of the core insights and resolved several pain points of each user type. After testing, we ended up with several features that would be beneficial from each one, with the framework stemming from “The Hub” concept below.
All Inclusive
Supportive, Informative, Transparent
Choose Your Adventure
Assistive, Informative, Control
The Hub
Collaborative, Unrestricted, Adaptable
in progress work
The hub was utilized to build the basic framework of the MVP for Blackboard Data. We are still currently working through features and sprints, continuously making updates to design, testing with users, and innovating on ways to show data insights where they are applicable. We have also started to focus on embedded insights across all of our products to meet users where they work and help them make decisions faster.
outcomes
Agile & UX Design Process
As this was a newly structured team, we had a lot of growing pains. But we all came together to gain an understanding of what everyone needed and expected to do their best work. Utilizing agile methodologies and design thinking frameworks, we were able to get going quickly.
Client Research & Connection
Through the in-depth research efforts with clients all over the globe, we connected with users who were eager to stay in touch and help us make the product the best it could be. We continuously reach out to them for user testing and meet up with them at conferences.
Blackboard Data Vision
Having conducted the research needed to build the framework, we were able to chunk out the design vision in an experience based roadmap which laid plans and made the product scalable for the next 2-3 years. We update as we need to if we learn new things from ongoing research.