How to Develop a Mastery Dashboard That Works

Posted September 30, 2019 | by Scott Ellis, CEO, MasteryTrack

How to Develop a Mastery Dashboard that Works” by Scott Ellis was originally published on gettingsmart.com.

A key challenge in mastery learning is how to organize and display the data about student learning progress. A web-based data dashboard is a common and reasonable approach for accomplishing this task–and scalable in a way that a Google spreadsheet is not. But how should the dashboard be structured? And what kind of underlying data architecture makes this possible? In developing MasteryTrack we confronted these issues and sought to develop a scalable approach that would work for students and teachers as well as principals and parents. We learned many lessons on the way, and these along with the details of our ultimate solution may be helpful for others trying to grapple with the thorny issue of organizing and displaying data to enable mastery learning.

5 Mastery Dashboard Design Tips

1. Grain size. Early in our efforts, we struggled with what “grain size” of objectives to include in the dashboard and how to organize them. After experimenting with a few different structures we ultimately implemented a five-level hierarchy to describe learning content in any course:

  1. Subject (math)
  2. Course (Late Elementary Math II)
  3. Unit (decimals)
  4. Concept (basic operations)
  5. Objective (multiply decimals up to hundredths)

We initially had an additional level between Unit and Concept called Topic because we thought we might need a sixth level in the hierarchy, but we eventually found that it was never used. Five levels have been sufficient for a range of courses and course “types” (e.g., math, computer science, SEL, world languages, etc.). All courses use Subject, Course, Unit, and Objective, and some (but not all) use Concept as well. We have also found that the structure can vary for different sections of the same course—some have enough levels of detail that they need the Concept level while others can be accurately displayed without it.

2. Dashboard types. We created two types of dashboards: the overview dashboard and the objective dashboard. We found that teachers wanted to see a high-level view of student mastery status across an entire course and all its units (the overview dashboard), and then they wanted to be able to dive into the status for every objective within a unit (the objective dashboard). These two views enable users to quickly and easily understand where students are in their learning.

3. Structures of courses. Structures of courses vary significantly by subject area and grade level, so in MasteryTrack we have implemented several different types. Some dashboards are designed to cover roughly a year of learning content. This is common in some math courses (e.g., Algebra, Geometry, Algebra II, etc.) and high school science courses. This may simply be an artifact of the existing time-based system, but it might actually be a reasonable long-term approach to structure mastery-based content in a way that is feasible for teachers and students. Learning trajectories like Algebra/Geometry/Algebra II are well understood, and structures like Chemistry/Honors Chemistry/AP Chemistry may provide an established mastery-based course architecture. So although mastery learning generally drifts away from grade levels, we have found one-year courses helpful in some cases.

Many subjects are moving towards a grade band structure. This is easy to display in MasteryTrack and enables teachers to see the data for students who are at widely different places in their learning—it is easy to see the progress of students who are far ahead as well as those who are earlier in their learning. This has worked well for structuring courses like SEL, Computer Science, Elementary Math, and Social Studies, and could easily apply for others. One downside of this approach is that if there are too many units or too many objectives visible on the screen, the dashboard starts to become more cumbersome and less useful. A “course” that includes content for multiple current grade levels may have large sections that are not used simply because students have not learned much of the content. 

We have developed a few courses that include several years of content. This has particularly been true in languages—this is the structure for Spanish Interpersonal Oral and Mandarin Chinese Writing. In Spanish Interpersonal Oral we initially designed a dashboard that included all content from beginner through advanced-intermediate, which covers several years of learning. This course has 10 units. The early units have one to six learning objectives each, while the advanced units have 14. This has been useful for teachers to see the full learning trajectory for students. Recently, however, we have received requests from teachers to restructure the Spanish content to create separate courses designed for earlier learners since so much of the advanced content is not relevant for them.

4. The number of objectives. A final dashboard architecture issue is the number of units and the number of objectives. The overview dashboard may become cumbersome if it has so many units that the user needs to scroll far to the right to see everything. In these cases, it can be helpful to split the course in two, or else to consolidate units so the full content can be seen in one view. In the objective dashboards, if a particular unit has too many objectives (more than 15-20), it can become hard to read and also require scrolling to the right that the user may find undesirable. We found that in these cases the best solution is to divide it into more than one unit so there is a more manageable number of objectives per unit. On the other hand, we have encountered some situations where a unit only had one to two objectives. While this is feasible, it becomes a bit cumbersome for the user, and so it can be more efficient to consolidate multiple units so there are at least four to five objectives in a unit.

5. Our best advice? Don’t over-engineer your dashboard. If you’re building your own dashboard, initiate two or three trials with a simple prototype and commit to continuous iteration so you can start to identify the right foundation in your context. The risk of building a solution on the wrong architecture is too high to justify trying to create a complete tracking system without getting user feedback along the way.

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *