Skip to main content Skip to secondary navigation
Main content start

We have conducted a number of pilot projects with individual instructors, programs, departments, and schools to create data dashboards and reports that deliver actionable, timely information. Through this iterative process, we have learned much regarding how to help refine key questions and answer them through learning analytics analyses. We have categorized and grouped questions into three levels; course, program/department, and school. This distinction helps navigation through the available reports.

Course-level pilot project

Glenn Katz is a Lecturer in the Department of Civil & Environmental Engineering (CEE) at Stanford University specializing in Architectural Design Studio, Building Information Modeling (BIM), and Parametric Design. We worked with him and his TA, Cynthia Brosque to transform three of his Hybrid Masters degree classes into fully online courses. The opportunity to take the CEE courses remotely provided students with more flexibility especially during the summer when they were away from campus or had internships. 

"​In this transformation getting insights from the data was an important factor, we kept improving the course based on data generated by students taking the classes." said Glenn. 

We collaborated with the CEE teaching team to  design custom learning analytics dashboards to help gain insights from student online activity, frequency of participation in course discussions, and assignment submission trends during the quarter. The patterns we identified helped to guide course enhancement and redesign efforts. "This use of technology was no longer about efficiency in delivering the same material, but a better learning experience for the students." said Glenn. 

Our work with Glenn started by analyzing his CEE 220A course data. At the start of the project there was a need to identify nuances in student performance which would enable Glenn to analyze and address any learning roadblocks encountered by students. The Learning Analytics team worked with him to analyze and revise elements of his course that would keep students engaged in an asynchronous environment, as well as provide information which would enable him to understand where students were—or were not—engaging  with the course.

Together, we were able to use analytics to better understand which learning tools, activities and materials students were interacting with. We used Canvas tools to outline student expectations/objectives and used discussions to gauge student participation. This enabled him to target topics that students may have missed, highlight the importance of certain lessons, and have more direct and targeted conversations with his students. It also helped him create material that he knew would be impactful for his students. Finally, Glenn created videos and infographics to help students better understand the complex topics of his course.He also looked at course trends to better determine which topics students revisited the most and areas where they may have had some difficulty with key concepts. Glenn found the analytics and feedback valuable in fine tailoring the course to better fit the needs of the students. A unique advantage to moving to the online system was that the analytics opened up whole new and unexpected window into insights about how students work with the content that were never available in a traditional classroom setting. Good use of data empowered Glenn to make changes he would not otherwise been able to make. Here is Glenn's experience in 2 minutes!

Department-Level Pilot Project

Advisors, programs, and departments benefit from student generated data.The data provides a holistic view of student activity and course participation that can serve as a tool to support students and assist them to better plan for success. Departments can get an overview of how each course is functioning by getting a report that includes all courses, identify students that are at risk of falling behind, and identify patterns that could potentially reveal best practices or even issues. For example if many attempts are made from more than a few students to submit files, that might indicate an issue in the system for uploading large files. Department can use these data dashboards to identify these types of issues and problem solve in a timely  manner and notify all faculty. Use of such reports can undoubtedly enhance teaching and improve students' learning experience.

School-level pilot project

Josh Weiss is the Director of Digital learning solutions at Stanford Graduate School Education (GSE). In his role, he provides support for courses and digital learning experiences to professors and researchers. His team of technologists, designers, and media specialists facilitate and deploy research-based practices as well as exploratory technologies in order to develop more meaningful learning experiences for students at the GSE.

His group frequently combines qualitative and quantitative analysis to target next steps with core support. To get a finer sense of how instructors were utilizing Canvas for effective teaching, he teamed up with Kathy Mirzaei, Associate Director of Data and Analytics and team to generate a custom data dashboard in Tableau. The goal of the data dashboard was not only to indicate frequency of contributions from students (and other classic engagement metrics), but also to analyze the use of high-engagement vs low-engagement tools in Canvas, and how they varied by class and by quarter.

This data yielded some significant insights and guided new strategies for Josh and his team. First, the team was able to surface innovative and exemplary teaching — and, crucially, intentional teaching practices in classrooms that up to this point had not been captured or shared out. By identifying these exemplary cases, the support team at the GSE was able to approach standout instructional teams and de-silo practices through extensive interviews and documentation, which over time accumulated into a bank of practices within the GSE IT Teaching Resources website. More than 20 interviews and accounts were collected through remote learning, and the site continues to grow even after remote instruction with the momentum that the data dashboard-based conversations generated. This website continues to be accessed by instructors from around the university as they mine accounts of intentional teaching for strategies and techniques that they can use in their own classrooms.

Another interesting insight emerged around aggregate practices. Over time, it became more apparent with remote instruction that some instructors were leveling up their asynchronous practices and leveraging multiple remote teaching modalities to serve a diverse set of student needs. As this wave of innovative instruction continued to rise through 2020 and 2021, Josh and his team were able to recruit instructors for panels and share-out sessions among instructors and HAs to give voice to unheralded efforts and to de-silo new and worthwhile practices happening at Stanford.

To complement this, the team also put together a series of workshops that addressed gaps in instruction that were expressed during the share-out sessions and which were validated by data from the Tableau dashboards. In this way, strengths were surfaced and weak points were also revealed so that the team could address overall needs in remote instruction.

Ready to start your learning analytics journey? Submit your request