Earlier this week my Ithaka S+R colleagues and I published “Student Data in the Digital Era: An Overview of Current Practices,” in which we review how institutions of higher education are currently using student data, and some of the practical and ethical challenges they face in doing so. As we conducted research for this report, part of our Responsible Use of Student Data in Higher Education project with Stanford University, we heard recurring concerns about the growing role of for-profit vendors in learning analytics. These third-party vendors, the argument goes, operate without the ethical obligations to students that institutions have, and design their products at a remove from the spaces where learning happens.

Because of these concerns, several prominent leaders in the field have argued for academic institutions to take a leading role in learning analytics. Just last May, George Siemens, who has long advocated for open analytics, stressed the importance that educators be “active participants in the sociotechnical and economic spaces” driving change, and warned against the consequences of third party vendors with black box algorithms taking too much control over the field. Similarly, in an article published in the Chronicle of Higher Education earlier this year, Stanford’s Candace Thille advocated for colleges to take more control over learning analytics and adaptive learning, and encouraged institutions to incentivize research and innovation with student data.

These are laudable goals, but institutions face many challenges when it comes to realizing this vision. Below are four strategies that higher education leaders and other stakeholders should pursue if colleges and universities are to successfully take the lead in the development of learning analytics:

Collaborate across differentially resourced institutions: As Michael Meotti poignantly noted in a recent article on higher education’s role in learning analytics, community colleges and broad-access public institutions, which serve the majority of at-risk students, usually do not have the research or technical capacity to develop in-house learning analytics solutions. When these institutions do use analytics tools, they use third-party platforms. And, even if institutions like Stanford or Carnegie Mellon were to take the lead in developing learning analytics solutions for higher education writ-large, questions remain about whether models developed at these schools would be appropriate for all institutional contexts.

In order to address this issue, institutions of various types should commit to collaboration and openness. Several initiatives, such as the PAR Framework, the Open Academic Analytics Initiative, and the Simon Initiative Learning Engineering Ecosystem offer some models for this, and have allowed institutions to increase their technical capacity through common and shared resources. As learning analytics grows, there will be a need for increased infrastructure and incentives to support open analytics models and collaboration across institutions.

Create incentives for research and application based on student data: While colleges may be the ideal laboratories for research into student learning, there is not yet a widespread sense of obligation towards a scholarly approach to teaching and learning. Nor is there well-established infrastructure for translating this research into institutionalized changes or interventions. Institutions like Stanford and University of Michigan have found innovative ways to incentivize and incubate learning science research, and Colorado Technical University, University of Central Florida, and University of Maryland University College are currently sharing data and best practices to support adaptive learning and learning analytics initiatives. But examples like these remain the exception rather than the rule. If colleges are to take the lead in learning analytics development, there must incentives and opportunities for researchers to engage with this sort of work in institutional settings must be more widespread.

Create tools that instructors and students can easily understand: Much of the argument for colleges taking the lead in learning analytics development is based in the notion that those closest to student learning should maintain control over it. By this logic, instructors and students—as well as academic researchers—should be able to easily understand data and the algorithms that inform teaching and learning, so that they can be collaborators, rather than mere data points or facilitators, in the analytics process. This means that analytics tools should be built with interfaces that allow for easy visualization and input that assist, rather than determine, decision making, and that institutions operate with transparency as they collect and use student data.

Establish ethical guidelines: Implicit in Thille and Siemen’s urgings are assumptions that institutions have an obligation to use student data, and that they can do so in a more ethical and effective way than can third-party vendors. Yet, few institutions have articulated ethical principles to guide their use of student data for learning analytics, or to frame it within larger institution goals. Indeed, colleges and universities are better poised to manage issues like transparency, consent, privacy, and bias than are third-party vendors, but, in order to effectively leverage this position, they must clearly articulate guidelines that define their approach.

For-profit vendors bring resources, talent, and new ideas into the learning analytics space, and have a crucial role to play in an ethical, institution-led ecosystem of development. If institutions are to take the lead in this ecosystem, they must collaborate—with other institutions and with vendors—to create open models, incentivize innovations in learning sciences, build tools that are accessible to multiple campus stakeholders, and establish clear ethical principles to define responsible institutional use.