Pedagogy and Governance: What Responsible AI Adoption Requires
Most institutions are asking how AI can improve learning. That is the right question. But it is not the only one.
The second question is less visible and often more consequential: What has the institution agreed to before the tool ever reaches a classroom?
Responsible AI adoption in education requires two things. Both are necessary. Neither is sufficient alone.
The first is pedagogy. Does the tool strengthen the student-teacher-content relationship, or weaken it? Does it support the cognitive work that produces learning, or displace it? Is it introduced through deliberate instructional design, or simply added to practice without evidence of its effects? These questions deserve serious attention. Brookings, UNESCO, and UNICEF have devoted substantial effort to them, and their findings point in the same direction: AI can enrich learning under specific conditions, but those conditions require intentional design, not passive adoption.
The second is governance. Who controls the environment in which pedagogy now occurs? What data is being collected, under what terms, stored where, and used for what purposes? Does the institution retain the capacity to change direction, or has it ceded that capacity to a vendor without realizing it? Who inside the institution is accountable for these decisions?
These questions receive far less attention — yet they often determine whether pedagogical promises can be realized at all.
The Gap Is Documented
The imbalance is visible in the research literature itself. A 2025 bibliometric analysis published in IEEE Access, "AI Governance in the Context of the EU AI Act," found that education is the single most-studied AI application domain in its dataset. Yet governance-focused research within that education literature remains a small fraction of the whole, dominated by studies of learning outcomes, tool use, and self-regulated learning. The authors identify a significant lag between AI technological advancement and the development of policy and regulation as a defining feature of the current landscape. Pedagogy at scale. Governance at the margins.
The same imbalance shows up at the institutional level. The EDUCAUSE 2025 AI Landscape Study found that only 39% of higher education institutions have AI-related acceptable use policies — up from 23% the year before. An acceptable use policy is not procurement governance. It does not address vendor data rights, contract portability, stewardship accountability, or what the institution agreed to give away in exchange for access. The pedagogical conversation is advancing. The governance infrastructure it depends on is not keeping pace.
Why the Two Cannot Be Separated
This is not an argument that governance matters more than pedagogy. It is an argument that pedagogical questions become harder — sometimes impossible — to answer when governance conditions are weak.
An institution cannot fully evaluate whether a tool serves learning if it cannot inspect how the system works, access meaningful data about its effects, or change course without major disruption. It cannot protect the student-teacher relationship if student interactions are becoming commercial training data. It cannot claim responsible innovation if responsibility has been outsourced to terms of service no one meaningfully reviewed.
Pedagogy and governance are not two separate conversations. They are two dimensions of the same institutional decision, and the most consequential of those decisions are often made before anyone logs in.
Four Criteria That Hold Both Together
The following criteria do not replace the pedagogy conversation. They are the institutional conditions that make it answerable. Each has a pedagogical dimension and a governance dimension. Neither can be ignored without undermining the other.
Capacity vs. Dependency
What can your institution still do if this tool disappears?
Sustainable adoption builds durable institutional expertise alongside tool use — in data interpretation, in pedagogical design, in technical governance — rather than transferring those capabilities to a vendor in exchange for access. When training requires the vendor's trainers, when system knowledge lives only in external documentation, when faculty cannot adapt or interrogate the tool without vendor mediation, the adoption has created dependency where it should have built capacity.
Pedagogically, dependency produces the outcome the best AI-in-education research warns against: institutions unable to exercise the professional judgment necessary to deploy AI well because that judgment has been outsourced. A responsible adoption leaves the institution stronger, not more reliant. Sustainable adoption is measured not by what a tool can do, but by what an institution can still do if the tool disappears.
Reciprocity vs. Extraction
What does the vendor receive from this relationship — and does the institution receive commensurate value in return?
Vendors receive more than licensing fees. They receive behavioral signals generated by students navigating a platform, response patterns from thousands of tutoring interactions, pedagogical content created by faculty inside a vendor-controlled architecture. The pedagogical activity of the institution simultaneously generates commercial value for the vendor. Contracts that grant vendors rights to use institutional or student-generated content for AI model training without explicit governance review represent a form of extraction the field has only begun to name.
Educational activity should not become unpaid research and development. Reciprocity is the minimum governance condition for a relationship that does not become exploitative of the educational work it is ostensibly designed to support.
Infrastructure vs. Platforms
Does your institution govern the systems its education depends on — or has it quietly ceded that capacity to a commercial platform?
Infrastructure is governed. Platforms are used. When core educational functions — course delivery, assessment, student data management, learning analytics — depend on a commercial platform, governance of those functions has migrated to the vendor. The institution uses the system. The vendor controls it. The pedagogical decisions the institution believes it are making are being shaped by design choices it did not make and may not be able to see.
Data portability provisions, interoperability requirements, and lock-in cost assessments determine whether an institution retains the capacity to change direction. Without them, convenience today becomes lock-in tomorrow — and under-resourced institutions, with less negotiating leverage and less technical capacity to assess what they are agreeing to, bear this risk most acutely.
Stewardship vs. Consumption
Are student and faculty data treated as institutional assets to protect — or as inputs to be processed?
This criterion connects governance structure directly to pedagogical trust. The student-teacher relationship that serious AI research identifies as central to learning depends on conditions of safety, privacy, and institutional accountability. Named data stewardship roles — not vendor contacts — are the minimum governance infrastructure for responsible adoption. That means named human beings inside the institution who know what is being collected, under what terms, stored where, and accessible to whom.
When that accountability lives with a vendor's customer success team, the data is not being protected. It is being managed on the vendor's behalf. Stewardship requires organizational design, not just organizational values — and it is the governance precondition for the trust conditions that learning requires.
What Responsible Adoption Requires
Responsible AI adoption does not begin with the demo. It begins with institutional readiness — aligning pedagogy, governance, procurement, and accountability before deployment begins.
The field's pedagogical conversation is asking the right questions. The governance infrastructure it depends on has not been built at the same pace, in the same institutions, with the same urgency. These four criteria are where that work begins: in the contract, before the signature, while there is still something to negotiate.
The Sustainable Learning Framework's Digital Stewardship pillar was built around this integration. Governance is not the obstacle to good pedagogy. It is the precondition for it.
Resources
On the Research Literature Gap
A 2025 bibliometric analysis published in IEEE Access, "AI Governance in the Context of the EU AI Act," examined AI governance research across application domains. Education emerged as the most-studied category, yet governance-focused research within that literature remained a small fraction of the whole — dominated by pedagogy questions. AI Governance in the Context of the EU AI Act — IEEE Access
On the Scale of the Institutional Gap
The EDUCAUSE 2025 AI Landscape Study documents the state of institutional AI policy adoption in higher education, including the finding that only 39% of institutions have AI-related acceptable use policies. EDUCAUSE 2025 AI Landscape Study
On AI Contract Governance and Institutional Risk
The AI Now Institute's annual reports analyze how AI procurement contracts distribute risk between institutions and vendors — the conceptual foundation for the Reciprocity vs. Extraction and Infrastructure vs. Platforms criteria. AI Now Institute
Student Data Privacy Governance
The Student Data Privacy Consortium maintains model contract language, vendor assessment tools, and state-level policy guidance for institutions navigating AI procurement. Student Data Privacy Consortium
Institutional Technology Governance
CoSN's Trusted Learning Environment program provides a framework for data stewardship roles, contract provisions, and equity assessments relevant to the Capacity vs. Dependency and Infrastructure vs. Platforms criteria. CoSN Trusted Learning Environment
The Sustainable Learning Framework
The Digital Stewardship pillar — including Provide Secure Learning Spaces, Promote Digital Fluency, and Use Purpose-Designed Platforms — grounds the governance criteria discussed in this piece. Sustainable Learning Framework