Subcommittee on Technological Change and Artificial Intelligence
Issue Summary and Core Questions, November 2025
Frank Pasquale, David Rand, Phoebe Sengers (chair), Marcus Smolka, Abraham Stroock, Julia Thom-Levy
Over the prior decades, digital technologies have profoundly altered the landscape of practices across the university: in education, research and scholarship, student life, and general operations. AI technologies are now being rolled out across these contexts at a much faster pace, and sometimes more disruptively, than these prior technologies. We can no longer assume that students will arrive on campus with strong foundations in core skills (e.g., writing, reading, mathematics, critical thinking, sustained deep thought), nor that the traditional experience of university education will provide these skills without special attention. At the same time, which skills are considered core may be in flux.
One dimension to consider is the rapid development and deployment of generative AI (i.e., algorithms that produce text, images, and other outputs by statistical prediction of likelihood of the next sequential output). These algorithms have grown astoundingly in their capacity to produce outputs that mimic human intellectual work, raising questions about what knowledge and skills humans need and what can be better outsourced to machines. This situation also prompts questions about when and how it is appropriate and desirable to integrate AI into the varied activities of the university, while suggesting opportunities for innovative scholarship, improved pedagogy, and better operations.
Cornell already has strong AI initiatives across research, teaching, and operations.1 These are coordinating the university’s immediate response to AI in a principled, strategic fashion. Instead of duplicating these efforts, our task is to consider AI at a longer scale and in the context of other past and possible future technological changes that impinge on the university’s missions.
At this scale, the effects of AI are as yet unknown. For example, it is difficult to assess the longer-term impact of AI on work and on the skills workers will need, although dire predictions of widespread job loss are likely overstated.2 Neither is it easy to anticipate the transformative possibilities of these technologies for learning, scholarship, and engagement. At the scale of decades, we cannot accurately anticipate the impact of other future technologies. What we do know is that we will benefit from fostering resilience in the institution and in our students in the face of such unpredictable change. The effects of such technologies are in any case not a given; they depend on ongoing technological, economic, and policy decisions that the university can inform through its own teaching, research, and public engagement.
Thinking more broadly about technological change
Generative AI is part of a longer lineage of information technologies (e.g., the internet, world wide web, social media, data science, and machine learning) that affect how people understand what knowledge is, where they go to be informed, and the role of institutions (or lack thereof) in producing knowledge. These technologies have also participated in changing how work is done at the university, how our students understand themselves, and how they relate to the world around them. They have enabled extraordinary advances in education, research, and democratization of knowledge. But they are also implicated in trends that undermine the university’s truth-seeking and educational missions, including political polarization, online bullying, the deliberate spread of falsehoods, and a culture of trolling.3 Universities participate in the controversies that result, as explored by our Trust Subcommittee.
Students have been profoundly affected by these technological changes and accompanying cultural shifts. While there is disagreement about the degree to which this is caused by technology, it is the case that students arrive on university campuses with significantly different outlooks, attitudes, and skills than prior generations.4 Students’ social interactions have been reshaped through always-on connectivity and pressure to manage online reputations. They have far more access to information, informal learning opportunities, and networks beyond the university. At the same time, mental health issues like anxiety have increased dramatically5 and paths to career success have become unsettled. For better or for worse, online political interactions shape how students understand themselves as citizens. The cognitive skills students arrive with are also changing. For example, faculty report problems with basic reading skills, focused attention, and the ability to tolerate intellectual and emotional discomfort. These changes extend beyond the classroom to all facets of student life. They also affect faculty and staff. Cornell must grapple with these changes head-on.
Looking forward
Undergraduate, graduate, and professional education: We need to talk explicitly about how cognitive skills are shifting with the ubiquitous adoption of digital technology and the use of AI tools and ask how we want to cultivate minds in this era. Our choices related to technology must be driven by great clarity around the core skills and experiences the university must foster, and recognize that our ways of teaching these skills may need to shift with the arrival of new technologies. What aspects of specifically human intelligence and skill should we emphasize in a world with pervasive access to AI? What capacities may be compromised by AI use and need to be amplified? These might include emotional intelligence, as well as long-standing humanistic values like judgement, critical thinking, and aesthetic development.
Research and scholarship: Generative AI is both a topic of scholarship and a tool that can be used throughout the research process. Its use raises a myriad of questions about methodology, ethics, and what it means to know. There are opportunities for Cornell to achieve first-mover advantage in emerging research areas. At the same time, research heavily reliant on AI may be directional, potentially driving researchers to converge to overcrowded areas of technology and knowledge and obscuring the degree to which AI may provide answers without understanding. A balanced research program at Cornell will need to strike a delicate equilibrium, perhaps embracing new AI-based technologies while retaining and supporting researchers with diverse perspectives on the integration of AI into the process of producing scholarship. A unique strength of Cornell in this landscape is the breadth of disciplines we offer and our low interdisciplinary barriers. This allows us to come at AI and other technologies from a unique vantage point – by synthesizing social, humanistic, scientific, technical, and/or creative perspectives. We are simultaneously developing AI, critiquing AI, and using AI, and we have the capacity to cross-pollinate these efforts. For example, we can be leaders in sound research methodology in generative AI; we can also develop forms of AI that foster more effective use and enable better governance.
Public impact and community engagement: While universities are not isolated from the rest of the economy, non-profit institutions of higher education have a different incentive structure than private industry. At Cornell, it is possible to imagine designing, deploying, organizing, utilizing, and teaching about technologies differently. We would benefit from finding more ways to shift from being a passive recipient of technological change to being an active driver, in ways that center university and public priorities rather than those arising from the political economy of tech innovation. We can leverage our robust outreach and extension arms to engage the public in considerations of the consequences of technology and needs it could address. Such leadership can enable transformative public impact. For example, we can contribute in technology ethics, policy, and law, and by developing technologies in the public interest and in domains underserved by industry. Such contributions align with the public responsibilities of universities, as explored by our University-Government Relations Subcommittee.
Core Questions
- How can universities, through their curricula, departmental structures, and central administrative resources, be organized to allow them to adapt quickly and respond effectively to a rapidly changing landscape?
- What core skills and experiences should the university focus on in an era of rapid technological change and an unpredictably evolving workforce? Where should we alter our methods in research and teaching in order to assure these skills and experiences are indeed central?
- How and where can Cornell serve as a living laboratory for and benefit from research innovations in AI? Should we, and how can we, leave protected space for modes of teaching, learning, and research that are not infused with AI?
- What unique role can universities play in fostering technology development and use that benefit society?
Footnotes
-
See ai.cornell.edu. See also Cornell’s reports on Generative AI in Research, and on Generative AI for Education and Pedagogy. Return to origin
-
Autor, David. The Labor Market Impacts of Technological Change: From Unbridled Enthusiasm to Qualified Optimism to Vast Uncertainty. no. w30074, National Bureau of Economic Research, 30 May 2022. www.nber.org, https://doi.org/10.3386/w30074. Frey, Carl Benedikt, and Michael Osborne. “Generative AI and the Future of Work: A Reappraisal.” Brown J. World Aff., vol. 30, 2023, p. 161. Return to origin
-
Barberá, Pablo. “Social Media, Echo Chambers, and Political Polarization.” Social Media and Democracy, edited by Joshua A. Tucker and Nathaniel Persily, Cambridge University Press, 2020, pp. 34–55. Nagle, Angela. Kill All Normies: Online Culture Wars From 4Chan And Tumblr To Trump And The Alt-Right. Zer0 Books, 2018. Return to origin
-
Twenge, Jean M. iGen. Atria Books, 2017 Return to origin
-
Tan, Gabriel X. D., et al. “Prevalence of Anxiety in College and University Students: An Umbrella Review.” Journal of Affective Disorders Reports, vol. 14, Dec. 2023, p. 100658. ScienceDirect, https://doi.org/10.1016/j.jadr.2023.100658. Return to origin