Skip to main content

Artificial Intelligence & Society Research Area

College of Information Science faculty are leading researchers in artificial intelligence and society, including computer-assisted language learning; language, identity and communication; misinformation/disinformation/fake information; narrative design; AI literacy; and AI's impact on society.


Faculty


Select Current & Recent Research

Current and recent funded faculty research in this area includes but is not limited to the following projects:

Next-Generation Teams
PI: Adarsh Pyarelal (The University of Arizona)
Co-PIs: Clayton Morrison, Kobus Barnard, Winslow Burleson (The University of Arizona)
Key Personnel: Payal Khosla (The University of Arizona)
Collaborators: Evan Carter (Army Research Laboratory)
Funding: U.S. Army Contracting Command, $882,546
Project Dates: September 1, 2025 – August 31, 2026
Website: wiki.lab.pyarelal.xyz/books/projects/page/next-generation-teams
Summary:
The Next Generation Teams project will conduct experiments to study the structured communication protocols that exist on workplace and other teams that use AI agents, the interventions AI agents use to correct deviations from those protocols, and the effects of AI on team performance, coordination, creativity and plan recognition.


DASS: A Framework for Smart Contract Wills
PI: Clayton Morrison
Funding: National Science Foundation, $748,328
Project Dates: October 2022 – September 2026
Publications:
“A Framework to Retrieve Relevant Laws for Will Execution,” Proceedings of the Natural Legal Language Processing Workshop, 2025
“Classify First, and Then Extract: Prompt Chaining Techniques for Information Extraction,” The 6th Natural Legal Language Processing Workshop, 2024
“Information Extraction from Legal Wills: How Well Does GPT-4 Do?” Findings of EMNLP, 2023
“Validity Assessment of Legal Will Statements as Natural Language Inference,” Findings of the Association for Computational Linguistics, 2022
Website: ml4ai.github.io/nli4wills-corpus 
Summary: Researchers are developing accountable software by combining expertise in the law with techniques from natural language processing and formal methods for verifiable software generation and execution. By accountable, this means that the software will (1) verifiably follow all applicable rules and laws, (2) adapt appropriately to changes in laws, and (3) be explainable, so that non-programmers can understand what the code does and how it relates to what they intend. The project consists of a collaboration between the University of Arizona’s College of Information Science, James E. Rogers College of Law and Department of Computer Science.