What are the Best Practices for Establishing an AI Governance Plan? InfoSci Interim Dean Catherine Brooks Provides Insight
A recent article in Reworked explores the growing importance of information governance as organizations integrate generative artificial intelligence into their operations. In “Best Practices for Establishing an AI Governance Plan,” Reworked notes that while AI tools promise innovation and efficiency, they also introduce risks, including data mismanagement, regulatory non-compliance and unclear accountability. Experts agree that a strong governance plan is essential to navigate this evolving landscape effectively.
One of those experts is Catherine Brooks, professor and interim dean of the College of Information Science. In the article she warns against underestimating the challenges posed by emerging AI tools. “In the face of emerging AI tools, the key is going to be to think ahead of ourselves as best we can in order to mitigate unintended negative impacts that are already emerging,” she says. Brooks stresses the need to balance the benefits of AI with the responsibility of managing its risks.
The article highlights five best practices to establish a robust AI governance foundation. One critical element is transparency, a principle Brooks emphasizes when she suggests that “organizations should consider labeling AI-assisted materials and being open about how AI is used in knowledge management processes.” This practice, she adds, helps ensure accountability and may soon align with regulatory requirements.
Training is another key focus. Brooks advocates for a workforce equipped with both analytic and creative skills to maximize the benefits of AI while adhering to ethical practices. The goal is “to train a workforce with analytic skills and with creative tools for managing, preserving and making information useful and reliable for other people,” she notes. “We need to avoid the use of AI tools ‘too much’ while being open and purposefully training people to use AI tools in ways that are beneficial and useful to them.”
Reworked bills itself as “the world’s leading community of employee experience, digital workplace and talent management professionals.” Accordingly, the article identifies practical steps organizations can take in preparing governance plans, including establishing governance frameworks, standardizing and cleaning up data, and ensuring traceability in AI systems. Brooks points out that effective governance also requires clear communication about how AI tools will be used, as well as ongoing education to foster ethical practices.
“In the end,” she says to the Reworked community, “human creativity remains important and cannot be replicated by AI tools, no matter how refined those tools may become.”
As businesses increasingly turn to AI to enhance knowledge sharing and innovation, Brooks and other experts stress the importance of tying technology adoption to thoughtful, strategic governance. Given AI’s potential to transform industries, those who prioritize transparency, accountability and workforce readiness will be best positioned to harness its promise while minimizing risks.
Read the full article at Reworked.
Meet Interim Dean Brooks or explore the interdisciplinary research and faculty of the College of Information Science.