Summary Review of AI Guidance :
- AI tools reviewed and supported by CU should not use highly confidential data.
- Take care with personal and sensitive information in using AI tools.
- Do not integrate unapproved AI tools with university systems (OtterAI in Zoom meetings, Gmail plug-ins, etc.).
- Coordinate with UIS staff on any AI usage to protect you and CU.
The following tools have been approved for CU System Administration use with university data.
You are responsible for the data you enter into a tool and the output from AI tools. Do not use highly confidential data while using AI tools.
Copilot for web (previously known as Bing Chat) is an AI-powered web chat that can generate content, images, and answers to questions and potentially improve productivity. It is only approved to use through a university Microsoft account and users should use discretion when sharing personal or organizational data.
- CU Anschutz provides guidelines for using CoPilot Chat securely . Highly confidential data should NOT be entered into Copilot chat. You should also make sure that you are using your university account so that any information added during the session will be protected and then purged when the session ends.
- How to access: The Microsoft Copilot Chat web app can be accessed from Microsoft Edge or Microsoft Bing. Make sure you are logged into your university account and then either:
- Launch your browser and click the Copilot icon in the upper right-hand corner; or,
- Go to copilot.microsoft.com and login to begin.
Copilot for Microsoft 365 is an AI-powered assistant designed to enhance productivity and streamline workflows within the Microsoft 365 suite. By leveraging advanced AI and machine learning technologies, Copilot helps users quickly generate content, analyze data, summarize documents, learn new skills and write code.
- Copilot Microsoft 365 is approved for use with public and confidential data (not highly confidential data). The user will still need to be logged in to their CU account to ensure proper data protection and must follow and abide by university policy, and relevant state and federal law regarding protecting data and information systems.
- How to access: Available for license purchase by contacting the UIS Service Desk.
Zoom AI Companion is an intelligent assistant integrated into the Zoom platform, designed to enhance virtual meetings and collaboration. This AI tool can help staff and faculty generate meeting summaries, transcribe conversations, and highlight key points. It can also provide actionable insights, suggest next steps, and assist with scheduling and follow-ups.
Adobe Firefly is a suite of generative AI tools integrated into Adobe's Creative Cloud, designed to enhance creativity and streamline the creative process. Firefly leverages advanced AI to generate images, graphics and other visual content from textual descriptions, allowing users to quickly and easily bring their creative visions to life.
- How to access: User needs to pay for the Adobe Creative Cloud license.
Salesforce Einstein is a set of AI tools that use machine learning, natural language processing and other techniques to analyze data and automate tasks.
Google Gemini (previously known as Google Bard) is an AI chat service that generates text, translates languages and provides creative content. It was found not secure for university data.
ChatGPT generates text for answering questions, providing explanations, engaging in conversations, translating languages and more. It was found not secure for university data.
CU Policies and Procedures
CU has a variety of policies and procedures regarding information technology, information security, data and procurement that may apply to the use of AI tools. CU endeavors to develop policies that apply to a wide range of technologies rather than specific policies about different technologies, which also apply to AI technologies.
Universitywide policies
CU System Administration policies
State Regulations
Under current review: Senate Bill 24-205 AI Decision Making Tools
The new law examines the risk of algorithmic discrimination when using AI tools in the decision-making process, such as higher ed enrollment processes. It also would require that end-users be informed when they are interacting with an AI system.
Enforcement begins Feb. 1, 2026.
UIS anticipates that more state, national and international AI regulations will emerge.
Campus Resources
CU Anschutz: Artifical Intelligence
CU Anschutz: AI Opportunities, Risks and Dangers
CU Boulder: Artificial Intelligence
CU Boulder: Guiding Principles for Generative AI in Support of Marketing and Communications
CU Boulder: AI Limitations and Considerations — AI Hallucinations and Bias
CU Boulder Libraries: Generative AI LibGuide
CU Denver: Artitifical Intelligence Tools
UCCS: Artificial Intelligence
Other Resources
Generative AI in a Nutshell — YouTube video
Educause: A Generative AI Primer
Educause: The Basics of AI in Higher Education
Microsoft: Empowering responsible AI practices
US Office of Science and Technology Policy: Blueprint for an AI Bill of Rights