The university is actively assessing and exploring the impact of Artificial intelligence (AI) on teaching and learning, research and scholarship, administrative processes, and other functions. Although AI is not new and has been a part of our lives for decades, its potential usefulness is transforming the way we think about our daily work.
AI is a catch-all term for tech that helps machines do things that usually need human smarts, like thinking, problem-solving, learning from experience, and adapting to new situations. It breaks down into a few main areas:
On campus, our information technology and security teams have been exploring AI's potential and have approved the following AI tools for university use:
Additional information is available on the AI Tool Comparison webpage.
Faculty and staff can harness the power of AI in various ways to streamline processes and improve productivity. One of the most effective ways to use these approved Generative AI tools is to create clear and concise prompts, called prompt engineering. Prompt engineering involves crafting effective input prompts to guide the model’s responses.
Decide what you want to ask and provide context to why you're asking. Instead of asking "Tell me about AI" you can ask, "What are the key applications of AI in healthcare and how are they improving patient outcomes." Or you could pick a topic and ask for step-by-step explanations, "What are the top five things I should know when managing a large project? and What is agile project management and how does it differ from waterfall?" Providing context and asking if more information is needed will provide better results.
When using Copilot Chat, the start image where you will input the prompt will look like the following:
DALL-E has been integrated into Copilot, enhancing its capabilities with advanced image generation. This integration empowers Copilot users to create custom visuals, streamlining content creation for academic and administrative purposes. Prompt engineering is important with asking AI to create a DALL-E image as well.
The AI landscape is rapidly evolving, with new tools being introduced frequently. Our Information Security and IT Compliance (ISIC) teams are continuously reviewing AI considerations. While AI is a powerful tool, it requires careful planning and ongoing monitoring. It is crucial to balance the benefits of AI with the need to minimize risks.
Most of these AI tools are currently approved for use with public and confidential data. At this time, only Copilot for Microsoft 365 is approved for use with highly confidential data. The University of Colorado has adopted data classification types including highly confidential information, confidential information and public information. Review the CU Data Classification Table for more information about data types.
Be cautious when using AI tools, ensuring that personal, sensitive and university data are not uploaded or shared with unvetted AI systems, as they may not be secure. How an AI tool or assistant processes and uses the data that is input into it is a key factor in determining its security. Refer to our AI Tool Comparison webpage for more information.
In addition to using university approved AI tools with public or confidential data, all users must follow and abide by university policy - and relevant state and federal law regarding protecting data and information systems. These include but are not limited to:
Please note: For Microsoft AI tools, though Microsoft assures that your data and university data won’t be used to train their artificial bot, it is still advisable to exercise caution. When using Copilot Chat, be sure you are logged in to your university account and confirm that the "protected shield" is visible in the prompt area. Additional information about securely using AI are available. Visit the ISIC webpages: Microsoft Copilot products, Zoom AI Companion features and AI Security and Compliance.
Request an Applications Assessment for assistance vetting AI prior to acquiring a technology, particularly if the AI is intended for clinical purposes or will use highly confidential data such as FERPA or HIPAA data.