AI at CU Anschutz

University-approved tools, guidelines and best practices

The university is actively assessing and exploring the impact of Artificial intelligence (AI) on teaching and learning, research and scholarship, administrative processes, and other functions. Although AI is not new and has been a part of our lives for decades, its potential usefulness is transforming the way we think about our daily work.

AI is a catch-all term for tech that helps machines do things that usually need human smarts, like thinking, problem-solving, learning from experience, and adapting to new situations. It breaks down into a few main areas:

  • Machine Learning (ML): ML involves computers learning from data and improving their performance over time. They can identify patterns, make predictions, and sort information based on past observations. For instance, machine learning algorithms help identify spam emails by analyzing patterns. 
  • Deep Learning (DL): DL uses networks of artificial "neurons" to understand complex data relationships. It excels at tasks like recognizing images, understanding human language, and even creating human-like speech. These models can, as an example, can assist doctors in diagnosing medical conditions by analyzing medical images. 
  • Generative AI: This area of AI focuses on creating new content, such as images, text or music. A good example is DALL-E, which can create unique images from descriptions you give it.

On campus, our information technology and security teams have been exploring AI's potential and have approved the following AI tools for university use:

  • Microsoft Copilot for the Web Microsoft Copilot lets you chat with an AI agent within your Edge or Bing browser that can answer your questions, generate content or help you with tasks using public online data. Think of it as a smart and friendly chatbot that can talk to you about anything you want.
  • Copilot for Microsoft 365: Copilot for Microsoft 365 is an AI-powered assistant designed to enhance productivity and streamline workflows within the Microsoft 365 suite. By leveraging advanced AI and machine learning technologies, Copilot helps users quickly generate content, analyze data, summarize documents, learn new skills and write code.
  • Zoom AI CompanionZoom AI Companion is an intelligent assistant integrated into the Zoom platform, designed to enhance virtual meetings and collaboration. This AI-powered tool can help staff and faculty by generating meeting summaries, transcribing conversations, and highlighting key points in real time. It can also provide actionable insights, suggest next steps and assist with scheduling and follow-ups.
  • Adobe Firefly: Adobe Firefly is a suite of generative AI tools integrated into Adobe's Creative Cloud, designed to enhance creativity and streamline the creative process. Firefly leverages advanced AI to generate images, graphics, and other visual content from textual descriptions, allowing users to quickly and easily bring their creative visions to life.

Additional information is available on the AI Tool Comparison webpage.

 

Best Practices When Using AI

  • Gain a basic understanding of how AI works, including its capabilities and limitations, with the available tools approved for use at the university.
  • Clearly define what you want to achieve with AI. Whether it’s automating tasks, improving decision-making, or enhancing user experiences, having clear objectives will guide your AI implementation.
  • The use of AI should be clearly documented and disclosed. Ensure to cite your AI usage clearly if incorporating written prompts.
  • Always evaluate results as responses may not always be accurate. Verify information using credible sources to ensure accuracy.
  • Be cautious about bias detection; a fairness assessment can help ensure equitable results. Evaluate whether an AI model treats all individuals equitably by ensuring data quality, defining the fairness criteria and identifying protected attributes such as race or gender, for example.
  • Prioritize ethical guidelines, fairness, openness, and privacy laws.
  • Ensuring data protection is our top priority. Most of these tools are currently approved for use with public and confidential data, as defined in the university’s data classification system. Copilot for Microsoft 365 is the only approved Copilot AI tool for use with highly confidential data.

 

Getting Started

Faculty and staff can harness the power of AI in various ways to streamline processes and improve productivity. One of the most effective ways to use these approved Generative AI tools is to create clear and concise prompts, called prompt engineering. Prompt engineering involves crafting effective input prompts to guide the model’s responses. 

Decide what you want to ask and provide context to why you're asking. Instead of asking "Tell me about AI" you can ask, "What are the key applications of AI in healthcare and how are they improving patient outcomes." Or you could pick a topic and ask for step-by-step explanations, "What are the top five things I should know when managing a large project? and What is agile project management and how does it differ from waterfall?" Providing context and asking if more information is needed will provide better results. 

When using Copilot Chat, the start image where you will input the prompt will look like the following:

 

copilot start image

DALL-E has been integrated into Copilot, enhancing its capabilities with advanced image generation. This integration empowers Copilot users to create custom visuals, streamlining content creation for academic and administrative purposes. Prompt engineering is important with asking AI to create a DALL-E image as well.

For example, the following specific instructions were used to generate the four sample images below: Create a highly detailed, realistic image of a modern research campus. Include state-of-the-art buildings with large glass windows, a central plaza with green spaces and benches, students walking and discussing projects, and researchers in lab coats working in a high-tech laboratory visible through the windows. The scene should have a vibrant, sunny atmosphere.

 

DALL-E prompt image

Guidelines for Safely Using AI

The AI landscape is rapidly evolving, with new tools being introduced frequently. Our Information Security and IT Compliance (ISIC) teams are continuously reviewing AI considerations. While AI is a powerful tool, it requires careful planning and ongoing monitoring. It is crucial to balance the benefits of AI with the need to minimize risks.


Data Privacy

Most of these AI tools are currently approved for use with public and confidential data. At this time, only Copilot for Microsoft 365 is approved for use with highly confidential data. The University of Colorado has adopted data classification types including highly confidential information, confidential information and public information. Review the CU Data Classification Table for more information about data types. 

Be cautious when using AI tools, ensuring that personal, sensitive and university data are not uploaded or shared with unvetted AI systems, as they may not be secure. How an AI tool or assistant processes and uses the data that is input into it is a key factor in determining its security. Refer to our AI Tool Comparison webpage for more information.

In addition to using university approved AI tools with public or confidential data, all users must follow and abide by university policy - and relevant state and federal law regarding protecting data and information systems. These include but are not limited to: 

Please note: For Microsoft AI tools, though Microsoft assures that your data and university data won’t be used to train their artificial bot, it is still advisable to exercise caution. When using Copilot Chat, be sure you are logged in to your university account and confirm that the "protected shield" is visible in the prompt area. Additional information about securely using AI are available. Visit the ISIC webpages: Microsoft Copilot productsZoom AI Companion features and AI Security and Compliance

Request an Applications Assessment for assistance vetting AI prior to acquiring a technology, particularly if the AI is intended for clinical purposes or will use highly confidential data such as FERPA or HIPAA data.

Information Strategy and Services

CU Anschutz

Fitzsimons Building

13001 East 17th Place

Aurora, CO 80045


CMS Login