Copilot Chat FAQs
General
Now that I can attach files in Copilot Chat what are the file limits?
If you use Copilot Chat and you are not logged into your Microsoft 365 educational account then the file size limit per chat is 1 MB.
If you log into Copilot Chat using your Ohio State username and password, then the file size limit is 512 MB per chat.
Should I be using AI?
New AI tools are released daily, and university technology teams evaluate these tools to ensure they are accessible, secure and effective. Faculty and staff should ensure they are logged in with their university credentials if they plan to enter institutional data that is above S1-classified data into an approved AI tool. Learn more about institutional data classifications as they relate to AI in the Security and Privacy Statement on Artificial Intelligence statement.
Students should use GenAI tools for coursework only with the explicit permission of each instructor, in the ways allowed by that instructor. For an overview of all available AI tools, visit Ohio State’s AI Tools webpage.
What does Enterprise data protection mean?
To see more about the differences and your protections please see Microsoft's documentation on Enterprise data protection in Microsoft 365 Copilot and Microsoft Copilot.
- We secure your data: We help protect your data with encryption at rest and in transit, rigorous physical security controls, and data isolation between tenants.
- Your data is private: Microsoft will not use your data except as you instruct. Our commitments to privacy include support for GDPR, ISO/IEC 27018, and our Data Protection Addendum.
- Your access controls and policies apply to Copilot: Copilot respects your identity model and permissions, inherits your sensitivity labels, applies your retention policies, supports audit of interactions, and follows your administrative settings.
- You're protected against AI security and copyright risks: We help safeguard against AI-focused risks such as harmful content and prompt injections. For content copyright concerns, we provide protected material detection and our Customer Copyright Commitment.
- Your data isn’t used to train foundation models: Microsoft Copilot uses the user’s context to create relevant responses. Microsoft 365 Copilot also uses Microsoft Graph data. Consistent with our other Copilot offers, prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation models.
- Employees using Copilot must sign in to http://copilot.microsoft.com using their work account.
- Please note the following OTDI Blog update of Learn to Use New Microsoft Copilot Features | Office of Technology and Digital Innovation
What is AI Hallucination?
AI "hallucination" is a term used when a Generative AI tool, like Copilot, generates information that is plausible sounding but factually incorrect or completely made up. It's not the same as a human hallucination. The AI isn't "seeing" or "imagining" things in a conscious way. AI's directive is to provide an answer, any answer, to make the user happy. That is why you must take a Use but Verify approach with any AI generated content or answers. It is up to the user to verify the accuracy of the results. Copilot's goal is to be helpful and provide a complete-sounding answer, but in doing so, it can sometimes confidently present a fabrication as a fact.
Microsoft has provided information to assist in the ethical and transparent use of AI.
Transparency Note for Copilot - This contains the Limitations for Copilot Chat
Transparency Note for Microsoft 365 Copilot (Copilot Full License)
Specific Limitations of Microsoft 365 Copilot (Copilot Full License)
What is the difference between Copilot and Copilot for Microsoft 365
Microsoft Copilot is a digital assistant that uses AI to help you with your work. It is available in two versions: Copilot and Microsoft 365 Copilot.
Copilot (formerly known as Copilot with data protection) is a chatbot that uses public online data to provide you with information. It does not have access to your organization’s data or content within Microsoft 365, such as documents in OneDrive, SharePoint, emails, or other data in Microsoft 365.
Microsoft 365 Copilot, Microsoft 365 Copilot | Administrative Resource Center, is a paid version of Copilot that uses four technologies to generate its natural language. It utilizes the web, large language models (LLMs), and your data in Microsoft 365 services, but it goes one step further by fully integrating into Microsoft 365 applications, such as Word or PowerPoint
Why can't I see Copilot Chat inside of my Microsoft Applications?
To use Microsoft 365 Copilot features in desktop applications like Word, Excel, PowerPoint, Outlook, and OneNote, users must have Microsoft 365 Apps installed and be on either the Current Channel or the Monthly Enterprise Channel. These update channels ensure access to the latest features, including Copilot Chat integration. If you are a MITS customer, you are on Current Channel or the Monthly Enterprise Channel. Some non-MITS units have installed Microsoft 365 using a method called device-based or using a volume licensed key method. If that is the case you will not see Copilot within the desktop applications.
In addition, users must also be signed into their Microsoft 365 desktop apps using their organizational OSU Campus login of lastname.#@osu.edu(opens in new window) or lastname.#@buckeyemail.osu.edu(opens in new window). Copilot is only supported on primary mailboxes hosted in Exchange Online, and not on shared, delegate, or archive mailboxes. For full functionality, you should have an active OneDrive account, as some Copilot features rely on OneDrive for file access and management.