About AI at Marshall
Marshall University’s Visionary AI Initiative
Marshall University is advancing a coordinated and responsible approach to artificial intelligence through a structured AI governance framework. Established in 2023 with the formation of the Presidential Task Force on AI, this work brings together faculty, staff, students, and leadership to explore how AI can support teaching, learning, research, and university operations.
Building on this foundation, the university released the AI Strategy Plan on a Page in early 2025 and transitioned to a formal governance model consisting of an AI Steering Committee, an AI Enablement Sub‑Committee, and an Academic AI Integration Sub‑Committee. Together, these groups guide the strategic, operational, and academic use of AI, ensuring alignment with institutional priorities, ethical standards, and student success.
This framework positions Marshall to use AI thoughtfully and effectively in support of its mission and future‑ready vision.
University Approved AI Tools
At Marshall University, the use of Artificial Intelligence (AI) tools must align with University standards for data security, privacy, and academic integrity. If you are uncertain whether an AI tool is safe or compliant, do not share University or personal data.
Faculty & Staff: University IT and Purchasing procedures must be followed before adopting any AI tool for use with university data.
Students: Generative AI has many costs and benefits. You may not use generative AI in any way that would violate the Student Code of Conduct.
Data Classification
Marshall University’s Information Security Policy uses a data classification system to ensure appropriate handling of information based on its sensitivity and regulatory requirements, including the use of AI. The three main categories include:
- Public Data – Information intended for public dissemination, where unauthorized disclosure poses little or no risk. (ex. Press releases)
- Confidential (or Private) Data – Data that, if disclosed, could pose a moderate risk. (ex. Internal communications)
- Restricted Data – This includes highly sensitive information where unauthorized disclosure could cause significant harm to the University or individuals. (ex. Personally Identifiable Information)
| Name | Type | Availability | Cost | Request | Allowed Data Type |
|---|---|---|---|---|---|
| Adobe AI Assistant | PDF focused AI Assistant | Faculty, Staff, Students | Free | N/A | Public, Private, FERPA Data Approved |
| Adobe Firefly | AI image generator | Faculty, Staff, Students | Free | N/A | Public |
| Adobe PDF Spaces | Custom PDF AI Agent Tool | Faculty, Staff, Students | Free | N/A | Public, Private, FERPA Data Approved |
| Blackboard AI Assistant | LMS AI Design Assistant | Faculty, Staff | Free | N/A | Public & Private |
| Microsoft 365 Copilot Chat | AI Assistant | Faculty, Staff, Students | Free | N/A | Public & Private |
| Microsoft 365 Copilot Pro | Fully Integrated AI assistant | Faculty, Staff | $360 annually | Request Here | Public, Private |
| Microsoft Teams Premium License | AI assistant integrated into Teams | Faculty, Staff | $25 annually | Request Here | Public & Private |