New policy effective 01/31/2025
CHSU values the independent development of students' foundational science knowledge, critical thinking, clinical skills, and self-reflection. CHSU also recognizes the potential benefits of artificial intelligence (AI) technology in assisting student development and supports the responsible and safe use of AI in the educational program. However, the University is also aware of the inherent risks associated with AI, including, without limitation, bias, limited contextual understanding, plagiarism, ethical issues, inaccurate information, and academic integrity. This policy defines the responsible and ethical use of generative AI tools by CHSU students. This policy identifies CHSU requirements for the use of AI and provides guidance to students on how to responsibly incorporate AI tools in their academic work.
Students may use AI tools to enhance learning consistent with specific conditions set by CHSU. While AI can aid in generating ideas, researching topics, or supporting study, it should never replace a student’s own understanding, analysis, or critical thinking. AI usage must align with academic integrity standards and be properly cited.
CHSU Requirements and Student Guidelines and Responsibilities in using AI
Academic Integrity and Personal
- Understanding Faculty Expectations for AI Use
Faculty have the authority to define acceptable AI use within the syllabus for specific assignments and courses, especially for assignments that require independent thought and analysis. This includes providing explicit instructions to students for when AI may or may not be used. Students must adhere to these instructions. For assignments allowing AI, students should understand how and when these tools may be incorporated.
- Responsible Use and Citation of AI
If an AI tool is used to assist with assignments, students must provide appropriate citation following CHSU’s citation guidelines. This maintains transparency and academic honesty. Any content produced with AI should be treated as a source requiring acknowledgment.
- Ownership of AI-Generated Content
Students are fully responsible for the accuracy, reliability, and ethical quality of their work product, including any content generated by AI. If AI-produced information is inaccurate, biased, or otherwise inappropriate, the student is accountable for its inclusion in their work. Final submissions should reflect the student’s genuine understanding and critical evaluation of the subject.
Ethical Use of AI in Clinical
- AI Use in Clinical Documentation
Unless expressly allowed at clinical sites, students are prohibited from using AI tools to generate or complete any clinical documentation, including histories and physicals (H&Ps) and other patient records. All clinical notes and records must be directly authored by the student to ensure accuracy and authenticity, as they reflect genuine clinical observations and decisions.
- Patient Privacy and Data Security
Students must not input protected health information (PHI) or any patient-related details into AI tools, in compliance with the Health Insurance Portability and Accountability Act (HIPAA) and other laws and regulations concerning privacy and PHI. Maintaining confidentiality and safeguarding sensitive information is essential in clinical settings.
Copyright and Intellectual Property Compliance
Students must not copy course materials, exam questions, or proprietary content into AI tools. Unless otherwise prohibited by instructors in their syllabus, students may use AI tools to create flashcards, practice questions, and lecture summaries from lecture PowerPoint materials. These restrictions protect CHSU’s intellectual property rights and uphold course confidentiality.
Violations and Consequences
- Academic Misconduct Related to AI Use
Misuse of AI tools, such as unauthorized assistance or failure to cite AI-generated content, will be treated as a violation of CHSU’s Academic Freedom, Intellectual Honesty and Academic Integrity Policy. Sanctions for misuse can include academic penalties and may lead to further disciplinary action for severe or repeated violations.
- Reporting and Resolution of Violations
Suspected violations related to AI misuse should be reported consistent with CHSU’s established Academic Integrity Conduct Process. Students found in violation of this policy may face academic consequences aligned with CHSU’s Code of Conduct.
Definitions
- Generative AI Tool: Any software that produces text, images, or other content in response to user prompts (e.g., ChatGPT, Bard, DALL·E).
- AI Detection Tool: Software used to identify whether a student’s submission contains AI-generated content.