FAQ - ChatGPT and other AI Tools
/https://siu.edu/search-results.php
Last Updated: Feb 03, 2025, 10:31 AM
Frequently Asked Questions
How does ChatGPT, Claude 2, Bing Chat, and other LLMs Work?
ChatGPT, Claude 2, Bing Chat, and other LLMs (Language Learning Models) operate by using neural network architectures trained on vast amounts of text data. They predict the next word in a sequence based on context, which, when done iteratively, allows them to generate coherent sentences and paragraphs. Models like these are trained to understand grammar, context, and semantics from their training data, enabling them to respond to user prompts with relevant and human-like text. The ability to produce quality outputs requires knowledge and skills in prompt engineering.
Are SIU Students Using AI Tools like ChatGPT?
According to a survey conducted by BestColleges, 43% of college students have had experience using AI tools like ChatGPT, and half of those acknowledge turning to those tools to work on assignments or exams. However, over half of college students (51%) believe using artificial intelligence (AI) tools like ChatGPT to complete assignments and exams is cheating. The survey also found that 48% of students agreed that “it is possible to use AI in an ethical way to help complete my assignments and exams,” more than twice the percentage (21%) who disagreed. (source: Nietzel, 2023: Forbes.com).
Keep in mind that students will likely be expected to use LLMs in the the workplace. Consider your learning objectives and outcomes first, then create policies and assignments that align with learning objectives. In some cases, integrating AI into your pedagogy—rather than resisting it—may improve learning outcomes.
Are LLMs the only AI tools being used by SIU Students?
No. Many students—and even their instructors—are using tools like DALL-E, a generative AI model developed by OpenAI that can create realistic images and art from a description in natural language, and Adobe AI to generate images. Film students may be using the AI tools in DaVinci Resolve. Architecture students could be using Midjourney, Autocad, or Revit (if they have an internship that sponsors it). Science students may be using Microsoft Co-Pilot or ChatGPT to help analyze quantitative data. The potential uses of AI are long.
How can I be Policy Proactive with Students?
Students are accustomed to seeing rules and policies in the course syllabi. We recommend putting these directly in D2L on the course’s homepage, welcome widget, or module 1 content area. As of publication, SIU does not have an official policy on the uses of AI. However, the rules of academic dishonesty would apply to the uses of AI if you have established a policy. See Academic Misconduct | Student Rights and Responsibilities | SIU for more information. You can also find helpful resources and guides on the CTE’s website: Artificial Intelligence | Center For Teaching Excellence | SIU.
Can Turnitin Detect AI?
SIU has a subscription to Turnitin, which is available in D2L assignments (see image); our current subscription does not include AI detection. Currently, there is mixed opinion about the efficacy of Turnitin or other AI detectors. They may detect AI-generated text; however, they are not always reliable. These detectors can produce false positives, with tools like Grammarly often being misinterpreted as AI. Additionally, there’s no database to compare AI-generated content, leading to documented cases of human-written text flagged as AI and vice versa. Relying solely on these tools is problematic. Instead, instructors should trust their instincts, considering the tone and content of submissions and comparing them to previous student writing samples. However, remaining open-minded is essential, as some students may naturally write in a style resembling AI output.
Manage Turnitin in D2L
What About Online Exams?
Thanks to the support of CTE, Extended Campus, and OIT, we have a top-level subscription to Respondus Monitor, which now has AI detection built into them: LockDown Browser and Respondus Monitor Instructor Guide – OIT Knowledge Base.
How Can I Detect or Deter AI Use without Technology?
Detecting tools like ChatGPT in academic environments can be challenging because the text generated is often coherent and human-like. However, try these strategies.
- Specificity and Context: ChatGPT and other LLMs often produce generic answers when given vague prompts. If you find answers that are overly verbose or don’t precisely address the specific context of the question, it might be a clue.
- Inconsistencies: While ChatGPT can generate coherent text, the context might shift over longer answers. Look for inconsistencies or shifts in a topic.
- Unique Assignments: Design assignments or questions that are unique and can’t be easily answered by a generic response. This could involve linking to specific class discussions, course materials, or recent events.
- Oral Examinations: If feasible, oral examinations or discussions should be considered. It’s much harder to use external tools in real-time spoken scenarios.
- Socratic Discussion: Stimulate critical thinking through discussion and questions instead of one-way information delivery.
- Meta-knowledge Questions: Ask questions about class-specific discussions, experiences, or unique materials. AI won’t have access to in-class experiences.
- Pattern Detection: Over time, you might notice certain phrasings or styles that are characteristic of AI-generated content. Keeping an eye out for these patterns can be helpful. [Are you detecting them here? :-)]
- Educate About Ethics: Instead of just setting rules, educate students about the importance of academic honesty, the value of genuine learning, and the potential consequences of dishonesty.
- Open Conversations: Allow students to discuss and share their methods of research and study. Sometimes, students might reveal or discuss amongst peers the tools they use, including AI.
Where Can I Go for Questions or More Assistance?
The CTE is an excellent source of information. Connect with one of our Instructional Designers or attend one of our many workshops on the topic: Training and Workshops | Center For Teaching Excellence | SIU
However, local approaches to AI policy may be more helpful. We recommend that you do the following.
- Foremost >> Feedback from Students: Oftentimes, students themselves can provide insights into popular tools and the reasons they may or may not turn to them.
- Consult with Colleagues: Your fellow faculty members may have faced similar challenges and can offer insights or solutions they’ve found effective.
- College-level Meetings: Raise the issue in director’s meetings for a broader perspective and to understand departmental stances.
- Departmental Discussions: Open a dialogue during department meetings to discuss collective strategies or shared experiences.
- CTE Workshops: The Center for Teaching Excellence (CTE) at SIU offers various workshops. Check their training calendar at CTE Training Calendar to find relevant sessions or request a custom workshop on this topic.
I LOVE AI! How Can I Be More Reflexive About These Tools?
SIU is a place for everyone. Please be thoughtful about your use of AI tools, like ChatGPT, from Accessibility, Diversity, Equity, and Inclusion (ADEI), and socio-political perspective. Here are just a few things to consider.
- Foremost >> Feedback from Students: Oftentimes, students themselves can provide insights into popular tools and the reasons they may or may not turn to them.
- Assistive Technologies: Ensure that any tech solutions proposed are compatible with assistive technologies, like screen readers or speech recognition tools.
- Cultural Bias: AI models can sometimes reproduce cultural biases in their training data. It’s essential to educate students on these biases and encourage critical consumption.
- Representation: Ensure that the use of AI tools doesn’t diminish the representation of diverse voices and perspectives in academic submissions.
- Equal Opportunity: If AI tools are allowed or integrated into curriculum, ensure every student has an equal opportunity to access and use them.
- Training: Offer training sessions for students unfamiliar with these tools so that they’re not at a disadvantage.
- Inclusive Dialogue: Before making decisions about using AI in coursework, consider getting feedback from diverse students to understand different perspectives.
- Language Barriers: For many (international) students, English isn’t their first language. AI tools might inadvertently become crutches if they feel they better capture their intended meaning. They may also use them to correct writing style, grammar, and punctuation. This could be considered equity. Also, they may struggle to prompt the AI effectively due to their ability to fully express themselves in English.
- Privacy Concerns: Using som tools might raise concerns about data privacy and surveillance, including if they are required to sign up for a freemium service.
- Economic Implications: The “pay-to-use” model of some advanced AI tools might economically disadvantage some students.
- Critical Thinking: Ensure that reliance on AI doesn’t stifle critical thinking, especially when discussing socio-political topics. Encourage students to challenge AI-generated content and think deeply about the sources of information.
- Originality and Authenticity: Emphasize the importance of original thought and caution against over-reliance on AI, which might dilute individual perspectives and voices. Remind students that AI is built on internet and digital data; it represents a very small amount of human knowledge in existence.
- Fairness: In any rules or guidelines set, ensure fairness so that no specific group is disadvantaged.
Final Thoughts
LLMs and AI are trained via internet data. Not all knowledge of disciplines or of the world exists in these databases. Personal experience, cultural experience, observation, oral histories, books that have not been digitized, and so on are excellent sources of knowledge as well. When we use AI tools, we are only using a small part of all the expertise available to humans.