Nevada State Horizontal Gold and Black Logo

Artificial Intelligence

  1. Home
  2. Artificial Intelligence

Generative Artificial Intelligence (AI) – Guidance on Use and Applicable Policies

Ever since the emergency of commercially available Large Language Models (LLMs) in 2022, the field of AI has been an ever-changing landscape with significant impacts on higher education. Due to this dynamic environment, it is difficult to encode formal policy around AI, as the next generation of developments could fundamentally alter the capabilities of AI. While recognizing the dynamic nature of AI developments, it is vital that educational institutions meaningfully respond to AI in order to advance educational opportunities, ensure data security, and maintain academic integrity. This page serves as a central repository of resources on AI at Nevada State including relevant policies, training materials, and best practices surrounding AI.  

Relevant Policies

 

Topic Summary Relevant Policy(ies)

Academic Standards,

Academic Misconduct, &

Student Misconduct

Defines misuse of AI in the context of overall academic standards and outlines disciplinary procedures.

SA 6: Academic Standards 

SA 5.1: Student Code of Conduct*

Data and Privacy AI services may or may not be secure. Procedures for the proper handling of sensitive information with AI services is defined in context of overall data security.

FERPA: Family Educational Rights and Privacy Act 

ITS 5.1.1 – Data Security

ITS 5.1.2 – Information Security Plan

Purchasing of AI Services  AI services almost always have a contract associated with them that must be reviewed by Nevada State Contracts Group. Purchasing AI services follows the procedures outlined in the Purchasing Manual found in the Nevada State Dropbox Public Folders. Purchasing and Procurement Card Manual
AI Attribution, AI Faculty Conduct Proper citation of AI use in research and scholarship is defined and mechanisms for addressing misconduct are addressed.

AA 19.1: Research Misconduct Policy 

NSHE Code Title 2, Ch. 6: Rules & Disciplinary Procedures for Faculty

AI and Disability  Some AI tools fall under the umbrella of disability accommodations. These include speech-to-text and transcription, document readers, and translators.

DRC 5.1: Disability Accommodations Offered by NSU 

 

IT 1: Information & Communication Technology Accessibility

Incident Reporting Any violations of the above policies or other grievances related to the improper use of AI can be reported on the Nevada State reporting portal.

SA 2.1: Student Complaint Policy

DRC 10: Disability Services Grievance Policy

* Currently undergoing a major revision.

Developing an AI Policy for Your Classroom

AI tools such as ChatGPT, Gemini, Grok, Grammarly, and other GenAI technologies can be valuable resources for learning when used appropriately. Instructors should carefully consider the role AI will play in their classroom. Instructors are encouraged to develop a clear syllabus statement addressing AI as well as discussing AI use with students early in class.

Developing an AI syllabus statement

Best practices for establishing a strong AI syllabus statement include:

  • Define AI tools clearly
  • Clarify permitted and prohibited uses of AI
  • Establish expectations for AI attribution
  • Discuss academic integrity and clarify consequences

Please reference the template and example AI syllabus statements provided.

Discussing AI tools in class

In addition to going over your course syllabus statement, holding an in-class discussion during the first week of class on AI use can help set the tone for the rest of the course. By establishing expectations up front, students can be more confident they are using AI appropriately as well as understand why proper AI use is important. In addition to your syllabus statement, you can:

Using AI Responsibly

Students

While AI tools can help you be a better learner and speed up tasks, you should take care to acknowledge when and how you use it as required by your instructor. Misuse of AI is a form of plagiarism and can result in disciplinary action. When AI is used to supplement learning rather than replace it, AI can be a very effective tool. You should always reference the AI policy in your course syllabus before using AI on any course work including any required AI attribution. Potential uses of AI in the classroom are shown below:

Possible Use Tips Pitfalls
Homework Help and Tutoring. Stuck on a homework problem? AI can provide feedback on your work and step-by-step solutions. Always try a problem on your own before asking an AI for help to develop key skills. Be sure to also use office hours and other instructor-provided resources, which may better align with instructor expectations. Overreliance on AI for homework problems can undermine learning. When exams come around, you may not have built essential concepts in muscle memory.
Note-Taking and Summarization. AI can summarize lectures, readings, and class discussions. Physically writing or typing notes yourself is a learning best practice as it helps you make connections with course content. AI note-taking can help when this is not possible, but don’t let it replace your own notes and summaries. You should be attending lectures, not the AI. While summaries can save time in the short-term, you will not be mastering material if you are not paying attention in class yourself.
Idea Generation and Structuring for Paper and Projects. AI helps brainstorm ideas, outline arguments, or organize thoughts or writing. Brainstorm together with AI rather than outsourcing idea generation completely to AI. AI can be very helpful when you don’t know where to start, but you should iteratively refine ideas by giving it feedback like you would do with a human collaborator to arrive at a final design or solution. AI outputs can often be bland and uncreative, so take care to be critical when using their outputs especially for creative tasks. AI can be particularly dangerous for novices or amatuers in a particular field, as they do not have a point of reference for what is good and what is bad.
Research Assistance. AI finds relevant sources and generates citations. Use AI together with other secondary sources like Wikipedia references to identify important sources in a given field. AI can be useful as a first, but not a final, resource for identifying sources. Current GenAI models are known to have weak search capabilities often providing surface-level sources or even making up sources. Always check all sources provided by an AIMany research projects will require more niche sources that won’t appear in AI search results, including archival documents and research papers.
Writing Assistance or Editing. AI helps with grammar, style, and content generation. AI can be helpful as an on-the-go editor to catch spelling and grammar mistakes or even rephrasing tasks. Care should be taken when relying on it to generate content, which is best reserved for routine tasks rather than creative ones. It is best to write together with AI tools, modifying outputs as needed rather than blindly copying and pasting. Writing is a skill that can take years to develop on your own. Get in the practice of writing yourself. Overly relying on AI for writing tasks can result in bland and trite writing, and you lose your unique voice.
Language Translation. AI translates text and speech for non-native speakers. AI can make resources available to you that would otherwise be inaccessible due to language barriers. If you do use an translated material, take care to check the accuracy of the translation yourself or with a native speaker. You should include instructions in the prompt on how literal of a translation you want or how to handle transcription errors. AI can struggle with idiomatic phrases and transcription errors. It also has no guarantee of accuracy, often resulting in outputs that are not to be found in the original input text without warning. It will not warn you that text is missing or that it is making educated guesses.

Note that you are responsible for the work you turn in. If content generated by an AI tool in inaccurate or doesn’t fit the criteria of a given assignment, you are held accountable for those mistakes. A good rule of thumb is to carefully check all work generated by an AI including verifying solutions, checking facts and sources, and editing outputs. The American Association of Colleges and Universities in collaboration with Elon University has developed a Student Guide to Artificial Intelligence. This comprehensive guide can help you effectively and responsibly approach AI on campus.

Faculty

While GenAI tools can do some fairly sophisticated things, they make fairly basic mistakes that can have catastrophic outcomes. Not only is it error-prone, it’s prose can feel lifeless and bland. While AI models have grown in their capabilities over the last few years, it is like the LLM-based model most AIs are build on will plateau in performance as training datasets are exhausted. If that’s the case, what is AI good for? When used strategically for specific kind of tasks, AI can help save you time with many of the tasks you carry out as an instructor and an administrator. Think of AI as a very smart and well-read graduate student. While it can write you an essay or compose a research outline, it needs guidance and correction to get a workable product. While AI may not be teaching courses or conducting experiments on its own, it can save you time in your own teaching and research.

The best advice for faculty and administrators using AI is to test it out on your own. Benchmark its outputs on tasks of varying difficulty to identify what it can and can’t do for you. You can quickly get a sense of its limitations and its strengths. Most usecases will involve iterative prompts and modification of outputs rather than directly copying and pasting text from an LLM. The CTLE has developed a set of guidelines for working with Gen AI that is regularly updated. Potential use case of working with AI as a faculty member or administrator are listed below.

Possible Use Tips Pitfalls
Brainstorming. AI can help move past writing blocks to help the writing process. Need a creative title for a paper? Starting a new new project, but you aren’t sure where to start? Do you a set of examples to demonstrate a concept? Or are you trying to hunt down a book or a source based on a quote you only sort of remember? GenAI is a great way to get the creative juices flowing by collaboratively brainstorming with an AI partner. Brainstorming with an AI partner is most effectively done when you already have some expertise in the area and have some clear constraints or starting points to get the conversation started. Be very precise when brainstorming with AI. When your prompt is too vague, you will get bland and trite results. Also be sure to ask the AI to generate more options than you need; if you don’t like some, you can try some of the other outputs or mix and match components from different outputs. If the results weren’t quite on target, give some additional context and generate another batch of results. More often than not, you won’t directly use the result from an AI brainstorming session, but the process of iterating with the AI will help you move in the right direction.
Peer Review and Editing. AI can provide an additional pair of eyes on manuscripts and other documents. AI is arguably more useful as an editor rather than a writer. Using AI as a peer reviewer lets you maintain responsibility for content and accuracy, GenAI tools can help streamline the editing process. This can go beyond mere grammary, spelling, and rephrasing. With prompts tweaked appropriately, you can have an AI check documents for adherence to a rubric, help design a statistically sound set of experiment, or give feedback from the perspective of an expert in the field or a lay audience. The key is using a clear prompt. Be as specific a possible for what you are looking for. As always, be very careful about what you provide the AI. Be sure to check the data sharing policy of the AI service and remove any content from manuscripts that you wouldn’t want added to an LLM’s training dataset. Consider only putting small chunks of text into the AI, rather than providing the entire document.
Writing Homework and Quiz Questions. AI can help with specific taks such as writing multiple choice questions. AI can help busy instructors by cutting down on time to do simple repetitive tasks in course design. For example, coming up with wrong answers on a multiple choice portion of an exam or developing checks for understanding from a reading assignment. Do not delegate all creative responsibility for homework assignments to AI as it can result in bland and generic results. Limit AI use to relatively contained and repetitive tasks such a multiple choice questions and answers or variations on a theme. Use AI to help increase creativity in course design rather than removing it by using AI as a brainstorming tool. Have the AI generate multiple options that you can modify or mix and match.
Assistance Filling Out Administrative Forms. AI can streamline standardized forms based on user background and needs. Filling out forms is a tedious if necessary part of navigating higher education and bureaucracies in general. Forms and documents you may need to fill out include grant applications, IRB applications, budget documents, and nominations and letters of recommendation. Some forms will require you to summarize or rephrase material you have written elsewhere. AI can help with that. Some forms may have material you are unfamiliar with, or it may be your first time completing such a form. AI can help with that. Be careful to be targeted in what you ask the AI to do; give it more than “fill out this form for me.” The more specific and narrow the task, the more successful the outcome will be. Be aware that you are 100% responsible for all content you use to complete forms or submit paperwork. You should read, check, and modify all content to make sure it adheres to everything that is asked for. Be aware that the task of reading bureaucratic forms can be just as tedious as filling them out, so don’t make someone else’s life miserable by doing a poor job. Be accountable for all work you submit.
Spicing Up Your Slide Deck Many AI tools can be used to enhance Powerpoints and other presentations including image generators. There are even presentation-specific tools such as beautiful.ai. When used thoughtfully, AI can help make your slides more effective in conveying information to your audience whether that be students, employees, or administrators. Be careful not to overload a presentation with irrelevant or gratuitous images. AI-generated images can quickly become tiresome or distracting from the message you are trying to convey. AI is best used to help develop a theme through your slide deck. Use images to create a cohesive narrative and fill text-heavy slides. In general do not use AI-generated diagrams, graphs, or content-heavy images as they can be inaccurate.

Guidelines for Sharing Information with AI

What sort of information should not be shared with AI tools?
Examples Risk
Personally identifiable information (PII) Names, addresses, phone numbers, social security numbers, etc. Could lead to identity theft or unauthorized access.
Financial information Bank account information, credit card numbers, tax information Could lead to financial fraud
Medical and health information Patient records, diagnoses, prescriptions, medical history Could lead to being non-compliant with HIPAA regulations.
Student records and academic data Grades, transcripts, disciplinary records, admission information Could lead to being non-compliant with FERPA regulations.
Sensitive or proprietary information Proprietary research, trade secrets, classified documents AI tools may train on this information and potentially expose information to competitors or unauthorized users.
Intellectual property Creative works such as art or writing AI tools may use your work to further train without explicit permission, repurpose it in ways you don’t intend, or expose it to others without consent.
What information is safe to share with AI?
  • Anonymized data (data stripped of PII)
  • Information that is already publicly available and is non-sensitive
  • Information that is ok to be made public
Are there exceptions to sharing sensitive data with AI?

If the AI platform provides controlled access that can guarantee data will not be shared with other users or be utilized for AI training outside of your tenant, it may be ok to share sensitive data with it.

AI Literacy and Training

Promoting effective and ethical use of AI at Nevada State by students, faculty, and staff is essential. Important components of AI literacy include:

  • Understand how common AI models work e.g. LLMs
  • Implement GenAI tools responsibly.
    • Understand appropriate and inappropriate uses of AI.
    • Understand key policies around AI use including access, security, and attribution.
  • Recognize bias, stereotypes and misinformation and how they can have academic and social consequences.
  • Assess various GenAI tools to identify appropriate use cases.

AI Training and Resources at Nevada State

Nevada State has a number of resources for students, faculty, and staff on learning about AI tools. Whether or not you plan to use AI in your role, becoming AI literate whether a professor, student, or staff member is essential. AI technology is adapting quickly and is here to stay. Understanding AI tools will help us determine how to use them – or not use them – appropriately, ethically, and responsibly. 

Students

  • Library LibGuides.* The Marydean Library has a useful introduction to evaluating resources in their Introduction to Research series. More AI-specific content is forthcoming.
  • FYE 101 Traveling Modules.* The Nevada State first-year seminar course includes a unit on Academic Integrity covering the responsible use of AI. This course content is available in a Canvas Shell of travelling modules accessible to students and instructors.

Faculty and Staff

  • CTLE Canvas Shell. The CTLE has ongoing workshops on teaching and learning each semester including topics on AI. Most sessions are recorded and available through the Canvas shell including:
    • Faculty Panel on AI
    • Student Panel on AI
    • AI Comes to Canvas

*Currently under development.

Additional Resources

AAC&U Student Guide to AI Elon University and AAC&U collaboration that provides extensive AI resources, guidelines, and training tools for students and educators.
MIT AI Hub Here you can explore ways generative AI can augment your teaching and learning from the MIT Sloan School of Management.
Artificial Intelligence Teaching Guide Open source modules and lessons provided by Stanford University Teaching Commons
Coursera Search for free courses on AI provided by a range of organizations and educators.

AI Attribution

Proper citation of AI varies by discipline. Provided below are both the citation in the bibliography as well as the in-text citation format.

APA Style

Bib: OpenAI. (2023). ChatGPT (Feb 13 version) [Large language model]. https://chat.openai.com

In-Text: (OpenAI, 2023)

Chicago Style

Footnote: Text generated by ChatGPT, March 31, 2023, OpenAI, https://chat.openai.com.

MLA Style

Bib: “Text of prompt” prompt. ChatGPT, Day Month version, OpenAI, Day Month Year, chat.openai.com

In-Text: “Text of prompt”

Find further details at Purdue University’s AI Citation Guide. Instructors can also establish their own expectations on proper citation of AI in the classroom. For example, ChatGPT allows users to generate a shareable link to past conversations which can be submitted with assignments.

Publisher Policies

Publishers likely have established their own policies on the use of AI in scholarly publications. Examples include:

Be sure to review publisher guidelines on AI use before submission.

Definitions

Artificial Intelligence (AI): A branch of computer science focused on creating systems that can perform tasks typically requiring human intelligence, such as reasoning, perception, and decision-making.

AI Literacy: Understanding how to use, critically evaluate, communicate about and with, and collaborate effectively with generative AI (GenAI) in a manner that is ethical and socially and academically responsible.

Machine Learning (ML). A subset of AI where algorithms learn from data to make predictions or decisions from patterns in the data without being explicitly programmed for each specific task.  

Neural Networks. Computational models inspired by the structure and function of the human brain, consisting of layers of interconnected nodes (neurons) that process input data to generated predictions. More layers allow neural networks to exhibit more abstract behaviors. 

Deep Learning. A subset of machine learning that uses artificial neural networks, particularly deep neural networks with many layers, to model complex patterns and relationships in data. 

Generative Artificial Intelligence (GenAI). A type of AI that creates new content, such as text, images, music, or videos, based on patterns learned from existing data. Examples include ChatGPT for text generation and DALL-E for images. 

Large Language Model (LLM). A category of deep learning models trained on massive text datasets to understand, generate, and manipulate human language. 

Diffusion Models. A class of generative AI models that create images by iteratively refining noise, mimicking the process of reversing a physical diffusion process. Used in AI generation tools like Stable Diffusion and DALL-E. 

Agentic AI. AI systems that operate autonomously with the ability to plan, make decisions, and take action in the world to achieve goals, often with minimal human oversight. 

Prompt Engineering. The practice of designing and refining prompts to effectively interact with generative AI models like ChatGPT for desired outputs.

AI Hallucinations. Instances where an AI model produces or creates inaccurate, misleading, or false information as if it were factual. Hallucinations are common due to biased or incomplete training data. Examples include citing non-existent sources, broken links, or inventing facts or data.