Student Guide to Generative AI
Generative Artificial Intelligence (GenAI) technologies present significant opportunities to enhance learning, research, and professional practice. However, these tools also pose academic, ethical, and employability risks if engaged with uncritically or inappropriately.
AI has, of course, been around for some time and we often use it without realising. However, it is becoming an increasingly important part of modern society and has begun to revolutionise the way we work, communicate, and learn.
This guide provides an overview of some of the strengths and limitations of Generative AI, as well as considering ethical, privacy and other factors to help support you in using this technology effectively and in a way that aligns with Queen Mary's regulations and values.
What is Generative AI and how does it work?
Generative AI tools create content based on patterns, generating reasonably accurate, plausible-sounding (or looking) responses to prompts on a wide range of topics. However, GenAI does not ‘think’ or create original ideas.
Limitations of Generative AI
- Generative AI tools are unable to access information specific to your particular course or module at Queen Mary, nor do they have access to journals and resources that are available to you via Queen Mary Library Services, so they will only make use of openly available information.
- Generative AI can create ‘hallucinations’ or fake data or information that is convincing but not based on reality or evidence. Furthermore, it can produce fake references or citations
Using AI critically
When engaging with AI:
- Interrogate outputs – Ask: How reliable is this response? What is it based on?
- Corroborate – Validate claims against scholarly and other reputable sources (which you can reference).
- Contextualise – Ensure the prompts you use are adapted to your disciplinary, cultural, and methodological contexts
- Identify gaps - Which perspectives are missing, and why might they be absent? How does this align or conflict with established literature in my field?
- Incorporate – Your OWN interpretation, analysis, synthesis, and evaluation must remain at the core of your work. Generative AI can complement, but is not a substitute, for academic thought.
The attributes gained by studying at university include your ability to critically interrogate information, generate original insights, and act with integrity. These are qualities that will help you to succeed in a world where AI is inevitable.
Academic integrity
Any use of Generative AI must align with Queen Mary’s Academic Integrity and Misconduct Policy. AI use may be acceptable in some scenarios, but not all. For more detail, see the FAQs below.
Ethical issues
- There is concern about the environmental impacts of AI, particularly in regards to waste, energy and water usage.
- Generative AI systems are informed by existing texts or other datasets and they, like other algorithm-based tools such as internet search engines, reproduce the biases contained within the source material.
- There are important considerations around data privacy. What are AI tools doing with the data you input? Are they using this to train their models? Are you uploading someone else’s work (eg lecture notes or published articles) without permission?
- Issues around AI generated content go beyond text. The growth in ‘deep fake’ images and videos require you to develop your information literacy skills to evaluate the authenticity of all sources.
Linking AI use to Queen Mary graduate attributes
The Queen Mary Graduate Attributes have been created to equip QM students to become active global citizens. Knowing how to use AI appropriately is inescapably part of these now. Some examples of where AI knowledge is useful include:
Applying it to AI use
Understanding AI’s capabilities, limitations, and ethical implications.
An example
Identifying potential bias in an AI-generated literature summary and adjusting the scope of your research accordingly.
Applying it to AI use
Making honest and informed decisions about whether GenAI supports, rather than replaces, your own work.
An example
Only use AI when it adds genuine value and helps with your learning or development.
Applying it to AI use
Use GenAI in line with QM guidelines, thereby contributing to the shared culture of academic integrity, which is integral value of your degree.
An example
Acknowledging and referencing any GenAI assistance in your assessment after checking whether or how AI can be used in that particular assignment.
Applying it to AI use
Evaluate and work critically with Generative AI outputs
An example
Critically evaluate output from GenAI tools and ensure this is backed up with reliable sources and your own intepretation.
Frequently asked questions
So, can you use Generative AI in your assignments at university? See the FAQ's below for more information:
Some schools have their own specific guidance, so the first thing to do is check with your school or institute.
Some scenarios when it is acceptable to use AI tools to help with learning or preparation include:
- exploring general ideas on a topic
- helping you to understand a challenging concept or reading more clearly by generating a plain English explanation or providing examples
- identifying search terms and keywords for your search strategy using the Library databases
- using AI to revise by preparing summaries of your own notes
- producing practice questions or revision schedules to prepare for exams
Think of AI as a critical friend, to discuss, advise, question, prepare, or work through problems. Just as you wouldn’t use your friend as a source in your essay, or get them to do your work for you, the same is true of AI.
Some students are already being instructed to use AI for specific assignments, with tasks such as creating an AI-generated report and then critiquing it. Being asked to do this does not mean you can use AI throughout your course though.
You need to take a critical approach to AI output. Is it accurate? Does it tell the full story? What are the gaps, inconsistencies, or irrelevant pieces of information? What happens if you ask the question again? Look at similarities, differences and patterns in the outputs and think about what they could tell you.
No. Use multiple sources. Don’t just rely on whatever is generated by an AI tools. They can’t always access the most current information and can only use open source materials, meaning that to get information from many journals and books you still need to go via the Queen Mary University Library. To learn more about finding information, see the ‘How to research’ section on the Academic Skills Centre on QMplus.
Be very specific with your prompts. Experiment with different ways of asking AI for information, including asking follow up questions.
What personal information are you sharing with the company behind the tool? Will the company own or be able to share anything you submit? Does the tool use any data you input to train their models? What biases are demonstrated by the tool? Are there any ethical concerns about the tool that you’re thinking about using? If someone is making the tool available apparently for free, what’s in it for them?
Yes. It is important to explain where you got your information from. Debates are still ongoing about how to cite generative AI tools, but we have put together a Queen Mary guide on how to reference AI within our Referencing Hub, including information on how to do so using different referencing systems.
AI has already been here for a long time, with features such as predictive text, and more tools are incorporating generative AI so seamlessly that you may not realise you’re using it. This brings us back to the key question – always ask yourself, honestly, ‘Am I sure this is this my own work I’m submitting?’
AI has long been incorporated into tools such as text to speech software. Students with disabilities should be assured that any assistive software and other technology that they utilise to access teaching and learning is permissible, even though it may use artificial intelligence.
New students at Queen Mary are enrolled on a QMplus resource on AI for student learning and research, which aims to introduce you to AI, what it can do, and how you can use it constructively and ethically to support your learning and research. Returning students can self-enrol if they wish to take the course.
Postgraduate researchers (PGRs)
For research students, there is specific guidance provided by the Doctoral College: AI Guidance for PGRs
This guide has been created in partnership with Queen Mary Students’ Union and colleagues from across Queen Mary, including the Centre for Excellence in AI in Education. It is a companion to the Queen Mary staff guide to the use of Generative AI.
Updated 12/9/25