The
was able to use easily accessible, free artificial intelligence (AI) tools to generate realistic but completely fictional, high-quality Leaving Cert research projects online within 30 seconds.It comes as guidelines released this week around new Leaving Cert projects, set to be worth 40% of a student’s overall marks in a subject, allow for the use of AI.
These projects, called additional assessment components (AACs), are due to be rolled out from next September in stages, beginning with biology, chemistry, physics and business.
The guidelines published this week state that students must reference when AI has been used to generate material, in the same way a book, newspaper, magazine or online source would be referenced.
The document goes on to highlight how plagiarism is a serious offence and includes the use of “material generated using artificial intelligence (AI) software or AI applications".
“Direct copying of material from any source without proper acknowledgement is not permitted and may incur penalties, up to and including the withholding of related results.”
However, teachers have raised serious concerns, warning that the high percentage of marks available for the AACs, coupled with guidelines that do little to counteract the use of virtually undetectable AI will encourage students to engage in unethical behaviour.
AI technology is now advanced enough to easily generate realistic data for such projects, according to teacher Humphrey Jones, chair of the Irish Science Teachers’ Association (ISTA).
“Imagine what this technology will be like in two years' time when the first batch of sixth years are doing AACs.”
Using a suggested prompt put together by Mr Jones, the
asked free online AI tools such as Chat GPT and Perplexity to generate a hypothetical project for a Leaving Cert student, alongside a set of fictional results.Within less than 30 seconds, the tools generated research questions and background knowledge in response to the question while also simultaneously designing a hypothetical experiment.
As prompted, the tools also generated a believable but completely fictional scientific method followed by a student, alongside a realistic timeline for the experiment.
AI was also easily able to discuss the fictional yet realistic results, issue a logical conclusion and generate a well-rounded reflection on the project.
In business, where students will be asked to develop a research project using SMART objectives as their AAC, AI issued five potential projects when asked.
These suggestions included looking at the impact of social media marketing on a local, small business or at market research on consumer preferences for eco-friendly products.
When further prompted, AI devised a set of fictional reports, including a survey of 100 customers, and a breakdown of fictional sales amongst four fictional businesses, and the impact social media had on said sales.
In each subject, the
asked the AI tools to repeat the process but with new research topics. Each time as prompted, the AI tools repeated the process quickly and clearly with new sets of research topics.Mr Jones pointed out how these projects were generated with minimal prompts.
“I could have sat down and refined that prompt several times; Can you expand the scientific knowledge there, can you add some data that may be out of line with what's expected, can you re-write it with some spelling and grammar errors you would expect as written by a 17-year-old.
"The vast majority of students will be doing genuine projects, of course they will be, and their teachers will be supporting them along the way, but the burden of responsibility is put on to the teacher to authenticate the work, I just don’t think it's possible to do that completely.”
He also raised concerns about what happens if students do reference using AI in the project, as it is not clear in the guidelines.
“If they do reference it, does it mean they will lose marks? Does it mean that they’ll be paying more attention to disparity between exam scores and project scores? If you don’t reference it, and you are found to have used AI, will your exam be cancelled?”
Guidelines around the use of AI in the classroom have not been issued in Ireland, Mr Jones added.
“If you compare us to the UK, where the Joint Council for Qualifications issued guidelines for teachers and AI back in February 2023, and updated the document back in April 2024, we’ve had no documentation or guidelines whatsoever on AI.”
The only place where the State Examinations Commission has commented on AI is in the guidelines for non-examined components.
“What you can see in the documents published this week is asking the students to reference using AI properly. If you compare that to the UK guidelines, it also says to reference properly but it also highlights exactly what misuse of AI is, it's defined very, very clearly. It states very clearly that if a student is found to use AI, then they will not receive marks for that section.
"That’s what’s missing from the [AAC] guidelines; Where can pupils use AI, where is it not appropriate to use, and what happens if they use it inappropriately?”
There is also now a huge burden on teachers to verify the work students do for AACs, some of which will be at home or outside the classroom, Mr Jones pointed out.
"I think that’s unfair because we’re not police. Our job is to teach, not to focus solely on making sure the students work is authentic.”
The goal of the AACs is to allow students to engage with the scientific process, he added.
“To come up with their own research ideas, to carry out background research on the science underpinning their scientific questions, to gather data and analyse that data. The biggest problem I have with AI is that AI can do all that for them.
Allocating 40% of a students mark to ACC is “disproportionate”, he added.
“It's disproportionate for the hours but also for the skills we want to show. If it's open to abuse, which it clearly is now, the fact that it’s worth 40% puts a real dampening on it.”
No two AIs are equal, he added.
“Not every student has access to AI or digital devices. You can imagine a student whose maybe in a fee-paying school, whose parents are computer scientists, they’ve got access to AI, maybe much more advanced, paid subscription models.
"They are going to be even better than the free, options we did ourselves. They will produce even more realistic results, better constructed texts. We’re effectively writing off 40%.”
AI doesn’t just relate to essays anymore, according to English teacher Conor Murphy.
Mr Murphy is also chair of the Drama, Film and Theatre Teachers' Association, a new subject to be introduced at Senior Cycle.
“It can do the lighting design for them, it’ll be able to write a script for them, it’ll be able to edit a film for them. It’ll be able to do everything if they really wanted them to.
"Imagine what it’ll be like in 10 years' time?”
Most students will not use AI, he believes, but they face a lot of pressure when it comes to the 'points-race' for college courses.
“As soon as CAO points pressure kicks in, it's different for the kids and it's different for the parents. Without the CAO, AI will still be an issue, but with the CAO, that’s where the real pressure comes from.
“Inevitably, you’ll get a couple of kids [who will resort to using it]. It'll be the easiest thing; They’ll be sitting at home, and they’ll think ‘oh god, I need a story, I need a script’ and they’ll think ‘here, I’ll just throw it into AI’ in all kinds of innocence, and as a last resort.
"That’ll lead to the next thing and suddenly the whole thing is done for them, and they’ll just jiggle it around and it won’t be their work.”
A spokesperson for the Department of Education said AACs are being designed so that the associated teaching and learning takes place across the whole of the two years of a leaving certificate course.
"Regular, comprehensive engagement with each student’s work on their AAC will enable teachers to confidently and legitimately authenticate any work being submitted for assessment, and ensure that any instances of plagiarism (including the misuse of AI tools) will be combated.
"AAC guidelines were developed in close cooperation with the State Examinations Commission (SEC) and teacher focus groups, the spokesperson said. It is important to note that this is not the first time that the use of AI has been referred to in regard to Leaving Certificate assessment, they added.
"Since the 2023 examinations, the State Examinations Commission (SEC) includes an instruction in relation to material generated by artificial intelligence (AI) software in its documentation. The requirement for any material generated by AI software to be appropriately referenced is in line with the approach taken currently by the SEC."
Some 29 of 41 Leaving Certificate subjects now include an AAC, the spokesperson said.
"These assessments vary by subject, including oral and aural exams, practical performances, coursework, digital submissions, and project work completed and marked in schools by visiting examiners. The new guidelines explain the steps students follow during the two-year course, with teachers closely monitoring and authenticating their work.
"Including AI generated material without quoting it as the work of AI software will be considered plagiarism, which can result in the forfeit of all marks for the coursework component. Where any material generated by AI software is included in a coursework submission and is properly quoted or referenced, no credit will be awarded for any of that material itself. Credit can only be awarded for the effective use of this material in the support or development of the candidate’s own work."