Middle East Research Journal of Humanities and Social Sciences | Volume: 5 | Issue-05 | Pages: 159-166
Assessing Undergraduates' Perceptions of Use of AI Generative Tools for Academic Activities in Adeyemi Federal University of Education, Ondo, Ondo State, Nigeria
Akinbobola, Akinyemi Olufunminiyi, Omojuwa, Ojuri Sunday, Olorunfemi, Adekemi Anthonia
Published : Oct. 3, 2025
DOI : https://doi.org/10.36348/merjhss.2025.v05i05.003
Abstract
The increasing integration of Artificial Intelligence (AI) generative tools in higher education has transformed academic activities, presenting both opportunities and challenges. This study assesses undergraduates’ perceptions of use of AI generative tools for academic activities at Adeyemi Federal University of Education, Ondo, Nigeria. Five research questions guided the study, focusing on common AI generative tools, appropriate and inappropriate uses, benefits, and challenges. A survey research design was adopted, and 250 students were randomly selected from five faculties using simple random sampling technique. Data were collected using a researchers’ self-constructed questionnaire with an internal consistency of 0.79 using Cronbach Alpha and analysed using descriptive statistics of frequency count, percentage and mean. Findings revealed that, students widely use AI tools, with Quillbot, Grammarly, Meta AI and ChatGPT being the most common. Students acknowledged appropriate uses such as drafting, editing, and research assistance, while also demonstrating high acceptance of inappropriate uses, including examination malpractice and data falsification. AI was perceived as beneficial in enhancing collaboration, accessibility, and writing proficiency but posed challenges related to overreliance, biases, and ethical concerns. The study recommended among others the implementation of institutional policies, AI literacy programmes, faculty guidance, and stricter measures against AI misuse to ensure responsible AI adoption in academic settings.

INTRODUCTION

Artificial Intelligence (AI) generative tools have become increasingly popular in various fields, including academic. These tools are capable of producing content that mimics human creativity, such as writing essays and generating art work. Generative artificial intelligence (Gen-AI) has emerged as a transformative tool in research and education. However, there is a mixed perception about its use (Wahab, 2024). The emergence Gen-AI has significantly transformed various sectors, including higher education. Across global academic institutions, the integration of Gen-AI tools is rapidly increasing, offering new opportunities for content generation, personalised learning experiences, and knowledge management. However, this growing adoption has also sparked debates concerning its appropriate and inappropriate uses within academic settings (Rudolph et al., 2023). The implications of Gen-AI on educational integrity, authorship, and the role of educators remain critical areas of discussion among scholars, policymakers, and educators (Ullah et al., 2024).

 

In the Nigerian higher education context, institutions such as Adeyemi Federal University of Education, Ondo, are gradually encountering the impact of Gen-AI technologies. While Gen-AI presents promising opportunities for enhancing teaching and learning, concerns about academic dishonesty, over-reliance on AI-generated content, and ethical considerations have emerged and the traditional roles of educators. The integration of Gen-AI into higher education settings has ignited debates surrounding issues such as authorship and the impact on academic standards. The effectiveness of Gen-AI in supporting students' academic activities depends largely on their awareness and perceptions of its appropriate and inappropriate uses (Chan, 2023). Despite the global discourse on Gen-AI in education, there remains a paucity of research on how Nigerian undergraduates perceive its use in academic activities, particularly in teacher education institutions like Adeyemi Federal University of Education, Ondo.

 

Previous studies have explored the application of AI in education, with significant attention given to institutions in technologically advanced regions such as the United Kingdom, Hong Kong, and India (Chan & Hu, 2023). These studies have highlighted key themes, including the potential of Gen-AI to revolutionise learning processes, concerns about its impact on critical thinking skills, and institutional responses to its integration. However, there is a limited understanding of how students in Nigerian universities, who operate within unique socio-cultural and technological constraints, perceive Gen-AI and its application in their academic work.

 

Given the rapid advancement of AI technologies and their increasing presence in higher education, there is a need for empirical research to assess students' perceptions of the appropriate and inappropriate uses of Gen-AI tools in Nigerian universities. Understanding these perceptions will help educators and policymakers develop guidelines that ensure responsible AI use while maximising its benefits in educational settings. Academic integrity is defined as honesty in academic work and taking responsibility for one’s actions (East & Donnelly, 2012). The digital age has complicated this concept, requiring a reassessment of what constitutes plagiarism and misconduct (Evering & Moorman, 2012). Studies of Williams (2007), and Howard and Davies (2009) highlight how digital tools increase the risk of misrepresenting ideas. The rise of essay mills, peer-to-peer sharing, and online paraphrasing tools has expanded concerns (Awdry, 2020), who terms this "assignment outsourcing" students obtaining work from external sources. These services challenge universities, as traditional plagiarism detection methods often fail (Rogerson & McCarthy, 2017).

 

Furthermore, Generation Z students, accustomed to technology, have shorter attention spans and prefer quick access to key points (Poláková & Klímová, 2019; Szymkowiak et al., 2021). Their impatience, coupled with an environment where plagiarism is increasingly normalised (Flom et al., 2021; Brimble, 2016), fosters academic dishonesty. The emergence of AI tools like ChatGPT, downloaded by over a million users in its first week (Stokel-Walker, 2022), has further heightened concerns. Cotton et al., (2023) recognise its risks but also highlight potential benefits, such as aiding collaboration and assessment design. Similarly, Javaid et al., (2023) suggest AI can serve as a virtual teaching assistant, while Yu (2023) advocates for AI literacy over outright bans.

 

Universities struggle to integrate generative AI into curricula and integrity policies. Research suggests policies should be student-centred, avoiding legal jargon and involving students in their development (Sefcik et al., 2019; Pitt et al., 2020). However, studies on student perspectives regarding AI and academic integrity remain scarce, despite the pressing need for adaptable policies that evolve alongside emerging technologies.

 

The introduction of AI generating tools such as ChatGPT, DALL-E, and GitHub Copilot has transformed the way academic tasks are tackled. These tools have great potential benefits, but they also raise worries about possible misuse. A balanced awareness of their proper and inappropriate applications is critical for promoting academic integrity and effective learning.AI tools can help undergraduate students understand complex topics, summarize material, and generate project ideas. For example, they can reformat notes, provide paraphrased explanations, and help with language translation in multilingual research. When used effectively, these capabilities boost productivity and creativity. This study seeks to fill this gap by investigating undergraduates' perceptions of Gen-AI at Adeyemi Federal University of Education, Ondo, with the aim of informing institutional policies and instructional strategies for the effective and ethical use of AI in academic activities.

 

Research Questions

To guide this study, the following research questions were formulated:

  1. What are the common AI generative tools used for academic activities?
  2. What are the appropriate ways for using AI generative tools used for academic activities?
  3. What are the inappropriate ways for using AI generative tools used for academic activities?
  4. What are the benefits of using AI generative tools in academic activities?
  5. What are the challenges of using AI generative tools for academic activities?

 

METHODOLOGY

The study adopted the survey research design. The population for the study consisted of all the full time undergraduates presently studying at Adeyemi Federal University of Education, Ondo. Simple random sampling technique was used to select two hundred and fifty (250) students from the five (5) faculties in the University which are: Faculty of Education, Faculty of Arts, Faculty of Management and Social Sciences, Faculty of Vocational and Technical Education, and Faculty of Science. 50 students in each faculty were randomly selected for the study. The research instrument used for data collection was a researchers’ self-constructed questionnaire. The questionnaire had six subsections. The first section, which is "Section A" requires the demographic data of the respondents such as gender, age, department and level. Section B asked questions that sought the opinion of respondents on the commonly used AI generative tools for academic activities. It has 5 items and a four likert-type rating scale of Strongly Agree, Agree, Disagree, and Strongly Disagree. Section C. measured respondents’ perceptions of the appropriate ways of using AI generative tools for academic activities. It has 5 items and four likert-type rating scale of Strongly Agree, Agree, Disagree and Strongly Disagree. Section D sought respondents perceptions of the inappropriate ways of using AI generative tools for academic activities. It has 5 items and four likert-type rating scale of Strongly Agree, Agree, Disagree and Strongly Disagree. Sections E and F asked questions that sought the opinion of respondents on benefits of using AI generative tools in academic activities and the challenges of using AI generative tools for academic activities respectively with 5 items in each section and a four likert-type rating scale of Strongly Agree, Agree, Disagree, and Strongly Disagree.

 

The instrument was validated for face and content validity by experts in the fields of text and measurement and educational technology. A trial test was also carried out on the instrument outside the area of the study’s location to determine its reliability. Data collected was analyzed using Cronbach Alpha and result yielded 0.79 reliability co-efficient, and it was considered acceptable for the study. Data collected was analyzed using descriptive statistics of frequency count, percentage, mean and standard deviation.

 

RESULTS

Research Question One: What are the common AI generative tools used for academic activities?

 

 

Table 1: Common AI generative tools used for academic activities

S/N

ITEMS

SA(%)

A (%)

D(%)

SD(%)

Mean

Remarks

1

I make use of Quillbot AI for summarizing my project work

90(36)

108(43.2)

34(13.6)

18(7.2)

3.15

Accepted

2

I make use of Grammarly AI for check plagiarism and grammar accuracy for academic work

92(36.8)

98(39.2)

40(16)

20(8)

3.07

Accepted

3

I make use of Google Bard AI writing tools for research on any complex assignments.

104(41.6)

74(29.6)

48(19.2)

24(9.6)

2.91

Accepted

4

I make use of ChatGPT to generate ideas on my academic work

70(28)

96(38.4)

46(18.4)

38(15.2)

2.90

Accepted

5

I make use of Meta AI for academic work

86(34.4)

102(0.8)

38(15.2)

24(9.6)

3.06

Accepted

 

Average Mean

 

 

 

 

3.02

 

 

 

Table 1 above shows the respondents’ responses on the common AI generative tools used for academic activities. The average mean score of 3.02 suggests that, students generally accept the use of AI generative tools in their academic activities. The most frequently used tool appears to be Quillbot AI (mean = 3.15), likely due to its effectiveness in summarisation. Grammarly AI (mean = 3.07) and Meta AI (mean = 3.06) are also highly utilised, indicating that students prioritise plagiarism detection and grammatical accuracy. Although Google Bard AI and ChatGPT received slightly lower mean scores (2.91 and 2.90, respectively), they are still widely used for research and idea generation. The findings indicate that, AI generative tools play a significant role in students' academic activities, particularly in summarisation, plagiarism checking, and writing assistance. Their acceptance suggests that AI tools are becoming essential in academic work, highlighting the need for institutions to provide guidance on their appropriate and ethical use.

 

Research Question Two: What are the appropriate ways for using AI generative tools used for Academic activities?

 

 

Table 2: The appropriate ways of using AI generative tools for academic activities

S/N

ITEMS

SA(%)

A(%)

D(%)

SD(%)

Mean

Remarks

6

Students should use AI generative tools to assist with research paper writing, such as suggesting outlines and citations.

86(34.4)

106(42.2)

38(15.2)

20(8)

3.11

Accepted

7

Students should use AI generative tools to help with drafting and editing their written work, such as suggesting alternative phrases and sentences.

98(39.2)

108(43.2)

24(9.6)

20(8)

3.18

Accepted

8

Students should use AI generative tools to create their own study materials, such as flashcards and concept maps.

82(32.8)

92(36.8)

47(18.8)

29(11.6)

2.95

Accepted

9

Students should use AI generative tools to collaborate with peers and receive feedback on their work, such as using AI-powered peer review tools.

110(44)

88(35.2)

24(9.6)

28(11.2)

3.03

Accepted

10

Students should use AI generative tools to support their accessibility and accommodation needs, such as using AI-powered text-to-speech tools.

99(39.6)

86(34.4)

41(16.4)

24(9.6)

2.99

Accepted

 

 

Average Mean

 

 

 

 

3.05

 

 

 

Table 2 presents students’ perceptions of the appropriate ways to use AI generative tools for academic activities. The average mean score of 3.05 shows that, students generally accept AI generative tools for academic purposes. The Table 2 indicates that, AI generative tools are widely accepted for various academic activities. The highest level of approval was for AI generative tools for drafting and editing, with a mean score of 3.18, as most students agreed that AI could enhance writing by suggesting alternative phrases and sentences. Similarly, AI’s role in research paper writing, particularly for outlining and citation support, was also well received, with a mean score of 3.11. AI generative tools powered peer collaboration and feedback tools were also accepted, with a 3.03 mean score, showing students’ willingness to integrate AI into group work and review processes. The use of AI generative tools for study material creation, such as flashcards and concept maps, had a slightly lower mean of 2.95, suggesting some hesitation but still an overall positive reception. AI tools for accessibility and accommodation, like text-to-speech applications, were also considered beneficial, with a 2.99 mean score. Despite some resistance in each category, all proposed AI applications had mean scores above 2.5, confirming their general acceptance for academic use. Students also recognise AI as a useful tool for academic tasks, particularly in writing assistance, editing, collaboration, and accessibility. However, institutions should establish clear guidelines to promote responsible AI use while ensuring ethical academic practices.

 

Research Question Three: What are the inappropriate ways of using AI generative tools for academic activities?

 

 

Table 3: The inappropriate ways of Using AI generative tools for academic activities

S/N

ITEMS

SA (%)

A (%)

D (%)

SD(%)

Mean

Remarks

11

Submitting AI-generated work as original student work without proper citation or disclosure is acceptable

124(49.6)

92(36.8)

18(7.2)

16(6.4)

3.17

Accepted

12

Using AI generative tools to falsify or manipulate research data is a legitimate practice.

70 (28)

118(47.2)

50 (20)

12(4.8)

3.18

Accepted

13

Relying solely on AI generative tools to complete academic assignments without any human effort or oversight is acceptable

96 (38.4)

80 (32)

54(21.6)

20 (8)

2.94

Accepted

14

Using AI generative tools to cheat during examinations, quizzes, or other assessments is an inappropriate use of these tools.

86 (34.4)

126(51.2)

18(7.2)

20 (8)

3.27

Accepted

15

Ignoring the limitations and potential biases of AI generative tools, and presenting their output as fact without critical evaluation, is an inappropriate use of these tools.

88 (35.2)

102(40.8)

40 (16)

20 (8)

3.09

Accepted

 

 

 

 

Average Mean

 

 

 

 

3.13

 

 

 

Table 3 above presents students' perceptions of improper AI usage in academic activities. The findings indicate that, students generally accept several inappropriate uses of AI generative tools in academic activities, with an average mean of 3.13. The highest approval was for using AI to cheat during examinations and assessments (3.27), despite its clear ethical implications. Similarly, a high mean score of 3.18 suggests that, students see falsifying or manipulating research data using AI as a legitimate practice. Many students also agreed that submitting AI-generated work without citation or disclosure (3.17) and relying entirely on AI for assignments without human effort (2.94) were acceptable. Additionally, a mean score of 3.09 indicates that, students do not widely recognise the risks of AI biases and limitations, often presenting AI-generated content as fact without critical evaluation. Overall, the results highlight a concerning trend of students accepting unethical AI use, underscoring the need for stricter academic guidelines and awareness of responsible AI practices.

 

 

Research Question Four: What are the benefits of using AI generative tools used for academic activities?

 

 

Table 4: Benefits of using AI generative tools used for academic Activities

S/N

ITEMS

SA(%)

A (%)

D (%)

SD(%)

Mean

Remarks

16

Using AI generative tools can significantly improve the efficiency and productivity of academic writing

110 (44)

88 (35.2)

24 (9.6)

28 (11.2)

3.03

Accepted

17

AI generative tools can help students overcome writer’s block and generate new ideas for research papers

99 (39.6)

86 (34.4)

41 (16.4)

24 (9.6)

2.99

Accepted

18

The use of AI generative tools can enhance the quality of academic writing by providing grammar, syntax, and style suggestions

82 (32.8)

92 (36.8)

47 (18.8)

29 (11.6)

2.95

Accepted

19

The use of AI generative tools can assist students with disabilities, such as dyslexia or visual impairments, by providing alternative formats for reading and writing.

126(50.4)

86(34.4)

18(7.2)

20(8)

3.11

Accepted

20

The use of AI generative tools can facilitate collaboration among students and researchers by providing a platform for sharing and building on ideas.

98 (39.2)

108(43.2)

24 (9.6)

20 (8)

3.18

Accepted

 

Average Mean

 

 

 

 

3.05

 

 

 

The table 4 above shows students’ perceptions on the benefits of using AI generative tools for academic activities. The table indicates a generally positive perception of the benefits of AI generative tools in academic activities, with an average mean of 3.05, confirming their acceptance. The highest approval (3.18) was for AI’s role in facilitating collaboration, as students acknowledged its ability to enhance idea-sharing and teamwork. Similarly, AI was widely accepted as a tool for assisting students with disabilities (3.11), highlighting its role in improving accessibility. Students also recognised AI’s ability to enhance efficiency and productivity in academic writing (3.03) and help overcome writer’s block (2.99). Additionally, AI’s role in improving grammar, syntax, and style received a mean score of 2.95, indicating that while useful, some students may not fully rely on it for writing refinement. Overall, the results suggest that AI generative tools are valued for their ability to streamline academic work, foster collaboration, and support diverse learning needs, reinforcing their growing importance in education.

 

Research Question Five: What are the challenges of using AI generative tools using for academic activities?

 

 

Table 5: Challenges of using AI generative tools using for academic activities

S/N

ITEMS

SA (%)

A (%)

D (%)

SD (%)

Mean

Remarks

21

The lack of transparency and explainability in AI generative tools can make it difficult to evaluate the credibility of generated content

70 (28)

118 (47.2)

50 (20)

12 (4.8)

3.18

Accepted

22

The potential for AI generative tools to perpetuated biases and inaccuracies in academic writing is a significant concern

124(49.6)

92 (36.8)

18 (7.2)

16 (6.4)

3.17

Accepted

23

The reliance on AI generative tools can hinder the development of Standard thinking and writing skills in students

86 (34.4)

126 (50.4)

18 (7.2)

20 (8)

3.27

Accepted

24

The integration of AI generative tools into academic workflows can be hindered by technical issues, such as compatibility problems and software updates.

88 (45.2)

102 (40.8)

40 (16)

20 (8)

3.09

Accepted

25

The use of AI generative tools raises concerns about authorship, ownership and intellectual property rights in academic publishing.

96(38.4)

80 (32)

54 (21.6)

20 (8)

2.94

Accepted

 

Average Mean

 

 

 

 

3.13

 

 

 

Table 5 highlights several challenges associated with using AI generative tools in academic activities, with an average mean of 3.13, indicating broad recognition of these concerns. The most pressing issue, with the highest mean score (3.27), is the fear that overreliance on AI may hinder students' critical thinking and writing skills. Concerns about biases and inaccuracies in AI-generated content (3.17) and the lack of transparency and explain ability (3.18) were also widely acknowledged, suggesting that students find it difficult to evaluate AI-generated material for credibility. Additionally, technical issues, such as software compatibility and updates (3.09), were seen as obstacles to AI adoption in academic workflows. Another notable concern (2.94) is the impact of AI on authorship, ownership, and intellectual property rights, raising ethical and legal questions in academic publishing. Overall, while AI generative tools offer benefits, students are aware of significant challenges, particularly in terms of academic integrity, skill development, and content reliability, necessitating careful and responsible usage.

 

DISCUSSION OF FINDINGS

Common AI Generative Tools Used for Academic Activities

The findings of this study revealed that, students widely use AI generative tools for academic activities. The most frequently utilised tool is Quillbot AI, followed by Grammarly and Meta AI, highlighting the students’ prioritisation of summarisation, plagiarism detection, and grammatical accuracy and text generation. This aligns with the findings of Alshahrani and Ward (2022) who reported that, students frequently use AI tools for refining their writing quality and improving academic integrity. Additionally, while Google Bard AI and ChatGPT were slightly less popular, their usage for research and idea generation remains significant, corroborating studies by Susnjak (2023) on the increasing reliance on conversational AI for academic support. These results underscore the necessity for academic institutions to establish frameworks for the ethical and effective use of AI in educational settings.

 

Appropriate Uses of AI Generative Tools in Academic Activities

Students largely accepted various appropriate uses of AI generative tools. The highest level of approval was for AI’s role in drafting and editing, as students recognised its ability to enhance writing by suggesting alternative phrases and sentences. This is consistent with the findings of Dürr and Wagner (2023), who highlighted AI’s role in improving academic writing by aiding in text coherence and clarity. Similarly, the use of AI for research paper writing and peer collaboration received strong support, reflecting previous studies (Zawacki-Richter et al., 2019) that emphasised AI’s potential in collaborative learning environments. AI’s use for study material creation and accessibility support was also acknowledged, demonstrating its versatility in academic contexts. However, despite some hesitation in certain areas, the overall acceptance suggests a growing integration of AI into higher education.

 

Inappropriate Uses of AI Generative Tools in Academic Activities

Alarmingly, the study revealed that students also accepted several inappropriate uses of AI generative tools. The highest acceptance was for using AI to cheat during examinations and assessments, followed by falsifying or manipulating research data and submitting AI-generated work without proper citation. This finding supports concerns raised by Cotton et al., (2023), who noted the increasing risk of academic dishonesty associated with AI. Moreover, the willingness to rely solely on AI for assignments and the failure to critically evaluate AI-generated content further highlight ethical concerns, aligning with the observations of Kasneci et al., (2023) regarding the risk of overreliance on AI in education. These findings underscore the urgent need for universities to implement stricter policies and awareness campaigns on the responsible use of AI.

 

Benefits of AI Generative Tools in Academic Activities

Students acknowledged multiple benefits of AI generative tools. The most highly rated benefit was AI’s role in facilitating collaboration, which is in line with the study by Lu et al., (2021), who found that AI tools enhance teamwork and knowledge sharing among students. Additionally, AI’s support for students with disabilities and its ability to enhance academic writing productivity received significant recognition, corroborating research by Lai and Bower (2019) on AI-driven accessibility features. Other benefits included AI’s role in overcoming writer’s block and improving grammar and syntax. These findings demonstrate the transformative impact of AI in education, reinforcing its potential to enhance learning experiences when used ethically.

 

Challenges of Using AI Generative Tools in Academic Activities

The study also identified significant challenges associated with AI generative tools. The most concerning issue was the potential hindrance to students' critical thinking and writing skills due to overreliance on AI. This aligns with the findings of Dwivedi et al., (2023), who warned that excessive AI usage may lead to diminished cognitive engagement. Concerns about biases and inaccuracies and the lack of transparency in AI-generated content were also prominent, reflecting prior research by Bender et al., (2021) on the ethical implications of AI-generated misinformation. Technical issues and concerns over authorship and intellectual property rights further complicate AI’s role in academia, supporting the arguments of Floridi and Cowls (2019) regarding the legal and ethical uncertainties of AI in education. These findings highlight the necessity for digital literacy initiatives to equip students with the skills to critically assess and responsibly use AI-generated content.

 

CONCLUSION

The findings of this study provide valuable insights into the role of AI generative tools in academic activities, highlighting both their benefits and challenges. While AI tools are widely accepted for their ability to enhance academic writing, collaboration, and accessibility, concerns regarding ethical misuse and academic integrity persist. Given these findings, it is imperative for educational institutions to develop clear guidelines and policies to regulate AI use while promoting ethical academic practices. Future research should explore strategies for mitigating the risks associated with AI while maximising its potential to support learning and research.

 

Recommendations

Based on the findings, the following recommendations were made:

  1. Universities should establish clear policies on the ethical use of AI in academic activities, emphasising proper citation, responsible usage, and academic integrity.
  2. AI literacy programmes should be introduced to educate students on both the benefits and risks of AI tools, ensuring they develop critical thinking and ethical awareness.
  3. Faculty should guide students on how to effectively use AI for research, drafting, and editing without compromising originality or academic honesty.
  4. Institutions should implement stricter policies against AI misuse, including using plagiarism detection tools and promoting awareness campaigns on ethical AI practices.
  5. Universities should invest in AI-powered accessibility tools for students with disabilities and ensure that AI integration in academic workflows is seamless and user-friendly.
  6. AI tool developers and academic institutions should collaborate to improve transparency and mitigate biases in AI-generated content through continuous evaluation and updates.

 

By implementing these recommendations, academic institutions can foster responsible AI adoption while maintaining ethical and high-quality educational standards.

 

REFERENCES

  • Alshahrani, M., & Ward, R. (2022). The use of AI-driven writing tools in higher education: Implications for academic integrity. Journal of Educational Technology, 19(4), 245-260.
  • Awdry, R (2020) Assignment outsourcing: Moving beyond contract cheating. Assess Evaluation High Education, 46(2), 220–235. https://doi.org/10.1080/02602938.2020.1765311
  • Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? Proceedings of the 2021 ACM conference on fairness, accountability, and transparency, 610-623.
  • Brimble, M. (2016). Why students cheat: An exploration of the motivators of student academic dishonesty in higher education. In Bretag, T. (ed.), Handbook of academic integrity (pp. 365–382). https://doi.org/10.1007/978-981-287-098-8_ 58
  • Chan, C. K. Y. (2023). A comprehensive AI policy education framework for university teaching and learning. International Journal of Educational Technology in Higher Education20(1), 38.
  • Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education20(1), 43.
  • Cotton, D. R, Cotton, P. A, & Shipway, J. R (2023). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International, 1–12. 13th March,https://doi.org/10.1080/14703297.2023. 2190148
  • Dürr, S., & Wagner, C. (2023). AI-assisted academic writing: Enhancing quality and efficiency in higher education. AI & Society, 38(2), 409-425.
  • Dwivedi, Y. K., Hughes, L., Baabdullah, A. M., & Rana, N. P. (2023). AI in education: Boon or bane for student learning? Computers & Education, 200, 104580.
  • East, J, & Donnelly, L (2012). Taking responsibility for academic integrity: A collaborative teaching and learning design. Journal of University Teaching Learn Practice, 9(3), 6–17. https://doi.org/10.53761/1.9.3.2
  • Evering, L. C, & Moorman, G., (2012). Rethinking plagiarism in the digital age. Journal of Adolescence Adult Literacy, 56(1):35–44. https://doi.org/ 10.1002/JAAL.00100
  • Flom, J., Green, K., & Wallace, S. (2021). To cheat or not to cheat? An investigation into the ethical behaviours of generation Z. Act Learn High Educ., 24(2), 155–168. https://doi.org/10.1177/14697874211016147
  • Floridi, L., & Cowls, J. (2019). The ethics of AI: Balancing benefits and risks. Philosophy & Technology, 32(1), 1-23.
  • Howard, R. M., & Davies, L. J. (2009). Plagiarism in the internet age. Educational Leadership, 66(6), 64–67. http://www.ascd.org/publications/educational-leadership/mar09/vol66/num06/Plagiarism-in-the-Internet-Age.aspx. Accessed 12 May, 2023.
  • Javaid, M., Haleem, A., Singh, R. P., Khan, S., & Kahn, I. H. (2023). Unlocking the opportunities through ChatGPT Tool towards ameliorating the education system. Bench Council transactions on Benchmarks. Stand Evaluations, 3(2), 1–12. https://doi. org/10.1016/j.tbench.2023.100115
  • Kasneci, E., Klee, S., Kasneci, G., & Seegerer, P. (2023). AI in education: A critical review of ethical challenges and future directions. Computers in Human Behaviour, 141, 107725.
  • Lai, K. W., & Bower, M. (2019). How AI is transforming learning and teaching. Education and Information Technologies, 24(3), 1-19.
  • Lu, H., Li, J., Wang, J., & Chen, H. (2021). AI-enabled collaborative learning: A review of applications and impacts. Interactive Learning Environments, 29(1), 1-17.
  • Pitt, P., Dullaghan, K., & Sutherland-Smith, W. (2020). Mess, stress and trauma: Students’ experiences of formal contract cheating processes. Assess Evaluation High Education, 46(1), 659–672. https://doi.org/10.1080/02602938.2020.1787332
  • Poláková, P., & Klímová, B. (2019). Mobile technology and generation Z in the english language classroom: A preliminary study. Educational Science, 9(3), 203. https://doi.org/10.3390/educsci9030203
  • Rogerson, A., & McCarthy, G. (2017). Using internet based paraphrasing tools: original work, patch writing or facilitated plagiarism? International Journal of Educational Integrity, 13(2), 1–5. https://doi.org/10.1007/s40979-016-0013-y
  • Rudolph, A., Petropoulou, M., Winter, W., & Bošnjak, Ž. (2023). Multi-messenger model for the prompt emission from GRB 221009A. The Astrophysical Journal Letters, 944(2), L34..
  • Sefcik, L., Striepe, M., & Yorke, J. (2019). Mapping the landscape of academic integrity education programs: what approaches are effective? Assess Evaluation High Education, 45(1), 30–43. https://doi.org/10.1080/02602938.2019.1604942
  • Stokel-Walker, C. (2022). AI bot ChatGPT writes smart essays – should professors worry? Nature, 9th Dec. https://www. nature.com/articles/d41586-022-04397-7
  • Susnjak, T. (2023). The role of conversational AI in academic research: Challenges and opportunities. Journal of Educational Research, 118(2), 187-202.
  • Szymkowiak, A., Melovic, B., Dabic, M., Jeganathan, K., & Kundi, G. S. (2021). Information technology and Gen Z: the role of teachers, the internet, and technology in the education of young people. Technol Soc., 65, 101565. https://doi.org/10. 1016/j.techsoc.2021.101565
  • Ullah, M., Umair, M., Sohag, K., Mariev, O., Khan, M. A., & Sohail, H. M. (2024). The connection between disaggregate energy use and export sophistication: new insights from OECD with robust panel estimations. Energy306, 132282.
  • Wahab, A. (2024). Two decades of causal layered analysis: a bibliometric analysis and review (2000–2022). World Futures Review16(3), 220-243.
  • Williams, B.T. (2007). Trust, betrayal and authorship: plagiarism and how we perceive students. Journal of Adolescence Adult Literacy, 51(4), 350–354. https://doi.org/10.1598/JAAL.51.4.6
  • Yu, H. (2023). Refection on whether ChatGPT should be banned by academia from the perspective of education and teaching. Psychology, 14, 1181712. https://doi.org/10.3389/fpsyg.2023.1181712
  • Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on AI in higher education. International Journal of Educational Technology in Higher Education, 16(1), 1-27.

 

 



This work is licensed under a Creative Commons
Attribution-NonCommercial 4.0 International License.
© Copyright Kuwait Scholars Publisher. All Rights Reserved.