<doi_batch xmlns="http://www.crossref.org/schema/4.4.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="4.4.0"><head><doi_batch_id>2a0dd09c-bcfc-48e1-86fd-0e8b536a44ee</doi_batch_id><timestamp>20251027072441345</timestamp><depositor><depositor_name>naun:naun</depositor_name><email_address>mdt@crossref.org</email_address></depositor><registrant>MDT Deposit</registrant></head><body><journal><journal_metadata language="en"><full_title>International Journal of Education and Information Technologies</full_title><issn media_type="electronic">2074-1316</issn><archive_locations><archive name="Portico"/></archive_locations><doi_data><doi>10.46300/9109</doi><resource>http://www.naun.org/cms.action?id=3037</resource></doi_data></journal_metadata><journal_issue><publication_date media_type="online"><month>1</month><day>31</day><year>2025</year></publication_date><publication_date media_type="print"><month>1</month><day>31</day><year>2025</year></publication_date><journal_volume><volume>19</volume><doi_data><doi>10.46300/9109.2025.19</doi><resource>https://npublications.com/journals/educationinformation/2025.php</resource></doi_data></journal_volume></journal_issue><journal_article language="en"><titles><title>An EEG and Deep Learning-based Detection Method for Learning Concentration</title></titles><contributors><person_name sequence="first" contributor_role="author"><given_name>Yuyun</given_name><surname>Kang</surname><affiliation>Linyi University, Lanshan, Linyi, Shandong, 276000 China</affiliation></person_name><person_name sequence="additional" contributor_role="author"><given_name>Baiyang</given_name><surname>Wang</surname><affiliation>Shandong University, Jimo, Qingdao, Shandong, 266237 China</affiliation></person_name><person_name sequence="additional" contributor_role="author"><given_name>Chengyue</given_name><surname>Hu</surname><affiliation>Linyi University, Lanshan, Linyi, Shandong, 276000 China</affiliation></person_name><person_name sequence="additional" contributor_role="author"><given_name>Xiangyue</given_name><surname>Zhang</surname><affiliation>Linyi University, Lanshan, Linyi, Shandong, 276000 China</affiliation></person_name><person_name sequence="additional" contributor_role="author"><given_name>Guifang</given_name><surname>Feng</surname><affiliation>Linyi University, Lanshan, Linyi, Shandong, 276000 China</affiliation></person_name></contributors><jats:abstract xmlns:jats="http://www.ncbi.nlm.nih.gov/JATS1"><jats:p>Concentration has a significant impact on learning effectiveness, making it crucial to study attention. However, there is little research on the quantitative measurement of learning attention. Electroencephalography (EEG) signals can reflect the brain’s attention during learning; therefore, this paper proposes a learning concentration detection method based on EEG. Firstly, a portable, wearable single-channel EEG acquisition device is used to collect the brain’s EEG signals during the learning process. Secondly, the single-channel EEG signals are converted into images to evaluate learning concentration, thereby transforming the concentration detection problem into an image recognition task. Thirdly, convolutional neural network models—AlexNet, ResNet, and the Visual Geometry Group (VGG) network—are applied to detect the converted images. Finally, an experiment was conducted, and the results show that the detection accuracy rate reaches 93.23%, which proves that the proposed method can effectively evaluate learning concentration.</jats:p></jats:abstract><publication_date media_type="online"><month>10</month><day>27</day><year>2025</year></publication_date><publication_date media_type="print"><month>10</month><day>27</day><year>2025</year></publication_date><pages><first_page>160</first_page><last_page>169</last_page></pages><publisher_item><item_number item_number_type="article_number">17</item_number></publisher_item><ai:program xmlns:ai="http://www.crossref.org/AccessIndicators.xsd" name="AccessIndicators"><ai:free_to_read start_date="2025-10-27"/><ai:license_ref applies_to="am" start_date="2025-10-27">https://npublications.com/journals/educationinformation/2025/a362008-017(2025).pdf</ai:license_ref></ai:program><archive_locations><archive name="Portico"/></archive_locations><doi_data><doi>10.46300/9109.2025.19.17</doi><resource>https://npublications.com/journals/educationinformation/2025/a362008-017(2025).pdf</resource></doi_data><citation_list><citation key="ref0"><unstructured_citation>S. Dong, S. Wang, L. P. Ma, “A Study on the Attention-Driving Mechanism of Online Learning for Working Students Based on Grounded Theory,” Voc. Edu. For., vol. 40, no. 2, pp. 90-98, 2024. </unstructured_citation></citation><citation key="ref1"><unstructured_citation>M. Y. Lei, “A computer assessment Study of attention based on learning ability,” Nanjing Normal University, 2018. </unstructured_citation></citation><citation key="ref2"><doi>10.1146/annurev-psych-040723-012736</doi><unstructured_citation>N. Cowan, C. Bao, B. M. Bishop-Chrzanowski, A. N. Costa, N. R. Greene,D. Guitard, C. Li, M. L. Musich, Z. E. Ünal, “The relation between attention and memory,” Ann. Rev. of Psy., vol. 75, pp. 183-214, 2024, doi: 10.1146/annurev-psych-040723-012736. </unstructured_citation></citation><citation key="ref3"><doi>10.59946/jfki.2023.250</doi><unstructured_citation>T. Aminoto, D. Agustina, “The effect of sensorimotor games on increasing learning concentration of export training participants at asi training center bekasi city, west java,” Jur. Fisi. dan Kes. Ind., vol. 3, no. 2, pp. 168-175, 2023. </unstructured_citation></citation><citation key="ref4"><doi>10.25217/jcd.v3i2.3814</doi><unstructured_citation>Pujiarto, Y. N. Dwi, “Factor analysis of teachers' skills in directing silent sitting given emotion regulation and concentration of early childhood children,” Jour. of Chil. Dev., vol. 3, no. 2, pp. 59-68, 2023, doi: 10.25217/jcd.v3i2.3814. </unstructured_citation></citation><citation key="ref5"><doi>10.1177/20592043231214085</doi><unstructured_citation>V. Julia, O. Miitta, S. Helmi, S. Suvi, “Melody for the Mind: Enhancing Mood, Motivation, Concentration, and Learning through Music Listening in the Classroom,” Mus. &amp; Sci., vol. 6, pp. 1–13, 2023, doi: 10.1177/20592043231214085. </unstructured_citation></citation><citation key="ref6"><doi>10.1016/j.stueduc.2022.101128</doi><unstructured_citation>A. Yau, M. Yeung, C. Lee, “A co-orientation analysis of teachers’ and students’ perceptions of online teaching and learning in Hong Kong higher education during the COVID-19 pandemic,” Stu. in Edu. Eva., vol. 72, pp. 101128, 2022, doi:10.1016/j.stueduc.2022.101128. </unstructured_citation></citation><citation key="ref7"><doi>10.1016/j.stueduc.2023.101250</doi><unstructured_citation>A. Lee, “Supporting students’ generation of feedback in large-scale online course with artificial intelligence-enabled evaluation,” Stu. in Edu. Eva., vol. 77, pp. 101250, 2023, doi: 10.1016/j.stueduc.2023.101250. </unstructured_citation></citation><citation key="ref8"><unstructured_citation>X. L. Shao, Suwarsi, I. Pornman, L. F. Asmarani, Muflih, N. R. Listyana, S. Damayanti, “The relationship between learning concentration and understanding level through the online learning process,” Jour. Kep. Res. Yog., vol. 10, no. 2, pp. 89 - 93, 2023, doi:10.35842/jkry.v10i2.741. </unstructured_citation></citation><citation key="ref9"><unstructured_citation>S. Li, Q. H. Zheng, J. L. Du, S. Wang, “ The Relationship between Attention Investment Characteristics in Online Learning and Learning Completion Rate--Analysis based on clickstream data,” Chin. Edu. Tech., vol. 02, pp. 105-112, 2021. https://zdjy.cbpt.cnki.net/portal/journal/portal/client/pa per/69aca066d9ab8077236ddc3ff4f2873d </unstructured_citation></citation><citation key="ref10"><doi>10.1371/journal.pone.0299018</doi><unstructured_citation>J. Wang, Y. Yu, “Machine learning approach to student performance prediction of online learning,” PLOS ONE, vol. 20, no. 1, pp. e0299018, 2025, doi: 10.1371/journal.pone.0299018. </unstructured_citation></citation><citation key="ref11"><doi>10.6007/ijarbss/v13-i3/16471</doi><unstructured_citation>T. H. Siok, M. S. Sim, N. H. Rahmat, “Motivation to learn online: An analysis from McClelland's Theory of Needs,” Int. J. Aca. Res. in Busi. Soc. Sci., vol. 13, no. 3, pp. 215 – 234, 2023, doi: 10.6007/IJARBSS/v13-i3/16471. </unstructured_citation></citation><citation key="ref12"><doi>10.1007/s10055-022-00689-5</doi><unstructured_citation>Y. Lin, Y. F. Lan, S. Wang, “A method for evaluating the learning concentration in head-mounted virtual reality interaction,” Virt. Rea., vol. 27, pp. 863–885, 2023, doi: 10.1007/s10055-022-00689-5. </unstructured_citation></citation><citation key="ref13"><doi>10.1101/2021.04.13.439732</doi><unstructured_citation>S. Q. Cai, E. Z. Su, L. H. Xie, H. Z. Li, “EEG-based auditory attention detection via frequency and channel neural attention,” Ieee Tran. on Human-Mach. Syst., vol. 52, no. 2, pp. 256-266, 2022, doi: 10.1109/THMS.2021.3125283. </unstructured_citation></citation><citation key="ref14"><doi>10.1038/s41598-021-94876-0</doi><unstructured_citation>M. Geravanchizadeh, H. Roushan, “Dynamic selective auditory attention detection using RNN and reinforcement learning,” Scient. Repo., vol. 11, no. 1, pp. 15497, 2021, doi: 10.1038/s41598-021-94876-0. </unstructured_citation></citation><citation key="ref15"><doi>10.3390/jcm7120510</doi><unstructured_citation>M. Slimani, H. Znazen, N. L. Bragazzi, M. S. Zguira, D. Tod, “The effect of mental fatigue on cognitive and aerobic performance in adolescent active endurance athletes: insights from a randomized counterbalanced, cross-over trial,” J. of Clin. Med., vol. 7, no. 12, pp. 510, 2018, doi: 10.3390/jcm7120510. </unstructured_citation></citation><citation key="ref16"><doi>10.3389/fnins.2023.1152394</doi><unstructured_citation>C. Porcaro, K. Avanaki, O. Arias-Carrion, M. Morup, “Editorial: Combined EEG in research and diagnostics: Novel perspectives and improvements,” Fron. in Neur., vol. 17, pp. 1152394, 2023, doi: 10.3389/fnins.2023.1152394. </unstructured_citation></citation><citation key="ref17"><doi>10.3390/s22134939</doi><unstructured_citation>D. L. Ouyang, Y. F. Yuan, G. F. Li, Z. Z. Guo, “The Effect of Time Window Length on EEG-Based Emotion Recognition,” Sensors, vol. 22, no. 13, pp. 4939, 2022, doi: 10.3390/s22134939. </unstructured_citation></citation><citation key="ref18"><doi>10.1007/s10916-019-1345-y</doi><unstructured_citation>B. Ay, O. Yildirim, M. Talo, U. B. Baloglu, G. Aydin, S. D. Puthankattil, U. R. Acharya, “Automated Depression Detection Using Deep Representation and Sequence Learning with EEG Signals,” J. Med. Syst., vol. 43, no. 7, pp. 1-12, 2019, doi: 10.1007/s10916-019-1345-y. </unstructured_citation></citation><citation key="ref19"><doi>10.3389/fpsyt.2021.659006</doi><unstructured_citation>K. Gagnon, C. Bolduc, L. Bastien, R. Godbout, “REM Sleep EEG Activity and Clinical Correlates in Adults With Autism,” Front. Psychiatry, vol. 12, p. 659006, 2021, doi: 10.3389/fpsyt.2021.659006. </unstructured_citation></citation><citation key="ref20"><doi>10.3389/fncom.2021.758212</doi><unstructured_citation>H. R. Liu, Y. Zhang, Y. J. Li, X. Y. Kong, “Review on Emotion Recognition Based on Electroencephalography,” Front. Comput. Neurosci., vol. 15, p. 84, 2021, doi: 10.3389/fncom.2021.758212. </unstructured_citation></citation><citation key="ref21"><doi>10.1109/taffc.2020.3013711</doi><unstructured_citation>X. B. Du, C. X. Ma, G. H. Zhang, J. Y. Li, Y. K. Lai, G. Z. Zhao, X. M. Deng, Y. J. Liu, H. A. Wang, “An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1528-1540, 2022, doi: 10.1109/TAFFC.2020.3013711. </unstructured_citation></citation><citation key="ref22"><doi>10.1016/j.future.2021.01.010</doi><unstructured_citation>Y. S. Liu, G. F. Fu, “Emotion Recognition by Deeply Learned Multi-Channel Textual and EEG Features,” Futur. Gener. Comput. Syst., vol. 119, pp. 1-6, 2021, doi: 10.1016/j.future.2021.01.010. </unstructured_citation></citation><citation key="ref23"><doi>10.1016/j.bspc.2022.103547</doi><unstructured_citation>A. S. Rajpoot, M. R. Panicker, “Subject Independent Emotion Recognition Using EEG Signals Employing Attention Driven Neural Networks,” Biomed. Signal Process. Control, vol. 75, p. 103547, 2022, doi: 10.1016/j.bspc.2022.103547. </unstructured_citation></citation><citation key="ref24"><doi>10.1109/jbhi.2022.3198688</doi><unstructured_citation>L. Feng, C. Cheng, M. Y. Zhao, H. Y. Deng, Y. Zhang, “EEG-Based Emotion Recognition Using Spatial-Temporal Graph Convolutional LSTM With Attention Mechanism,” IEEE J. Biomed. Health Inform., vol. 26, no. 11, pp. 5406-5417, 2022, doi: 10.1109/JBHI.2022.3198688. </unstructured_citation></citation><citation key="ref25"><doi>10.1007/s13042-021-01414-5</doi><unstructured_citation>Q. Gao, Y. Yang, Q. J. Kang, Z. K. Tian, Y. Song, “EEG-Based Emotion Recognition with Feature Fusion Networks,” Int. J. Mach. Learn. Cybern., vol. 13, no. 2, pp. 421-429, 2022, doi: 10.1007/s13042-021-01414-5. </unstructured_citation></citation><citation key="ref26"><doi>10.1109/tcbb.2016.2616395</doi><unstructured_citation>B. Hu, X. W. Li, S. T. Sun, M. Ratcliffe, “Attention Recognition in EEG-Based Affective Learning Research Using CFS plus KNN Algorithm,” IEEE-ACM Trans. Comput. Biol. Bioinform., vol. 15, no. 1, pp. 38-45, 2018, doi: 10.1109/TCBB.2016.2616395. </unstructured_citation></citation><citation key="ref27"><doi>10.1088/1741-2552/ab909f</doi><unstructured_citation>L. W. Ko, O. Komarov, W. K. Lai, W. G. Liang, T. P. Jung, “Eyeblink Recognition Improves Fatigue Prediction from Single-Channel Forehead EEG in a Realistic Sustained Attention Task,” J. Neural Eng., vol. 17, no. 3, p. 036015, 2020, doi: 10.1088/1741-2552/ab909f. </unstructured_citation></citation><citation key="ref28"><doi>10.3390/s22145252</doi><unstructured_citation>X. L. Zhu, W. T. Rong, L. Zhao, Z. L. He, Q. L. Yang, J. Y. Sun, G. D. Liu, “EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features,” Sensors, vol. 22, no. 14, p. 5252, 2022, doi: 10.3390/s22145252. </unstructured_citation></citation><citation key="ref29"><doi>10.1007/s13534-022-00232-0</doi><unstructured_citation>M. Kim, S. Yoo, C. Kim, “Miniaturization for Wearable EEG Systems: Recording Hardware and Data Processing,” Biomed. Eng. Lett., vol. 12, no. 3, pp. 239-250, 2022, doi:10.1007/s13534-022-00232-0. </unstructured_citation></citation><citation key="ref30"><doi>10.1097/wnp.0000000000000736</doi><unstructured_citation>M. C. Ng, J. Jing, M. B. Westover, “A Primer on EEG Spectrograms,” J. Clin. Neurophysiol., vol. 39, no. 3, pp. 177-183, 2022, doi: 10.1097/WNP.0000000000000736. </unstructured_citation></citation><citation key="ref31"><doi>10.1093/bib/bbw068</doi><unstructured_citation>S. Min, B. Lee, S. Yoon, “Deep Learning in Bioinformatics,” Brief. Bioinform., vol. 18, no. 5, pp. 851-869, 2017, doi: 10.1093/bib/bbw068.</unstructured_citation></citation></citation_list></journal_article></journal></body></doi_batch>