Learning Evaluation Methods in University based on Data Mining

AUTHORS

Golshah Abawajy,Charles Sturt University, Wagga Wagga, Australia

ABSTRACT

One of the current frontier points of learning evaluation is to focus on whether and how to use powerful digital technology to analyze digital data. This paper proposes a learning evaluation method based on big data. This paper constructs a new standard for the development of evaluation tools—metrolytic standards. The combination of standards used in the field of learning analytics and commonly used methods in educational measurement provides a framework for ensuring the reliability and validity of all educational evaluations. Measurement and analysis standards include quality requirements for the reliability, validity, accuracy, or interpretability of the test. These requirements are usually only applicable to high-risk, large-scale evaluations, such as PISA, SAT, or GMAT. The application of measurement analysis standards is based on a solid understanding of evaluation and its role in learning, combined with the advantages of learning analysis, artificial intelligence, and measurement science, and provides a choice for researchers in the frontier field of evaluation.

 

KEYWORDS

Digital technology, Artificial intelligence, Learning evaluation tools, Learning evaluation methods

REFERENCES

[1] J. W. Pellegrino, “The evolution of educational assessment: Considering the past and imagining the future. William H Ang off memorial Lecture,” ETS
[2] P. Griffin and E. Care, “Assessment and teaching of 21st-century skills: Methods and approaches,” Dordrecht: Springer, vol.2, (2015)
[3] K. Tremblay, D. Lalancette, and D. Roseveare, “Assessment of higher education learning outcomes (AAHELO): Feasibility study report, design, and implementation,” Paris, France: Organization for Economic Cooperation and Development, (2012)
[4] OECD., “The future of education and skills: Education 2030, Geneva, Switzerland,” (2018)
[5] S. K. Milligan, G. Kennedy, and D. Israel, “Assessment, credentialing and recognition in the digital era: Recent developments in a fertile field,” Seminar Series272, Centre of Strategic Studies, Melbourne, (2018)
[6] S. E. Dreyfus and H. L. Dreyfus, “A five-stage model of the mental activities involved in directed skill acquisition
[7] P. Griffin, “The comfort of competence and the uncertainty of assessment,” Studies in Educational Evaluation, vol.33, no.1, pp.87-99, (2007)
[8] S. Messick, “Standards of validity and the validity of standards in performance assessment,” Educational Measurement: Issues and Practice, vol.14, no.4, pp.5-8
[9] “Asia-Pacific education research institutes network regional study on transversal competencies in education policy and practice,” UNESCO, Bangkok. And Paris. Retrieved from: unesdoc.unesco.org/images/0023/002319/231907E.pdf, (2015)
[10] World Economic Forum., The new vision for education: Unlocking the potential of technology. Geneva, Switzerland, (2015)
[11] J. D. Bransford, J. D. Brown, and R. R. Cocking, “How people learn: Brain, mind, experience, and school,” Expanded edition, Washington DC,” National Academy Press. Retrieved from http://www.nap.edu/read/9853/chapter/1, (2003)
[12] S. K. Milligan and P. Griffin, “Understanding learning and learning design in MOOCs: A measurement-based interpretation,” Journal of Learning Analytics, Special Section on Learning Analytics for 21st Century Competencies: UTS, Australia. Retrieved from https://www.researchgate.net/publication/308272525_Understanding_Learning_and_Learning_Design_in_MOOCs_A_Measurement-Based_Interpretation, (2016)
[13] M. Scardamalia, J. Bransford, B. Kozma, and E. Quellmalz, “New assessments and environments for knowledge building,” In P. Griffin, B. McGaw, and E. Care (Eds.), Assessment and teaching of 21century skills, New York: Springer, vol.1, pp.231-300, (2013)
[14] L. Corrin and P. de Barba, “Exploring students ‘interpretation of feedback delivered through learning analytics dashboards. In B. Hegarty, J. McDonald, and S.-K. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology(pp.629-633). Proceedings ASCILITE Dunedin, (2014)
[15] R. Luckin, “Towards artificial intelligence-based assessment systems,” Nature Human Behaviour,1(0028). Retrieved from https://www.nature.com/articles/s41562-016-0028, (2017)
[16] I. Roll and R. Wylie, “Evolution and revolution in artificial intelligence in education,” International Journal of Artificial Intelligence in Education, vol.26, no.2, pp.582-599, (2016)
[17] J. He, B. I. P. Rubinstein, J. Bailey, R. Zhang, S. Milligan, and J. Chan, “MOOCs meet measurement theory: A topic-modeling approach,” Paper presented at the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16), Phoenix, Arizona, (2016)
[18] M. Flor, S. Y. Yoon, J. Hao, L. Liu, and A. von Davier, “Automated classification of collaborative problem-solving interactions in simulated science tasks,” In Proceedings of the 11th Workshop on Innovative Use of NLP for Building Educational Applications, pp.31-41, (2016)
[19] C. Carmean and P. Mizzi, “|The case for nudge analytics,” EDUCAUSE Quarterly, Retrieved from https://eric.ed.gov/?id=EJ909992, vol.333, (2010)
[20] D. Gasevic, S. Dawson, and G. Siemens, “Lets not forget: Learning analytics is about learning,” Tech Trends, vol.59, no.1, pp.64-71, (2015)
[21] G. Siemens and P. Long, “Penetrating the fog: Analytics in learning and education,” EDUCAUSE Review, vol.46, no.5, pp.30-32, (2011)
[22] W. Greller and H. Draschler, “Translating learning into numbers: A framework for learning analytics,” Educational Technology and Society, vol.15, no.3, pp.42-47, (2012)
[23] W. Greller and H. Draschler, “Translating learning into numbers: A framework for learning analytics,” Educational Technology and Society, vol.15, no.3, pp.42-47, (2012)
[24] T. J. Cleary, G. I. Callan, and B. J. Zimmerman, “Assessing self-regulation as a cyclical, context-specific phenomenon: Overview and analysis of SLR Microanalytic protocols,” Educational Research International. Retrieved from https://doi:10.1155/2012/428639, (2012)
[25] R. lterman, “Understanding promotions in a case study of student blogging,” Paper presented at the Third International Conference on Learning Analytics and Knowledge, Leuven, Belgium. (2013)
[26] J. He, B. I. P. Rubinstein, J. Bailey, R. Zhang, S. Milligan, and J. Chan, “MOOCs meet measurement theory: A topic-modeling approach,” Paper presented at the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16), Phoenix, Arizona, (2016)
[27] J. Hattie and H. Timperley, “The power of feedback,” Review of Educational Research, vol.77, no.1, pp.81-112, (2007)
[28] S. Buckingham and R. D. Crick, “Multimodal and 21st-century skills learning analytics and datasets,” Journal of Learning Analytics, vol.3, no.2, pp.6-21, (2016)
[29] V. Shute and M. Ventura, “Stealth assessment: Measuring and supporting learning in video games. Cambridge,” MA: MIT Press, (2013)
[30] M. Wilson, K. Scalise, and P. Gochyyev, “Assessment of learning in digital interactive social networks: A learning analytics approach,” Online Learning, vol.20, no.2, pp.97-119, (2016)
[31] S. T. Polyak, A. von Davier, and K. Peterschmidt, “Analyzing game-based collaborative problem solving with computational psychometrics, ” In Proceedings of ACM KDD conference,” Halifax, Nova Scotia, Canada. (2017)
[32] M. Wilson, “Constructing measures: An item response modeling approach,” New York: Taylor and Francis Group, (2005)
[33] R. J. Mislevy and G. D. Haertel, “Implications of evidence-centered design for educational testing,” Educational Measurement: Issues and Practice, vol.25, no.4, pp.6-20, (2006)

CITATION

  • APA:
    Abawajy,G.(2019). Learning Evaluation Methods in University based on Data Mining. Asia-Pacific Journal of Educational Management Research, 4(3), 21-32. 10.21742/AJEMR.2019.4.3.03
  • Harvard:
    Abawajy,G.(2019). "Learning Evaluation Methods in University based on Data Mining". Asia-Pacific Journal of Educational Management Research, 4(3), pp.21-32. doi:10.21742/AJEMR.2019.4.3.03
  • IEEE:
    [1] G.Abawajy, "Learning Evaluation Methods in University based on Data Mining". Asia-Pacific Journal of Educational Management Research, vol.4, no.3, pp.21-32, Dec. 2019
  • MLA:
    Abawajy Golshah. "Learning Evaluation Methods in University based on Data Mining". Asia-Pacific Journal of Educational Management Research, vol.4, no.3, Dec. 2019, pp.21-32, doi:10.21742/AJEMR.2019.4.3.03

ISSUE INFO

  • Volume 4, No. 3, 2019
  • ISSN(p):2207-5380
  • ISSN(e):2207-290X
  • Published:Dec. 2019

DOWNLOAD