||Peer assessment is a mechanism widely used for assessing students’ learning outcome, such as written assignments and oral reports. Some studies indicated that peer assessment brings many benefits to student learning, e.g., improving learning performance and strengthening the ability of problem solving. However, there is a lack of research to use peer assessment to support the design and evaluation of DGBL. On the other hand, DGBL is widespread in educational settings. A great amount of research indicated that DGBL can improve students’ learning outcomes, increase their learning motivation, etc. However, each individual has different background, needs and learning preferences so it is doubtful whether the design approaches used for the DGBL can be suitable to every learner. Thus, there is a need to consider human factors. Among various human factors, this study focuses on cognitive styles, which refer to the way of how learners process and organize information. Accordingly, cognitive styles may drive the design and evaluation of DGBL.|
However, paucity of research examined the impacts of cognitive styles on the design and evaluation of DGBL of peer assessment. To fill this gap, this study examined the differences of design and evaluation among different cognitive style combinations. To achieve comprehensive understandings, this study addressed ten research questions, i.e., 1. How the combination of cognitive styles affects the scores that game designers obtain during the process of peer assessment, in terms of game elements? 2. How cognitive styles affects the scores that evaluators marks during the process of peer assessment, in terms of game elements? 3. How cognitive styles affects the scores that evaluators mark for each cognitive style combination, in terms of game elements? 4. How the combination of cognitive styles affects the scores that game designers obtain during the process of peer assessment, in terms of game proposals? 5. How cognitive styles affects the scores that evaluators marks during the process of peer assessment, in terms of game proposals? 6. How cognitive styles affect the scores that evaluators mark for each cognitive style combination, in terms of game proposals? 7. How the combination of cognitive styles affects the scores that game designers obtain during the process of peer assessment, in terms of game implementation? 8. How cognitive styles affects the score that evaluators mark during the process of peer assessment, in terms of game implementation? 9. How cognitive styles affect the scores that evaluators mark for each cognitive style combination, in terms of game implementation? 10. How the combination of cognitive styles affects the relationship between scores from game element and those from game interface design.
In order to answer the above questions, the subjects of this study play as both the roles of designers and evaluators. Regarding the role of designers, all members were divided into three groups according to their cogitative styles, i.e., field-independent & field- independent (FI&FI) 、field-independent & field-dependent (FI&FD) 、field-dependent & field- dependent (FD&FD). Regarding the role of evaluators, there are field-independent (FI) and field-dependent (FD) evaluators. All members should go through three stage of this experiment. Stage one focus on the implementation of game elements while stage two and stage three emphasize on the improvement of user interface. More specifically, all members need to incorporate three game element, i.e., playfulness、novelty and functionality, into game design in stage one. All member should make a proposal for improving uses interface based on Nielsen heuristics in stage two and then make implementation according to what they propose in stage three.
The result of this study include scores that designers obtain and those that evaluators marks at each stage. Regarding the implementation of game elements, the scores that FI&FI obtained are higher than those of FD&FD, regardless of playfulness, novelty or functionality. On the other hand, the scores that FI mark for FI&FI and FD&FD are higher than those from FD, regardless of playfulness, novelty or functionality. Regarding the stage of the proposal of game interface, the scores that, the scores that FI&FI obtained are higher than those of FD&FD, including H1-H10. On the other hand, the scores that FI mark for FI&FI are higher than those from FD, in terms of H2, H3, H4, H5, H7, H8. Furthermore, the scores that FI mark for FD&FD are higher than those of FD, including H1-H10. Regarding the stage of the implementation of game interface, the scores that FI&FI obtained are higher than those of FD&FD, including H3-H9. On the other hand, the scores that FI mark for FI&FI are higher than those from FD, including H2-H8. Furthermore, the scores that FI mark for FD&FD are higher than those of FD, in terms of H3, H4, H6, H7, H8.
The above results provide the understandings of how different cognitive style combinations react to game elements and Nielsen’s ten heuristics during the process of game design and evaluation. In addition, evaluators with different cognitive styles also mark differently, regardless of game elements or game interface. In other words, cognitive styles play a key role. Such results can provide guidance for future researchers so that they how to undertake peer assessment from a cognitive style perspective.
Keywords: peer assessment, Digital Game Base Learning cognitive styles, Nielsen Heuristic.
吳裕益 (1987)。認知能力與認知型態個別 差異現象之探討。教育學刊，7，51-98。
Agbonifo, O. C., & Ofueu, S. (2018). A digital game-based quadratic factorisation learning system using tic-tac-toe. Nigerian Journal of Technology, 37(2), 463-469.
Barzilai, S., & Blau, I. (2014). Scaffolding game-based learning: Impact on learning achievements, perceived learning, and game experiences. Computers & Education, 70, 65-79.
Carvalho, L. R. D., Évora, Y. D. M., & Zem-Mascarenhas, S. H. (2016). Assessment of the usability of a digital learning technology prototype for monitoring intracranial pressure. Revista Latino-americana de Enfermagem, 24. 4-6
Chang, B., Chen, S. Y., & Jhan, S. N. (2015). The influences of an interactive group-based videogame: cognitive styles vs. prior ability. Computers & Education, 88, 399-407.
Cheng, W., & Warren, M. (2005). Peer assessment of language proficiency. Language Testing, 22(1), 93-121.
Chen, S. Y., & Macredie, R. D. (2002). Cognitive styles and hypermedia navigation: Development of a learning model. Journal of the American Society for Information Science and Technology, 53(1), 3-15.
Chen, S. Y., & Macredie, R. D. (2004). Cognitive modeling of student learning in web-based instructional programs. International Journal of Human-Computer Interaction, 17(3), 375-402.
Chen, S. Y., & Macredie, R. D. (2005). The assessment of usability of electronic shopping: A heuristic evaluation. International Journal of Information Management, 25(6), 516-532.
Chen, S. Y., & Macredie, R. D. (2010). Web-based interaction: A review of three important human factors. International Journal of Information Management, 30(5), 379-387.
Chiang, T. H., Yang, S. J., & Hwang, G. J. (2014). Students′ online interactive patterns in augmented reality-based inquiry activities. Computers & Education, 78, 97-108.
Chou, C., & Lin, H. (1997). Navigation Maps in a Computer-networked Hypertext Learning System. Paper presented at the Annual Meeting of the Association for Educational Communications and Technology, Albuquerque, NM.
Falchikov, N. (2001). Learning together: Peer Tutoring in Higher Education, London, UK: RoutledgeFalm.
Fatemi, A. H., Vahedi, V. S., & Seyyedrezaie, Z. S. (2014). The effects of top-down/bottom-up processing and field-dependent/field-independent cognitive style on Iranian EFL learners′ reading comprehension. Theory and Practice in Language Studies, 4(4), 686-693.
Ford, N., & Chen, S. Y. (2001). Matching/Mismatching revisited: an empirical study of learning and teaching styles. British Journal of Educational Technology, 32(1), 5-22.
Ford, N., Wilson, T., Foster, A., Ellis, D., & Spink, A. (2002). Information seeking and mediated searching. Part 4. Cognitive styles in information seeking. Journal of the American Society for Information Science and Technology, 53(9), 728-735.
Goodenough, D. R. (1976). The role of individual differences in field dependence as a factor in learning and memory. Psychological Bulletin, 83(4), 675-694.
Guo, P. J., & Reinecke, K. (2014). Demographic Differences in How Students Navigate through MOOCs. In Proceedings of the first ACM conference on Learning@ scale conference, 4, Atlanta, Georgia (pp. 21-30). http://dx.doi.org/10.1145/2556325.2566247.
Hao, Y., Hong, J. C., Jong, J. T., Hwang, M. Y., Su, C. Y., & Yang, J. S. (2010). Non‐native Chinese language learners′ attitudes towards online vision‐based motion games. British Journal of Educational Technology, 41(6), 1043-1053.
Hsieh, Y. Z., Su, M. C., Chen, S. Y., & Chen, G. D. (2015). The development of a robot-based learning companion: a user-centered design approach. Interactive Learning Environments, 23(3), 356-372.
Hong, J. C., Hwang, M. Y., Tam, K. P., Lai, Y. H., & Liu, L. C. (2012). Effects of cognitive style on digital jigsaw puzzle performance: A GridWare analysis. Computers in Human Behavior, 28(3), 920-928.
Hwang, G. J., Hung, C. M., & Chen, N. S. (2014). Improving learning achievements, motivations and problem-solving skills through a peer assessment-based game development approach. Educational Technology Research and Development, 62(2), 129-145.
Lai, C. L., & Hwang, G. J. (2015). An interactive peer-assessment criteria development approach to improving students′ art design performance using handheld devices. Computers & Education, 85, 149-159.
Liu, M., & Reed, W. M. (1995). The effect of hypermedia assisted instruction on second language learning. Journal of Educational Computing Research, 12(2), 159-175.
Lu, C. H., Hong, J. C., & Huang, P. H. (2007). The Effects of Individual Characteristics on Children’s Problem Solving Perfromances in the Context of Game-based Learning. Paper presented at the Culture, Knowledge and Understanding. National Institute of Education, Singapore. (pp.9-15). doi:10.1.1.537.1789
McDaniel, R., & Kenny, R. (2013). Evaluating the relationship between cognitive style and pre-service teachers’ preconceived notions about adopting console video games for use in future classrooms. International Journal of Game-Based Learning (IJGBL), 3(2), 55-76.
Messick, S. (1976). Individuality in Learning. San Francisco, CA: Jossey-Bass Publishers.
Nielsen, J. (1994). Heuristic Evaluation. In J. Nielsen and R. L. Mack (Eds.), Usability Inspection Methods (pp.25-64), New York, NY: John Wiley & Sons.
Orsmond, P., Merry, S., & Reiling, K. (1996). The importance of marking criteria in the use of peer assessment. Assessment & Evaluation in Higher Education, 21(3), 239-250.
Pask, G. (1976). Styles and strategies of learning. British Journal of Educational Psychology, 46(2), 128-148.
Pask, G. (1979). Final report of SSRC Research programme HR 2708. Richmond (Surrey): System Research Ltd.
Prensky, M. (2001). The games generations: How learners have changed. In Prensky, M. (Ed.), Digital game-based Learning (pp. 8-10). New York, NY: McGraw-Hill.
Puegphrom, P., & Chiramanee, T. (2011). The Effectiveness of Implementing Peer Assessment on Students’ Writing Proficiency. Proceedings of the 3rd International Conference on Humanities and Social Sciences, Prince of Songkla University.
Purchase, H. C. (2000). Learning about interface design through peer assessment. Assessment & Evaluation in Higher Education, 25(4), 341-352.
Qian, M., & Clark, K. R. (2016). Game-based Learning and 21st century skills: A review of recent research. Computers in Human Behavior, 63, 50-58.
Raptis, G. E., Fidas, C. A., & Avouris, N. M. (2016). Do field dependence-independence differences of game players affect performance and behaviour in cultural heritage games. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (pp. 38-43), Austin, Texas, USA.
Robertson, J., & Howells, C. (2008). Computer game design: Opportunities for successful learning. Computers & Education, 50(2), 559-578.
Riding, R. J., & Sadler‐Smith, E. (1997). Cognitive style and learning strategies: Some implications for training design. International Journal of Training and Development, 1(3), 199-208.
Riding, R., & Rayner, S. (1998). Cognitive Styles and Learning Strategies. London, UK: David Fulton Publishers.
Shneiderman, B. (1998), Eight golden rules for interface design. In B. Shneiderman (3rd ed.). Designing the User Interface: Strategies for Effective Human Computer Interaction. Reading, MA: Addison Wesley.
Sitthiworachart, J. & Joy, M. (2003). Deepening computer programming skills by using web-based peer assessment. In Proceedings of the 4th Annual Conference of the LTSN Centre for Information and Computer Sciences.
Sluijsmans, D. (2002). Student Involvement in Assessment: The Training of Peer Assessment Skills. Doctoral thesis, Open University of the Netherlands, Heerlen. Retrieved from http://dspace.ou.nl/bitstream/1820/1034/1/dissertation%20Sluijsmans%20%202002.pdf.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257-285.
Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249-276.
Wang, X. M., Hwang, G. J., Liang, Z. Y., & Wang, H. Y. (2017). Enhancing students’ computer programming performances, critical thinking awareness and attitudes towards programming: An online peer-assessment attempt. Journal of Educational Technology & Society, 20(4), 58-68.
Witkin, H. A., Dyk, R., Faterson, H. F., Goodenough, D. R., & Karp, S. A. (1962). Psychological Differentiation: Studies of development. Oxford, UK: Wiley.
Witkin, H. A., & Moore, C. A. (1974). Cognitive Style and the Teaching Learning Process. In Paper presented at the annual meeting of the American Educational Research Association 59th, Chicago, IL .
Witkin, H. A., Moore, C. A., Goodenough, D. R., & Cox, P. W. (1977). Field dependent and field-independent cognitive styles and their educational implications. Review of Educational Research, 47(1), 1-64.
Wu, H. (2018). The Effects of Field Independent/Field Dependent Cognitive Styles on Incidental Vocabulary Acquisition under Reading Task. Theory and Practice in Language Studies, 8(7), 813-822.
Yeh, Y.-T., Hung, H.-T., & Hsu, Y.-J. (2017). Digital Game-Based Learning for Improving Students’ Academic Achievement, Learning Motivation, and Willingness to Communicate in an English Course, 2017 6th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI), (pp. 560–563). doi: 10.1109/IIAI-AAI.2017.40.
Yen, L., Chen, C. M., & Huang, H. B. (2016). Effects of Mobile Game-Based English Vocabulary Learning APP on Learners’ Perceptions and Learning Performance: A Case Study of Taiwanese EFL Learners. In ICEL2016-Proceedings of the 11th International Conference on e-Learning: ICEl2016 (p. 255). Academic Conferences and publishing limited.
Yen, J. C., & Lee, C. Y. (2011). Exploring problem solving patterns and their impact on learning achievement in a blended learning environment. Computers & Education, 56(1), 138-145.
Yinjaroen, P., & Chiramanee, T. (2011). Peer Assessment of Oral English Proficiency. Paper presented at The 3rd International Conference on Humanities and Social Sciences. Hat Yai, Songkhla, Thailand. Retrieved from http://tar.thailis.or.th/bitstream/123456789/660/1/001.pdf
Zin, M. (2009). Digital game-based learning in high school computer science education: Impact on educational effectiveness and student motivation. Computers & Education, 52(1), 1-12.
Zin, N. A. M., Jaafar, A., & Yue, W. S. (2009). Digital game-based learning (DGBL) model and development methodology for teaching history. WSEAS Transactions on Computers, 8(2), 322-333.