Gamification in Employee Selection (2019 PDF)
Document Details
Uploaded by HeartfeltGnome388
Athens University of Economics and Business
Konstantina Georgiou, Athanasios Gouras, Ioannis Nikolaou
Tags
Related
- Mapping the Implementation Framework PDF
- PalawanPay Gamified v4 (2023) PDF
- Gamified Flashcards on Blockchain Technology PDF
- Introduction to Gamification and Agile Performance Management PDF
- Gamification in Branded Sports Apps PDF
- Exploring the Relationship of a Gamified Assessment with Performance PDF
Summary
This is an academic article about gamification in employee selection. The authors explore the construct validity of a gamified assessment method based on the situational judgment test (SJT) methodology. The study examines the applicability of game elements to traditional assessment methods for evaluating candidates' soft skills.
Full Transcript
Received: 5 April 2018 | Revised: 22 March 2019 | Accepted: 25 March 2019 DOI: 10.1111/ijsa.12240 F E AT U R E A R T I C L E Gamification in employee selection: The development of a gamified assessment Konstantina Georgiou | Athanasios Gouras | Ioannis Nikolaou...
Received: 5 April 2018 | Revised: 22 March 2019 | Accepted: 25 March 2019 DOI: 10.1111/ijsa.12240 F E AT U R E A R T I C L E Gamification in employee selection: The development of a gamified assessment Konstantina Georgiou | Athanasios Gouras | Ioannis Nikolaou Department of Management Science and Technology, School of Business, Athens Abstract University of Economics and Business, Gamification has attracted increased attention among organizations and human re‐ Athens, Greece source professionals recently, as a novel and promising concept for attracting and Correspondence selecting prospective employees. In the current study, we explore the construct va‐ Konstantina Georgiou, Department of Management Science and Technology, lidity of a new gamified assessment method in employee selection that we developed School of Business, Athens University of following the situational judgement test (SJT) methodology. Our findings support the Economics and Business, 76, Patission Str., Athens GR10434, Greece. applicability of game elements into a traditional form of assessment built to assess Email: [email protected] candidates' soft skills. Specifically, our study contributes to research on gamification and employee selection exploring the construct validity of a gamified assessment method indicating that the psychometric properties of SJTs and their transformation into a gamified assessment are a suitable avenue for future research and practice in this field. KEYWORDS employee selection, gamified assessment method, situational judgement test 1 | I NTRO D U C TI O N gamification in the recruitment and selection process. Therefore, a question arises. Should researchers and professionals in Work/ New technologies, such as gamification, game‐based assessments, Organizational Psychology and Human Resource Management be and serious games, have recently attracted increased attention interested in the use and effectiveness of gamified selection meth‐ in the field of talent identification (Chamorro‐Premuzic, Akhtar, ods? Gamified selection methods might improve hiring decisions. Winsborough, & Sherman, 2017). Serious games are games de‐ For example, traditional methods used in employee selection make signed and used for a primary goal other than entertainment two inferential leaps; one about applicants' rating on multiple‐choice (Michael & Chen, 2005). In turn, gamification refers to the incorpo‐ items measuring traits and competencies and the extent to which ration of game elements into nongaming activities in any context, they possess these traits or competencies, and another between such as the workplace, giving birth to game‐based assessments, competencies and applicants' actual job performance, which gami‐ which can be classified according to the level of game character‐ fied selection methods may not make (Fetzer, McNamara, & Geimer, istics they employ from gamified assessments, such as multimedia 2017). Playing online gamified assessments might simulate situations situational judgement test (SJTs) to different styles of games, such where individuals' intentions and behaviors are shown. Depending as Candy Crush and Flight Simulator (Hawkes, Cek, & Handler, on the type of the game design and elements used in assessments, 2017). applicants' attention might be drifted from the fact that they are Gamification has been applied to employee selection settings in evaluated, showcasing thus their true behaviors and, as a result, re‐ order to make assessment methods more game‐like improving thus duce faking and/or social desirability biases (Armstrong, Landers, & applicant reactions and possibly increase the prediction of job per‐ Collmus, 2016b). Therefore, gamified selection methods might re‐ formance (Armstrong, Ferrell, Collmus, & Landers, 2016a). However, duce the traditional methods' inferential leaps, improving thus the no published studies to date have established the effectiveness of prediction of job performance. Int J Select Assess. 2019;27:91–103. wileyonlinelibrary.com/journal/ijsa © 2019 John Wiley & Sons Ltd | 91 14682389, 2019, 2, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/ijsa.12240 by Erasmus University Rotterdam Universiteitsbibliotheek, Wiley Online Library on [13/01/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License 92 | GEORGIOU et al. Since recent studies have dictated the applicability of SJTs in al., 2016b). Furthermore, Armstrong, Ferrell et al. (2016a), recently high fidelity modes, such as video, multimedia, and interactive for‐ clarified that “a gamified assessment is not a stand‐alone game, but mats (Lievens & Sackett, 2006a) and gamified contexts (Armstrong, it is instead an existing form of assessment that has been enhanced Landers et al., 2016b), the purpose of our research is to explore with the addition of game elements” (p. 672). This pertains that gam‐ the development and construct validity of a SJT assessment that ified assessments reflect an advanced level of existing typical types has been subsequently converted into a gamified assessment. of selection methods, a meta‐method that incrementally reinforces Specifically, we used gamification to gamify a previous form of as‐ the possibilities of increased job performance prediction (Lievens, sessment that we initially developed (SJT). To achieve our goal, we Peeters, & Schollaert, 2008). conducted two studies; the development and construct validity of a Recently, different types of gamified assessment methods were SJT (Study 1), and the replication of the results to a gamified version developed by various specialized companies, such as Owiwi and of the SJT and its cross‐validation (Study 2). Pymetrics, where others have focused on developing game‐based assessments (e.g., Artic Shores and cut‐e) attracting increased in‐ terest and use among organizations globally (Nikolaou, Georgiou, 2 | G A M I FI C ATI O N I N E M PLOY E E Bauer, & Truxillo, 2019). These gamified assessments might assess S E LEC TI O N an applicant's cognitive ability or judgment regarding a situation encountered in the workplace. However, gamification types in em‐ Similarly to work sample and multimedia assessment tools, gamified ployee selection vary and can include various elements, either nar‐ selection methods assess applicants' knowledge, skills, abilities, and rative, such as additional text to an online questionnaire, till highly other characteristics (KSAOs), which are supported to predict job interactive game elements, such as avatars and digital rewards performance (e.g., Lievens & De Soete, 2012; Schmidt & Hunter, (Armstrong, Ferrell et al., 2016a). For example, gamified assessments 1998). Moreover, the use of gamified selection methods might lead might include virtual worlds sharing characteristics akin to work set‐ to increased engagement and positive perceptions of the organiza‐ tings and avatars representing employees in order to assess candi‐ tion signaling that it is at the cutting edge of technology offering dates' skills and elicit job relevant behaviors (Laumer, Eckhardt, & competitive advantage in the war of talent (Fetzer et al., 2017). Chow Weitzel, 2012). Nevertheless, more research is needed to test the and Chapman (2013) have recently claimed that gamification can be effectiveness of gamified assessment methods and establish valid used effectively in the recruitment process to attract a large number and robust theoretical underpinnings to confirm their applicability in of candidates, improve organizational image and attractiveness and, human resource management and employee selection settings. as a result, positively affect applicants' job pursuit behaviors toward On the other hand, research has already supported that SJTs an organization. Game elements might also improve the selection predict job‐related behaviors above cognitive ability and personal‐ process, since it is more difficult for test‐takers to fake the assess‐ ity tests (Lievens et al., 2008). SJTs tend to determine behavioral ment, as desirable behaviors may be less obvious to individuals play‐ tendencies, assessing how an individual will behave in a certain ing the game, and as a result, improve prediction of job performance situation and are assumed to measure job and situational knowl‐ and hiring decisions (Armstrong, Landers et al., 2016b). This could be edge (Motowidlo, Dunnette, & Carter, 1990; Motowidlo, Hooper, especially the case for traditional selection methods, such as person‐ & Jackson, 2006). Additionally, several scholars have concluded ality tests, which are prone to faking undermining thus their predic‐ that SJTs can tap into a variety of constructs—ranging from prob‐ tive validity (Murphy & Dzieweczynski, 2005). The gamification of lem solving and decision‐making to interpersonal skills and they are selection methods is also likely to improve performance prediction able to measure multiple constructs at the same time (e.g., Christian, by impeding information distortion and providing better quality in‐ Edwards, & Bradley, 2010). Also recent research (Krumm et al., 2015; formation about the test‐takers (Armstrong, Landers et al., 2016b). Lievens & Motowidlo, 2016) indicated that more general domain However, this is not an inherent quality of gamification but it largely knowledge can be assessed by SJTs depending on the content of depends on the type of gamification; candidates might be less likely situations developed, leaving space to researchers and practitioners to identify the correct or desirable answer and distort their answer, to better capture general soft skills' performance and increase the either intentionally to inflate their scores or unintentionally to be targeted audience administration. socially desirable (Richman, Kiesler, Weisband, & Drasgow, 1999). Moreover, video technology has been successfully applied to Gamified assessment methods have also the potential to ex‐ SJTs increasing their effectiveness (e.g., Olson‐Buchanan, Drasgow, tract information about candidates' behavior more accurately com‐ Weekley, & Ployhart, 2006). To be more specific, the increased pared to personality inventories (Armstrong, Landers et al., 2016b). fidelity of presenting the situations in video format might lead to Specifically, contrary to personality questionnaires, they do not in‐ higher predictive validity whereas increased realism might result in clude self‐reported data. Instead of asking participants to indicate favorable applicant reactions (Lievens & Sackett, 2006b). Oostrom, their agreement to various statements, they can assess gameplay Born, Serlie, and Molen (2010) supported that an open‐ended web‐ behaviors to measure candidates' skills. These gameplay behaviors cam SJT, utilizing a webcam instead of a static video recorder to can be job related, and as a result, they might predict future work be‐ capture the responses of participants, predicts job placement suc‐ havior more accurately than questionnaires (Armstrong, Landers et cess. Rockstuhl, Ang, Ng, Lievens, and Dyne (2015) endeavored to 14682389, 2019, 2, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/ijsa.12240 by Erasmus University Rotterdam Universiteitsbibliotheek, Wiley Online Library on [13/01/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License GEORGIOU et al. | 93 predict task performance and interpersonal OCB by expanding the or psychometric tests. Moreover, many authors claim that the diffi‐ traditional SJT paradigm to multimedia, implementing it across dif‐ culty in transferring and assessing soft skills, compared to hard skills ferent cultural samples. In both cases, additional game elements in (e.g., technical or business‐related knowledge and skills), results in SJTs, such as a webcamera and video‐based vignettes, respectively, increased waste of time and money for organizations (e.g., Laker & contributed to better prediction of performance providing support Powell, 2011), accounting for our focus on soft skills and need to in this practice as a promising method for personnel selection. More use an assessment that may provide better quality information about recently, Lievens (2017) suggested that webcam SJTs seem to be candidates' behavior on the job. For example, Kubisiak, Stewart, a promising approach in understanding intra‐individual variability Thornbury, and Moye (2014) employed self‐report surveys to assess in controlled settings, by combining the procedural knowledge and willingness to learn and a gamified simulation to assess ability to the expressed behavior. It is also suggested that incorporating game learn, concluding that a gamified assessment can be used to assess elements into an existing practice in HR might have higher return predictor constructs in a selection context, where survey method‐ on investment for an organization than developing a whole new ology may not be adequate. Similarly, since resilience, ability, flexi‐ digital game (Landers, 2014). Considering the psychometric qual‐ bility, and decision‐making do not address intentions but behaviors, ities of SJTs (McDaniel, Hartman, Whetzel, & Grubb, 2007) along a gamified assessment might be better employed to measure these with the performance results when integrated with multimedia and important attributes among job applicants. game elements, the gamification of SJTs seems to be an appropriate Subsequently, we chose the type of gamification to employ. method to follow. Armstrong, Ferrell et al. (2016a, p. 672), recently There is still limited research in human resources management and emphasized the role of gamification as “especially valuable to prac‐ work/organizational psychology literature on gamification in selec‐ titioners in an era moving toward business‐to‐consumer (B2C) as‐ tion and assessment but there are recommendations for researchers sessment models” which is highly applicable for our research. Taking to approach gamified assessment addressing which game elements the above into consideration, we have chosen SJTs as the most ap‐ might affect and in what way the assessment outcomes (Armstrong, propriate methodology to develop and then, convert it into a new Ferrell et al., 2016a). Drawing from the taxonomy of gamification gamified assessment. To establish the effectiveness of the gamified elements for use in educational contexts (Dicheva, Dichev, Agre, selection method, we will initially explore the construct validity of & Angelova, 2015), we gamified the SJT assessment in respect to a new SJT and the replication of the results to a gamified version of the following gamification design principles: engagement, feedback, the test. progress, freedom of choice, and storytelling. Although there are fundamental differences between game‐based learning and gami‐ fied assessments, as the objective in learning is to motivate not to 3 | G A M I FI E D A S S E S S M E NT measure, the common gamification principles of game‐based learn‐ D E V E LO PM E NT ing might also be appropriate for selection (Hawkes et al., 2017). Dicheva et al. (2015) reviewed previous studies on the application Our aim was to gamify an assessment method that would support of gamification in education and mapped the context of application organizations to map out prospective employees' soft skills. We first and game elements used. The game elements are conceptualized as need to identify the most common core competencies and skills organ‐ the gamification design principles with the game mechanics that are izations often seek from their employees, especially when recruiting in typically used to implement them. For example, the game mecha‐ graduate trainee and entry‐level positions. For example, adaptability, nisms that are used for the principles of engagement and feedback flexibility, learning agility, knowledge breadth, and multicultural per‐ might be avatars (e.g., Deterding, Björk, Nacke, Dixon, & Lawley, spective have often been described as key competencies for employ‐ 2013), immediate rewards instead of vague long‐term benefits (e.g., ability across several stakeholder groups (e.g., Gray, 2016, “The digital Zichermann & Cunningham, 2011), and immediate or cycles of feed‐ future of work”, 2017; Robles, 2012). Moreover, among the most com‐ back (e.g., Nah, Zeng, Telaprolu, Ayyappa, & Eschenbrenner, 2014). mon skills that individuals may use in several job positions are decision‐ In addition, the progress principle is achieved by using a progress bar making, flexibility, and ability to work under pressure, whereas on the or points and levels (e.g., Zichermann & Cunningham, 2011), while other hand, employers often face difficulty in locating young graduates storytelling by using avatars (e.g., Nah et al., 2014), visual and voice possessing soft skills, such as resilience and teamwork (Clarke, 2016). overs. Finally, among the most common gamification design prin‐ Following an extensive search of the literature and research on gradu‐ ciples in educational settings, is freedom of choice (Dicheva et al., ate employability, we selected four of the skills that seemed to become 2015), which in a gamified assessment, may relate to how players more and more relevant in today's demanding work environments (re‐ interact with the game as well as to other choices players may make, silience, adaptability, flexibility, and decision‐making) to form initially for example, whether they can skip a level, leave the game at any the SJT and subsequently the gamified assessment's dimensions. time, save it and return later, and so on (Hawkes et al., 2017). We believe that these skills, which have been identified as key We gamified the SJT assessment by using game mechanics that transferable soft skills integral to graduate employability (Andrews & serve those principles. For example, in the beginning of the assess‐ Higson, 2008), are more suitable to be assessed through a gamified ment, test‐takers select a play hero/avatar. Every crew member assessment than traditional selection methods, such as interviewing appearing in the gamified assessment has a backstory. The story 14682389, 2019, 2, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/ijsa.12240 by Erasmus University Rotterdam Universiteitsbibliotheek, Wiley Online Library on [13/01/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License 94 | GEORGIOU et al. follows the journey of play heroes in four islands, one for each soft 4.2 | Measures skill assessed. Storytelling/narration takes place using visual and voice overs while playing the “game.” We employed narration and 4.2.1 | SJT measurement fantasy in the gamified assessment to bring in engagement, mean‐ Twenty‐five scenarios accompanied with four response options ing and clear calls to action, showing test‐takers how to get on a describing (a) Resilience, (b) Adaptability, (c) Flexibility, and (d) path, in other words, to respond to scenarios. We could have used Decision‐Making situations have been developed. Each scenario is narration reflecting the real word but this could possibly have not accompanied by a scoring key, indicating the correct, wrong, and the emotional advantages of fantasy and adventurous stories that neutral alternatives. The participant should indicate which alterna‐ keep people engaged. On the other hand, according to Malone and tive serves as correct and wrong in each situation. Every correct Lepper (1987), fantasy is described as one of the key reasons users choice gave +1 point to the test taker and −1 for the wrong choice, appreciate in a game and as one of the most important features of 0 points were given to the other two options. Each participant re‐ games raising a player's imagination. ceived four separate scores, one for each scale, which derives from There is also a visual progression bar showing the progress in summing up the individual scenario scores. A sample scenario of the the assessment as well as story troubleshooting mechanisms and SJT is presented in the Appendix. voice overs to remind users what the interface does and how to In order to explore the construct validity of the SJT measure, play the “game.” There is also a world map showing the islands the assessing the four constructs, we used the following measures. players progress through. Rewards given to test‐takers are intrin‐ sic, by successfully completing the missions/solving the scenarios, 4.2.1.1 | Resilience and extrinsic, by receiving a report including feedback on player's We used the Resilience Scale of Wagnild and Young (1993) which competencies following the completion of the “game.” Test tak‐ contains 25 items, all of which are measured on a 7‐point scale from ers have freedom of choice as well as they choose their avatar, 1 (strongly disagree) to 7 (strongly agree). An example item is: “When they can skip the narrative and can leave the “game” anytime and I make plans I follow through with them.” The alpha reliability of the continue from where they left off. Finally, a fine balance is kept scale was 0.89. between assessment and game mechanics to make it as fun and engaging as possible but without alienating nongamers from the 4.2.1.2 | Adaptability experience and discriminate against them by making it fair and pro‐ We used the scale developed by Martin, Nejad, Colmar, and Liem viding equal opportunities to all. In an adventure story setting, it (2012) consisting of nine items. Each item is measured on a 1 might be more difficult to ascertain the context of the question (“strongly disagree”) to 7 (“strongly agree”) scale. An example item is: making the candidate think twice and to respond with a more rep‐ “I am able to think through a number of possible options to assist me resentative answer, while candidates' interest in the assessment in a new situation.” The alpha reliability of the scale was 0.89. might be increased. 4.2.1.3 | Flexibility Flexibility was measured using the HEXACO Personality Inventory 4 | M E TH O D (Lee & Ashton, 2004), which contains 10 items measured on a 5‐ point scale, from 1 (“strongly disagree”) to 5 (“strongly agree”) scale. 4.1 | Samples An example item is: “I react strongly to criticism.” The alpha reliabil‐ Initially, 20 experienced HR professionals in employee selec‐ ity of the scale was 0.74. tion and assessment from various hierarchical levels (directors, managers, and recruiters), based in Athens, Greece, were inter‐ 4.2.1.4 | Decision‐making viewed during the development phase of the SJT. Also, seven For the assessment of decision‐making skills, researchers adopted HR professionals served as experts to determine the scoring Mincemoyer and Perkin's (2003) measure which assesses factors, key of the new SJT. For face validation purposes, another group such as “define the problem; generate alternatives; check risks and of eight HR practitioners completed the SJT. Additionally, 321 consequences of choices; select an alternative; and evaluate the business schools' students and graduates (61% female) with a decision.” The response category for each question was a 5‐point mean age of 26.5 years old (SD: 5.4 years) and educational level: Likert‐type scale (1 = never to 5 = always) designed to determine 42% bachelor's degree, 41% master's degree, were employed as frequency of use. An example item is: “I easily identify my problem.” the construct validity and confirmatory factor analysis sample. The alpha reliability of the scale was 0.77. For conducting the replication of the gamified SJT, we gathered 410 employees or job seekers (46% female) on top of the 321 4.3 | Procedure test takers of the previous step, with an average age of 29 years old (SD: 7.4 years), 72% of which had a bachelor's or master's We developed a SJT assessing the four competencies (resilience, degree. adaptability, flexibility and decision‐making) following the guidelines 14682389, 2019, 2, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/ijsa.12240 by Erasmus University Rotterdam Universiteitsbibliotheek, Wiley Online Library on [13/01/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License GEORGIOU et al. | 95 suggested by Motowidlo et al. (1990). The content of SJT's situa‐ facets as dependent variables, controlling for age and gender. The tions and response options was first developed followed by an itera‐ results presented in Table 1 provide evidence of convergent and dis‐ tive procedure of face validation and construct validity assessment. criminant validity of the SJT. At this stage, the SJT's scenarios along with measures of Resilience, More specifically, the resilience SJT facet is related to the resil‐ Adaptability, Flexibility, and Decision‐Making have been administered ience scale at a significant but mediocre level (β = 0.350, p < 0.01), to business school students and graduates. Then, the SJT's scenarios and so do the decision‐making scale (β = 0.104, p < 0.05) and flexibil‐ were converted into adventure scenarios around a common story by ity (β = −0.140, p < 0.05). On the other hand, the SJT flexibility mea‐ a English‐speaking professional writer. The professional writer con‐ surement regression coefficients are statistically significant with the verted the four competencies into “islands of adventure,” then, au‐ HEXACO personality inventory measuring flexibility (β = 0.366, p < thors thoroughly examined the content of the converted scenarios 0.01) and adaptability scale (β = 0.166, p < 0.05). SJT Adaptability is to ensure correspondence. A sample scenario of the gamified assess‐ related only to the adaptability scale (β = 0.166, p < 0.01) and the SJT ment is presented in the Appendix, where the players wonder around Decision‐Making facet with decision‐making (β = 0.389, p < 0.01), the islands responding to how they would most likely and least likely flexibility (β = −0.114, p < 0.05) and resilience (β = 0.202, p < 0.01) behave to particular instances. The mission of the gamified assess‐ scales, respectively. Some of the facets of SJT are cross‐correlated ment is to respond to all situations/scenarios by indicating what is with other measurements; however, the magnitude is low sufficing most likely and least likely to do. Having established a robust SJT evidence for discriminant validity. To further establish convergent measurement and a gamified equivalent, we proceeded to first ensure validity on the same sample (N = 321), we conducted CFA (Bentler, the construct validity of the gamified SJT and second to verify the lack 2004) with maximum likelihood estimation and robust statistics to of systematic variance between the two different modes of testing address nonnormality of data and fit indexes, as recommended by (SJT and gamified SJT). Therefore, we administered the gamified SJT Hu and Bentler (1999). More specifically, a value of >0.90 for com‐ to employees and job seekers for validation purposes. As a result, a parative fit index (CFI) and normed fit index (NFI) and a value of fully functional gamified selection and assessment approach has been