Which is a positive nonverbal behavior of an interviewer

Introduction

Virtual reality (VR) bears the potential to disrupt today's education (Allcoat and Von Mühlenen 2018; Checa and Bustillo 2020; Radianti et al. 2020). It allows students to experience and engage in virtual scenarios as if they were there (Jerald 2016; Rupp et al. 2019). VR allows educators to deliver learning content in a more immersive way than slides, texts, or 2D videos. A promising VR application for students in higher education is to increase awareness of possible biases in potential future workplace situations (Innocenti et al. 2012; Salmanowitz 2018), such as judging applicants’ competences in a recruitment interview. The main task of the rater in a recruitment interview is to judge the competences of an applicant objectively (Dipboye 2017). Research on situations other than job interviews has shown that people are influenced by their environment (Cialdini 2001). One important aspect is social influence (e.g., Cialdini and Goldstein 2004, for an overview of social influence), which often occurs through nonverbal behavior (Guyer et al. 2019). To our knowledge, no previous study has investigated whether the nonverbal behavior of interviewers influences observers who must rate the competences of a job applicant in a recruitment interview. To close this gap, we conducted a study in which participants observed a recruitment interview and rated three competences of the interviewed person (customer care, behavior in a team, and sales skill). We used a 2 × 2 design with the nonverbal behavior (positive vs. negative) of the interviewer and display type (VR-cardboard vs. 2D screen display) as between-subjects factors. In the remaining introduction, we present the educational background of this study and summarize previous research on job interviews, social influence through nonverbal behavior, and immersive VR versus 2D displays before explaining the current study and the expected results.

Educational background of this study

Traditional classroom-based lecturing uses slides, texts, and 2D videos. The use of VR can increase engagement and enjoyment, promote self-learning, and enable multisensory learning (Papanastasiou et al. 2019). This is particularly useful for theories and concepts that are related to perception, action, and decision making in real-life situations (for further potentials of VR-use in education see, e.g., Freina and Ott 2015; Martín-Gutiérrez et al. 2017; Jensen and Konradsen 2018; Markowitz et al. 2018; Radianti et al. 2020; Wu et al. 2020). We applied a total of four interactive VR modules in a lecture on general psychology for first-semester undergraduate psychology students. With the first VR module, the students were introduced to VR technology by interactively mixing colored lights in a dark room and viewing different colored objects in a virtual environment. In a session two weeks later, we introduced two subsequent VR modules to demonstrate biases of distance perceptions under various environmental conditions (e.g., daylight vs. night; detailed vs. limited visual context). The current study evaluates the fourth VR module, which was deployed again two weeks later. Its goal was to raise awareness of social influence bias by nonverbal behavior in a job recruitment interview. This module combines the topics of cognitive interpretations of perceived visual and auditive stimuli with a topic relevant to the students’ future careers (occupational, organizational, and business psychology students). Furthermore, the VR module set up for the study served multiple purposes in the class-based lecture. The students’ ratings of the evaluated competences were summarized as graphs in real time and used to discuss rating biases and their implications. Additionally, the disclosure and explanation of the study design introduced the students to key elements of experimental designs. Finally, students received first-hand experience using VR in a non-game application.

Job recruitment interviews and nonverbal behavior

Job recruitment interviews continue to be one of the most widely used methods for evaluating the fit between potential applicants’ competences and job requirements (Arvey and Campion 1982; Schmidt and Hunter 1998; Ryan et al. 1999; Salgado et al. 2004; Macan 2009). In fact, “the interview is the most-used procedure in personnel selection across all countries. Job applicants expect to be interviewed; managers expect to interview some or all candidates” (Viswesvaran and Ones 2018, p. 460). For example, a survey conducted among human resources (HR) personnel in the German-speaking part of Switzerland revealed that 99% of the respondents included a form of interview in their selection processes, while 71% of these were at least semistructured (König et al. 2010, p. 23). A meta-analysis by Hausknecht et al. (2004) revealed that candidates also perceive interviews favorably compared to alternative selection methods (see also Dipboye et al. 2012). However, job recruitment interviews differ in several aspects. One important aspect is standardization across candidates regarding the questions asked and the evaluation (Campion et al. 1997). A structured interview enhances reliability and predictive validity (e.g., Campion et al. 1988; McDaniel et al. 1994; Schmidt and Hunter 1998). Therefore, structured interviews are often the preferred choice in practice. Interview formats with two or more interviewers/observers (also called panel interviews) enhance the accuracy of the assessment (Huffcutt et al. 2013). Moreover, applicants perceive that interviews with more than one interviewer are more fair (Blackman 2017, p. 187).

An essential part of communication and human interaction is nonverbal (see Bonaccio et al. 2016, for an overview in the workplace context). Nonverbal behavior is an effective means of social influence: it affects recipients’ attitudes, behaviors, and beliefs (e.g., Guyer et al. 2019). Nonverbal cues are mostly emitted unintendedly and unconsciously. Visually observable positive nonverbal behaviors are related, for example, to eye gaze, nodding of the head in agreement, and smiling, while negative nonverbal cues can be expressed with a furrowed brow, stern look, and tightening of the mouth (Butler and Geis 1990, p. 49; Owen et al. 2012, p. 366). Whereas many studies have analyzed the effects of applicants’ nonverbal behavior on interviewers’ ratings (e.g., Parsons and Liden 1984; Gifford et al. 1985; DeGroot and Gooty 2009; Huffcutt et al. 2011, for a more advanced model), only a few studies have investigated the effects of interviewers’ behavior on the applicant (e.g., Liden et al. 1993). To our knowledge, no study has investigated whether social influence through the nonverbal behavior of interviewers influences fellow interviewers or raters.

Display types and perceived immersion

VR enables users to immerse themselves in and experience virtual content as if they were there (Jerald 2016; Lanier 2017; Slater 2018). VR offers new ways of studying human perception and behavior (e.g., Blascovich et al. 2002; Pan and Hamilton 2018). Highly realistic VR content includes 360-degree videos, also called spherical or immersive videos (e.g., Violante et al. 2019). This type of content has limited natural interaction capabilities typically associated with immersive VR. However, 360-degree videos allow users to look around at all angles from the location where the 360-degree camera captured the footage. In addition to experiencing 360-degree videos on an immersive VR headset (e.g., VR-cardboard), users can view the content on a 2D screen display (e.g., notebook or tablet display). Instead of simply turning the head naturally to change perspective in VR, the viewing angle is controlled on a 2D screen display using the computer mouse (notebook computer) or a touch screen (tablet computer). Immersive VR has the advantage that visual distractions from the user’s real environment are not perceived. The user’s visual focus remains on the virtual world.

Only a few studies have compared immersive VR and traditional 2D screen displays. For example, Schutte and Stilinović (2017) found that VR evoked significantly more empathy than 2D screen displays when participants watched a documentary about a girl in a refugee camp. Niu et al. (2019) compared subjective emotions evoked by short film sequences in immersive VR and 2D screen displays and found that immersive VR caused significantly higher arousal (intensity of the emotion) than the same sequences viewed on a 2D screen display. Research comparing VR to real-world experiences showed high ecological validity of VR. For example, watching a 360-degree video of a scenic landscape in immersive VR evoked similar emotions as seeing it in reality (Chirico and Gaggioli 2019). Kinateder and Warren (2016) studied socially influenced human behavior in evacuation scenarios. Their findings also yielded similar behaviors in immersive VR as in real life, supporting the ecological validity of immersive VR compared to real-life behavior. Other studies compared job interviews conducted as videoconferences on 2D screen displays and conducted face-to-face. For example, Sears et al. (2013) and Blacksmith et al. (2016) found that applicants received better interview ratings in face-to-face interviews. These authors argue that technology use limits the perception of interpersonal skills, such as effectively reading nonverbal cues. Overall, studies suggest that nonverbal cues are less effective in 2D screen video-mediated communications than the face-to-face equivalent (e.g., Doherty-Sneddon et al. 1997; Bos et al. 2002; Proost et al. 2020).

One reason for the better performance of VR compared to 2D screen displays could be the degree of immersion (e.g., Kumar et al. 2017). From a technical perspective, immersion in VR can be described as “the extent to which the computer displays are capable of delivering an inclusive, extensive, surrounding and vivid illusion of reality to the senses of a human participant” (Slater and Wilbur 1997, p. 604ff). However, more recent research characterizes immersion as a psychological state that “people attain when they are intensely involved in an activity and are experiencing a high level of enjoyment and fulfillment” (Cheng et al. 2015, p. 234ff). Based on game immersion theory, Brown and Cairns (2004) operationalized immersion on three levels (engagement, engrossment, and total immersion). Each level has specific barriers that players need to overcome before reaching the next stage. Cheng et al. (2015) refined this concept for video games and learning applications and shaped subfactors for each of the three levels. They described engagement as the first stage of immersion: users are interested in the topic and style of the application. The second stage is engrossment, in which users focus on the application’s content and gradually lose awareness of their environment and self-perception. Total immersion represents the third and final stage: users experience feelings of presence (a sense of actually being there) and detach entirely from reality (Cheng et al. 2015, p. 235). Existing questionnaires measuring aspects of immersion for VR were typically designed to evaluate fully interactive VR applications (e.g., Sutcliffe et al. 2005) in immersive VR only (see also Suh and Prophet 2018; Schwind et al. 2019, for overviews of different scales). Only a few studies have evaluated aspects of immersion based on game immersion theory using VR or comparing VR to other media. However, a few recent publications have successfully applied Georgiou and Kyza’s (2017) augmented reality immersion (ARI) questionnaire (Voit et al. 2019) or parts of it (Gallup et al. 2019; Tennant et al. 2020). Thus, in the current study, we slightly adapted and applied the ARI questionnaire (see Method for details).

Current study

We investigated whether the nonverbal behavior of an interviewer influences the competence ratings of observers in a recruitment interview using 360-degree videos. We used a 2 × 2 design with the nonverbal behavior (positive vs. negative) of the interviewer and display type (VR-cardboard vs. 2D screen display) as between-subjects factors. Participants acted as interview observers who had to rate the applicant’s competences (behavior in a team, customer care, sales skill) in the recruitment interview. To assess perceived immersion depending on the display type (VR-cardboard vs. 2D screen), participants completed the ARI questionnaire (Georgiou and Kyza 2017). Additionally, they rated their overall satisfaction with the experience.

Research questions and expected results

Based on the reviewed literature mentioned above, we defined our research questions and hypotheses: First, does the nonverbal behavior of interviewers influence the competence ratings of observers in recruitment interviews? We expected that positive nonverbal behavior of the interviewer would lead to higher competence ratings than negative nonverbal behavior of the interviewer (and vice versa; e.g., Cialdini and Goldstein 2004; Cialdini 2007). Second, does display type (VR-cardboard vs. 2D screen display) influence the competence ratings of observers in recruitment interviews? Previous studies reported higher performance ratings for face-to-face interviews compared to technology mediated interviews (Sears et al. 2013; Blacksmith et al. 2016). However, to our knowledge, no previous study examined whether display type (VR-Cardboard vs. 2D screen display) influences the competence ratings of observers in recruitment interviews. Therefore, we did not formulate a hypothesis for the second research question. Third, is there an interaction between nonverbal behavior and display type on competence ratings of observers in recruitment interviews? Because research on this question is also scarce, we did not formulate a hypothesis on this third research question. Fourth, does VR result in higher immersion than 2D screen displays? The first two stages of immersion (engagement, engrossment) focus mainly on content- and application-related topics (Brown and Cairns 2004; Cheng et al. 2015), which were identical for the immersive VR and 2D screen display conditions. Thus, we expected to find higher perceived immersion on the subscales of flow and presence (both related to total immersion) for VR because the technology is capable of making users experience a feeling of being there and detaching from reality (Slater and Wilbur 1997). Fifth, can the ARI questionnaire (Georgiou and Kyza 2017) be applied to compare 360-degree videos displayed with VR versus displaying them on 2D screens? We expected that applying the ARI scales would differentiate between immersive VR and 2D screen display (construct validity) on the relevant subscales of total immersion and that the ARI scales would show good reliability.

Method

Participants

One hundred and seventeen (117) first-semester psychology students simultaneously participated in the experiment in the General Psychology I lecture. Participation was voluntary and took place during the scheduled lecture time. All participants gave informed consent before starting the experiment. Seven students did not complete the questionnaire due to technical issues during the experiment (e.g., smartphone incompatibility with VR-cardboard, low battery); thus, we excluded their data. The final sample consisted of 110 participants (n = 110; 74% female; age: M = 24.1 years, SD = 4.6). All had previously experienced VR-cardboard by conducting a cardboard VR module on color and two modules on distance perception in the same lecture two weeks before the experiment. The study complied with the American Psychological Association Code of Ethics and was conducted in accordance with the European General Data Protection Regulation.

Experimental design

We used a 2 × 2 design, with the interviewer’s nonverbal behavior (positive vs. negative) and display type (VR-cardboard vs. 2D screen display) as between-subject factors. The participants rated three competences of the applicant (behavior in a team, customer care, and sales skill) based on 360-degree videos of a job recruitment interview. Finally, the participants completed the ARI questionnaire (Georgiou and Kyza 2017) on perceived immersion and answered one item on overall satisfaction with the experience.

Procedure

The experiment was conducted in a lecture. We welcomed the students and explained that they would be observers of a job interview to provide competence ratings of a job applicant. We told the participants that they would be randomly distributed to the VR-cardboard or the 2D screen condition and that they could experience the experiment later at home using both display types if they wished to do so. All participants were asked to remain seated at their table in the classroom during the experiment. We explained to the participants that they could stop the experiment anytime and should stop the experiment immediately if they felt any discomfort (this was important for the immersive VR conditions because, in rare cases, participants can feel adverse effects such as sensation disorientation, oculomotor issues, or visually induced motion sickness; e.g., Lubeck et al. 2015; Kim et al. 2018). To our knowledge, this was not the case; all participants without technical difficulties completed the experiment. Six persons were available on-site to help in case of technical issues. After the instructions, the participants logged into a web-based survey and education platform (developed by Waimanoo 2020) using their notebook or tablet computer and an anonymous ID. The platform randomly distributed the participants to four groups: 2D screen display with positive nonverbal behavior, 2D screen display with negative nonverbal behavior, VR-cardboard with positive nonverbal behavior, and VR-cardboard with negative nonverbal behavior. All participants were then asked to connect their headphones to the device (notebook, tablet, or smartphone) and adjust the audio volume to a comfortable level based on an audio sample. Participants in the immersive VR condition then mounted their smartphone on their VR-cardboard (all students had received a Google cardboard V2, Google 2020a, in the previous lecture two weeks earlier). The experiment started for all students simultaneously (Fig. 1).

Fig. 1

Photograph of the experiment in the classroom lecture

Full size image

In the VR-cardboard conditions, participants could change the viewpoint by moving their head. Interaction elements, such as buttons, were activated by either maintaining the look-at position for three seconds (also called “fusing”; with a cursor animation; Google 2020b) or clicking on a physical button positioned at the top right-hand side of the VR-cardboard. To ensure easy usability for both display types and keep the interaction modalities comparable, in the 2D screen display condition, participants could move their viewpoint with a mouse or touchscreen (click-and-drag). They could select buttons with a screen-centered cursor. A click on the left mouse button, a tab on the screen (tablets), or the keyboard’s enter key was used to activate a button.

Once ready, a click on an “OK” button started the experiment with the first 360-degree video (introduction video; Fig. 2a), welcoming the applicant into the room and introducing the interview setting and the persons involved. Then, three video sequences were displayed (one for each competence; Fig. 2b, for an example); the order of these sequences was counterbalanced across participants using Latin square (Jones and Kenward 2015, p. 140). After each interview sequence, the video faded out again, and instructions appeared to rate the corresponding competence (Fig. 2c, for an example). Once answered, a click on an “OK” button started the next video sequence until all three competences were rated. A brief closing video represented the end of the interview setting. Participants in the VR-cardboard condition were asked to remove the VR-cardboard and continue on their notebook or tablet computer. Subsequently, all participants completed the ARI questionnaire (Georgiou and Kyza 2017) and rated an extra item on the overall satisfaction with the experience on their notebook or tablet computer.

Fig. 2

Screenshots of the 360-degree videos used in the study. a Picture of the introduction video showing the job applicant entering the room. b Example picture of the interview. c Question and ten-point Likert scale for rating the competence behavior in a team

Full size image

Materials

As visual and auditory stimuli, multiple 360-degree videos were created in cooperation with Waimanoo (2020) to represent realistic video sequences of a recruiting interview. The 360-degree camera captured the videos from a natural viewing position of the rater or the actual participant (Fig. 2). We did not manipulate the camera position for all clips to reduce the risk of simulator sickness (e.g., Lubeck et al. 2015). To retain the participant’s attention on the relevant actions, we kept the job interview setting’s surroundings simple (Fig. 3) with a minimum of visual distractions (Pirker et al. 2020). The interview setting is a generic office environment consisting of a table, a fictional company logo, a plant, and a round table with the three protagonists: the interviewer, the applicant, and the rater. From the rater’s perspective, this setup allowed the participants to observe the interviewer and applicant simultaneously, even with the limited horizontal field of view of a VR-cardboard (between 60° and 100°, depending on the specific smartphone used; Powell et al. 2016).

Fig. 3

Layout of the interview setting. The interviewer, the job applicant, and the rater were located around a table. The 360-degree camera was placed at the position of the rater’s chair. The room’s setting was kept simple, with a minimum of visual distractors

Full size image

We captured the video footage with a dual-lens GoPro Fusion camera and then stitched the content to monoscopic, spherical 360-degree videos (for a general overview of typical capturing workflows, see Feurstein 2019). An Olympus LS-P4 recorded the audio. The separate video and audio materials were merged and edited (for illustration, see Fig. 4) using Adobe Premiere Pro version 13.1.3. Furthermore, we aligned the center of the “primary action” (Pirker et al. 2020, p. 323) to the viewer’s forward view. To minimize potential technical issues with the Wi-Fi bandwidth in the classroom and limited cache on the students’ devices, we lowered the resolution to 1920 × 960 pixels and set the target bitrate to 5 Mbps, which was a compromise for good image quality combined with high functionality on the variety of student devices used during the experiment.

Fig. 4

Editing process of the 360-degree video footage. All scenes were recorded multiple times. The content was then assembled and edited using Adobe Premiere Pro’s masking functions (red frame; depending on the nonverbal behavior). In the final videos, only the visual and auditive footage of the interviewer differed between the positive and negative nonverbal behavior conditions

Full size image

We implemented the scenario of a job applicant applying for a salesperson position in an electronic retail store. We chose this scenario because first-semester students in higher education have had experiences with salespersons in such a store. Thus, it could be assumed that they could imagine an appropriate job requirement profile. In addition to technical knowledge, salespersons should also exhibit competences in other categories, such as working well in a team, interacting with customers, and sales skills (for a meta-analysis of the determinants of salespersons’ performance, Churchill et al. 1985; for a study analyzing situational interviews for a sales-related job, Weekley and Gier 1987; for detailed job analysis for sales-related jobs, Phillips 1992; for a study evaluating a skill-set of sales-related jobs, Rentz et al. 2002). Accordingly, we chose three specific competences for each of these three categories: behavior in a team, customer care, and sales skill.

The first 360-degree video served as an introduction (the same video for all test conditions) and aimed to familiarize the rater with the virtual scene and build rapport in the interview setting (e.g., Barrick et al. 2010; Swider et al. 2016). The interviewer (female; professional theater actress) welcomed the participant and explained the interview procedure. Then, a door buzzer sounded, the door opened, and the interviewer welcomed the applicant (male; professional theater actor). After they both sat down at a table, the interviewer briefed the applicant and the rater on the procedure. This introduction video ended after 1 min 32 s. The 360-degree content blurred out, and written text appeared with instructions for the participant to focus on the upcoming competence rating. A click on the “OK” button started the video sequence for the first competence rating. For each of the three competences (each video sequence was 42 s), the interviewer asked one of the following questions.

Behavior in a team: For this competence, the interviewer asked a situational question derived from Campion et al. (1994, p. 999): “Suppose you had an idea for a change in work procedure to enhance quality, but there was a problem in that some members of your work team were against any type of change. What would you do in this situation?”.

Customer care: This question was derived from an example noted in Weekley and Gier (1987, p. 485): “A customer comes into the store to pick up a watch he had left for repair. The repair was supposed to have been completed a week ago, but the watch is not yet back from the repair shop. The customer becomes very angry. How would you handle this situation?” For a better fit in the chosen scenario, we replaced the term “watch” with “smartphone.”

Sales skill: The interviewer’s question posed for this competence was derived from Campion et al. (1997, p. 668): “Suppose you were giving a sales presentation and a difficult technical question arose that you could not answer. What would you do?”.

The applicant’s reply to the question was the same in all four conditions within every competence. We defined and evaluated the replies to each competence question with HR experts. The applicant’s replies corresponded to answers rated as average and contained no salient cues related to high or low performers. For each competence, we edited two versions (positive nonverbal behavior of the interviewer, negative nonverbal behavior of the interviewer) using Adobe Premiere Pro’s masking functions. Only the interviewer’s visual and audio differed between the two versions, while all other content remained the same. The interviewer responded to the applicant’s reply by giving nonverbal cues only. For the positive nonverbal behavior condition, the interviewer showed facial expressions (for an overview on facial expressions, see Cohn et al. 2007), such as cheek raising, head up, and head down, along with other nonverbal visual cues of pleasure (for studies applying these cues, see Butler and Geis 1990; Owen et al. 2012), such as friendly eye gaze to the rater, nodding, and authentic smiles (see Fig. 4, top right). For the negative condition, the interviewer showed facial expressions such as brow lowering, chin raising, and lip tightening, along with other nonverbal cues of displeasure, such as giving stern looks to the rater, frowning, or humming quietly (negative nonverbal behavior; see Fig. 4, bottom right). To keep the credibility of the interviewer as perceived by the observer consistent (Williamson et al. 2013), we maintained the intensity and number of nonverbal indications for all conditions (Borkenau and Liebler 1995; Larsen and Shackelford 1996; for examples, see Fig. 5 in Appendix).

In the closing video (length: 30 s), the interviewer said goodbye to the applicant, who left the room. The interviewer told the participant (rater) to continue on his/her notebook or tablet computer for the final questionnaire. The total length of the 360-degree videos per participant was 4 min 18 s. For the immersive VR conditions, the total time wearing a VR-cardboard was under 8 min (avoiding potential adverse effects on the participants’ ocular system; see, e.g., Turnbull and Phillips 2017; Elias et al. 2019), including reading instructions and rating the competences.

All video sequences were evaluated by three experts (one professor and two research scientists) working in the field of personnel assessment. They unanimously confirmed that the videos corresponded to a realistic interview setting, the interviewer asked realistic questions, the applicant’s replies were short but realistic, and that the nonverbal behavior of the interviewer was clearly identifiable as either positive or negative. The experts also mentioned that professional interviewers would often pose follow-up questions and that their facial expressions would be less pronounced.

Measures

Competence ratings: We asked the participants to rate the applicant’s competencies (behavior in a team, customer care, sales skill) directly after the presentation of the specific video sequence on a ten-point Likert scale ranging from “low” (1) to “high” (10), see Fig. 2, right for an example.

Immersion questionnaire: We applied the ARI questionnaire (Georgiou and Kyza 2017) to measure perceived immersion aspects based on game immersion theory (Brown and Cairns 2004; Cheng et al. 2015). Cronbach’s \(\alpha\)-values reported by Georgiou and Kyza (2017) are shown in Table 1. The model consists of three main factors, each with two subscales: engagement is composed of usability and interest, engrossment consists of emotional attachment and focus of attention, and total immersion includes flow and presence.

Table 1 Cronbach’s \(\alpha\) Values of the ARI Scales and Subscales Reported by Georgiou and Kyza (2017, p. 33)

Full size table

We used the items of the ARI questionnaire (Georgiou and Kyza 2017) and replaced the term “AR” with “VR” in the four items mentioning AR. A native English and German speaker translated the items into German (initial translation; see, e.g., Beaton et al. 2000). The items were then translated back to English by another native speaker of both languages (back-translation), followed by a review by the authors. Table 6 in Appendix contains the English and German items (the latter were used in the current study). We used a seven-point Likert scale ranging from “totally disagree” (1) to “totally agree” (7), identical to Georgiou and Kyza (2017).

Overall satisfaction: To gather an overall impression of whether the participants liked the experience, we added one extra item at the end of the survey: “How did you like the VR module [with VR-cardboard/on the 2D screen display]?” The text indicated in square brackets changed depending on the display type condition. A seven-point Likert scale ranging from “very bad” (1) to “very good” (7) was applied.

Analysis

For psychometric and descriptive statistics, analyses of variance (ANOVAs), and post hoc comparisons, we used the free and open statistical software JAMOVI version 1.2.27.0 (The Jamovi Project 2020) with alpha set to 0.05. We report the effect sizes of ANOVAs using η2p (partial eta-squared) and interpret η2p-effect sizes of .01, .06, and .14 as small, medium, and large effects, respectively (Cohen 1988, p. 368). Significant interaction effects of the ANOVAs were further investigated with post hoc tests using Holm–Bonferroni corrections (Holm 1979).

Results

All analyses refer to the same data set; thus, the participants per condition remained the same for all analyses. No outliers were detected or removed.

Effects of nonverbal behavior and display type on competence ratings

Table 2 shows the number of participants per condition, means, standard deviations, and results of two-way ANOVA statistics (according to the American Psychological Association 2020) for the competence ratings as dependent variables.

Table 2 Number of participants, means, standard deviations, and two-way ANOVA statistics for evaluation of competences

Full size table

We conducted two-way between-subject ANOVAs to explore the impact of nonverbal behavior and display type on the three competence ratings using two-way independent ANOVAs (Table 2). We found significant main effects of the interviewer’s nonverbal behavior on the competence ratings of the job applicant with regard to behavior in a team and customer care. For competence behavior in a team, participants who experienced positive nonverbal behavior gave higher ratings (M = 5.78, SD = 1.49) than those who experienced negative nonverbal behavior (M = 5.00, SD = 2.08). Competence customer care showed similar effects: the participants in conditions with positive nonverbal behavior (M = 5.72, SD = 1.87) rated the applicants higher than those in conditions with negative nonverbal behavior (M = 5.02, SD = 1.77). However, these ratings did not depend on the display type; we found no significant effects of display type and no significant interaction of nonverbal behavior with the display type. No main effects or interaction of the display type with nonverbal behavior reached statistical significance for competence sales skill. Considering the ten-point Likert scale used for the three competency ratings (behavior in a team, customer care, sales skill), the means of each group (ranging from M = 4.44 to M = 5.83) shown in Table 2 indicate average ratings with no signs of ceiling or floor effects.

Effects of nonverbal behavior and display type on perceived immersion

Table 3 shows the psychometric properties of the immersion scales we used based on the ARI questionnaire of Georgiou and Kyza (2017). We found very good reliability (DeVellis 2016, p. 136) for all scales, with Cronbach’s \(\alpha\) ranging from .79 to .91.

Table 3 Psychometric properties for ARI scales and subscales

Full size table

We conducted two-way between-subject ANOVAs to explore the impact of nonverbal behavior and display type on the immersion scales (see Table 4). The analysis of the immersion variable engagement showed a significant interaction in the subscale of interest between nonverbal behavior and display type. Post hoc analysis revealed significantly higher interest ratings in the negative nonverbal behavior with VR-cardboard condition compared to the negative nonverbal behavior with 2D screen, t(106) = 2.83, p < 0.05.

Table 4 Means, standard deviations, and two-way ANOVA statistics for the ARI subscales

Full size table

The ANOVAs also yielded a significant main effect of the display type on both subscales of total immersion: flow (VR-cardboard: M = 4.21, SD = 1.61; 2D screen display: M = 3.43, SD = 1.46) and presence (VR-cardboard: M = 3.53, SD = 1.53; 2D screen display: M = 2.67, SD = 1.31). The VR-cardboard condition led to higher ratings of both subscales compared to the 2D screen display condition.

Effects of nonverbal behavior and display type on overall satisfaction with the experience

We further analyzed overall satisfaction with a two-way between-subject ANOVA (see Table 5). The ANOVA suggested a significant main effect of the display type (VR-cardboard: M = 5.37, SD = 1.46; 2D screen display: M = 4.68, SD = 1.43). Thus, experiencing 360-degree videos with a VR-cardboard led to higher overall satisfaction than viewing the videos on a 2D screen display.

Table 5 Means, standard deviations, and two-way ANOVA statistics for overall satisfaction

Full size table

Discussion

We studied whether an interviewer’s nonverbal behavior (positive vs. negative) impacts the competence ratings of observers in a job recruitment interview using 360-degree videos experienced with two different display types (VR-cardboard vs. 2D screen display). First-semester psychology students (n = 110) in the role of raters observed a recruitment interview and assessed three competences (behavior in a team, customer care, and sales skill) of an applicant. We found that positive nonverbal behavior of the interviewer resulted in higher ratings for two of three competences (behavior in a team and customer care) compared to negative nonverbal behavior by the interviewer. This effect was similar for 360-degree videos viewed with immersive VR and on 2D screen displays; we found no effect on display type for the three competence ratings. Our data did not show an interaction of nonverbal behavior and display type on the competence ratings. However, 360-degree videos resulted in higher immersion when experienced with VR than when viewed on 2D screen displays with regard to flow and presence. Thus, the two relevant ARI subscales of total immersion (flow and presence) differentiated between VR and 2D screen display (construct validity). All applied immersion scales showed good reliability. In the following sections, we discuss each key result in the context of the research questions, hypotheses, and relevant literature.

Effects of nonverbal behavior on competence ratings

The interviewer’s nonverbal behavior influenced observers’ competence ratings in the job recruitment interview. Overall, the results support our first hypothesis on social influence trough nonverbal behavior. The participants, acting as interview observers, rated the job applicant’s competence behavior in a team and customer care significantly higher when the interviewer reacted with positive nonverbal behavior to the job applicant’s replies than when the interviewer’s nonverbal behavior was negative. This effect of social influence can be explained by social proof (also called informational influence, Cialdini and Goldstein 2004; Cialdini 2007), a form of seeking accuracy for judgments in uncertain situations (Wood 2000). These results have highly relevant practical implications. To increase the validity of interview ratings (e.g., Huffcutt et al. 2013; Blackman 2017; and further practical reasons), recruitment interviews often include multiple persons from the organization’s side. However, these persons may unintentionally influence others and can unknowingly be socially influenced, which is by no means intended in job interviews with multiple interviewers or raters. However, our results show no significant difference between positive and negative nonverbal behavior with competence sales skills. A possible explanation may be that the participants felt more competent in judging this specific competence because it can be considered part of the everyday experience if a salesperson’s reaction leads to real sales success. We will return to this point in the section on limitations and further research.

From an educational perspective, our study has shown that 360-degree videos are useful to increase students’ awareness regarding the effects of social influence through nonverbal behavior in job application interviews. With the integration of didactical aspects (e.g., Fowler 2015; with focus on active learning: e.g., Theobald et al. 2020), such as the subsequent real-time visualization of the experiment’s data and class discussion, this procedure can lead to the desired reflections by students. On the other hand, these achievements can only be seen as an initial step to reduce the risk of biases such as these occurring in job interviews. Before individuals act as interviewers or observers in real recruitment interviews, thorough training on interviewing and rating applicants’ competences is highly recommended (Alcañiz et al. 2018, for leadership training in VR; Doll 2018, for training on conducting interviews). Our study suggests that such training would benefit from increasing awareness of possible social influence effects through nonverbal behavior that could result in biases when judging job applicants’ competence.

Effects of display type (VR-cardboard vs. 2D screen display) on competence ratings

The effect of social influence through nonverbal behavior was similar for 360-degree videos viewed in VR-cardboard and 2D screen displays. To our knowledge, no previous research has investigated the effects of nonverbal behavior on observers of job recruitment interviews. Therefore, we did not formulate a hypothesis on whether the effect of social influence depends on the display type used to experience 360-degree videos. Our results suggest that the display type (VR-cardboard vs. 2D screen display) does not significantly impact competence ratings. Further, we did not formulate a hypothesis for interaction effects of nonverbal behavior and display type on competence ratings at job interviews. On all three competence ratings, the ANOVAs did not yield an interaction effect. These findings may imply that using 2D screen displays for awareness-raising experiences (or training) on social influence biases could be sufficient in practice. Currently, the use of immersive VR in large higher education courses still demands extra effort. Although the use of low-cost VR-cardboards, combined with the use of students’ smartphones, does not necessarily result in high hardware acquisition costs, the complete implementation requires more effort. Securing full technical functionality and compatibility on all device types (e.g., different screen sizes, operating systems, browser types, and versions) demands additional testing and, thus, potentially more development resources. The alternative, purchasing an adequate set of homogenous stand-alone, untethered VR devices, is costly and involves additional necessary cleaning measures to meet hygiene standards. In contrast, viewing (360-degree) videos on notebook/tablet computers can be achieved relatively quickly. However, immersive VR in a class-based scenario has several benefits compared to 2D screen displays. First, immersive VR automatically absorbs potential visual distractors (e.g., neighboring students). Second, as higher education prepares students for future needs at work (e.g., twenty-first-century skills), such non-game applications are feasible ways to familiarize students with VR’s emerging trends and technology. Third, the overall satisfaction ratings showed that the students liked using immersive VR (with VR-cardboard) significantly better than 2D screen displays for viewing 360-degree video content. Finally, technology improves rapidly, and VR solutions become more affordable every year.

Effects of nonverbal behavior and display type on perceived immersion

360-degree videos have higher immersion when viewed in VR than when viewed on 2D screen displays with regard to flow and presence. Game immersion theory (Brown and Cairns 2004) suggests that players need to overcome specific barriers on an immersion level before reaching the next stage (for learning context, see also Cheng et al. 2015). The first stage (engagement) and the second stage (engrossment) of perceived immersion focus more on the content (e.g., the story) and the application’s use (e.g., interaction modes). These were kept nearly identical for the VR-cardboard and 2D screen display conditions. As expected, the subscale ratings on the two first stages did not differ with regard to the display type. However, the subscale means ranged from M = 4.46 to M = 6.00 (see Table 4). Taking into account the seven-point Likert scale, these values were high, indicating that the two stages were achieved. The analysis of the subscales of total immersion (presence and flow) showed that they were both significantly higher for the VR-cardboard condition than for the 2D screen display condition. These findings align with our expectations as immersive VR has the advantage of providing a higher perception of total immersion (subscales of presence and flow) than 2D screen displays.

An interesting point was discovered in the interaction between nonverbal behavior and display type on the subscale of interest. The VR-cardboard group with negative nonverbal behavior perceived the experience with significantly higher interest than the group with positive nonverbal behavior. One possible explanation is the evocation of emotions through nonverbal behavior or social influence. To our knowledge, however, no previous research examines social influence that triggers emotions (i.e., their nature, valence, and intensity), which then influences aspects of perceived immersion. We will return to this point in the section on limitations and further research.

Adapted ARI questionnaire

As mentioned in Introduction and Method, we decided to slightly adapt the ARI questionnaire (Georgiou and Kyza 2017) to be used with 360-degrees videos presented using a VR-cardboard and a 2D screen display. Our results suggest that the adapted ARI questionnaire can be used to reliably measure different aspects of immersion when viewing 360-degree videos in VR-cardboard and on a 2D screen display. The psychometric properties of the adapted ARI questionnaire showed very good reliability with our data (Cronbach’s \(\alpha\)-values ranging from .79 to .91). Remarkably, with the exception of the subscales usability and focus of attention, the reliabilities with our sample were even higher than the reported values of the original ARI questionnaire that was used in another context (Georgiou and Kyza 2017).

Limitations and further research

As mentioned in Method section, all video sequences were evaluated by three experts working in the field of personnel assessment. They unanimously confirmed that the interview setting, interviewer questions and answers of the applicant were realistic, but professional interviewers would show less pronounced emotional expressions. Our study highlights the importance of conducting job assessment interviews in a benevolent manner without showing pronounced facial expressions. We tested lay raters (first-semester psychology students) in our study. It is unclear whether our findings of social influence bias through nonverbal behavior would also affect professional raters in job recruitment interviews. It would be interesting to investigate this aspect in a future study and compare students’ ratings to those of experienced job interviewers’ ratings. Furthermore, our data relied on the judgment of three competences based on situational questions only. A more comprehensive range of job-relevant aspects (e.g., Lambert et al. 2014, for sales-related jobs) and the inclusion of behavior description questions (e.g., Taylor and Small 2002; Huffcutt et al. 2004; Klehe and Latham 2006; Culbertson et al. 2017; Hartwell et al. 2019) would allow further conclusions about which type of competencies or questions are especially likely to socially influence raters. Our results are very promising regarding the application of a slightly adapted ARI questionnaire (Table 6 in Appendix) to measure the three levels of perceived immersion (as defined in Brown and Cairns 2004). However, further research with more participants would be beneficial to validate the model structure of game immersion theory (Cheng et al. 2015) in more detail. Finally, the observed interaction of nonverbal behavior and the display type on the immersion scale of interest potentially opens further research questions. Why and to what extent do immersive VR experiences with negative nonverbal behavior compared to positive nonverbal behavior arouse interest, but not when displayed on a 2D screen? A future study analyzing viewers’ emotions (e.g., Diemer et al. 2015; Chirico and Gaggioli 2019; Niu et al. 2019) induced by nonverbal behavior (positive vs. negative) could more precisely reveal the reasons for the observed interaction effect on interest.

Conclusion

Recruitment interviews are a widely used method for recruiting new personnel and often include multiple persons from the hiring organization. This study evaluated whether an interviewer’s nonverbal behavior (positive vs. negative) impacts observers’ competence ratings of an applicant using 360-degree videos on two different display types (VR-cardboard vs. 2D screen display). The participants observed interview sequences in the role of a rater and assessed three competences (behavior in a team, customer care, and sales skill) of the applicant. The interviewer’s nonverbal behavior influenced these ratings on two out of three evaluated competences (behavior in a team and customer care). Overall, these findings confirm a social influence bias and are therefore highly relevant for practice, particularly because interviews with multiple interviewers/raters are generally considered valid. Additionally, our study showed that 360-degree videos of recruitment interviews can be used for awareness-raising experiences on social influence through nonverbal behavior. Interestingly, the effect of social influence was independent of the display type: watching the 360-degree videos with immersive VR and on 2D screen displays resulted in similar ratings of competence. This raises the question of whether it is worth the extra effort of using immersive VR in a classroom to demonstrate the effect of social influence through nonverbal behavior. However, the participants in the immersive VR condition rated the overall satisfaction significantly higher than the participants in the 2D screen display condition. Furthermore, the participants perceived a higher level of flow and presence (defined as the third level of immersion, total immersion) when experiencing the 360-degree videos in immersive VR compared to 2D screen displays. This finding is in line with the underlying game immersion theory. The first and second levels of immersion (engagement with the subscales of usability and interest; engrossment with the subscales of emotional attachment and focus of attention) did not differ between the display types because the constructs relate mainly to the content and the implementation (which were similar for both display conditions). Thus, the evaluated scales of the ARI questionnaire (Georgiou and Kyza 2017, with the term “AR” modified to “VR”; see the Table 6 in Appendix) demonstrated construct validity and high reliability. Accordingly, our results suggest that the ARI questionnaire is an appropriate instrument to rate different aspects of immersion when assessing 360-degree videos using immersive VR and 2D screen displays.

References

  • Alcañiz M, Parra E, Chicchi Giglioli IA (2018) Virtual reality as an emerging methodology for leadership assessment and training. Front Psychol 9:1658. //doi.org/10.3389/fpsyg.2018.01658

    Article  Google Scholar 

  • Allcoat D, von Mühlenen A (2018) Learning in virtual reality: effects on performance, emotion and engagement. Res Learn Technol 26:1–13. //doi.org/10.25304/rlt.v26.2140

    Article  Google Scholar 

  • American Psychological Association (2020) Publication manual of the American Psychology Association. American Psychological Association, Washington, DC

    Google Scholar 

  • Arvey RD, Campion JE (1982) The employment interview: a summary and review of recent research. Pers Psychol 35:281–322. //doi.org/10.1111/j.1744-6570.1982.tb02197.x

    Article  Google Scholar 

  • Barrick MR, Swider BW, Stewart GL (2010) Initial evaluations in the interview: relationships with subsequent interviewer evaluations and employment offers. J Appl Psychol 95:1163–1172. //doi.org/10.1037/a0019918

    Article  Google Scholar 

  • Beaton DE, Bombardier C, Guillemin F, Ferraz MB (2000) Guidelines for the process of cross-cultural adaptation of self-report measures. Spine (Phila Pa 1976) 25:3186–3191. //doi.org/10.1097/00007632-200012150-00014

    Article  Google Scholar 

  • Blackman M (2017) Using interviewing in selection. In: Goldstein H, Pulakos E, Passmore J, Semedo C (eds) The wiley blackwell handbook of the psychology of recruitment, selection and employee retention. John Wiley, Chichester, pp 182–201

    Chapter  Google Scholar 

  • Blacksmith N, Willford J, Behrend T (2016) Technology in the employment interview: a meta-analysis and future research agenda. Pers Assess Decis 2:12–20. //doi.org/10.25035/pad.2016.002

    Article  Google Scholar 

  • Blascovich J, Loomis J, Beall AC, Swinth KR, Hoyt CL, Bailenson JN (2002) Immersive virtual environment technology as a methodological tool for social psychology. Psychol Inq 13:103–124. //doi.org/10.1207/S15327965PLI1302_01

    Article  Google Scholar 

  • Bonaccio S, O’Reilly J, O’Sullivan SL, Chiocchio F (2016) Nonverbal behavior and communication in the workplace: a review and an agenda for research. J Manag 42:1044–1074. //doi.org/10.1177/0149206315621146

    Article  Google Scholar 

  • Borkenau P, Liebler A (1995) Observable attributesas manifestations and cues of personality and intelligence. J Pers 63:1–25. //doi.org/10.1111/j.1467-6494.1995.tb00799.x

    Article  Google Scholar 

  • Bos N, Olson J, Gergle D, Olson G, Wright Z (2002) Effects of four computer-mediated communications channels on trust development. In: Proceedings of the SIGCHI conference on human factors in computing systems. Association for Computing Machinery, Minneapolis, MN, pp 135–140

  • Brown E, Cairns P (2004) A grounded investigation of game immersion. In: Conference on human factors in computing systems. Association for Computing Machinery, Vienna, Austria, pp 1297–1300

  • Butler D, Geis F (1990) Nonverbal affect responses to male and female leaders: implications for leadership evaluations. J Pers Soc Psychol 58:48–59. //doi.org/10.1037/0022-3514.58.1.48

    Article  Google Scholar 

  • Campion M, Pursell E, Brown B (1988) Structured interviewing: raising the psychometric properties of the employment interview. Pers Psychol 41:25–42. //doi.org/10.1111/j.1744-6570.1988.tb00630.x

    Article  Google Scholar 

  • Campion M, Campion J, Hudson J Jr (1994) Structured interviewing: a note on incremental validity and alternative question types. J Appl Psychol 79:998–1002. //doi.org/10.1037/0021-9010.79.6.998

    Article  Google Scholar 

  • Campion M, Palmer D, Campion J (1997) A review of structure in the selection interview. Pers Psychol 50:655–702. //doi.org/10.1111/j.1744-6570.1997.tb00709.x

    Article  Google Scholar 

  • Checa D, Bustillo A (2020) A review of immersive virtual reality serious games to enhance learning and training. Multimed Tools Appl 79:5501–5527. //doi.org/10.1007/s11042-019-08348-9

    Article  Google Scholar 

  • Cheng MT, She HC, Annetta LA (2015) Game immersion experience: its hierarchical structure and impact on game-based science learning. J Comput Assist Learn 31:232–253. //doi.org/10.1111/jcal.12066

    Article  Google Scholar 

  • Chirico A, Gaggioli A (2019) When virtual feels real: comparing emotional responses and presence in virtual and natural environments. Cyberpsychology Behav Soc Netw 22:220–226. //doi.org/10.1089/cyber.2018.0393

    Article  Google Scholar 

  • Churchill GA, Ford NM, Hartley SW, Walker OC (1985) The determinants of salesperson performance: a meta-analysis. J Mark Res 22:103–118. //doi.org/10.2307/3151357

    Article  Google Scholar 

  • Cialdini R (2001) Influence: science and practice. Allyn & Bacon, Boston

    Google Scholar 

  • Cialdini R (2007) Influence: the psychology of persuasion. Collins, New York

    MATH  Google Scholar 

  • Cialdini RB, Goldstein NJ (2004) Social influence: compliance and conformity. Annu Rev Psychol 55:591–621. //doi.org/10.1146/annurev.psych.55.090902.142015

    Article  Google Scholar 

  • Cohen J (1988) Statistical power analysis for the behavioral sciences. Taylor and Francis, Hoboken

    MATH  Google Scholar 

  • Cohn J, Ambadar Z, Ekman P (2007) Observer-based measurement of facial expression with the facial action coding system. In: Coan J, Allen J (eds) The handbook of emotion elicitation and assessment. Oxford University Press, Oxford, pp 203–221

    Google Scholar 

  • Culbertson SS, Weyhrauch WS, Huffcutt AI (2017) A tale of two formats: direct comparison of matching situational and behavior description interview questions. Hum Resour Manag Rev 27:167–177. //doi.org/10.1016/j.hrmr.2016.09.009

    Article  Google Scholar 

  • DeGroot T, Gooty J (2009) Can nonverbal cues be used to make meaningful personality attributions in employment interviews? J Bus Psychol 24:179–192. //doi.org/10.1007/s10869-009-9098-0

    Article  Google Scholar 

  • DeVellis R (2016) Scale development: theory and applications. Sage, Los Angeles

    Google Scholar 

  • Diemer J, Alpers GW, Peperkorn HM, Shiban Y, Mühlberger A (2015) The impact of perception and presence on emotional reactions: a review of research in virtual reality. Front Psychol 6:26. //doi.org/10.3389/fpsyg.2015.00026

    Article  Google Scholar 

  • Dipboye R (2017) The selection/recruitment interview: core processes and contexts. In: Evers A, Anderson N, Voskuijl O (eds) The Blackwell handbook of personnel selection. Blackwell, Malden, pp 119–142

    Chapter  Google Scholar 

  • Dipboye R, Macan T, Shahani-Denning C (2012) The selection interview from the interviewer and applicant perspectives: can’t have one without the other. In: Schmitt N (ed) The Oxford handbook of personnel assessment and selection. Oxford University Press, Oxford, pp 323–352

    Google Scholar 

  • Doherty-Sneddon G, Anderson A, O’Malley C, Langton S, Garrod S, Bruce V (1997) Face-to-face and video-mediated communication: a comparison of dialogue structure and task performance. J Exp Psychol Appl 3:105–125. //doi.org/10.1037/1076-898X.3.2.105

    Article  Google Scholar 

  • Doll JL (2018) Structured interviews: developing interviewing skills in human resource management courses. Manag Teach Rev 3:46–61. //doi.org/10.1177/2379298117722520

    Article  Google Scholar 

  • Elias ZM, Batumalai UM, Azmi ANH (2019) Virtual reality games on accommodation and convergence. Appl Ergon 81:102879. //doi.org/10.1016/j.apergo.2019.102879

    Article  Google Scholar 

  • Feurstein M (2019) Towards an integration of 360-degree video in higher education. Workflow, challenges and scenarios. In: Schiffner D (ed) Proceedings of DeLFI workshops 2018 co-located with 16th e-Learning conference of the German computer society (DeLFI 2018). CEUR WS, Frankfurt, Germany, pp 1–12

  • Fowler C (2015) Virtual reality and learning: where is the pedagogy? Br J Educ Technol 46:412–422. //doi.org/10.1111/bjet.12135

    Article  Google Scholar 

  • Freina L, Ott M (2015) A literature review on immersive virtual reality in education: state of the art and perspectives. In: The 11th international scientific conference e-Learning and software for education. ”Carol I” National Defence University Publishing House, Bucharest, Romania, pp 214–220

  • Gallup AC, Vasilyev D, Anderson N, Kingstone A (2019) Contagious yawning in virtual reality is affected by actual, but not simulated, social presence. Sci Rep 9:294. //doi.org/10.1038/s41598-018-36570-2

    Article  Google Scholar 

  • Georgiou Y, Kyza EA (2017) The development and validation of the ARI questionnaire: an instrument for measuring immersion in location-based augmented reality settings. Int J Hum Comput Stud 98:24–37. //doi.org/10.1016/j.ijhcs.2016.09.014

    Article  Google Scholar 

  • Gifford R, Ng C, Wilkinson M (1985) Nonverbal cues in the employment interview: links between applicant qualities and interviewer judgments. J Appl Psychol 70:729–736. //doi.org/10.1037/0021-9010.70.4.729

    Article  Google Scholar 

  • Google (2020a) Google cardboard. //arvr.google.com/cardboard. Accessed 31 Aug 2020

  • Google (2020b) Interactive patterns – controls. //designguidelines.withgoogle.com/cardboard/interactive-patterns/controls.html. Accessed 31 Aug 2020

  • Guyer JJ, Briñol P, Petty RE, Horcajo J (2019) Nonverbal behavior of persuasive sources: a multiple process analysis. J Nonverbal Behav 43:203–231. //doi.org/10.1007/s10919-018-00291-x

    Article  Google Scholar 

  • Hartwell CJ, Johnson CD, Posthuma RA (2019) Are we asking the right questions? Predictive validity comparison of four structured interview question types. J Bus Res 100:122–129. //doi.org/10.1016/j.jbusres.2019.03.026

    Article  Google Scholar 

  • Hausknecht JP, Day DV, Thomas SC (2004) Applicant reactions to selection procedures: an updated model and meta-analysis. Pers Psychol 57:639–683. //doi.org/10.1111/j.1744-6570.2004.00003.x

    Article  Google Scholar 

  • Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6:65–70

    MathSciNet  MATH  Google Scholar 

  • Huffcutt AI, Conway JM, Roth PL, Klehe UC (2004) The impact of job complexity and study design on situational and behavior description interview validity. Int J Sel Assess 12:262–273. //doi.org/10.1111/j.0965-075X.2004.280_1.x

    Article  Google Scholar 

  • Huffcutt AI, Van Iddekinge CH, Roth PL (2011) Understanding applicant behavior in employment interviews: a theoretical model of interviewee performance. Hum Resour Manag Rev 21:353–367. //doi.org/10.1016/j.hrmr.2011.05.003

    Article  Google Scholar 

  • Huffcutt AI, Culbertson SS, Weyhrauch WS (2013) Employment interview reliability: new meta-analytic estimates by structure and format. Int J Sel Assess 21:264–276. //doi.org/10.1111/ijsa.12036

    Article  Google Scholar 

  • Innocenti A, Molinari F, Müller C (2012) Assessing risk perception and well-being at workplaces: a game based approach. In: 2012 18th international ICE conference on engineering, technology and innovation. IEEE, Munich, Germany, pp 1–8

  • Jensen L, Konradsen F (2018) A review of the use of virtual reality head-mounted displays in education and training. Educ Inf Technol 23:1515–1529. //doi.org/10.1007/s10639-017-9676-0

    Article  Google Scholar 

  • Jerald J (2016) The VR book: human-centered design for virtual reality. ACM, New York

    Google Scholar 

  • Jones B, Kenward M (2015) Design and analysis of cross-over trails. CRC Press, Boca Raton

    MATH  Google Scholar 

  • Kim HK, Park J, Choi Y, Choe M (2018) Virtual reality sickness questionnaire (VRSQ): motion sickness measurement index in a virtual reality environment. Appl Ergon 69:66–73. //doi.org/10.1016/j.apergo.2017.12.016

    Article  Google Scholar 

  • Kinateder M, Warren WH (2016) Social influence on evacuation behavior in real and virtual environments. Front Robot AI 3:1–8. //doi.org/10.3389/frobt.2016.00043

    Article  Google Scholar 

  • Klehe UC, Latham G (2006) What would you do—Really or ideally? Constructs underlying the behavior description interview and the situational interview in predicting typical versus maximum performance. Hum Perform 19:357–382. //doi.org/10.1207/s15327043hup1904_3

    Article  Google Scholar 

  • König CJ, Klehe UC, Berchtold M, Kleinmann M (2010) Reasons for being selective when choosing personnel selection procedures. Int J Sel Assess 18:17–27. //doi.org/10.1111/j.1468-2389.2010.00485.x

    Article  Google Scholar 

  • Kumar R, Sharma G, Kumar L, Chandra S (2017) Effect of immersion (2D vs. 3D) on attention through virtual reality. Int J Sci Res 6:204–207

    Google Scholar 

  • Lambert B, Plank RE, Reid DA, Fleming D (2014) A competency model for entry level business-to-business services salespeople. Serv Mark Q 35:84–103. //doi.org/10.1080/15332969.2014.856746

    Article  Google Scholar 

  • Lanier J (2017) Dawn of the new everything. Bodley Head, London

    Google Scholar 

  • Larsen RJ, Shackelford TK (1996) Gaze avoidance: personality and social judgments of people who avoid direct face-to-face contact. Pers Individ Differ 21:907–917. //doi.org/10.1016/S0191-8869(96)00148-1

    Article  Google Scholar 

  • Liden RC, Martin CL, Parsons CK (1993) Interviewer and applicant behaviors in employment interviews. Acad Manag J 36:372–386. //doi.org/10.5465/256527

    Article  Google Scholar 

  • Lubeck AJA, Bos JE, Stins JF (2015) Motion in images is essential to cause motion sickness symptoms, but not to increase postural sway. Displays 38:55–61. //doi.org/10.1016/j.displa.2015.03.001

    Article  Google Scholar 

  • Macan T (2009) The employment interview: a review of current studies and directions for future research. Hum Resour Manag Rev 19:203–218. //doi.org/10.1016/j.hrmr.2009.03.006

    Article  Google Scholar 

  • Markowitz DM, Laha R, Perone BP, Pea RD, Bailenson JN (2018) Immersive virtual reality field trips facilitate learning about climate change. Front Psychol 9:2364. //doi.org/10.3389/fpsyg.2018.02364

    Article  Google Scholar 

  • Martín-Gutiérrez J, Mora C, Añorbe-Díaz B, González-Marrero A (2017) Virtual technologies trends in education. Eurasia J Math Sci Technol Educ 13:469–486. //doi.org/10.12973/eurasia.2017.00626a

    Article  Google Scholar 

  • McDaniel M, Whetzel D, Schmidt F, Maurer S (1994) The validity of employment interviews: a comprehensive review and meta-analysis. J Appl Psychol 79:599–616. //doi.org/10.1037/0021-9010.79.4.599

    Article  Google Scholar 

  • Niu Y, Wang D, Wang Z, Sun F, Yue K, Zheng N (2019) User experience evaluation in virtual reality based on subjective feelings and physiological signals. J Imaging Sci Technol 63:1–11. //doi.org/10.2352/J.ImagingSci.Technol.2019.63.6.060413

    Article  Google Scholar 

  • Owen DJ, Slep AM, Heyman RE (2012) The effect of praise, positive nonverbal response, reprimand, and negative nonverbal response on child compliance: a systematic review. Clin Child Fam Psychol Rev 15:364–385. //doi.org/10.1007/s10567-012-0120-0

    Article  Google Scholar 

  • Pan X, Hamilton AFC (2018) Why and how to use virtual reality to study human social interaction: the challenges of exploring a new research landscape. Br J Psychol 109:395–417. //doi.org/10.1111/bjop.12290

    Article  Google Scholar 

  • Papanastasiou G, Drigas A, Skianis C, Lytras M, Papanastasiou E (2019) Virtual and augmented reality effects on K-12, higher and tertiary education students’ twenty-first century skills. Virtual Real 23:425–436. //doi.org/10.1007/s10055-018-0363-2

    Article  Google Scholar 

  • Parsons C, Liden R (1984) Interviewer perceptions of applicant qualifications: a multivariate field study of demographic characteristics and nonverbal cues. J Appl Psychol 69:557–568. //doi.org/10.1037/0021-9010.69.4.557

    Article  Google Scholar 

  • Phillips JF (1992) Predicting sales skills. J Bus Psychol 7:151–160. //doi.org/10.1007/BF01013925

    Article  Google Scholar 

  • Pirker J, Lesjak I, Kopf J, Kainz A, Dini A (2020) Immersive learning in real VR. In: Magnor M, Sorkine-Hornung A (eds) Real VR – Immersive digital reality. Springer Nature, Cham, pp 321–336

    Chapter  Google Scholar 

  • Powell W, Powell V, Brown P, Cook M, Uddin J (2016) Getting around in google cardboard – Exploring navigation preferences with low-cost mobile VR. In: 2016 IEEE 2nd workshop on everyday virtual reality (WEVR). IEEE, Greenville, SC, pp 5–8

  • Proost K, Germeys F, Vanderstukken A (2020) Applicants’ pre-test reactions towards video interviews: the role of expected chances to demonstrate potential and to use nonverbal cues. Eur J Work Organ Psychol 30:265–273. //doi.org/10.1080/1359432X.2020.1817975

    Article  Google Scholar 

  • Radianti J, Majchrzak TA, Fromm J, Wohlgenannt I (2020) A systematic review of immersive virtual reality applications for higher education: design elements, lessons learned, and research agenda. Comput Educ 147:103778. //doi.org/10.1016/j.compedu.2019.103778

    Article  Google Scholar 

  • Rentz JO, Shepherd CD, Tashchian A, Dabholkar PA, Ladd RT (2002) A measure of selling skill: scale development and validation. J Pers Sell Sales Manag 22:13–21

    Google Scholar 

  • Rupp MA, Odette KL, Kozachuk J, Michaelis JR, Smither JA, McConnell DS (2019) Investigating learning outcomes and subjective experiences in 360-degree videos. Comput Educ 128:256–268. //doi.org/10.1016/j.compedu.2018.09.015

    Article  Google Scholar 

  • Ryan AM, McFarland L, Shl HB (1999) An international look at selection practices: nation and culture as explanations for variability in practice. Pers Psychol 52:359–392. //doi.org/10.1111/j.1744-6570.1999.tb00165.x

    Article  Google Scholar 

  • Salgado J, Viswesvaran C, Ones D (2004) Predictors used for personnel selection: an overview of constructs, methods and techniques. In: Anderson N, Ones D, Sinangil H, Viswesvaran C (eds) Handbook of industrial, work and organizational psychology. Sage, London, pp 375–377

    Google Scholar 

  • Salmanowitz N (2018) The impact of virtual reality on implicit racial bias and mock legal decisions. J Law Biosci 5:174–203. //doi.org/10.1093/jlb/lsy005

    Article  Google Scholar 

  • Schmidt F, Hunter J (1998) The validity and utility of selection methods in personnel psychology: practical and theoretical implications of 85 years of research findings. Psychol Bull 124:262–274. //doi.org/10.1037/0033-2909.124.2.262

    Article  Google Scholar 

  • Schutte NS, Stilinović EJ (2017) Facilitating empathy through virtual reality. Motiv Emot 41:708–712. //doi.org/10.1007/s11031-017-9641-7

    Article  Google Scholar 

  • Schwind V, Knierim P, Haas N, Henze N (2019) Using presence questionnaires in virtual reality. In: Proceedings of the 2019 CHI conference on human factors in computing systems. Association for Computing Machinery, Glasgow, UK, pp 1–12

  • Sears GJ, Zhang H, Wiesner WH, Hackett RD, Yuan Y (2013) A comparative assessment of videoconference and face-to-face employment interviews. Manag Decis 51:1733–1752. //doi.org/10.1108/MD-09-2012-0642

    Article  Google Scholar 

  • Slater M (2018) Immersion and the illusion of presence in virtual reality. Br J Psychol 109:431–433. //doi.org/10.1111/bjop.12305

    Article  Google Scholar 

  • Slater M, Wilbur S (1997) A framework for immersive virtual environments (FIVE): speculations on the role of presence in virtual environments. Presence Teleoperators Virtual Environ 6:603–616. //doi.org/10.1162/pres.1997.6.6.603

    Article  Google Scholar 

  • Suh A, Prophet J (2018) The state of immersive technology research: a literature analysis. Comput Hum Behav 86:77–90. //doi.org/10.1016/j.chb.2018.04.019

    Article  Google Scholar 

  • Sutcliffe A, Gault B, Shin JE (2005) Presence, memory and interaction in virtual environments. Int J Hum Comput Stud 62:307–327. //doi.org/10.1016/j.ijhcs.2004.11.010

    Article  Google Scholar 

  • Swider B, Barrick M, Harris T (2016) Initial impressions: what they are, what they are not, and how they influence structured interview outcomes. J Appl Psychol 101:625–638. //doi.org/10.1037/apl0000077

    Article  Google Scholar 

  • Taylor PJ, Small B (2002) Asking applicants what they would do versus what they did do: a meta-analytic comparison of situational and past behaviour employment interview questions. J Occup Organ Psychol 75:277–294. //doi.org/10.1348/096317902320369712

    Article  Google Scholar 

  • Tennant M, Youssef GJ, McGillivray J, Clark TJ, McMillan L, McCarthy MC (2020) Exploring the use of immersive virtual reality to enhance psychological well-being in pediatric oncology: a pilot randomized controlled trial. Eur J Oncol Nurs 48:101804. //doi.org/10.1016/j.ejon.2020.101804

    Article  Google Scholar 

  • The Jamovi project (2020) jamovi (Version 1.2.27.0) [Computer software]. //www.jamovi.org. Accessed 31 Aug 2020

  • Theobald EJ, Hill MJ, Tran E et al (2020) Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proc Natl Acad Sci 117:6476. //doi.org/10.1073/pnas.1916903117

    Article  Google Scholar 

  • Turnbull PRK, Phillips JR (2017) Ocular effects of virtual reality headset wear in young adults. Sci Rep 7:16172. //doi.org/10.1038/s41598-017-16320-6

    Article  Google Scholar 

  • Violante MG, Vezzetti E, Piazzolla P (2019) Interactive virtual technologies in engineering education: why not 360° videos? Int J Interact Des Manuf 13:729–742. //doi.org/10.1007/s12008-019-00553-y

    Article  Google Scholar 

  • Viswesvaran C, Ones D (2018) Non-test methods and techniques used in employee selection. In: Ones D, Anderson N, Viswesvaran C, Sinangil H (eds) Handbook of industrial, work & organizational psychology. Sage, Los Angeles, pp 451–473

    Google Scholar 

  • Voit A, Mayer S, Schwind V, Henze N (2019) Online, VR, AR, Lab, and In-Situ: comparison of research methods to evaluate smart artifacts. In: Proceedings of the 2019 CHI conference on human factors in computing systems. Association for Computing Machinery, Glasgow, Scotland, UK, pp 1–12

  • Waimanoo (2020) Survey and education platform. www.waimanoo.ch. Accessed 31 August 2020

  • Weekley J, Gier J (1987) Reliability and validity of the situational interview for a sales position. J Appl Psychol 72:484–487. //doi.org/10.1037/0021-9010.72.3.484

    Article  Google Scholar 

  • Williamson P, Weber N, Robertson MT (2013) The effect of expertise on memory conformity: a test of informational influence. Behav Sci Law 31:607–623. //doi.org/10.1002/bsl.2094

    Article  Google Scholar 

  • Wood W (2000) Attitude change: persuasion and social influence. Annu Rev Psychol 51:539–570. //doi.org/10.1146/annurev.psych.51.1.539

    Article  Google Scholar 

  • Wu B, Yu X, Gu X (2020) Effectiveness of immersive virtual reality using head-mounted displays on learning performance: a meta-analysis. Br J Edu Technol 51:1991–2005. //doi.org/10.1111/bjet.13023

    Article  Google Scholar 

Download references

Which is an example of an interviewer's positive nonverbal behavior?

Non-verbal Communication: Greet the interviewer with a firm handshake. Maintain steady eye contact — it shows sincerity and helps establish rapport. Communicate positive facial expressions and speech qualities. Demonstrate enthusiasm and interest by good posture and friendly smiles.

Which information with the nurse educator include regarding the seating of the patient and the interviewer while teaching nursing students about interviewing techniques?

What information should the nurse educator include regarding the seating of the patient and the interviewer? The interviewer should stand in front of the patient. The interviewer should sit facing the patient across a desk.

Which behavior of the nurse suggest an interest in what the client is saying during an interview?

Active Listening By using nonverbal and verbal cues such as nodding and saying “I see,” nurses can encourage patients to continue talking. Active listening involves showing interest in what patients have to say, acknowledging that you're listening and understanding, and engaging with them throughout the conversation.

What is the primary purpose of interviewing a patient?

It improves the patient's understanding of what their outcomes are likely to be with medical treatment and adherence to that treatment. This form of medical decision-making is more successful with positive patient communication developed in the initial interview.

Toplist

Última postagem

Tag