Job interviews are a predominant part of any hiring process to evaluate applicants. It is used to evaluate applicant's knowledge, skills, abilities, and behavior in order to select the most suited person for the job. Recruiters make their opinion, on the basis of both verbal and nonverbal communication of an interviewee. Our behavior and communication in daily life are cross-modal in nature. Facial expression, hand gestures and body postures are closely linked to speech and hence enrich the vocal content. Nonverbal communication plays an important role in what we are saying and what we actually mean to say. It carries relevant information that can reveal social construct of a person as diverse as his personality, state of mind, or job interview outcome; they convey information in parallel to our speech. In this paper, we present an automated, predictive expert system framework for the computational analysis of HR Job interviews. The system includes analysis of facial expression, language and prosodic details of the interviewees and thereby quantifies their verbal and nonverbal behavior. The system predicts the rating on the overall performance of the interviewee and on each behavior traits and hence predict their personality and hireability.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Discover Cross-Modal Human Behavior Analysis


    Beteiligte:


    Erscheinungsdatum :

    2018-03-01


    Format / Umfang :

    6197014 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Discover Engineering

    Pavlock, Kate M. | NTRS | 2012


    Discover Engineering

    K. M. Pavlock | NTIS | 2012



    Deep Classification-driven Domain Adaptation for Cross-Modal Driver Behavior Recognition

    Reiss, Simon / Roitberg, Alina / Haurilet, Monica et al. | IEEE | 2020


    Can you discover Kraitchik's number?

    British Library Online Contents | 2004