Job application requiring emotion recognition from AI generated facial expressions

Yes this is after I told them I was autistic (they said they would apply a 'score adjuster' afterwards). Yes it produced the predictable result that I was terrible at it. Here is an example screenshot. 

The job had nothing to do with recognising emotions, it was for a technical role not a people focused one. It just seems ridiculous that they can do this so openly? As I feel like naming and shaming today, this was for a WSP job and the psycometric testing was Arctic Shores. 

 

Parents
  • Seems to be the equivalent of saying you are an equal opportunities employer, then instigating a high-jump test, with the proviso that the test results will be 'adjusted' for any wheelchair users. Flagrant discrimination on the grounds of disability. I would at least send all the details to the NAS. It might help if this sort of thing is brought to the attention of a wider audience. It would not filter out a sociopath, or narcissist, or thief, or fraudster etc. etc.

Reply
  • Seems to be the equivalent of saying you are an equal opportunities employer, then instigating a high-jump test, with the proviso that the test results will be 'adjusted' for any wheelchair users. Flagrant discrimination on the grounds of disability. I would at least send all the details to the NAS. It might help if this sort of thing is brought to the attention of a wider audience. It would not filter out a sociopath, or narcissist, or thief, or fraudster etc. etc.

Children
No Data