When Artificial Intelligence isn’t Intelligent
Find out if Integrity Advocate is right for you
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Get a Free Demo

When Artificial Intelligence isn’t Intelligent

When Artificial Intelligence isn’t Intelligent

AI or Artificial Intelligence are words that are used everywhere now when talking about technology. AI can be described as any technology that perceives its environment and takes actions based on the data it has and receives.

There is little doubt that AI is going to revolutionize the world. But not quite yet.

Artificial Intelligence is in its infancy at best. AI does not currently have the data it requires to make correct decisions 100% of the time and has no emotional intelligence or discretion whatsoever.

The 'lack of data' is why auto-proctoring services and the organizations using them have had to explain why people of colour and specific races are not recognized by their systems, delayed, and/or barred from starting events altogether. The data libraries that the AI needs, if it is going to be used to make decisions relating to access and/or identity confirmation, cannot be based on predominantly white participants.

The lack of emotional intelligence or discretion in AI can also cause significant stress for learners and increased administration for organizations.

Think of the following situations:

-      Learners with disabilities that use their eyes not as the AI predicts them to be used

-      Learners with small children or siblings that may come into the screen view

-      A portrait hanging on the wall that results in AI auto-proctoring to flag for multiple participants

-      Learners flagged for not being in camera view because they have their hand on or near their face

-      Learners with darker coloured skin being flagged for not being in view due toAI not recognizing their face as a face

All these false positives create anxiety for learners and result in additional administration for instructors/managers.

In fact,various privacy regulators have sided with the UK and their GDPR legislation,that gives people the right not to be subject to solely automated decisions.

It is for these reasons that Integrity Advocate does not allow AI to make decisions relating to learner identity or access control, and instead has chosen to continually develop our technology and not remove human reviewers from the decision-making process.

Simply stated, the demand for our technology and service comes from end users, regulators and leading organizations that always prioritize the privacy protection of learners.

Get in touch today to learn more about our best-in-class online participation monitoring & proctoring software solution.