A video that shows an automatic bathroom soap dispenser failing to detect the hand of a dark-skinned man has gone viral and raised questions about racism in technology, as well as the lack of diversity in the industry that creates it.

The now-viral video was uploaded to Twitter on Wednesday by Chukwuemeka Afigbo, Facebook's head of platform partnerships in the Middle east and Africa.

He tweeted: 'If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video.'

Scroll down for video

The video begins with a white man waving his hand under the dispenser and instantly getting soap on his first try. Then, a darker skinned man waves his hand under the dispenser in various directions for ten seconds, with soap never being released. A white paper towel works too

WHY IT ONLY WORKS FOR WHITE HANDS The soap dispenser likely uses an infrared sensor to detect a hand and release soap. It's known these sensors have a history of failing to detect darker skin tones because of the way they are designed. They send out invisible light from an infrared LED bulb and work when a hand reflects light back to the sensor. Darker skin can cause the light to absorb rather than bounce back, which means no soap will be released. Advertisement

The video begins with a white man waving his hand under the dispenser and instantly getting soap on his first try.

Then, a darker skinned man waves his hand under the dispenser in various directions for over ten seconds, with soap never being released.

It's unclear if this is Afigbo himself.

To demonstrate that skin color is the reason, he then waves a white paper towel under the dispenser and is instantly granted soap.

The tweet has been shared more than 93,000 time, and the video has more than 1.86 million views.

The tweet also spurred over 1,800 comments, many of which are citing this as just another example of lack of diversity in tech.

The tweet also over 1,800 comments, many of which are citing this as just another example of lack of diversity in tech. 'The point is these technical issues resulted from a lack of concern for all end users. If someone cared about the end users, would've used,' commented one user

Others, however, argued that take on the situation is going too far and that the soap dispenser's ability to function has nothing to do with race and diversity.

Many tied it into the current racial tensions in the US, writing this isn't 'a society problem' and people are 'just looking for a reason to fight.'

Throughout the comments, the two sides debated.

The soap dispenser appears to be this one from Shenzhen Yuekun Technology, a Chinese manufacturer.

It retails for as low as $15 each when purchased in bulk and is advertised as a 'touchless' disinfectant dispenser.

'So many people justifying this and showcasing just how deeply embedded racism is. Y'all think it's a *just* a tech prob. PEOPLE CREATE TECH,' commented another

'it's not that this exact thing is the problem, it's that a million tiny things like this exist. and that having more poc in dev would solve,' wrote a user

A user shared a photo depicting another scenario in which technology failed to detect darker skin, writing 'reminds me of this failed beta test

HOW A ROBOT BECAME RACIST Princeton University conducted a word associate task with the algorithm GloVe, an unsupervised AI that uses online text to understand human language. The team gave the AI words like 'flowers' and 'insects' to pair with other words defined as being 'pleasant' or 'unpleasant' like 'family' or 'crash' - which it did successfully. Then algorithm was given a list of white-sounding names, like Emily and Matt, and black-sounding ones, such as Ebony and Jamal', which it was prompted to do the same word association. The AI linked the white-sounding names with 'pleasant' and black-sounding names as 'unpleasant'. Findings suggested the datasets used to train AI is polluted with prejudices and assumptions and these technologies are adopting the ideas. Advertisement

DailyMail.com has reached out to the manufacturer for comment.

According to the product's specs, it uses an infrared sensor to detect a hand and release soap.

No manufacturers of infrared sensors were available for comment, but it's known these sensors have a history of failing to detect darker skin tones because of the way they are designed.

These types of sensors function by measuring infrared (IR) light radiating from objects in their field of view.

Essentially, the soap dispenser sends out invisible light from an infrared LED bulb and works when a hand reflects light back to the sensor.

Darker skin can cause the light to absorb rather than bounce back, which means no soap will be released.

'If the reflective object actually absorbs that light instead, then the sensor will never trigger because not enough light gets to it,' Richard Whitney, VP of product a Particle, told Mic in 2015 in reference to another viral video of a 'racist soap dispenser.'

Other types of technology have been called racist as well, including artificial intelligence.

Others, however, argued that take is going too far and the soap dispenser's ability to function has nothing to do with race and diversity. Throughout the comments, the two sides debated

When one commenter said 'it's not racism,' another replied, 'The point was, if they had hired a POC the tech would have been designed correctly in the first place'

When one user called those arguing this is related to the lack of diversity in tech 'naive,' another pointed out how this is a known problem that is true for facial recognition software as well

In many cases in which technology doesn't work for dark skin, it's because it wasn't designed with the need to detect darker skin tones in mind.

Such was the case with the world's first beauty contest judged by AI, in which the computer program didn't choose a single person of color as any of the nearly 50 winners.

The company admitted to the Observer: 'the quality control system that we built might have excluded several images where the background and the color of the face did not facilitate for proper analysis.'

Earlier this year, an artificial intelligence tool that has revolutionized the ability of computers to interpret language was shown to exhibit racial and gender biases.

Joanna Bryson, a computer scientist at the University of Bath and a co-author of the research, told The Guardian that AI has the potential to reinforce existing biases.

'A lot of people are saying this is showing that AI is prejudiced,' she said.

'No. This is showing we’re prejudiced and that AI is learning it.'

The tech industry overall has been under fire discriminatory practices regarding gender and race throughout 2017.