Humans aren't always great at identifying a person's gender based on visual cues, and a new study suggests that computers may be even worse at it than humans.

A group of researchers from University of Colorado Boulder tested the four biggest commercially available facial recognition software providers for potential race and gender biases.

The researchers found the systems misclassified trans men up to 38 percent of the time and had no options for nonbinary people, meaning they were misclassified 100 percent of the time by default.

Scroll down for video

Researchers from University of Colorado Boulder used 2,450 images to test the accuracy of the four largest facial recognition software providers

'These systems don't know any other language but male or female, so for many gender identities it is not possible for them to be correct,' researcher Jed Brubaker told CU Boulder Today.

The facial recognition software was much more accurate when evaluating cisgender, accurately identifying cisgender women 98.3 percent of the time and cisgender men 97.6 percent of the time.

The study was based on 2,450 images of faces collected from Instragram, each of which had a self-appointed gender identity indicated by the poster in the form of a hashtag.

The images were then split into seven groups according to those hashtags: #women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary.

Facial recognition software correctly identifies cisgender woman 98.3 percent of the time

Most major facial recognition software firms do not even acknowledge nonbinary gender identities, meaning nonbinary people are misclassified 100 percent of the time

Those images were then sent to each of the four largest facial recognition software providers, Amazon, IBM, Microsoft, and Clarifai.

One of the most common uses of facial recognition is used for location specific advertising for goods and services that are conventionally promoted along gendered lines.

'When you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the '90s and it is not what the world is like anymore,' said Brubaker.

'As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not.'

Advertisers are eager to use facial recognition software to draw people into nearby shops based on gender presumptions

Earlier this year, Google attempted to compensate for shortcomings in its own facial recognition software by sending out street teams to collect data from people of color.

The program drew criticism when it was found that the teams were misrepresenting the project, telling people they were playing a selfie game and not giving up personal data to a data base.

The company was also criticized for specifically targeting homeless people.