A study conducted by Cornell University and University of Washington professors and researchers concludes that filters used in dating apps reinforce societally influenced biases and division.

Cornell professors Solon Barocas and Karen Levy, Cornell research initiative coordinator Jessie Taft, and UW law scholar Jevan Hutson established theories surrounding "queer and social justice" as it pertains to human-computer interaction (HCI). They did so in order to analyze the ways in which race, gender, and disability filters on dating apps, or "intimate platforms," reinforce stereotypes, biases, and social divisions.

"Claims of neutrality from platforms ignore the inevitability of their role in shaping interpersonal interactions that can lead to systemic disadvantage"

[RELATED: Prof finds majority of minorities don’t face discrimination]

The study proposed that changes must be made both in the design and marketing of dating apps to diminish user bias and discrimination and foster "social justice and equality."

Barocas, Levy, Taft, and Hutson suggest that romantic and sexual choices are products of our "cultural environments." Resistance to this notion, the study claims, "rests on a fundamental misapprehension about both the nature of desire and the degree to which platforms can avoid exercising influence over our preferences in partners."

The researchers highlight three strategies that dating services can implement in their designs to see the impacts their practices have on "intimate discrimination."

The first strategy involved tweaking their filter, search, and sort tools. The researchers suggest that instead of letting users filter for what they believe they want, services can remove features from their filters that promote stereotypes and provide intentionally diverse results in the user’s searches.

Barocas, Levy, Taft, and Hutson also suggest implementing search tools that filter for standards that don’t reinforce social biases, such as political views, relationship history, education, and smoking and drinking preferences.

The researchers highlighted a second strategy, this one regarding search algorithms. They recommended that “instead of using algorithms that choose the ‘safest’ possible outcome, matches could be calculated with significantly more enthusiasm for diversity.”

Barocas, Levy, Taft, and Hutson also recommend implementing community guidelines that warn against discriminatory and disrespectful behavior. The researchers say that services can do this by keeping their users informed on the biases certain groups have to face and influence users to re-evaluate their own biases.

Lastly, the study's authors contend that services cannot remain neutral, as they either allow or disallow certain information to be used in their filters: “Claims of neutrality from platforms ignore the inevitability of their role in shaping interpersonal interactions that can lead to systemic disadvantage."

[RELATED: Colorado State: ‘avoid gendered emojis’]

The study listed three additional suggestions for how dating apps should intervene with user experience.

“A platform may view users’ preferences as their own, while still rejecting the idea that users from minority groups must face denigrating messages as a condition of navigating the platform,” Barocas, Levy, Taft, and Hutson write.

They advise dating apps to bear in mind the negative effects that intervention have had in the past, referencing “anti-miscegenation laws, anti-sodomy laws, conversion therapy regimes, or forced sterilizations of persons with disabilities.”

Discussing when it is and is not appropriate to intervene in such intimate matters, the researchers suggest that it would be imprudent for designers to forbid Jewish users from trying to match with other Jewish users, or for gay users to be barred from searching for users of the same sex and sexual orientation.

Follow the author of this article on Twitter @knelson1776