Type a search term into Google’s ubiquitous blue box, and there’s a good chance you’ll reflexively scroll down to the autocomplete suggestions. While you may not think much about those suggestions, you probably think even less about the ones you never see.

Here’s why you should start: For about half a decade, bisexual advocates have been fighting to get Google Inc. (NASDAQ:GOOG) to untangle the word “bisexual” from its complex predictive algorithm. Currently, if you begin typing that word into a search box, the autocomplete suggestions disappear, just as they would for any number of offensive slurs or four-letter expletives. This is not the case for other LGBT terms -- “transgender,” for instance, or “lesbian” -- which produce an array of relevant recommendations with no apparent restrictions. BiNet USA, the country’s oldest bisexual rights organization, says that longstanding disparity is indicative of mainstream attitudes toward the bisexual community, which often struggles for recognition and acceptance even among its LGBT peers. “What it does is invisibilize our community,” Faith Cheltenham, the group’s president, told International Business Times.

This may seem, at first glance, like a trivial complaint. Autocomplete is not a necessity, after all, and no one is prohibited from searching for -- and finding -- bisexual-related content via Google. But Cheltenham said filtering the word on autocomplete has diminished users’ ability to find accurate information about bisexuality and bisexual advocacy groups. Consider that autocomplete reflex, and it’s easy to imagine what you miss when some suggestions are never presented. And that’s a serious problem, Cheltenham said, for an already-marginalized community dogged by misconceptions and misinformation.

“It’s a great loss of page rank for all of our organizations,” Cheltenham said. “It just makes it really hard for bisexual people to find relevant resources, and they need relevant resources. I have links to people who have said, ‘I was thinking about killing myself today, and I went on Google and couldn’t find anything. And so I’m going to die.’”

Photo: Google/screenshot

It was in 2009 that Cheltenham said she first contacted the Google Helpdesk to make the company aware of the issue. Google responded to explain that it was a bug and promised to fix it. “Sometimes perfectly good search terms can trip up our algorithms that decide whether to show instant results,” a Google spokesperson said at the time. “This can happen when our automatic filters detect a strong correlation on the (unfiltered) Internet between those terms and pornography.”

Months went by with no change, until finally, in September 2012, Cheltenham announced that Google had lifted the ban. But the problem still wasn’t solved, at least not entirely. Today certain suggestions now show up if you type in the word “bisexual,” but only after you fully type the word and begin a second term to follow it (“bisexual organizations,” for instance). For the most part, an attempt to type in the term will still result in zero suggestions -- almost two years after Google supposedly lifted the ban.

Sometime last summer, Cheltenham got tired of waiting. She teamed up with Sarah Prager, an activist and entrepreneur who developed a mobile app called Quist, which shares information about LGBT history. Prager is no stranger to bisexual censorship. When she first launched her app on the Apple Store, she received a message stating that one word in her product description was flagged as inappropriate -- which could result in the app’s removal. That word was “bisexual.” In short, Prager was told she couldn’t use the “B” to describe an LGBT app. Confounded and angry, Prager launched a petition on the website Change.org, urging Apple Inc. (NASDAQ:AAPL) to lift the ban. The company acted swiftly. “Within 24 hours we had over 1,000 signatures, and Apple called me and said the issue had been corrected,” Prager said in a phone interview.

Photo: Screenshot/Amazon

When Cheltenham told Prager that she was having the same problem with Google, the pair decided that a similar petition might get Google’s attention. Prager launched one in August 2013. The petition grew slowly but steadily for several months, until last week, when it was shared on a Tumblr page and went “sort of viral,” Prager said. At last check, it had just reached 10,000 signatures.

The Machines Have Already Won

Unfortunately, the fix Cheltenham and Prager are hoping for may not be coming anytime soon. A spokesperson for Google told IBTimes that the company is well aware of the issue and has been working to address it, but it’s not simply a matter of flipping a switch and unblocking a single term. Google’s autocomplete suggestions are served up via a complex algorithm, one that must make heads or tails out of billions of search terms every day. In between, it’s programmed to filter out terms and phrases that might be deemed inappropriate or might be associated with inappropriate searches.

Google doesn’t share detailed information about the secret sauce of its algorithms, but it says it can’t promise a particular term will always produce predictive suggestions. That means, at least to some extent, we are at the mercy of the machine. “These predicted searches are produced automatically based on a number of factors including the popularity of search terms,” the spokesman said.

Cheltenham said that’s not good enough. She said despite Google’s image as an LGBT-friendly company, it has been largely uncooperative in working with the bisexual community in the same way that it does with gay and lesbian groups. “The reality is they haven’t had a relationship with the bi community,” Cheltenham said. “Every suggestion or ask has been turned down.”

Photo: amBI Los Angeles

Given the pervasiveness of autocomplete, it’s somewhat surprising to discover just how divisive the feature can be. Many disagreements over Google’s autocomplete suggestions have wound up in court. A Japanese man claimed the feature associated him with crimes he didn’t commit. The former first lady of Germany sued Google because autocomplete linked her name to prostitution. An Italian businessman sued -- and won -- because he believed autocomplete was calling him a con man. The list goes on.

Jeff Hermes, director of the Digital Media Law Project at Harvard University, who has followed those lawsuits, said he believes similar autocomplete court battles would be “doomed to failure” in the United States, where Google has strong First Amendment protections on its side. “Fundamentally, it’s Google’s search engine and they can do what they want with it,” he told IBTimes. “They can impose whatever filter they want on particular results, so a legal claim there is unlikely to be successful.”

Nevertheless, Hermes said, it does raise concerns when a complaint involves filtering that is not based on effectively making autocomplete more useful, but instead prevents certain kinds of information from being presented. He called BiNet USA’s criticism a “totally valid point,” one compounded by Google’s power and prevalence in our lives. “Just because there isn’t a legal responsibility, doesn’t mean there isn’t a social or moral responsibility,” he said.

Prager, who has been watching her petition closely as it gained supporters over the last few days, said that moral responsibility goes deeper than a throttled search request. It extends, she said, to a broader society in which bisexual individuals don’t fit a simpler narrative often preferred by gays and straights alike. If that’s going to change, it’s going to start with corrective information -- but that means we’ll need to be able to find it. “The whole rhetoric of the mainstream culture already makes bisexuality invisible,” she said. “Efforts to become more visible should at least be given a fair chance to find more readers.”

Got a news tip? Email me. Follow me on Twitter @christopherzara.