Google is sometimes regarded as the 'perfect' search engine - it's certainly the most popular one on the web, mostly for the way it easily and accurately indexes the world's knowledge in a straightforward, unbiased way.

However, the algorithms and processes that make up Google's search results are designed and built by humans - who have their own prejudices, experiences and biases.

This issue was discussed in a February TEDx talk by Swedish journalist and lecturer Andreas Ekström, titled 'The myth of the unbiased search result'.

In the talk, Ekström mentions two separate incidents of 'Googlebombing' - the practice of manipulating Google's algorithms to get a piece of content to the top of the search results for a given topic, usually with an agenda in mind.

The first he mentions occurred shortly after the election of Barack Obama in 2009. A group started a racist campaign to try and get a distorted image of First Lady Michelle Obama which made her look like a monkey to the top of the image search result for 'Michelle Obama'.

By captioning, tagging and changing the file name of this picture to 'Michelle Obama' and publishing it on a number of blogs and social media sites, the group tricked Google's algorithm into thinking this was an popular and accurate image, pushing it to the top.

As a result, for a few weeks in 2009, this racist image appeared at the top of the Google search results for 'Michelle Obama'.

The campaign then ended and the picture would have drifted away from the top spot over time, but Google made the decision to step in, changing their settings and manually removing the inaccurate and racist image from their results.

A couple of years later, a similar thing happened in the wake of an attack in Norway in which far-right terrorist Anders Behring Breivik murdered 77 people in Oslo and on the island of Utøya in separate bomb and gun attacks.

It was the deadliest attack that had taken place in Norway since World War Two, and quickly people began to notice Breivik had left a carefully-laid digital trail - in blog posts and emailed manifestos that laid out his racist and Islamophobic views.

In pictures: Concept illustrations of the memorial to Norway's Utoya victims Show all 4 1 /4 In pictures: Concept illustrations of the memorial to Norway's Utoya victims In pictures: Concept illustrations of the memorial to Norway's Utoya victims Utoya memorial An illustration of the artwork from Swedish artist Jonas Dahlberg In pictures: Concept illustrations of the memorial to Norway's Utoya victims Utoya memorial Artist Dahlberg proposes cutting a one thousand cubic meter slice out of the rock, from the mainland at Soerbraaten near Utoya site of the mass killings of Anders Behring Breivik leaving a permanent scar on the landscape. The rubble collected will then be taken to the site where Breivik detonated a bomb in central Oslo, where it will be used first to build a temporary memorial walk and then later form part a permanent amphitheatre, titled 'Time and Movement' In pictures: Concept illustrations of the memorial to Norway's Utoya victims Utoya memorial Jonas Dahlberg won the competition '22 July Memorial sites'. The scar in the nature that can be seen in the illustrations will be excavated on the mainland at Soerbraaten near Utoya where 68 people lost their lives on 22 July 2011 In pictures: Concept illustrations of the memorial to Norway's Utoya victims Utoya memorial According to the judges, Dahlberg's design reveals a 'three-and-a-half-metre wide excavation running from the top of the headland at the Sorbraten site to below the waterline and extending to each side. This gap in the landscape will make it impossible to reach the end of the headland.'

In response to Breivik's efforts to spread his message online, a Swedish developer and search engine expert named Nikke Lindqvist launched a campaign, urging people to upload pictures of dog dirt to blogs and social media and tag them with Breivik's name.

Like the Obama campaign, it worked - and in the weeks following the attack, those Googling Breivik were confronted with hundreds of pictures of dog dirt.

Although different in motivation, the two campaigns worked in exactly the same way - but in the second, Google didn't step in, and the inaccurate Breivik images stayed at the top of the search results for much longer.

Few would argue that Google was wrong to end the Obama campaign or let the Breivik one run its course, but the two incidents shed light on the fact that behind such a large and faceless multi-billion dollar tech company as Google, there's people deciding what we see when we search.

And in a time when Google has such a poor record for gender and ethnic diversity and other companies struggle to address this imbalance (as IBM did when they attempted to get women into tech by encouraging them to 'Hack a Hairdryer'), this fact becomes more pressing.

Out of all of Google's technical staff worldwide, only 18 per cent are women. When looking at ethnicities, two per cent of tech staff are hispanic, and one per cent are black.

In light of Ekström's talk, many have questioned whether Google's practice of drawing the people who build and design its algorithms from a small, homogeneous pool of people could lead to unperceived biases, simply because the range of experiences and points of view at Google are narrower than they could be.

Speaking to The Independent, Ekström said: "It seems very obvious to anyone in tech I've spoken to that a diverse group will be more suited to sustainably solving a problem than a group of clones."

"We've come a long way in a short time when it comes to this issue, and I think we've got to the point where nobody really questions anymore the idea that your background and your experience is so profound that you will approach your professional challenges differently depending on your experience of life so far."

Google always fixes its controversial search results, such as an incident with Google Maps that took place this year, in which a search for a racist epithet took users to the White House.

Google's algorithm curates what its creators think we want to see - by increasing diversity amongst its staff, Google could more accurately provide its users with what they want, and possibly stop manipulation like this from happening in the first place.

This isn't neccessarily Google's fault, however - it needs to hire the best people for the job, and those studying IT-related courses at top universities are largely a fairly homogeneous group.

"Who goes to the most prestigious colleges? Who is deciding to major in what subject 15 or 20 years ago? It's going to take some time, but Google's made some incredible progress in this regard in the last decade," Ekström said.

In his view, fixing this problem needs a serious long-term approach from the company's HR department, but also a greater understanding of the difference between experience and skill.

"If we decide that your programming abilities are the most important thing then great, but what other qualities do you need? What other skillsets are required?" he says.

In his view, Google needs to ask what things they need to add to their experience.

Google has taken big steps to increase workplace diversity - but it's still got a long way to go (Getty)

"When you do that, it's easier next time around to say 'We're going to recruit a woman,' or 'We're going to recruit a person of colour', because the group we have at the moment is fairly similar and we need more points of view."

"When they look at a group that is lacking a certain skillset, Google looks to strengthen the group by bringing in different experiences and making it more diverse," he said.

Google, and every other tech company, is already taking steps to do this in an effort to help them serve their users better.