Google returning a page about “abortion” to a search for “murder” is valid.
because
score: 3
The creator of the webpage used the key word like "Murder" in their article which is their own bias view regarding abortion. Google search engine simple connected those key words in the search and return a webpage that involved the key word "Murder" to the search result.
Google bombing heavily linking unrelated topics, shows the connection between an abortion and murder. Cited specific years 2007, 2011 murders linked to abortion. I agree with how does murder link to abortion two different intentions
Google is its own company that's allowed to curate its search results however it deems fit by its own algorithm. Its algorithm is precise, and it's based on exactly what is meant to be returned.
Google's algorithm ties into what information it obtains from other sites. Because Google is foremost a search engine, Google is not at fault for falsely linked information simply because its algorithm displays the most searched websites.
Since Google returning the Wikipedia page on "abortion" for searches of the word "murder" was probably due to Google bombing (being fed false data), we can't say the result was valid. After all, it was based on incorrect data.
Google may have other intentions... Google Bombing is known for heavily linking two very different topics. Nothing is wrong with the system, they just may be biased.
Google's algorithm assigns each webpage a relevant score, based on how long the webpage has existed and the number of other webpages linked to the page in question. Because abortion is such a popular topic google returned a website that had a high score, which was not an error.
Google results are based off successful, long running algorithms trained on large datasets. These algorithms produce valid results that successfully increase clicks, even if people do not believe it.
The use of Google Bombing makes the result invalid, it does not come from accurate data or searches, instead it is an intentional attempt to "beat the system"
Google's ranking algorithm may be misinterpreting the context of how abortion and murder are related or their algorithm may have been deliberately manipulated to show this result.
We are combining a word associated with terrorism, bombing, with a relation between two words, murder and abortion. To some people they share a valid relation. By making the people who endorse this word association "bombers" this article is no better than Google in acting unbiased.
Google associating abortion with murder goes against the definition and surrounding context of the word and adds a connotation that is not necessarily agreed upon by all parties.
Various languages complicate the algorithm as 'abortion' might translate to a synonym to 'murder' but regardless the algorithm should be smarter to understand the context of a word
This is an error because the system was fed false data about the correlations of the words "murder" and "abortion". The use of the words were not in comparison. (njb317)
By just measuring the frequency the word abortion was used on the wikipedia page for abortion, google's algorithm ignored the context in which the word was used. Just because the word "murder" was mentioned does not mean it equated it to abortions making the search result misleading
By just measuring the frequency the word abortion was used on the wikipedia page for abortion, google's algorithm ignored the context in which the word was used. Just because the word "murder" was mentioned does not mean it equated it to abortions making the search result misleading
By just measuring the frequency the word abortion was used on the wikipedia page for abortion, google's algorithm ignored the context in which the word was used. Just because the word "murder" was mentioned does not mean it equated it to abortions making the search result misleading
If that result is the consequence of users manipulating Google's PageRank algorithm for political purposes, Google has a responsibility to counter those efforts.
An algorithm is not invalid just because it is not programmed to filter biases and opinions. The algorithm is using associations between search queries. Abortion and murder searchers may have high correlations.
"Wikipedia article for ‘Abortion’ as the 2nd most relevant result. Believed to have been planned and executed by a group of anti-abortion protesters, this bomb was designed to make a political statement surrounding the abortion debate in 2012 Presidential Election."
What if Google's algorithm just decided that abortion was relevant to murder because it saw a lot of arguments about whether or not abortion is murder, and concluded that they were related subjects?
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
However, it is Google's responsibility to help create a bias free platform (con). We have seen this problem in the latest presidential election. It should be an even platform for all topics/controversies.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
Google is such a large company, with hundreds of sections of regulation. One would think that their algorithms would at least not prioritize certain related sources over others.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
It is possible that Google can create a bias free platform because they are a huge company and almost everyone with access to the internet uses the search engine.
Google is such a large company, with hundreds of sections of regulation. One would think that their algorithms would at least not prioritize certain related sources over others.
However, it is Google's responsibility to help create a bias free platform (con). We have seen this problem in the latest presidential election. It should be an even platform for all topics/controversies.
However, it is Google's responsibility to help create a bias free platform (con). We have seen this problem in the latest presidential election. It should be an even platform for all topics/controversies.
However, it is Google's responsibility to help create a bias free platform (con). We have seen this problem in the latest presidential election. It should be an even platform for all topics/controversies.
It is hard for Google to constantly monitor everything that comes up on its linked sources. Therefore, because their algorithms are comprised of so much filtering already, the word "murder" is just a source commonly linked to abortion.
Google list gives results based on potential clicks, and therefore the result is valid because there are a significant number of clicks when this result comes up
The algorithm google presented worked perfectly, it was the users input that caused the algorithm to pull up abortion pages when the term murder is searched
Abortion coming up from a query to murder is a valid result. User activity and posts about the two topics cause them to be linked. Although its an uncomfortable topic, the algorithm used by google is simply learning from user activity.
The users are the ones who promote a "disturbing google response". The algorithm however will never lead to an error, they simply just count the amount of outside links to this website.
The google algorithm makes the best guess for a search based on the information it is given by users. If it makes a racist guess, it is the users fault for giving it racist data.