Search neutrality
{{Network neutrality}}
{{short description|Principle that search engines' results should be based solely on relevance}}
Search neutrality is a principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance.{{cite news | url=https://www.nytimes.com/2009/12/28/opinion/28raff.html | title=Search, but You May Not Find | work=The New York Times | year=2009 | access-date=March 3, 2011}} This means that when a user types in a search engine query, the engine should return the most relevant results found in the provider's domain (those sites which the engine has knowledge of), without manipulating the order of the results (except to rank them by relevance), excluding results, or in any other way manipulating the results to a certain bias.
Search neutrality is related to network neutrality in that they both aim to keep any one organization from limiting or altering a user's access to services on the Internet. Search neutrality aims to keep the organic search results (results returned because of their relevance to the search terms, as opposed to results sponsored by advertising) of a search engine free from any manipulation, while network neutrality aims to keep those who provide and govern access to the Internet from limiting the availability of resources to access any given content.
Background
The term "search neutrality" in context of the internet appears as early as March 2009 in an academic paper by the Polish-American mathematician Andrew Odlyzko titled, "Network Neutrality, Search Neutrality, and the Never-ending Conflict between Efficiency and Fairness in Markets".{{cite journal|last1=Odlyzko|first1=Andrew|title=Network Neutrality, Search Neutrality, and the Never-ending Conflict between Efficiency and Fairness in Markets|journal=Review of Network Economics|date=March 2009|pages=40–60|url=https://www.bsi.umn.edu/~odlyzko/doc/rne81.pdf|access-date=4 July 2017}}{{Dead link|date=August 2023 |bot=InternetArchiveBot |fix-attempted=yes }} In this paper, Odlykzo predicts that if net neutrality were to be accepted as a legal or regulatory principle, then the questions surrounding search neutrality would be the next controversies. Indeed, in December 2009 the New York Times published an opinion letter by Foundem co-founder and lead complainant in an anti-trust complaint against Google, Adam Raff, which likely brought the term to the broader public. According to Raff in his opinion letter, search neutrality ought to be "the principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance". On October 11, 2009, Adam and his wife Shivaun launched SearchNeutrality.org, an initiative dedicated to promoting investigations against Google's search engine practices.{{cite web|url=http://www.searchneutrality.org/about|website=searchneutrality.org|title=About SearchNeutrality.org|access-date=4 July 2017|url-status=bot: unknown|archive-url=https://web.archive.org/web/20160804021604/http://www.searchneutrality.org/about|archive-date=4 August 2016}} There, the Raffs note that they chose to frame their issue with Google as "search neutrality" in order to benefit from the focus and interest on net neutrality.
In contrast to net neutrality, answers to such questions, as "what is search neutrality?" or "what are appropriate legislative or regulatory principles to protect search neutrality?", appear to have less consensus. The idea that neutrality means equal treatment, regardless of the content, comes from debates on net neutrality.{{cite journal|title=Some Skepticism about Search Neutrality|journal=The Next Digital Decade: Essays on the Future of the Internet|last1=Grimmelmann|first1=James|date=17 January 2011 |pages=435–461|ssrn = 1742444}} Neutrality in search is complicated by the fact that search engines, by design and in implementation, are not intended to be neutral or impartial. Rather, search engines and other information retrieval applications are designed to collect and store information (indexing), receive a query from a user, search for and filter relevant information based on that query (searching/filtering), and then present the user with only a subset of those results, which are ranked from most relevant to least relevant (ranking). "Relevance" is a form of bias used to favor some results and rank those favored results. Relevance is defined in the search engine so that a user is satisfied with the results and is therefore subject to the user's preferences. And because relevance is so subjective, putting search neutrality into practice has been so contentious.
Search neutrality became a concern after search engines, most notably Google, were accused of search bias by other companies.{{cite journal|last1=Lao|first1=Marina|title="Neutral" Search As A Basis for Antitrust Action?|journal=Harvard Journal of Law & Technology Occasional Paper Series|date=July 2013|pages=1–12|url=http://jolt.law.harvard.edu/antitrust/articles/Lao.pdf|access-date=19 November 2014|archive-date=5 September 2015|archive-url=https://web.archive.org/web/20150905112907/http://jolt.law.harvard.edu/antitrust/articles/Lao.pdf|url-status=dead}} Competitors and companies claim search engines systematically favor some sites (and some kind of sites) over others in their lists of results, disrupting the objective results users believe they are getting.{{cite journal|last1=Herman|first1=Tavani|editor1-last=Zalta|editor1-first=Edward N.|title=Search Engines and Ethics|journal=The Stanford Encyclopedia of Philosophy|date=2014|url=http://plato.stanford.edu/archives/spr2014/entries/ethics-search/|access-date=20 November 2014}}
The call for search neutrality goes beyond traditional search engines. Sites like Amazon.com and Facebook are also accused of skewing results.{{cite web|last1=Shavin|first1=Naomi|title=Are Google and Amazon the next threat to net neutrality?|url=https://www.forbes.com/sites/naomishavin/2014/07/02/are-google-and-amazon-the-next-threat-to-net-neutrality/|website=Forbes.com|access-date=19 November 2014}} Amazon's search results are influenced by companies that pay to rank higher in their search results while Facebook filters their newsfeed lists to conduct social experiments.
"Vertical search" spam penalties
In order to find information on the Web, most users make use of search engines, which crawl the web, index it and show a list of results ordered by relevance. The use of search engines to access information through the web has become a key factor for online businesses, which depend on the flow of users visiting their pages.{{cite web | url=http://www.marketingtoday.com/emarketing/0305/b2b_importance_sem.htm | title=Search Engine Marketing | publisher=Marketing Today | year=2005 | access-date=March 2, 2011 | archive-date=February 25, 2011 | archive-url=https://web.archive.org/web/20110225213515/http://www.marketingtoday.com/emarketing/0305/b2b_importance_sem.htm | url-status=dead }} One of these companies is Foundem. Foundem provides a "vertical search" service to compare products available on online markets for the U.K. Many people see these "vertical search" sites as spam.{{cite web|title=Antitrust nemesis accuses Google of 'WMD program'|url=https://www.theregister.co.uk/2011/09/01/foundem_accuses_google_of_wmd_program_against_vertical_search_rivals/|work=The Register|publisher=The Register|access-date=2 August 2012|author=Cade Metz|date=1 September 2011}} Beginning in 2006 and for three and a half years following, Foundem's traffic and business dropped significantly due to what they assert to be a penalty deliberately applied by Google.{{cite web | url=http://www.searchneutrality.org/foundem-google-story| title=Foundem's Google Story| publisher=www.searchneutrality.org| archive-url=https://web.archive.org/web/20100127070002/http://www.searchneutrality.org/foundem-google-story |archive-date=27 January 2010 | year=2009 }} It is unclear, however, whether their claim of a penalty was self-imposed via their use of iframe HTML tags to embed the content from other websites. At the time at which Foundem claims the penalties were imposed, it was unclear whether web crawlers crawled beyond the main page of a website using iframe tags without some extra modifications. The former SEO director OMD UK, Jaamit Durrani, among others, offered this alternative explanation, stating that “Two of the major issues that Foundem had in summer was content in iFrames and content requiring javascript to load – both of which I looked at in August, and they were definitely in place. Both are huge barriers to search visibility in my book. They have been fixed somewhere between then and the lifting of the supposed ‘penalty’. I don't think that's a coincidence.”{{cite web|title=Foundem vs Google redux: it was a penalty! And search neutrality is at stake, dammit!|url=https://econsultancy.com/blog/5183-foundem-vs-google-redux-it-was-a-penalty-and-search-neutrality-is-at-stake-dammit/|author=Chris Lake|date=5 January 2010|access-date=July 3, 2017|archive-date=31 July 2017|archive-url=https://web.archive.org/web/20170731093158/https://econsultancy.com/blog/5183-foundem-vs-google-redux-it-was-a-penalty-and-search-neutrality-is-at-stake-dammit|url-status=dead}}
Most of Foundem’s accusations claim that Google deliberately applies penalties to other vertical search engines because they represent competition.{{cite web | url= http://googlepublicpolicy.blogspot.com/2010/02/committed-to-competing-fairly.html | title=Committed to competing fairly| year=2010 | access-date=March 2, 2011}} Foundem is backed by a Microsoft proxy group, the 'Initiative for Competitive Online Marketplace'.{{cite web|title=Texas Conducting Antitrust Review of Google|url=https://www.pcworld.com/article/204857/texas_conducting_antitrust_review_of_google.html|work=PC World|publisher=IDG Consumer and SMB|access-date=2 August 2012|author=Nancy Gohring|date=4 September 2010}}
=The Foundem’s case chronology=
The following table details Foundem's chronology of events as found on their website:{{cite web | url= http://www.searchneutrality.org/uncategorized/foundem-is-not-a-microsoft-puppet | title=The Chronology | publisher=www.searchneutrality.org | year=2010 | access-date=March 3, 2011}}
class="wikitable" | |
Date | Event |
---|---|
June 2006 | Foundem's Google search penalty begins. Foundem starts an arduous campaign to have the penalty lifted. |
August 2006 | Foundem's AdWord penalty begins. Foundem starts an arduous campaign to have the penalty lifted. |
August 2007 | Teleconference with Google AdWords Quality Team representative. |
September 2007 | Foundem is "whitelisted" for AdWords (i.e. Google manually grants Foundem immunity from its AdWords penalty). |
January 2009 | Foundem starts "public" campaign to raise awareness of this new breed of penalty and manual whitelisting. |
April 2009 | First meeting with ICOMP. |
October 2009 | Teleconference with Google Search Quality Team representative, beginning a detailed dialogue between Foundem and Google. |
December 2009 | Foundem is "whitelisted" for Google natural search (i.e. Google manually grants Foundem immunity from its search penalty). |
==Other cases==
Google's large market share (85%) has made them a target for search neutrality litigation via antitrust laws.{{cite web | url=http://www.searchneutrality.org/ | title=Background to EU Formal Investigation | publisher=foundem | date=November 30, 2010 | access-date=February 13, 2011}} In February 2010, Google released an article on the Google Public Policy blog expressing their concern for fair competition, when other companies at the UK joined Foundem's cause (eJustice.fr, and Microsoft's Ciao! from Bing) also claiming being unfairly penalized by Google.
=The FTC's investigation into allegations of search bias=
After two years of looking into claims that Google “manipulated its search algorithms to harm vertical websites and unfairly promote its own competing vertical properties,” the Federal Trade Commission (FTC) voted unanimously to end the antitrust portion of its investigation without filing a formal complaint against Google.{{cite web|title=Google Agrees to Change Its Business Practices to Resolve FTC Competition Concerns In the Markets for Devices Like Smart Phones, Games and Tablets, and in Online Search|url=http://www.ftc.gov/news-events/press-releases/2013/01/google-agrees-change-its-business-practices-resolve-ftc|website=FTC.gov|publisher=Federal Trade Commission|access-date=20 November 2014|date=July 3, 2013}} The FTC concluded that Google's “practice of favoring its own content in the presentation of search results” did not violate U.S. antitrust laws. The FTC further determined that even though competitors might be negatively impacted by Google's changing algorithms, Google did not change its algorithms to hurt competitors, but as a product improvement to benefit consumers.
Arguments
There are a number of arguments for and against search neutrality.
= Pros =
- Those who advocate search neutrality argue that the results would not be biased towards sites with more advertising, but towards sites most relevant to the user.{{cite web | url=http://www.concurringopinions.com/archives/2011/02/search-neutrality-as-disclosure-and-auditing.html | title=Search Neutrality as Disclosure and Auditing | publisher=Concurring Opinions | date=February 19, 2011 | access-date=March 3, 2011}}
- Search neutrality encourages sites to have more quality content rather than pay to rank higher on organic results.{{cn|date=December 2023}}
- Restrains search engines from only supporting their best advertisers.
- Search engines would allow traffic to sites that depend on visitors, keeping their results comprehensive, impartial, and based solely on relevance.{{cite news | url=https://www.nytimes.com/2009/12/28/opinion/28raff.html?_r=1 | title=Search, but You May Not Find | publisher=Bucknell.edu | date=December 27, 2009 | access-date=February 13, 2010}}{{cite book | last=Grimmelmann | first=James | chapter-url=http://works.bepress.com/cgi/viewcontent.cgi?article=1034&context=james_grimmelmann | title=The Next Digital Decade: Essays on the Future of the Internet | chapter=Some Skepticism About Search Neutrality | publisher=TechFreedom | year=2010}}
- Allows for organized, logical manipulation of search results by an objective, automatic algorithm. At the same time, disallowing underhanded ranking of results on an individual basis.
- Personalized search results might suppress information that disagrees with users' worldviews, isolating them in their own cultural or ideological "filter bubbles".{{cite news|first1= Lynn | last1= Parramore|title= The Filter Bubble|work= The Atlantic|quote= Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill....|date= October 10, 2010|url= https://www.theatlantic.com/daily-dish/archive/2010/10/the-filter-bubble/181427/|access-date= April 20, 2011}}
= Cons =
- Forcing search engines to treat all websites equally would lead to the removal of their biased look at the Internet. A biased view of the Internet is exactly what search users are seeking. By performing a search the user is seeking what that search engine perceives as the "best" result to their query. Enforced search neutrality would, essentially, remove this bias. Users continually return to a specific search engine because they find the "biased" or "subjective" results to fit their needs.
- Search neutrality has the possibility of causing search engines to become stagnant. If site A is first on a SERP (search engine results page) one month, and then tenth the next month search neutrality advocates cry "foul play," but in reality it is often the page's loss in popularity, relevance, or quality content that has caused the move. The case against Google brought forth by the owners of Foundem extoll this phenomenon and regulation could limit the search engine's ability to adjust ranking based on their own metrics.
- Proponents of search neutrality desire transparency in a search engine's ranking algorithm. Requiring transparent algorithms leads to two concerns. These algorithms are the companies private intellectual property and should not be forced into the open. This would be similar to forcing a soda manufacturer to publish their recipes. The second concern is that opening the algorithm would allow spammers to exploit and target how the algorithm functions directly. This would permit spammers to circumvent the metrics in place that prevent spammed websites from being at the top of a SERP.
- Removing a search engine's ability to directly manipulate rankings limits their ability to penalize dishonest websites that practice black hat techniques to improve their rankings. Any site who finds a way to circumvent the algorithm would benefit from a search engine's inability to manually decrease their ranking causing a spam site to gain high ranking for extended periods of time.
Related issues
According to the Net Neutrality Institute, as of 2018, Google’s "Universal Search" system{{cite web|title=Google 2.0: Google Universal Search|url=http://searchengineland.com/google-20-google-universal-search-11232|work=Search Engine Land|publisher=Third Door Media, Inc|access-date=2 August 2012|author=Danny Sullivan|date=16 May 2007}} uses by far the least neutral search engine practices, and following the implementation of Universal Search, websites such as MapQuest experienced a massive decline in web traffic. This decline has been attributed to Google linking to its own services rather than the services offered at external websites.{{cite web | url=http://www.google.com/intl/en/press/pressrel/universalsearch_20070516.html | title=Google Begins Move to Universal Search | date=May 17, 2007| access-date=February 13, 2011}}{{cite web | url=http://searchengineland.com/google-maps-gaining-on-market-leader-mapquest-13103 | title=Google Maps Gaining On Market Leader Mapquest | publisher=Search Engine Land | date=January 10, 2008 | access-date=February 13, 2011}} Despite these claims, Microsoft's Bing displays Microsoft content in first place more than twice as often as Google shows Google content in first place. This indicates that as far as there is any 'bias', Google is less biased than its principal competitor.{{cite web|title=Defining and Measuring Search Bias: Some Preliminary Evidence|url=http://www.laweconcenter.org/images/articles/definingmeasuring.pdf|publisher=International Center for Law and Economics|access-date=2 August 2012|author=Joshua D. Wright|date=3 November 2011}}
References
{{Reflist}}
{{DEFAULTSORT:Search Neutrality}}