Why are the algorithms used by mainstream search engines becoming a privacy threat? Do algorithms “filter” the Internet based on user patterns?
The story so far: Algorithms play a crucial role for search engines as they process millions of web searches every day. With the ever-increasing amount of information available on the Internet, search algorithms are becoming increasingly complex, raising privacy and other issues and attracting the attention of regulators. Last month, the UK’s digital watchdog said it would take a closer look at algorithms, seeking opinions on the benefits and risks of how sites and apps use algorithms, as well as contributions on audit algorithms, the current landscape and the role of regulators.
How do search algorithms work?
An algorithm, essentially, is a series of instructions. It can be used to perform a calculation, find answers to a question, or solve a problem. Search engines use a number of algorithms to perform different functions before displaying results relevant to an individual’s search query.
Tech giant Alphabet Inc’s Google, whose flagship product is the Google search engine, is the dominant player in the search market. Its search engine delivers results to consumers using its ranking systems, which are comprised of a large set of algorithms, which sort web pages in its search index to find the most appropriate results in a Record time. Its search algorithms consider multiple factors, including the words and phrases in a user’s query, page relevance and feel, source expertise, and user location and settings. , according to the firm.
While Google captures a large share of the general search market, there are alternative search engines such as Microsoft Bing and DuckDuckGo available to users. The latter, a privacy-focused search engine, says it does not collect or share users’ personal information.
In January, market leader Google generated 61.4% of all basic search queries in the United States, according to database firm Statista. During the same period, Microsoft sites handled a quarter of all search queries in the United States.
As the algorithms used to deliver the results vary from search engine to search engine, when a user enters a query, the results also differ. Moreover, the results of different users would rarely be similar, even when searching for the same things, because the algorithms take into account several factors, such as their location.
How are they developed?
Algorithms are often built from historical data and for specific functions. Once developed, they go through frequent company updates to improve the quality of search engine results presented to users. Most major search engine providers are also leveraging machine learning to automatically improve their users’ search experience, essentially by identifying patterns in previous decisions to make future ones.
Over the years, Google has developed search algorithms and constantly updated them, with major updates such as Panda, Penguin, Hummingbird, RankBrain, Medic, Pigeon and Payday intended to improve certain functions or solve some problems. In March, it introduced another update to improve the search engine’s ability to identify high-quality product reviews.
Search engines exert enormous control over the sites that consumers can find. Any changes or updates to their algorithms could also mean that traffic is diverted from certain sites and businesses, which could negatively affect their revenue.
What are the worries?
The search giant’s trackers have reportedly been found on the majority of millions of top websites, according to a DuckDuckGo blog post. “That means they don’t just track what you’re looking for, [but] they also track the websites you visit and use all of your data for advertisements that follow you across the internet,” he added.
According to a study by the Council of Europe, the use of profile data, including those based on data collected by search algorithms and search engines, directly affects the right to informational self-determination of a nobody. Most of Google’s revenue comes from advertisements, such as those it shows consumers in response to a search query.
DuckDuckGo, in addition to providing an alternative to Google’s search engine, offers mobile apps and desktop browser extensions to protect user privacy while browsing the web. The privacy-focused firm, in a blog post, said editorialized results, informed by the personal information Google has about people (like their search, browsing and purchase history), put them in a “filter bubble” based on Google’s algorithms. think they are most likely to click on.
What is the current state of these algorithms?
These search algorithms can be used to personalize services in ways that are hard to detect, leading to search results that can be manipulated to reduce choice or artificially alter consumer perception.
Additionally, companies can also use these algorithms to change the way they rank products on websites, prioritizing their own products and excluding competitors. Some of these concerns have caught the attention of regulators, and as a result, these search algorithms have come under intense scrutiny.
The European Commission has fined Google €2.42 billion for abusing its dominant market position as a search engine by giving an illegal advantage to another Google product, its search comparison service. price.
Furthermore, as part of the Commission’s proposal for the Digital Services Act, transparency measures for online platforms on a variety of issues, including the algorithms used to recommend content or products to users , should come into force.
“The majority of algorithms used by private online businesses are currently subject to little or no regulatory oversight,” the UK Competition and Markets Authority said in a statement, adding that “further oversight and action are needed on the part of regulators”.
An algorithm, essentially, is a series of instructions. Tech giant Alphabet Inc’s Google, whose flagship product is the Google search engine, is the dominant player in the search market.
Algorithms are often built from historical data and for specific functions. Once developed, they go through frequent corporate updates for quality improvement. Any changes or updates to their algorithms could also mean that traffic is diverted from certain sites and businesses, which could negatively affect their revenue.
Companies can use these algorithms to change the way they rank products on websites, prioritizing their own products and excluding competitors. Some of these concerns have caught the attention of regulators.