Google saves user data to better determine search context, and factors including recent search history can tweak results.
6 min read
What’s the difference between President Donald Trump and a takeout sandwich?
For one, it’s how Google treats a search for each. Type “best sandwich shops” into the search engine, and chances are you’ll be presented with top-rated spots in your current zip code. Search “Donald Trump,” and more generally applicable content — recent news hits, the president’s Twitter account and the White House website — receive top billing.
Though we have an idea of how Google prioritizes different sets of search results, the algorithms themselves are a mystery. When it comes to personalized search, though, we do know one thing: The company saves user data, including recent search history, to potentially tweak results based on each individual. For your lunchtime query, if you type only “sub” into the search bar, Google may use your search history to determine you’re talking about a sandwich and not a submarine.
No matter whether you’re seeking out the president or a panini, there’s a chance your past search history could inform the order of results. If you’re, say, a CNN loyalist, Google may use your search history to bump up that publication’s Trump content a couple of spots, whereas staunch readers of Fox News may see that publication featured slightly more prominently on the results page. By and large, however, searches should yield similar results for any user — and Google will never personalize results to be more liberal- or conservative-oriented, said Danny Sullivan, Google’s public liasion for Search. The company does not even have a way of sorting results into such categories, he said.
It’s important to note that personalization is often conflated with localization, but the search engine giant maintains that they’re two separate entities. For your lunchtime query, spots within walking distance of your office may show up first, but those results should be the same for everyone in your area.
“Personalization is when search results change on something that is related to only you and no one else,” Sullivan told Entrepreneur. “Localization is when something changes based on the location where a search is happening, which isn’t unique to an individual because it’s shared by a group.”
According to Google, around 2 percent of search queries are treated with some level of personalization. And though the company supports personalization, it condemns bias, as CEO Sundar Pichai wrote in an email to employees on Friday.
“We do not bias our products to favor any political agenda,” he wrote. “The trust our users place in us is our greatest asset, and we must always protect it. If any Googler ever undermines that trust, we will hold them accountable.”
Pichai’s comments follow a recent report in The Wall Street Journal that brought to light an internal Google discussion after President Trump’s 2017 travel ban. The ban restricted travel to the U.S. from Iran, Iraq, Libya, Somalia, Sudan, Syria and Yemen for 90 days, and in the internal email thread, employees discussed an idea to volley pro-immigration results in search. According to Google, the discussion was simply a brainstorming session and none of the ideas were ever implemented. But the disclosure will likely fuel critics’ complaints, especially as the 2020 election approaches, that large tech companies such as Google suppress conservative perspectives online.
“We build products for people of every background and belief, and we have strong policies to ensure that our products remain free of bias,” Pichai wrote.
It’s a vital conversation to have, as prominent tech companies hold the power to influence public opinion on a mammoth scale. According to peer-reviewed research published in 2015, a search engine’s algorithm may have the ability to shift undecided voters’ voting preferences by 20 percent or more — and up to 80 percent in some demographic groups — without the voters’ knowledge.
But in Google’s case — as far as personalized search results go — that doesn’t seem to be a pressing issue.
Although as much as 11.7 percent of search engine results may show differences due to personalization, according to a 2013 paper by Northeastern University’s Algorithm Auditing Research Group, researchers were surprised to find that past searches and browsing history did not seem to inform results in a signfiicant way. They found that a user’s location — as well as the status of being signed into a Google account — had the most impact on results. (Note that in this paper, researchers included localization in their definition of personalization.)
Though Google works to avoid top-level bias, there’s a more overarching idea to consider here: Personalization in any capacity can contribute to ground-level bias — or a “filter bubble” on the consumer level. The term was coined by internet activist Eli Pariser around 2010, and balancing that very idea — of an “echo chamber” of sorts that purports an individual’s own way of thinking — is something many tech companies continue to struggle with.
“While personalization provides obvious benefits for users, it also opens up the possibility that certain information may be unintentionally hidden from users,” the Northeastern University researchers wrote.
Whether you’re concerned with data security or want to avoid potential tweaks to results, here’s your starter guide for turning off Google’s personalization features on your computer.
To prevent saved searches and browsing history:
Go to Activity Controls, then pause each feature, such as Web & App Activity, Location History, Device Information, YouTube Search History and YouTube Watch History.
To delete account activity:
Go to Delete Activity, then select the date ranges for which you’d like to delete your account activity.
To control ad personalization:
If you’re aiming to turn off personalization on your Android, iPhone or iPad:
Go here. Then, in each of the four sections on the right, select your device and follow the instructions.
Site Search 360 News