google-search-results-have-more-human-help-than-you-think,-report-finds

google it —

Google is sometimes hands-on under the hood, and investigators want to know more.


A large Google sign seen on a window of Google's headquarters.

Enlarge / Mountain View, Calif.—May 21, 2018: Exterior view of a Googleplex building, the corporate headquarters of Google and parent company Alphabet.

Google, and its parent company Alphabet, has its metaphorical fingers in a hundred different lucrative pies. To untold millions of users, though, “to Google” something has become a synonym for “search,” the company’s original business—a business that is now under investigation as more details about its inner workings come to light.

A coalition of attorneys general investigating Google’s practices is expanding its probe to include the company’s search business, CNBC reports while citing people familiar with the matter.

Attorneys general for almost every state teamed up in September to launch a joint antitrust probe into Google. The investigation is being led by Texas Attorney General Ken Paxton, who said last month that the probe would first focus on the company’s advertising business, which continues to dominate the online advertising sector.

Paxton said at the time, however, that he’d willingly take the investigation in new directions if circumstances called for it, telling the Washington Post, “If we end up learning things that lead us in other directions, we’ll certainly bring those back to the states and talk about whether we expand into other areas.”

Why search?

Google’s decades-long dominance in the search market may not be quite as organic as the company has alluded, according to The Wall Street Journal, which published a lengthy report today delving into the way Google’s black-box search process actually works.

Google’s increasingly hands-on approach to search results, which has taken a sharp upturn since 2016, “marks a shift from its founding philosophy of ‘organizing the world’s information’ to one that is far more active in deciding how that information should appear,” the WSJ writes.

Some of that manipulation comes from very human hands, sources told the paper in more than 100 interviews. Employees and contractors have “evaluated” search results for effectiveness and quality, among other factors, and promoted certain results to the top of the virtual heap as a result.

One former contractor the WSJ spoke with described down-voting any search results that read like a “how-to manual” for queries relating to suicide until the National Suicide Prevention Lifeline came up as the top result. According to the contractor, Google soon after put out a message to the contracting firm that the Lifeline should be marked as the top result for all searches relating to suicide so that the company algorithms would adjust to consider it the top result.

Or in another instance, sources told the WSJ, employees made a conscious choice for how to handle anti-vax messaging:

One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.

At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)

The algorithms governing Google’s auto-complete and suggestion functions are also heavily subject to review, the sources said. Google says publicly it doesn’t allow for predictions related to “harassment, bullying, threats, inappropriate sexualization, or predictions that expose private or sensitive information,” and that policy’s not new. The engineer who created the auto-complete function in 2004 gave an example using Britney Spears, who at the time was making more headlines for her marriages than for her music.

The engineer “didn’t want a piece of human anatomy or the description of a sex act to appear when someone started typing the singer’s name,” as the paper describes it. The unfiltered search results were “kind of horrible,” he added.

The company has since maintained an internal blacklist of terms that are not allowed to appear in autocomplete, organic search, or Google News, the sources told the WSJ, even though company leadership has said publicly, including to Congress, that the company does not use blacklists or whitelists to influence its results.

The modern blacklist reportedly includes not only spam sites, which get de-indexed from search, but also the type of misinformation sites that are endemic to Facebook (or, for that matter, Google’s own YouTube).

Why antitrust?

Google relying on human intervention, and endless tweaks to its algorithms as the WSJ describes, isn’t an antitrust violation. When it uses its trove of data from one operation to make choices that may harm competitors to its other operations, though, that can draw attention.

All that human intervention and algorithmic tweaking also affects advertising and business results, according to the WSJ. Those tweaks “favor big businesses over smaller ones,” the paper writes, “contrary to [Google’s] public position that it never takes that type of action.”

The largest advertisers, including eBay, have received “direct advice” on how to improve their search results after seeing traffic from organic search drop, sources told the paper. Smaller businesses, however, have not been so lucky, being left instead to try to figure out the systems either bringing them traffic or denying them traffic on their own.

Links to Google’s own features and properties also take up an increasingly large percentage of the search results page, the WSJ notes. For example, if you search for one of today’s chart-toppers, such as Beyoncé, you’re greeted with three large Google modules that take up more than half the screen real estate:

Most of the results on the page are Google modules (highlighted in red).

Enlarge / Most of the results on the page are Google modules (highlighted in red).

More than half of Google searches are now reportedly “no-click” searches, where individuals look only at the page of results and use the snippets on it rather than clicking through to any of the sources from which Google is drawing that information. That kind of use of data, among others, could be considered harmful to competition, since the company is using data collected from competitors to keep users from going to those competitors.

Google, for its part, disputed the WSJ’s findings throughout, telling the paper, “We do today what we have done all along, provide relevant results from the most reliable sources available.”

google-chrome-experiment-crashes-browser-tabs,-impacts-companies-worldwide
Chrome logo

A Google Chrome experiment has gone horribly wrong this week and ended up crashing browsers on thousands, if not more, enterprise networks for nearly two days.

The issue first appeared on Wednesday, November 13. It didn’t impact all Chrome users, but only Chrome browsers running on Windows Server “terminal server” setups — a very common setup in enterprise networks

Complaints flooded Google

According to hundreds of reports, users said that Chrome tabs were going blank, all of a sudden, in what’s called a “White Screen of Death” (WSOD) error.

The issue was no joke. System administrators at many companies reported that hundreds and thousands of employees couldn’t use Chrome to access the internet, as the active browser tab kept going blank while working.

In tightly controlled enterprise environments, many employees didn’t have the option to change browsers and were left unable to do their jobs. Similarly, system administrators couldn’t just replace Chrome with another browser right away.

“This has had a huge impact for all our Call Center agents and not being able to chat with our members,” someone with a Costco email address said in a bug report. “We spent the last day and a half trying to figure this out.”

“Our organization with multiple large retail brands had 1000 call center agents and many IT people affected for 2 days. This had a very large financial impact,” said another user.

“Like many others, this has had significant impact on our organization with our entire Operations (over 500 employees) working in a RDS environment with Google Chrome as the primary browser,” said another system administrator.

“4000 impacted in my environment. Working on trying to fix it for 12 hours,” said another.

“Medium sized call center for a local medical office lost a day and a half of work for 40-60 employees,” added another.

“Same issue experienced, hundreds of users impacted – hours spent attempting to isolate the cause,” said another user.

Hundreds of complaints poured in via Google’s support forum, Chrome bug tracker, and Reddit [1, 2]. One impacted sysadmin told ZDNet that they initially mistook the Chrome blank tabs as a sign of malware and reacted accordingly, starting network-wide security audits.

Google ships a fix

However, with time, the root cause of the bug was eventually found, and traced back to a feature called “WebContents Occlusion.”

According to Google Chrome design document, this is an experimental feature that suspends Chrome tabs when users move other app windows on top of Chrome, treating the active Chrome tab as a background tab.

image2.png

Image: Google

The feature, meant to improve Chrome’s resource usage when not in active use, had been under testing in Chrome Canary and Chrome Beta releases all year.

However, this week, Google decided to test it in the main Stable release, so it could get more feedback on how it behaved.

That it behaved badly is an understatement.

“The experiment/flag has been on in beta for ~5 months,” said David Bienvenu, a Google Chrome engineer. “It was turned on for stable (e.g., M77, M78) via an experiment that was pushed to released Chrome Tuesday morning.”

“Prior to that, it had been on for about 1% of M77 and M78 users for a month with no reports of issues, unfortunately,” he added.

However, when rolled out to a broader audience — such as Windows users on terminal server setups — an unexpected bug occurred that instead of suspending Chrome tabs when users switched to another app, it unloaded the tab entirely, leaving a blank page behind.

Users could refresh the Chrome tab to access their sites again, but in some cases, this also meant they lost previous work.

The Chrome team said they pushed a new Chrome configuration file to all Chrome users and disabled the experiment.

Chrome engineers operate a system called Finch that lets them push updated Chrome settings to active installs, such as enabling or disabling experimental flags.

If the fix has not reached all impacted users, and they still have problems, they can disable the following two experimental flags by hand:

chrome://flags/#web-contents-occlusion

chrome://flags/#calculate-native-win-occlusion

chrome-occlusion.png

An alternative method to fixing this is to start Google Chrome with the following command-line argument: –disable-backgrounding-occluded-windows

Fix prompts more criticism

However, fixing the problem actually made system administrators even angrier. Many didn’t know that Chrome engineers could run experiments on their tightly-controlled Chrome installations, let alone that Google engineers could just ship changes to everyone’s browsers without any prior approval.

“Do you see the impact you created for thousands of us without any warning or explanation? We are not your test subjects,” said an angry sysadmin. “We are running professional services for multi million dollar programs. Do you understand how many hours of resources were wasted by your ‘experiment’?”

“How many tens of thousands of dollars has this oops cost everyone? This is starting to look like a pretty massive mistake on Googles part,” added another disgruntled sysadmin.

“We take great care in rolling our changes out in a very controlled manner to avoid this type of scenario and we spent the better part of yesterday trying to determine if an internal change had occurred in our environment without our knowledge. We did not realize this type of event could occur on Chrome unbeknownst to us. We are already discussing alternative options, none of them are great, but this is untenable,” said another, hinting at a browser change across their organization.

Although it lasted just two days, this entire incident is panning out to be one of the Chrome team’s biggest bungles. Many impacted users demanded an official apology from Google, and by the looks of the financial impact it may have caused some companies, they are entitled to it.

amazon-tops-google-in-q3-smart-speaker-market-report

Market research firm Canalys reported that Amazon shipped three times the number of smart speaker/display units as Google in the third quarter of 2019. According to the company, Google was responsible for 3.5 million units compared with Amazon’s 10.4 million; Alibaba was second with 3.9 million.

Nearly 30 million units shipped in Q3. The third quarter saw shipments of 28.6 million smart speakers and displays overall, compared with 26.1 in Q2 2019 and 19.7 million in Q3 2018 according to Canalys. Privacy concerns don’t seem to have weakened consumer demand.

Canalys attributes Amazon’s success to the strength of the e-commerce giant’s direct channel, Prime Day sales and other promotions, as well as company’s Echo trade-in program. Google sells directly, through traditional retailers and other channel partners but its direct sales have proven no match for Amazon.

Display category grew 500%. Canalys pointed out that the “smart display category grew 500% globally to reach 6.3 million units in Q3 2019.” The firm said that the Echo Show 5 (smart display) “contributed significantly” to Amazon’s Q3 success. The overall share of smart displays in Q3 was 20% for the first time, indicating increasing traction for the devices.

Google has been seeking to use smart displays, especially the Nest Hub Max, as a competitive advantage vs. Amazon. So far it doesn’t seem to be working.

There are numerous estimates circulating in the market about the total number of smart speakers. Research firms put the number of devices in U.S. homes above 100 million. Canalys projected that there would be 225 million smart speaker/display devices globally by 2020.

Why we should care. Given the Q3 numbers, we can expect a pretty robust holiday quarter for these smart devices. Though so far smart speakers/displays have yet to yield many benefits to marketers, they likely will over time and become an important channel. That’s especially true with smart displays, which have the benefit of touch screens, giving brands and retailers more marketing and advertising options than smart speakers.



About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.



google-open-sources-cardboard-sdk-to-keep-it-alive

Long before Google introduced Daydream and subsequently left it dead in the water, the company created the Cardboard platform. You can use the carton headsets as an ultra-low-budget entry to VR to this day, and they’re compatible with almost any regularly shaped phone on the market. Google has now open-sourced the underlying VR SDK which will allow interested developers to create their own VR experiences on Cardboard viewers and improve and enhance the project as they see fit.

Google says that it still wants to contribute to the project and plans to release a Unity SDK package, but it hasn’t actively developed the Google VR SDK for some time already. Still, it sees “consistent usage around entertainment and education experiences,” so it didn’t want to shut down the platform altogether. Google states that “an open source model will enable the community to continue to improve Cardboard support and expand its capabilities, for example adding support for new smartphone display configurations and Cardboard viewers as they become available.”

Open-sourcing the project to keep it alive is a better move than just shuttering it altogether, and it’s in line with Google’s vision for the platform. It has always wanted it to be open and accessible. The VR headset’s hardware specifications have been open-sourced a long time ago, allowing third-party manufacturers to create their own Cardboards. Similarly, the open-source VR view platform that enabled VR and Cardboard experiences on the web has been around since 2016.

google-doesn?t-have-an-?ideal?-page-speed

“[Optimizing for site speed] will never go to a point where you just have a score that you optimize for and be done with it,” said Google Webmaster Trends Analyst Martin Splitt on the October 30 edition of #AskGoogleWebmasters. Splitt joined fellow webmaster trends analyst John Mueller to field four questions on the topic of site speed, tools and metrics.

Ideal page speed. “What is the ideal page speed of any content for better ranking on SERP?” asked Twitter user @rskthakur1988.

“Basically, we are categorizing pages more or less as ‘really good’ and ‘pretty bad,’ so there’s not really a threshold in between,” said Splitt, advising that site owners should just focus on making their sites fast for users instead of fixating on an ideal page speed.

In terms of actual speed metrics, Google tries to calculate the theoretical speed of a page using lab data as well as real field data from users (similar to Chrome User Experience Report data), Mueller explained.

The best speed tool. “I wonder, if a website’s mobile speed using the Test My Site tool is good and GTmetrix report scores are high, how important are high Google PageSpeed Insights scores for SEO?” asked Twitter user @olgatsimaraki.

“In general, these tools measure things in slightly different ways,” said Mueller. “So, what I usually recommend is taking these different tools, getting the data that you get back from that and using them to discover low-hanging fruit on your web pages — so, things you can easily improve to really give your page a speed bump.”

The aforementioned tools are also meant for different audiences. “Test My Site is pretty high-level, so everyone understands roughly what’s going on there, where as GTmetrix is a lot more technical and PageSpeed Insights is kind of in the middle of that, so depending on who you are catering to — who you are trying to give this report to, to get things fixed — you might use one or the other,” said Splitt.

The best page speed metric. “What is the best metric(s) to look at when deciding if page speed is ‘good’ or not? Why/why not should we focus on metrics like FCP/FMP instead of scores given by tools like PageSpeed Insights?” asked Twitter user @drewmarlier.

FCP, which stands for first contentful paint, measures the time from navigation to when the first text or image is painted. FMP, or first meaningful paint, measures the time it takes for the main content of a page to become visible.

“It’s the typical ‘it depends’ answer,” said Splitt. “If you have just a website where people are reading your content and not interacting as much, then I think first meaningful paint or first contentful paint is probably more important than first input delay or time to interactive. But if it’s a really interactive web application, where you really want people to immediately jump in and do something, then probably that metric is more important.”

“The problem with the scores is they are oversimplifying things,” said Splitt, advising that instead of focusing on a score, “use the specific insights that different tools give you to figure out where you have to improve or what isn’t going so well.”

Imperfect speed metrics. “I am testing an almost empty page on #devtools Audits (v5.1.0) it usually gives minimum results which 0.8ms for everything and 20ms for FID but sometimes it gives worse results in TTI, FCI and FID. Same page, same code. Why?” asked Twitter user @ocurcelik66.

The acronyms above refer to the following:

  • FID – First input delay; which measures the time between when a user first interacts with your site (i.e., when they click on something) to the time the browser is able to respond to the interaction.
  • TTI – Time to interactive; the amount of time it takes a page to become fully interactive.
  • FCI – First CPU idle; the amount of time before there’s no longer any JavaScript or other work that needs to be done by the CPU.

“First thing’s first, these measurements aren’t perfect,” Splitt prefaced, adding that there will always be some noise in the measurements.

“Don’t get too hung up on these metrics specifically. If you see that there’s a perceptible problem and there’s actually an issue that your site stays working on the main thread and doing CPU work for a minute or 20 seconds, that’s what you want to investigate. If it’s 20 milliseconds, it’s probably fine,” said Splitt.

There’s no simple answer. “You can’t break down speed into one simple number — it is a bunch of factors,” said Splitt.

“If I’m painting really quickly, but then my app is all about interaction — it’s a messenger — so I show everything, I show the message history, but if I try to answer the message I just got, and it takes me 20 seconds until I actually can tap on the input field and start typing, is that fast? Not really. But, is it so important that I can use the contact form on the bottom of a blog post within the first 10 seconds? Not necessarily, is it? So, how would you put that into a number? You don’t.”

In the example above, Splitt highlighted the importance of selecting the speed metric that most accurately reflects how speed influences your user experience. Naturally, different types of content will require varying levels of interaction by the user, which is why certain metrics are more relevant than others.

Why we should care. Overemphasizing a particular metric, or even a specific speed score, may not be the best use of your resources as Google itself does not categorize speed in such a specific manner.

Knowing what you’re measuring will allow you to select an appropriate metric to reference and tool to use so that you can improve your site’s speed in ways that will improve user experience, as opposed to pumping up a metric that doesn’t have meaningful implications for the way users interact with your pages. As with all metrics, context matters.

For the latest coverage on site speed, bookmark our SEO: Site Speed section.



About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

google-to-stop-indexing-flash-content

Google announced it will stop indexing and ranking Flash content in its search engine. This means Google will no longer process content within Flash SWF files, either on websites designed fully in Flash or web pages that have portions of the page in Flash.

What is Flash. Flash was introduced in 1996 by Adobe as a way of producing richer content on the web and on computers. It was a very popular web publishing platform in the late 90s but as time went on, fewer and fewer browsers continued to support Flash.

What is changing. Back in 2008 Google first began crawling Flash files and a year later, Google got more sophisticated in how it indexed those SWF files. But it never really ranked content within Flash files all that well.

Google’s announcement. Google said, “Google Search will stop supporting Flash later this year.” Specifically, Google said “in Web pages that contain Flash content, Google Search will ignore the Flash content.” “Google Search will stop indexing standalone SWF files,” also Google added. That means Google won’t be indexing or ranking content within Flash web sites or Flash elements on a web page.

The impact. Google said, “most users and websites won’t see any impact from this change.” Apple stopped supporting Flash when it introduced the iPhone on those devices and the company may have been credited as killing Flash. As we said above, fewer and fewer browsers have supported Flash. Google said “Flash is disabled by default in Chrome (starting in version 76), Microsoft Edge, and FireFox 69.”

Alternatives. Google said you should look towards HTML5 and other newer forms of JavaScript. But Flash is something Google will stop working with for indexing.

Why we care. If you have a website fully designed in Flash or parts of your website’s content in Flash, and you depend on Google search traffic, you should really consider updating your website and stop using Flash going forward.



About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

welcome-bert:-google’s-latest-search-algorithm-to-better-understand-natural-language

Google is making the largest change to its search system since the company introduced RankBrain, almost five years ago. The company said this will impact 1 in 10 queries in terms of changing the results that rank for those queries.

Rolling out. BERT started rolling out this week and will be fully live shortly. It is rolling out for English language queries now and will expand to other languages in the future.

Featured Snippets. This will also impact featured snippets. Google said BERT is being used globally, in all languages, on featured snippets.

What is BERT? It is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers.

It was opened-sourced last year and written about in more detail on the Google AI blog. In short, BERT can help computers understand language a bit more like humans do.

When is BERT used? Google said BERT helps better understand the nuances and context of words in searches and better match those queries with more relevant results. It is also used for featured snippets, as described above.

In one example, Google said, with a search for “2019 brazil traveler to usa need a visa,” the word “to” and its relationship to the other words in query are important for understanding the meaning. Previously, Google wouldn’t understand the importance of this connection and would return results about U.S. citizens traveling to Brazil. “With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query,” Google explained.

Note: The examples below are for illustrative purposes and may not work in the live search results.

In another example, a search for “do estheticians stand a lot at work, Google Said it previously would have matched the term “stand-alone” with the word “stand” used in the query. Google’s BERT models can “understand that ‘stand’ is related to the concept of the physical demands of a job, and displays a more useful response,” Google said.

In the example below, Google can understand a query more like a human to show a more relevant result on a search for “Can you get medicine for someone pharmacy.”

Featured snippet example. Here is an example of Google showing a more relevant featured snippet for the query “Parking on a hill with no curb”. In the past, a query like this would confuse Google’s systems. Google said, “We placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb.”

RankBrain is not dead. RankBrain was Google’s first artificial intelligence method for understanding queries in 2015. It looks at both queries and the content of web pages in Google’s index to better understand what the meanings of the words are. BERT does not replace RankBrain, it is an additional method for understanding content and queries. It’s additive to Google’s ranking system. RankBrain can and will still be used for some queries. But when Google thinks a query can be better understood with the help of BERT, Google will use that. In fact, a single query can use multiple methods, including BERT, for understanding query.

How so? Google explained that there are a lot of ways that it can understand what the language in your query means and how it relates to content on the web. For example, if you misspell something, Google’s spelling systems can help find the right word to get you what you need. And/or if you use a word that’s a synonym for the actual word that it’s in relevant documents, Google can match those. BERT is another signal Google uses to understands language. Depending on what you search for, any one or combination of these signals could be more used to understand your query and provide a relevant result.

Can you optimize for BERT? It is unlikely. Google has told us SEOs can’t really optimize for RankBrain. But it does mean Google is getting better at understanding natural language. Just write content for users, like you always do. This is Google’s efforts at better understand the searcher’s query and matching it better to more relevant results.

Why we care. We care, not only because Google said this change is “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”

But also because 10% of all queries have been impacted by this update. That is a big change. We did see unconfirmed reports of algorithm updates mid-week and earlier this week, which may be related to this change.

We’d recommend you check to see your search traffic changes sometime next week and see how much your site was impacted by this change. If it was, drill deeper into which landing pages were impacted and for which queries. You may notice that those pages didn’t convert and the search traffic Google sent those pages didn’t end up actually being useful.

We will be watching this closely and you can expect more content from us on BERT in the future.



About The Author



google-chrome-will-block-mixed-content-?-are-you-ready-for-it?

Recently, Google Chrome announced that they will soon start blocking mixed content also known as insecure content on web pages.

This feature will be gradually rolled out starting from December 2019. This should give website owners enough time to check for mixed content errors and fix them before the block goes live.

Failing to do so will cause poor user experience, loss of traffic, and loss of sales.

In this guide, we will explain Google Chrome’s mixed content blocking and how you can be well prepared for it.

Getting ready for mixed content block by Google Chrome

Since this is a comprehensive guide, we have created an easy to follow table of content:

What is Mixed Content?

Mixed content is a term used to describe non-https content loading on an HTTPS website.

HTTPS represent websites using a SSL certificate to deliver content. This technology makes websites secure by encrypting the data transfer between a website and a user’s browser.

Google, Microsoft, WordPress.org, WPBeginner, and many other organizations are pushing HTTPs as the standard protocol for websites.

They have been very successful in their efforts. According to Google, “Chrome users now spend over 90% of their browsing time on HTTPS on all major platforms.”

However, there are still many websites serving partial insecure content (mixed content) over HTTPs websites. Google aims to improve this situation by giving website owners a nudge in the right direction.

Why Google Chrome Wants to Block Mixed Content?

Google Chrome already blocks mixed content, but it’s limited to certain content types like JavaScript and iFrame resources.

Blocked mixed content on a web page

From December 2019, Google Chrome will move forward to start blocking other mixed content resources like images, audio, video, cookies, and other web resources.

An insecure HTTP file on a secure HTTPs webpage can still be used by hackers to manipulate users, install malware, and hijack a website. This jeopardizes your website security as well as the safety of your website visitors.

It also creates a bad user experience as Google Chrome cannot indicate whether a page is completely secure or insecure.

What Will Happen if a Website is Showing Mixed Content?

Google Chrome has announced a gradual plan to implement mixed content blocking. It will be implemented in three steps spawning over the next three releases of Google Chrome.

Step 1

Starting from December 2019 (Chrome 79), it will add a new settings option to the ‘Site Settings’ menu. Users will be able to unblock the mixed content already blocked by Google Chrome including JavaScript and iframe resources.

If a user opts-out for a website, then Google Chrome will serve mixed content on that site, but it will replace the padlock icon with the insecure icon.

Step 2

Starting from January 2020 (Chrome 80), Google Chrome will start auto upgrading HTTP video and audio file URLs to HTTPs. If it fails to load them over https, then it will automatically block those files.

It will still allow images to load over HTTP, but the padlock icon will change to Not Secure icon if a website is serving images over HTTP.

Step 3

From February 2020 (Chrome 81), Google Chrome will start auto-upgrading HTTP images to load over HTTPs. If it fails to load them over https, then those images will be blocked as well.

Basically, if your website has any mixed content resources that are not upgraded to HTTPs, then users will see the Not Secure icon in their browser’s address bar.

This will create a poor user experience for them. It will also affect your brand reputation and business.

No need to panic though. You can easily prepare your website to fix all mixed content errors.

How to Prepare Your WordPress Website for Google Chrome’s Mixed Content Block

Google Chrome is the most popular browser in the world among both mobile and desktop users.

Leaving your website with incomplete HTTPS implementation or no HTTPS at all will result in loss of traffic, sales, and overall revenue.

Here is what you need to do to prepare your website for these changes.

Move Your Website to HTTPS

If your website is still using HTTP, then Google Chrome will already be showing a ‘Not Secure’ icon when users visit your website.

Not Secure HTTP website

It is about time to finally move your website to HTTPS.

We know that changes like these can be a bit intimidating for beginners. Some site owners postpone the move due to cost, which is no longer an issue as you can easily get a free SSL certificate for your website.

Other website owners delay it because they think it will be a complicated process and could break their website.

That’s why we have created a step by step guide to easily move your WordPress site from HTTP to HTTPS.

We will walk you through every step and show you how to get that secure padlock icon next to your website address in all browsers.

Finding Mixed Content on an HTTPS Website

If you already have an HTTPS-enabled website, then here is how you will find mixed content on your site.

The first indication of mixed content issues will be visible in Google Chrome’s address bar when you visit your website.

If Google Chrome has blocked a script on your website, then you will see the scripts blocked shield icon at the right corner of the address bar.

Blocked mixed content on a web page

Google Chrome has already blocked the insecure content and that’s why the padlock icon on the left corner of the address bar will not change.

The second indication that you should look for is the info icon. This icon will replace the padlock if the page you are viewing has mixed content that Google Chrome has not blocked.

Unblocked mixed content

Clicking on the icon will show the notice that ‘Your connection to this site is not fully secure’.

Usually, this content includes images, cookies, audio, or video files. Chrome does not block those files at the moment and that’s why it shows this notice.

If your site has both icons, then this means your site is loading multiple types of mixed content files using HTTP.

Next, you need to find out which files are loaded using the insecure HTTP URLs. To do that, right-click anywhere on your website and select Inspect tool from the browser menu.

Console tool in Inspect view showing mixed content errors and warnings

Switch to the ‘Console’ table under the Inspect window to view page load errors. You’ll be looking for ‘Mixed content:’ errors and warnings to find out which files are blocked and which files are loaded using the HTTP URLs.

Fixing Mixed Content Errors in WordPress

There are two easy methods that you can use to fix mixed content warnings and errors on your WordPress website.

Method 1. Fix Mixed Content Errors and Warnings Using a Plugin

This method is easier and recommended for beginners. We will use a plugin that will find and replace HTTP URLs to HTTPs on the fly before sending it to user’s browser.

The downside is that it adds a few milliseconds to your website’s page load speed which is barely noticeable.

First, you need to install and activate the SSL Insecure Content Fixer plugin. For more details, see our step by step guide on how to install a WordPress plugin.

Upon activation, go to Settings » SSL Insecure Content page to configure the plugin settings.

Secure Content Fixer plugin settings

Select the ‘Simple’ option and then click on the ‘Save changes’ button to store your settings.

Visit your website to look for mixed content warning errors.

For more detailed instructions, see our article on how to fix mixed content error in WordPress.

Method 2. Manually Fix Mixed Content Issues in WordPress

This method can get a bit complicated for beginners. Basically, you’ll be finding the insecure URLs across your website and replacing it with secure URLs.

We will still use a plugin to find insecure HTTP URLs on your website. However, you’ll be able to deactivate the plugin once you have changed the URLs, so this will not impact your page speed like the first option.

Let’s get started.

First, you need to install and activate the Better Search and Replace plugin.

Upon activation, you need to visit Tools » Better Search Replace page.

Under the ‘Search’ field, you need to add your website URL with http. After that, add your website URL with https under the ‘Replace’ field.

Better search and replace plugin settings

Click on Run Search/Replace button to continue.

The plugin will now run and find all instances of your website URLs starting with http and replace them with the https.

The plugin works on your WordPress database, so it will only change URLs for your content areas.

If the mixed content resources are loaded by your WordPress theme or plugin, then you will need to inform the theme or plugin developer, so they can release a fix for that.

For more details, see our complete beginner’s guide to fixing the common SSL/HTTPs issues in WordPress.

We hope this article answered your questions regarding Google Chrome’s mixed content block and helped you get ready for it. You may also want to see our guide on how to use Google Search Console to grow your website traffic, and the important marketing data you must track on all WordPress sites.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

google-announces-they-have-achieved-?quantum-supremacy?

Today, Google announced the results of their quantum supremacy experiment in a blog post and Nature article. First, a quick note on what quantum supremacy is: the idea that a quantum computer can quickly solve problems that classical computers either cannot solve or would take decades or centuries to solve. Google claims they have achieved this supremacy using a 54-qubit quantum computer:

Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output.

You may find it helpful to watch Google’s 5-minute explanation of quantum computing and quantum supremacy (see also Nature’s explainer video):

IBM has pushed back on Google’s claim, arguing that their classical supercomputer can solve the same problem in far less than 10,000 years.

We argue that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity. This is in fact a conservative, worst-case estimate, and we expect that with additional refinements the classical cost of the simulation can be further reduced.

Because the original meaning of the term “quantum supremacy,” as proposed by John Preskill in 2012, was to describe the point where quantum computers can do things that classical computers can’t, this threshold has not been met.

One of the fears of quantum supremacy being achieved is that quantum computing could be used to easily crack the encryption currently used anywhere you use a password or to keep communications private, although it seems like we still have some time before this happens.

“The problem their machine solves with astounding speed has been very carefully chosen just for the purpose of demonstrating the quantum computer’s superiority,” Preskill says. It’s unclear how long it will take quantum computers to become commercially useful; breaking encryption — a theorized use for the technology — remains a distant hope. “That’s still many years out,” says Jonathan Dowling, a professor at Louisiana State University.

More about…

a-deeper-look-at-google-search-console-speed-reports

Google has experimental reports in Google Search Console for the new speed reports. We covered it when Google announced it back in May, but very few had access it then. Now, some of us at Search Engine Land, have access to these new experimental speed reports.

Larger roll out. Some people at the Search Engine Land team now see this new speed report listed under the “Enhancements” section. It is named “Speed (experimental).” Maybe Google has pushed out this report to more Search Console users?

The reports. Since we now have access, we wanted to share our reports from Search Engine Land’s web site.

Broken down by mobile and desktop and the number of slow, moderate and fast URLs.
You can isolate the slow URLs and see tips on how to improve these URLs.
Here is another view showing you the moderate speed URLs and fast URLs.
You can drill into each issue and Google will show you the individual URLs affected.
If you want to show off, Google also shows you the URLs that are super fast. Of course, you can use this to see what you are doing well and apply it to the slower URLs.
You can click on the URL and it shows you more details from PageSpeed Insights.
Here is another example.

Why we care. As we said before, speed is not just important for ranking in Google but also important for your website visitors and for your conversion metrics. Having this data within Google Search Console gives SEOs and webmasters a single place to go and see this information without having to go into the PageSpeed tools. In addition, this report gives you historical data on improvements or possible problems as they get worse over time.



About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.