Europe’s GDPR took effect in May 2018, but 2019 was the year privacy got real for marketers in the U.S. There was a convergence of legal, technological and cultural factors that forced brands, publishers and tech companies to confront privacy head-on in ways they’d been trying to avoid for years.

CCPA comes into sharp focus

The California Consumer Privacy Act (CCPA) was passed in 2018 and came into sharp focus this year, as January 1, 2020 has approached. As we draw closer to that implementation deadline, the IAB, DAA and a host of software companies have introduced “compliance frameworks” and tools to help marketers and publishers address the requirements of the act.

However, there’s still considerable corporate foot dragging and uncertainty. That’s consistent with what happened with GDPR compliance. Indeed, many companies operating in Europe are still not fully compliant more than a year and a half later. With CCPA, there won’t be any enforcement actions before July 1, 2020, giving affected marketers some additional time to get in line.

For much of 2018 and early 2019, big tech companies and industry trade groups criticized and fought CCPA — trying to weaken it with unsuccessful amendments – because of anticipated compliance costs and fear that more limited access to data would harm revenues or disrupt the ads ecosystem. That very much remains to be seen.

The war on third-party cookies and ‘bad ads’

In September, Firefox launched Enhanced Tracking Protection, which included default third-party cookie blocking. Apple updated Safari’s Intelligent Tracking Prevention (ITP) to strength anti-tracking and cookie blocking capabilities and rules:

  • ITP now downgrades all cross-site request referrer headers to just the page’s origin. Previously, this was only done for cross-site requests to classified domains.
  • ITP will now block all third-party requests from seeing their cookies, regardless of the classification status of the third-party domain, unless the first-party website has already received user interaction.

Google Chrome, which controls 64% of the global browser market, also expanded third-party cookie blocking, claiming it was doing so in a smarter way (than Apple). And in July, Chrome rolled out ad filtering on a global basis. All ads that fail Better Ads Standards are now potentially blocked.

The rise of ‘surveillance capitalism’

This was also the year when the ominous term “surveillance capitalism” entered the digital lexicon and became mainstream, appearing in books and news articles, culminating in a December 21 NY Times editorial “Total Surveillance Is Not What America Signed Up For.”

China is the leading example of the dark side of digital technology, in the service of domestic surveillance. But in some ways, America isn’t that far behind. And mobile-location tracking is at the center of the debate over privacy and personalization in this country.

Technology companies, which went from being seen primarily as job creators, innovators and purveyors of social good, have been increasingly vilified. Facebook, in particular, stumbled badly in addressing privacy and data scandals it confronted over the past few years, captured in the Netflix documentary “The Great Hack.”

But most technology companies, for reasons that aren’t entirely clear, have failed to educate consumers and the broader market about the value of their services and methodologies. As a result, often sensational journalistic pieces filled the void and helped fuel popular distrust.

Consumers are now highly concerned and even fatalistic about technology and privacy. It’s to the point where 90% of consumers said they would click “do not sell my personal information” under CCPA. We’ll see if that actually happens.

Conclusion: Privacy is your friend

A cultural and legal Rubicon of sorts has been crossed. Privacy will now be a central feature of the user experience going forward. Privacy-conscious consumers will reward companies that are more transparent and shun those that are opaque or manipulative. One could argue the failure of Facebook’s Portal smart display is a byproduct of a lack of trust in the company.

Ethics and trust will also be critical features of a brand’s long-term value. Indeed, there’s early evidence that privacy is becoming a competitive advantage. The way forward for marketers involves a wholehearted embrace of privacy and the creation of genuine value for consumers in exchange for their personal data. There really is no other alternative.

About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes about the connections between digital and offline commerce. He previously held leadership roles at LSA, The Kelsey Group and TechTV. Follow him Twitter or find him on LinkedIn.


“If you are not paying for it, you’re not the customer; you’re the product…”

It’s obvious and logical. Hidden in plain sight like The Purloined Letter. So much, that you really don’t take it into consideration when designing a web site or an app.

We may overlook it because our concern with fonts is usually about copyright laws. And what is in plain sight is the information about it. The fonts are open source! And the text in the About page is inspiring!

Making the web more beautiful, fast, and open through great typography

We believe the best way to bring personality and performance to websites and products is through great design and technology. Our goal is to make that process simple, by offering an intuitive and robust collection of open source designer web fonts.

Google Fonts About page

So… let’s copy-paste them!

Furthermore, these web fonts are almost twenty years old and the majority of the websites use them.

But wait… what about privacy issues? It’s not in plain sight. I think that is what we call a dark pattern.

Well, it is not necessarily a big problem, depending on the requirements of your client. Or what you believe or who you trust. Or your principles about the internet and privacy. I don’t know.

But what I know is that we must be aware of this. And if you are already aware, you must keep it in mind when using web fonts.

There are two links in the Google Fonts About page, one for Terms and the other for Privacy. But they are very general of all Google products and services like search, accounts or apps.

By using Google Fonts, the Terms of Service that specifically applies is for the Google API because you usually embed with or @import into your CSS one of these URLs: fonts.googleapis.com or fonts.gstatic.com.

The APIs are designed to help you enhance your websites and applications (“API Client(s)”). YOU AGREE THAT GOOGLE MAY MONITOR USE OF THE APIS TO ENSURE QUALITY, IMPROVE GOOGLE PRODUCTS AND SERVICES, AND VERIFY YOUR COMPLIANCE WITH THE TERMS. This monitoring may include Google accessing and using your API Client, for example to identify security issues that could affect Google or its users.

Google APIs Terms of Service

Notice that when it says “to ensure quality, improve Google products and services”, Google Ads, Google AdSense or Google Analytics, are products or services. And it can be any other product or service for any of its clients or customers. The ones who pay, of course.

And notice also that where it says “to identify security issues” it is just an example. Nobody is going to complain about the use for security, so it’s intelligent to put the word “security” there.

Google could track the users of your website or app in a similar way to how a pixel-based tracking system works.

Or not. The problem is with “could”. But the information from all the sites using Google Fonts is too good to not use it, right?

So, let’s be aware of that. I became aware of it from a Reddit post.

There are ways to turn this around, of course. The obvious thing is to host the web fonts in your server and not call them from fonts.googleapis.com or fonts.gstatic.com. But you should check the code in templates or components that you use, anyway.

For more technical information check out the Reddit post and this article about fingerprinting by Federico Dossena in his blog.


Handing Google a major victory, the European Union’s highest court ruled Tuesday that the EU’s “right to be forgotten” rules, which let people control what comes up when their names are searched online, do not apply outside the 28-nation bloc.

Over the last five years, people in Europe have had the right to ask Google and other search engines to delete links to outdated or embarrassing information about themselves, even if the information is true. More recently, France’s privacy regulator wanted the rule applied to all of Google’s search engines, even those outside Europe.

But the European Court of Justice declared there is “no obligation under EU law for a search engine operator” to abide by the rule outside the EU.

The court said, however, that a search engine operator must put measures in place to discourage internet users from going outside the EU to find the missing information.

The decision highlights the growing tension between privacy and the public’s right to know, and it underscores the difficulties in enforcing different jurisdictions’ rules when it comes to the borderless internet.

It also illustrates how the internet is regulated more heavily in Europe than in the United States, where authorities are constrained by the public’s 1st Amendment rights to free speech and freedom of the press. The United States has no laws equivalent to Europe’s “right to be forgotten” measure.

Peter Fleischer, Google’s senior privacy counsel, said he welcomed the ruling and added that the Mountain View, Calif., internet search giant has worked hard “to strike a sensible balance between people’s rights of access to information and privacy.”

Those who wanted to see the rule extended beyond the EU argued that on the internet it is easy to switch between national versions of Google’s website — from google.fr to google.com, for example — to find missing information.

Since Google started handling “right to be forgotten” requests in 2014, it has deleted about 1.3 million web links from its search results, or 45% of all requests processed, according to the company’s transparency report.

Takedown requests filed by Europeans are reviewed by Google staff members, based mainly in Ireland, who look into whether the webpage contains sensitive information such as race, religion or sexual orientation; relates to children or crimes committed as a minor; or is about old convictions, acquittals or false accusations.

Last year, Google removed a link to a 1984 German news article about a person’s conviction for hijacking an East German airplane to flee to West Germany because the article was “very old” and related to now-repealed laws against illegal emigration.

Links to pages about a former politician involved in a drug scandal were deleted because they disclosed his home address, and links to information about convictions for rapes, sexual abuse and aiding and abetting terrorism were removed because those offenders had served their sentences.

Google, a division of Alphabet Inc., does not remove such material from all web searches, just when a person’s name is typed in. The material still shows up when other search terms are used.

Google says it may reject a delisting request if the page contains information that is “strongly in the public interest.” That can include material on public figures that relates to the person’s criminal record.

It can also say no if the content consists of government documents or is “journalistic in nature.”

Tuesday’s ruling is final and becomes the benchmark on which courts in the 28-nation bloc must base their decisions relating to such cases.