I think a lot of sites end up affected by "CDN accumulation". There can quite genuinely end up being a dozen or more different servers enabling a single website, just from all the third party CSS, JS libraries, analytics and fonts.
The trouble I find with relying on many disparate 3rd party CDNs is the unreliable response times, especially if any of your users are outside of the US. From Australia we often find a site serving locally in under 30ms, then sit and wait for 300ms per request for all the third party assets to load.
The first thing I do when looking at optimizing the felt performance of a website is move everything back to the application host, and then add a single CDN if the traffic is likely to be high.
It's interesting because there have been so many web developers over the years who been given the advice that a variety of 3rd party CDNs was "faster" and did that intentionally. It may have even been somewhat decent advice for a time. Mostly because of the 6 requests per host blocking in browsers: more hosts, more parallel requests. Of course parallelizing requests isn't always faster, especially as you point out if some of those CDNs underserve certain edges. Plus, HTTP/2 and HTTP/3 change some of the dynamics of that directly (by massively decreasing connection overhead to the same host so that even "blocked/serial" requests may finish a lot faster than the time it takes to open a full connection to a different host). Sometimes for the "if everyone uses the same CDN for jQuery the browser only has to cache it once for everyone" cache effect, and that is no longer true either as to avoid some privacy issues and such browsers have only further sandboxed their per-website/host caches and even "shared CDNs" no longer guarantee shared caches.
But yeah, post-HTTP/2 and privacy-related CDN sandboxing, especially, it is further and further from good advice.
Not just third-party CDNs. One of the worst examples I have seen is the FlixBus website, which loads JavaScript from like a dozen different CloudFront subdomains, each one gradually adding more functionality to the site (e.g. one subdomain seems to be responsible for autocompleting city names). They seem to have taken the notion of microservices to a whole new level.
One of the slowest parts of a Shopify website is their CloudFlare backed CDNs (may have improved, it's been a year since I benchmarked). The app server responds in < 60ms, then the CDNs responds anywhere between 50-300ms, which tanked the user experience.
Serving lots of little files even with HTTP2 has all the same problems, lots of little requests make the likelihood of hitting a bottlenecked CDN node more likely. I'm not sure where the idea of serving JS as small loaded in modules on the frontend came from, but I've never seen it increase the feel of a website. Single monolithic JS file sites almost always feel better and load faster. It might be better to just process the small modules to be inline with the HTML, instead of using a dozen network requests.
Be aware that Google Fonts will adapt the fonts and the CSS specification depending on the browser and OS. Swapping Google fonts with self hosted files might result in slightly different rendering. Not sure how big the differences are today, but a few years ago the differences have been visible even for some common browser/OS combos.
These days though having single woff2 file goes far enough. I have noticed that these auto-optimizing fonts are usually more headache than worth it. E.g. many time fonts from Adobe's TypeKit has different baseline in Google Chrome and Safari, it's maddening when it happens because one can't vertically place the fonts in the buttons in that case.
Even worse than what you mentioned, they sometimes update the fonts and refuse to offer any way to version lock them. Really. Check out this issue for more details: https://github.com/google/fonts/issues/1307. It's all a lot of magic for very little gain.
It's depend. For CJK fonts you don't want single big .woff2, you want to subset each font to smaller subset of characters, which Google Font does automatically.
The difference isn't as much as it used to be in terms of file-size optimization as woff2 is pretty much supported by all browsers and you don't have to serve anything else now. Except for CJK and Arabic fonts, which have to be split or their woff2 size would be a few MB - this is Progressive Font Enrichment [1].
However, once distinction is font hinting data [2]. Google Fonts serve different files based for Windows vs Mac. The windows variations have hinting included.
This site does seem aware of that and offers CSS options based on what are probably the two most common CSS returns from Google: a "Best Support" option that includes a number of legacy formats and "Modern Browsers" option that merely gives the browser a choice between WOFF and WOFF2, based on if they support WOFF2 or not.
Can you link to the relevant comment? I skimmed the thread and only found mentions of IP being logged (something I already mentioned last comment) and GDPR related lawyering.
That's correct. Perhaps candiddevmike's comment about "browser identification" goes back to where I said, "When you load them off Google Fonts, they come with 'secret sauce' to optimize them for each visitor’s individual browser and device." Perhaps the assumption was that this requires browser identification which, then, somehow is linked to the harvested IP address. I didn't say that specifically, nor do I have the knowledge to say so in any event.
Excellent point, by the way. Under GDPR, embedding something like a Google font is only legal after user consent, simply because they can see your IP address when you connect.
This was confirmed again on Monday by a German court [1] (German link) for the case where they loaded something from Akamai on a university library website. This was specifically deemed problematic since the implicit connection was into the US.
Basically, this means if you operate in Europe you can’t use US cloud providers at all before user consent.
I used this tool many times to download fonts and host them directly for a number of reasons:
- Avoid Google tracking / cookies
- Offline access (software for museums / exhibitions)
- Development even while offline
- Fonts can get updated, and updates sometimes affect letter geometry. I'd like to test before updating things to my users.
However, word of warning: I found at least one broken font. "Exo 2" has noticeable clipped sections where two curves overlap. Make sure to check your fonts... specially since, if you have the TTF downloaded from Google Fonts installed in the computer you're testing with, you might not see the text as your users do.
(Possibly Google Fonts is serving different CSS specific to macOS user agents as a current workaround.)
Fontsource (which is similar to this tool but packaged as npm packages and I found mentioned in another thread here) is similarly aware of the issue and suggests the current macOS workaround is to enable it as a Variable Font with slightly different CSS: https://github.com/fontsource/fontsource/issues/243
(They don't enable fonts as variable fonts in their default CSS because of different browser issues with variable fonts.)
Another reason you may want to self host your fonts vs Google Fonts is to support proxy (VPN) users.
Google's gstatic.com service rejects requests from the proxy service I use. This causes websites that use Google-hosted fonts to load with invisible text then show the text after a few seconds. This makes browsing slow. It affects the Flutter & Dart docs.
When Google's gstatic.com blocks proxy users, they break sites and apps that use code hosted on it: Google account login, Google Cloud Console, Google Maps, Google Search, search in all Google apps, etc. I'm a paying customer of Google Apps for Business and was unable to use their support site to contact them. To contact Google Support, you must get a Support PIN code. The PIN only lasts an hour, so the one I wrote down last year doesn't work anymore. It seems that there is no way for a proxy user to contact Google Support to report problems logging in. I'm transitioning away from Google products.
Serious question: Why the hell do all modern browsers not come with all of these fonts bundled in, like they do with CA certificates?
Google Fonts and all the free fonts that ship in Debian/Ubuntu are as close to a de facto standard as anything on the internet. Why not just lazy load them on first launch of a browser (or first use on a webpage) and cache them forever?
On use, the font files are cached for a year. Until relatively recently, that cache would’ve been shared across all sites, so it almost worked the way you described.
Now, the cache is per-site.[1] The arguments for partitioning the cache apply to fonts just as much as to other resources.
There have been thoughts[2] and proposals[3] around ways to make shared libraries without running into those issues, but nothing has really materialised AFAIK.
The broad issue is that 'just lazy load on first use' creates privacy reasons, and the idea of picking a list of popular resources to cache on launch never really goes anywhere.
For my personal site, i switched from using Open Sans through Google Fonts, to downloading all of the charsets (latin/latin-ext) and varieties (regular/bold) that i needed, setting up the fonts with @font-face and src, and then just serving them out of the same container that i do the rest of the site assets. So essentially now i can run everything i need with a single Docker container. I even did the same for the Material Icons font that my site uses (due to using the Materialize library, because that's what i felt like using). Admittedly doing all of that manually wasn't exactly a fun experience.
The downside of this is that doing so increases the total page size ~2x for my simple homepage, around 155 KB out of the 275 KB is just fonts: https://gtmetrix.com/reports/kronis.dev/EIHsevSv/ To that end, i find it incredibly odd that we as an industry don't seem to worry much about the size of the fonts that we use vs the characters that are supported by them. This probably isn't all that relevant with Open Sans in particular, but some of the other popular fonts can result in even larger download sizes, even if you don't need icon fonts or anything of the sort, or when you want to use more than just 1 font per page, for example, a different one for headers.
But without that being too big of a focus, it feels like at the end of the day one must consider just using the built in web safe fonts, or use bunches of bandwidth for oftentimes no good reason (in my case, i wanted the typography to be a bit more consistent with my blog, which already uses Open Sans, whereas companies usually care about branding and so on).
One of the biggest things we could do as a community is normalize local installs of major font libraries (say, top 50 of Google Fonts, for instance). It's relatively easy to do, and easy to automate, and could be done with OS or browser updates.
The chicken/egg problem though is that we don't want to encourage individual users to install fonts because that becomes a privacy deanonimization vector (existing trackers in the wild will request random Google Fonts and time roughly how long it takes for them to paint in an offscreen canvas) and it really needs to be a OS-wide or browser-wide initiative (like the original web safe fonts initiative that gave us Georgia/Verdana/Comic Sans MS/et al "everywhere").
(Though if an individual wants to do it, Skyfonts is an easy to find app that has a "download the Top X Google Fonts" option that's a quick choose your X and go and can make your browser experience noticeably faster/better without so much bandwidth spent on fonts.)
The Decentraleyes extension supplies local copies of popular CDN resources but it doesn't support Google Fonts yet unfortunately - this sounds like the obvious way to implement it.
what is wrong with using the default system fonts for majority of applications? I mean unless you are doing something artistic or whatever, why not go with whatever the system font is configured for the user? i could never understand this
if that is the case, why not advocate for updating system fonts? its not like windows or mac for example are idiots for shipping with "Default fonts" for no good reason?
It doesn't allow specifying which OpenType features to keep and which not, though. Tabular numbers are useful, and I've also found some use for small caps for example.
FontSquirrel also has a similar generator and converter which in theory gives more detailed control over which features to keep, although in practice an attempt to subset Fira Sans or FiraGO produced a font with broken ligatures, so not all that useful for me, either.
In the end I've ended up just subsetting and converting the font myself using the Python FontTools.
I wrote https://github.com/mmastrac/webfont-dl for this purpose a few years back (although it was designed more for the command line). It still works, though I probably should change the defaults to be a little more modernized.
Browsers will not just use cached fonts but also locally installed fonts if they have them. base64 data URL seems like a waste of bandwidth to me for fonts.
Do you have a source on that? The individual details might vary, but generally speaking the Google Fonts repo is made up of permissively-licensed fonts. [0]
The trouble I find with relying on many disparate 3rd party CDNs is the unreliable response times, especially if any of your users are outside of the US. From Australia we often find a site serving locally in under 30ms, then sit and wait for 300ms per request for all the third party assets to load.
The first thing I do when looking at optimizing the felt performance of a website is move everything back to the application host, and then add a single CDN if the traffic is likely to be high.