Heads up, buttercup: Last week I said the site was moving to a new IP. That actually didn’t happen until this weekend. Now the https is all set up. The links are updated, and now if you use the forum link in the header you should be taken to the secure version of the site.
While I was at it, I fixed the CSS so that the large first letter of a post wouldn’t get cut off if you began a post with a yellow aside box, which was the case for every post in Bob’s Game of Thrones series.
Technically, you should be able to access this blog via https://www.shamusyoung.com/twentysidedtale
. However, I’ve been having mixed results with this. I go to the front page, switch to the secure version, but then after clicking on some links I look up and see I’m back at the vanilla version. But other times not?
WordPress wants a fully-qualified URL for the root of the site. It prepends this to all automatically generated links, which includes a lot of the navigation stuff. I’m assuming this is the root of the problem, but I don’t know. The problem seems pretty inconsistent. It happened constantly last night, but tonight I’m having trouble getting it to happen.
I could change the ENTIRE site to ALWAYS be https, but I don’t know if there are drawbacks or problems to doing this. Would that be slower in some situations? Would that cause hassles? I don’t know what the drawbacks of https are so I’m wary of making sweeping changes without gathering a little feedback first.
Anyway, don’t worry if none of this makes sense to you. This is a small thing impacting an even smaller number of people. The point is, I did a technical thing that changed some stuff that is probably working or whatever.
What Does a Robot Want?

No, self-aware robots aren't going to turn on us, Skynet-style. Not unless we designed them to.
Deus Ex and The Treachery of Labels

Deus Ex Mankind Divided was a clumsy, tone-deaf allegory that thought it was clever, and it managed to annoy people of all political stripes.
The Game That Ruined Me

Be careful what you learn with your muscle-memory, because it will be very hard to un-learn it.
Silent Hill Turbo HD II

I was trying to make fun of how Silent Hill had lost its way but I ended up making fun of fighting games. Whatever.
Internet News is All Wrong

Why is internet news so bad, why do people prefer celebrity fluff, and how could it be made better?
“large first letter of a post”
…what? I don’t remember ever seeing large first letters in posts… I mean the capital letters in postheaders are larger but I don’t think that’s what you mean.
That said: I don’t really mind, it all looks fine to me.
And: YAY for HTTPS! Works fine! I tried a few links, and they’re all https for me.
With CSP you can instruct browsers to rewrite all links to https if they loaded the https version of a site: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/upgrade-insecure-requests
Since you’re worried about breakage: You can set Content-Security-Policy-Report-Only (explained at the bottom of that page), then browser will not actually enforce the rule, but report violations.
Shamus,the rss feed still links to the http version.Do you need to change it manually?Or is there a new feed with links to the https version?
Also,while the blog works perfectly with https,the forums do not for some reason.
I noticed this too. If you manually go to the https version of the website, the rss link at the top of the page is an https feed, and links from that feed seem to lead to the https site just fine.
Hmm, not working for me. I manually went to the https version, deleted my current RSS feed, entered the new one from the https page, and it’s still taking me back to the non-secure version of the site. Using Feedly on Chrome, if it matters.
Thats how it happened to me for the comments that were already loaded with http,but the one that was made after I entered the https worked properly.
Ohh, ok. I’m only subscribed to the posts, not the comments, so I’ll just wait for the next post to check it. Thanks.
It’s also possible that the browser is caching the page. From a security point of view one would assume that a browser would fetch the https version if the http one is cached, but who knows, I certainly haven’t tested that.
Another thing to keep in mind is that any url that starts with // will default to either https:// or http:// depending on what the referrer url was.
A website should usually not use http:// or https:// but relative urls only.
And in the future browsers will always try https first (and only try http if there is no https version of a site).
BTW! If anyone has the site bookmarked you will need to re-bookmark it as Chrome at least treats http and https bookmarks as separate sites.
As does Firefox.
I suspect that every bookmark that included the protocol part will need to be updated.
If you use HTTPS Everywhere, you can add a rule to always use the secure version of the site. (You must do this from the secure version)
Technically, secure web traffic is more work than insecure. I’d still vote for forcing https on for all content.
The only thing I noticed (when I popped into the dev console to check your certificate) was this:
It doesn’t appear to be actually causing any issues, but hey, it’s a thing.
A quick fix for this is to change it to ‘//fonts.googleapis.com/css?family=Belgrano’ and ‘//fonts.googleapis.com/css?family=Yanone+Kaffeesatz’ this will make the browser do a https call if the page is https or http if the page is http. This is also how Google does it in their own scripts.
“It doesn't appear to be actually causing any issues,” it has the potential to with restrictive firewalls or very secure browser modes.
Also “JQMIGRATE: Migrate is installed, version 1.4.1” is showing up in the console here, wasn’t this debug info removed previously, or was it never removed in the first place?
There really isn’t any harm to just hard coding https://; loading a secure URL from an insecure page is fine.
???????
That font – and all references to it – was removed months ago. I’m looking at the site and the RSS sheet, and I find no text matching “Yanone”. (Or Belgrano, which Roger is also seeing.) The only font I’m requesting is Noto Sans, and that’s done via https.
I can’t imagine how this would be happening unless people were somehow caching a months-old style sheet. I just… what’s going on here?
I don’t see it in the stylesheet my browser loaded, either. Or in the page source (except for in these comment bodies).
The Chrome dev console shows that the error is coming from “shamusyoung.com/promo/index.php”. If I’m reading it right. My webdev knowledge is limited to being pretty decent every two years or so when rebuilding/designing/fixing the company website, and in between those times is absolute shit.
Ah! The promo box. That explains it. Thanks so much. That was driving me crazy.
For the curious: The promo box is the “From the Archives” thing that appears below each post. I should probably just make that a WordPress plugin so it inherits the style sheet and doesn’t need its own.
Hey Shamus, regarding performance:
https://www.webpagetest.org/result/170702_QW_KVB/
https://www.webpagetest.org/result/170702_KE_KW0/
The difference isn’t huge. First Byte takes about 180ms longer. And about half a second longer total page/load/render.
Also, since two fonts are fetched over http while the rest are fetched over https could make out the difference as the browser need to make https and http calls to google fonts rather than just a https connection.
There are much larger gains from optimizing image compression.
https://webspeedtest.cloudinary.com/results/170702_KE_KW0
PS! This does not take into account HiDPI screens, in which case higher res (but scaled down in the browser) images gives a clearer image.
One neat “trick” is to use larger images but more aggressive compression, this should result in a better image than a lower res one with less strict compression (the image speed test does not take this into account).
Wow, that’s a lot of savings — although sometimes the suggestion is just to decrease jpeg quality, which may not always be the best idea. Some of the logos and cartoon-y images would probably take up way less space as high-res, non-antialiased PNGs. My experience with line graphs is that they’re smaller at 4 times the edge width and no AA then they are with 4xAA. Of course, once they’re in jpeg format, png will try to losslessly compress the jpeg-artifacts, and that kills it.
But really, SVG is my favourite format for that kind of stuff anyway. Is anyone actually using that on the web these days?
Also, webp seems like a pretty decent format.
I actually use SVG Where possible. Sometimes the SVG and PNG size end up about the same but the benefit of SVG being vector outweigh that.
Some minor symbols or such may actually take up less space as SVG than PNG or SVG (like a play or pause button symbol).
To be somewhat HiDpi friendly I tend to make PNGS slightly larger than needed in size, it also looks nicer if anyone zoom in as well.
As to decreasing quality. I’ve found certain “flat” lie art or cell like colored images that one might think would need to be 24 bit or at the very least 256 colors (8bit) PNG can actually be reduced to 16 colors (4 bit) without really seeing any difference (unless you know where to look). I do this a lot when making images to be used/Inlined in manuals.
I use a few “tricks”. For example. I use Irfanview and make sure that sub-pixel sampling is turned off when saving JPGs. This avoids some color artifacts (like the color being a little off somewhow, washed out), but the image get more difficult to compress. Anything from 65% to 85% JPG quality is usually good for my use.
With PNG I test and see how much color reduction I can do in Irfanview, sometimes I can go down to 16 colors, other times 24 or 32. When you see colors get erong it’s time to dial it up a little again. I make sure to turn off floyd steinberg dither and instead let it do “best color quality”, this is really slow and you may need to fiddle with retrying different color reductions. But if you get it right it compresses very well. (if dithered then compression works very poorly). Then I use PNGgauntlet to go over the PNGs and squeeze out some more byte savings.
With WebP I try both lossless mode and lossy mode. I usually use around 85% quality in lossy mode. Sometimes lossless end up smaller than lossy. And I usually allow more aggressive quality reduction on the alpha channel than the pixels themselves as slightly blurry alpha can rarely be noticed for icons or buttons. For other UI elements the reverse may be true though and you’ll need lossless alpha but can accept lossy color pixels.
For images that are created once but display a unimaginable number of times the effort it worth it.
I do the same thing with artwork for Windows programs and manuals as well. The less bytes to download the less waiting there is for users.
Yay for limited-palette PNGs!
I do that as well, although it’s sometimes a little overkill for the kind of documents I make (technical docs, reports, papers…)
I’m also deeeeeeply frustrated with the lack and patchiness of support for any vector format in any publishing software. LaTeX used to run on EPS, now I’m reading things like “you can make EPS work if you work around xy…”, in completely serious voice. Open/Libreoffice used to funnel EPS right through to pdfs, now it tries to interpret them (and fails), and the same goes for SVG.
Microsoft effing _invented_ WMF and EMF, but first they started to look different on different machines, and then MS simply stopped supporting them, or any other vector format.
This is crazy and enraging.
So most of the time, I’ll generate SVGs, then export them as EPS or PDF from Inkscape and convert that to PNG without anti-alias with GIMP (because the PNG export in inkscape cannot turn AA off, and neither can GIMP’s SVG import). Or I just use EPS in LaTeX, with the good old TeX->DVI->PDF process.
Really, everyone just needs to get on and support EPS and SVG as passive graphics formats for display and print.
AFAIK, the only major performance downside of HTTPS should be in the initial handshake, which requires some computation on both client and server side for generating the key pairs, as well as a few extra packet exchanges (which make up most of the delay). Then there’s a little CPU overhead during steady state communication since you have to encrypt your messages before sending them and decrypt them when you receive them, but that’s basically negligible.
Don’t half-ass it. Mixed content negates the entire point of encrypted connections. Switch everything over to HTTPS.
Your implementation needs some tweaking to meet current best practice – https://www.ssllabs.com/ssltest/analyze.html?d=shamusyoung.com
OK, here I was thinking all of the GOT episodes started with a third-person aside: “his series analyzes the show…”
But no, it’s supposed to be *This* series.
I like the third person intro better. plz2 reintroduce the bug. :)
This seems relevant (if poorly timed on Ars Technica’s part):
HTTPS Certificate Revocation is broken, and it's time for some new tools
https://arstechnica.com/security/2017/07/https-certificate-revocation-is-broken-and-its-time-for-some-new-tools/
I haven’t actually read it yet – it’s pretty looooooooooooooooooooong…
This is why partly why “Let’s Encrypt” has 90 days expiration on certs.
I tried switching parts of my org’s website to https, but I ended up with a LOT of hassle from “partially insecure” pages. Lots of auto-generated stylesheets were incorrectly trying to grab non-secure copies of files even when told to pick the https version.
So I eventually gave up trying to diagnose and just switched the whole thing to an auto-redirect to the https version.
Well, good news. My work’s firewall isn’t blocking the site anymore!
Hey Shamus. You sometimes make updates to the site, and write about them, and sometimes complain or comment on how WordPress does things. I was wondering if you ever considered switching to Squarespace or a similar service.
I listen to a lot of podcasts, and they seem to sponsor/advertise on a lot of them, and the people reading the ad copy seem to think they’re a pretty good service (though they could be saying that because they have to).
Thing is wordpress is free.Squarespace does not seem to be free.So even if its 10 times better,its still infinity times more expensive.