does the latest firefox really stop browser fingerprinting?


I ask this question because I’m using the latest version on my MacBook Pro (mid-2015). You can see the version I’m running with next.

I have set Firefox’s Browser Privacy Enhanced Tracking Protection to Strict. I’m also running Privacy Badger. Yet when I run the Panopticlick 3.0 test (see above) it shows my Firefox browser “has a nearly-unique fingerprint.” I could try Custom, the highest level and the level above Strict, but I have no idea what trackers to block.

Frankly I’m beginning to wonder if all this “tracking protection” is pretty much bogus from everyone, especially Mozilla.

i’m no longer adding hyperlinks to arcanesciencelab posts

I have given up placing HTML hyperlinks in my postings. I’ve never been a big fan of how the WWW eventually specified hyperlinking. It was originally envisioned as being bi-directional, but instead what we got was the outbound link and the back button, a barely usable solution. The reasons for my dropping hyperlinking are threefold.

  1. The EU link tax. Yes, folks, the EU passed a link tax primarily aimed at news aggregators (specifically Google). As I read the body of the law passed by the Eurocrats, I shake my head in disbelief at their ignorance on how the Web really works. France is particularly egregious about rushing to implement this tax. If you want to read something more than what I’ve written here, search on the web for “Just As Everyone Predicted: EU Copyright Directive’s Link Tax Won’t Lead To Google Paying Publishers”. You can do this in your browser by highlight the text between quotes and then rich click and search.
  2. Link rot. I don’t know how many times I’ve gone back to re-read one of my older award-winning posts, then clicked on some of the outbound links at random. Most of the time I can still reach the link, but a lot of times I either reach the site but the page is a 404, or else the site itself is completely gone. And the older the post, the worse that problem becomes. A very long time ago, on another blog I wrote on, I was all about keeping the links fresh and correct. But I quickly discovered the time I was spending cleaning up dead links was beginning to take too much of my time. I know the web is dynamic and in a constant, slow state of flux, but I don’t need to keep cleaning up the broken and borked links because something changed on the other end.
  3. Manipulation of the back button functionality via JavaScript. When you click a link you expect to be to go back to where you started by hitting the back button. More and more sites are intercepting the back button and blocking you from leaving (I can still get out by just closing the tab). The most current example is Slashdot (/.). Click on a link to go into a story from the front page then try to go back and instead of hitting the front page I get a crap page telling me to stick around and read more, except it’s all click-bait, not something I would care to go to directly. I’ve pretty much given up going to Slashdot, and a lot of other sites that didn’t use to pull these kinds of shenanigans. I’m not interested in inadvertently sending a reader to such a site from my pages via hyperlink.

As of today I’m saying to entropy and general human greed, “OK, You win.” Hyperlinking on this site is a thing of the past. And in case you ask, no, I won’t go back to all my older posts and remove them. That’s as bad as finding and fixing all the broken links.