Currently still in Nightly and only on ‘Copy Link’. Still nice progress though.
Great stuff. Firefox swinging the big dick about lately.
Just slappin the table with that fuckin toddler’s leg.
brandishing that baby arm holding an apple
Baby arm holding an apple, or an apple holding a baby arm?
The John Holmes of browsers
firefox removes tracking data
chrome wants to make it possible for websites to refuse to serve you data if you run unapproved software
there is, in fact, a good guy in the browser wars
Reading the chrome bit just made me disgusted lol
“refuse to serve you data”, read as, “refuse to show you ads”, yeah right
I would love for you to expand on what you were trying to say with this comment, in great detail. I would consider it a favor.
Ooh. I hope this can be set as the default for ‘Copy Link’ and then I can just have the tracking when I actually want it.
have the tracking when I actually want it.
So never? I agree defaulting it would be great as long as it doesn’t falsely remove anything.
Not all queryparameters are tracking, so the option to copy the actual href of the hyperlink is useful.
Most of the time I appreciate a feature that strips them automatically
Yea, existing extensions get things wrong sometimes. I’m sure this will be the same - especially if sites start changing things to temporarily circumvent. Needs to be an easy way to grab the real URL just in case.
I think I’m against this. Not because it’s the wrong thing to do, but it’s just going to swing marketers & such to obscure their tracking URLs to something like
/my-slug/hashed-uid-for-tracking-without-query-param/post
& it maybe unsafe or impossible to replace that part of the URL is some cases (think how not all credit cards numbers work, it has a built-in algorithm). The corpos can do this already now but query params are easier & less fiddly. Despite the large number of add-ons that could combat this already (including a uBlock Origin filter list), there wasn’t enough incentive to start another ad/tracking arms race… but you introduce it as a default feature in a major (🤞) browser, & now the corpos take notice instead of being able to wave it off as something a minority of users are doing.…And I say this as the guy that reminds
$WORK
chat poster to remove their tracking URLs for the privacy of the groupDo it for affiliate links too.
Actually curious. What’s wrong with people making money?
Nothing wrong with people making money if they are honest about their shilling and tell you upfront if they are affiliate links and they get a cut if you click on them, or that their product review is sponsored. One of my favorite yt channels is cheaprvliving and Bob will be straight and honest with you about that and I like that.
I do think its a little sleezy when creators don’t be honest with you about them shilling and making money from affiliate links.
In many cases a lot of sites don’t make it clear that they have a conflict of interest.
You want to push a product on me and you’ll get a cut? Cool, but disclose that.
Nothing provided it is an honest and upfront with consent from the user. The problem is vast majority of affiliate links are non-consensual, buried in articles and in the worst case are the reason that pages even exist - “top ten dishwashers”, “50 gifts to buy your wife for Christmas” etc. clickbait garbage. I doubt most visitors even understand that’s why the pages exist or the financial remuneration they get from making these lists.
So it would not be a bad thing that if a browser to detect an affiliate link and ask you if you wanted to follow it as-is or strip the affiliate info out with a checkbox to remember the decision for the site.
That’s awesome! I use the Clear URLs extension and it does a great job but it’ll be nice to have this capability baked in.
Hope one day Firefox integrates uBlock Origin by default.
What you want is Librewolf then.
This may be difficult to maintain as some query parameters might be necessary. How will they be sure they’re not stripping essential elements? Won’t this become an arms race to mask tracking elements as “legitimate” looking parameters?
Awesome if they can pull it off, though.
There are common, well-known tracking parameters that Google uses such as the ones starting with “utm_”
most of the time sharing utm links isn’t helpful to the origin as if you copy a link from your email it’ll have medium=email, but actually should now be medium=direct
Anything is better than nothing. Besides, it’s still useful because you can see where the original link was copied from, and you still have the referrer header
I like the ‘:has’ pun in the title too. Supporting that is a real game changer!
Another fine addition would be a cut of redirecting trash. If you post some link in some soc network, it would sometimes replace it with it’s own link going through an outbound clicks tracker, safety pages ‘Do you really want to follow it?’ or just block you saying the link seems malicious.
If I’m a 3%er, can I choose not to have it?
There is an Extension called FastForward which skips ad-link redirectors.
Not sure if thats what you mean.
Sorry for necroposting but are you the real linux creator
Lmao no
oh okay
This is great because I hate manually removing that stuff before I share a link. And I always have to test the link before clicking send.
How does firefox determine which params are trackers and which params are required data?
deleted by creator
So just a set of strings determined to be used for tracking among a set of hosts? It’s not like I have a better solution, but I feel like making this anti-tracking method encourages more complex tracking params. At some point, I wouldn’t be surprised to see randomly generated query parameter keys which are resolved server side, making this approach impossible.
I completely agree with you, but this is always the problem isn’t it? It’s a cat and mouse game. If the mouse learns techniques to hide from the cat the cat develops better techniques to find the mouse. It’s not a reason for the mouse to stop hiding.
It’s a shame they went for the additional option rather than a persistent setting that would always strip it from a URI.
Ublock can strip it from the URL automatically, though if the website checkw this it might block stuff :/
Would need some deeper integration to be able to separate the URL bar the Website sees and the one you see. (like the current separation of the DOM)