Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The TikTok pixel is not actually a pixel like in the old days. It is not a 1x1 transparent image loaded from their servers. It is executable javascript code. All you have to do to stop 99% of the corporate spying is disable unsafe remote code execution.

It's hard to believe I have to say that after the many decades of people getting it drilled into their heads "Do not open random email attachments" but here we are in a dark future where everyone is going to say not automatically running untrusted code is stupid and not a real option. It is. And it works.



I really really wish that I could convince Web Developers that not every website needs to be a web app.

I keep bringing up that I don't want JS to execute random code, even if it's sandboxed, it's mostly unnecessary, and I always get the same sort of replies.

Everyone calls me out of touch, I'm downvoted to oblivion, everyone suggests that I'm a unique case and everyone wants JS, they say that they don't want fragmentation and want life to be easier for them.

I get it, their pay check literally depends on them using JS, it adds a lot of flexibility.

I'm going to make the additional, controversial, guess that most web-developers don't really know what they're doing either; I would surmise that they lean on frameworks and if those frameworks are ever under threat (from people like me requesting progressive enhancement) then they need to defend the frameworks to defend themselves.


Of all things that can be complained about, JS sandboxing is actually really hardened. I think the issue is the cross site wild west we have today. Cookies, requests etc go all over the place, when it should be isolated to same origin unless specific interactions are needed (and then they should probably be user facing and blockable).

I think the real answer to why this hasn't been toothfully patched yet is ads and the billions of dollars behind it. Not JS developers.


I was talking on a podcast yesterday about a wiki project I’ve been involved in and one of the things that was eye opening for the audience was just how insane those requests can be. We literally pulled up a page that showed a single wiki page making >500 ad server requests.

https://pagexray.fouanalytics.com/q/pathofexile.fandom.com%2...


What in the actual f, that is an insane amount, and really well visualized too! Why? What was the podcast?


It’s all about the history of the wiki and how we went about forking it from Fandom.

https://m.youtube.com/watch?v=kqENNDKd2nw


CDNs were a huge mistake, they are the slow point of many websites and people don't even second guess importing fonts, images and libraries from half a dozen different providers. The pros are grossly overstated and the cons are many.


> Of all things that can be complained about, JS sandboxing is actually really hardened.

It's still very useful for people who want to run arbitrary code on your CPU. Both AMD and Intel have severe flaws, called "speculative execution CPU bugs", that can be exploited to extract credentials, encryption keys, session keys etc. from your computer - even information living in other applications or [hardware layer] VM's.


"toothfully patched"?


A "web app" can be a web app without including crap like this. Policies like these are usually set by sales and management.


Sales and management aren't going anywhere though, so we can expect web apps to continue to be user-hostile and better off avoided by anyone who doesn't want their privacy and security compromised.


I agree with you. As a user, I want web developers to have as little freedom as possible. The freedom they enjoy today implies the potential for abuse and boy do they abuse it. Tracking and fingerprinting everywhere, anti-patterns everywhere, sites that are painful to use because they use javascript for everything and even the back button doesn't works properly, sites that don't even render at all without javascript enabled.

This actually pushed me to scrape some websites. I essentially reverse engineered the site and created my own custom client for them just to ensure only my code ever runs. I don't have time or energy to do this for every site though...


To be fair, users expect websites to do a lot now. That is because brands pushed so much for interactivity. I do agree with you, but if we take away JS we would still want to enhance the web with interactivity.

I do recall building completely JS free eCommerce applications and in all honesty they were lightening fast compared to todays SPAs and still just as complex as applications. But we can do much better for user experience of complex applications with a sprinkle of interactivity.


If web developers had freedom, there would be minimal amount of tracking, fingerprinting or anti-patterns. If you want to assign blame, you have to look higher in the food chain


And it’ll get worse with wasm.


Why will wasm make 'it' (security?) worse?


I was thinking that you could embed the 3rd party code in your wasm file, but also you can do it in js, so I‘m also curious what was thinking GP, because everything goes through the browser then you would just block the request to certain domains as we do now.


> Everyone calls me out of touch, I'm downvoted to oblivion, everyone suggests that I'm a unique case and everyone wants JS, they say that they don't want fragmentation and want life to be easier for them.

If they sat down with real users, then they'd know that most people get very frustrated when web apps make their phones slow, which happens relatively often.

If they cared, they'd also be frustrated, because they have the background knowledge to understand just how unnecessary such poor user experiences are most of the time. I've seen simple mailing address forms slow phones down. That doesn't need to happen.

> I really really wish that I could convince Web Developers that not every website needs to be a web app.

I'd say it's more that not every website needs to be written in JavaScript from the bottom up. There are web apps that I use that are tasteful and reserved with their uses of JavaScript, if they even use it at all.

You don't need to make web apps using using dynamic code for every little thing, like even building static HTML elements with JavaScript.

And not everything needs to be a SPA, nor does everything need to be built using nine layers of frameworks and abstractions.


Blaming web developers for this is so out of touch.. since when are the developers in charge of product? It's like blaming a construction worker for the poor architecture of a building. Come on..


If you are asked to make a news website, and you reach for multiple JS frameworks and heavy analytics engines.

That is not products fault. Take some responsibility.


Terrible argument. You're talking about marginal behavior, not average. Sure, there's a % that will do that, but that applies to everything.


> I keep bringing up that I don't want JS to execute random code, even if it's sandboxed, it's mostly unnecessary, and I always get the same sort of replies.

I mean having done front-end and back-end. I gotta say, the way a SPA framework lets you re-use front-end code is probably why such frameworks are selling like hot cakes. I legit feel like someone needs to spec out a back-end layout that is generic and implementable in any language, but supports some of the concepts like containers and such.


What options will you suggest for web developers instead? How can you develop a convenient user experience without js? Even the most simple things require it, and will likely use a third party framework.


As someone who blocks JS by default, don't depend on remotely hosted JS. If I go to example.com and I think it seems truthworthy I might allow scripts to run from that domain and that domain only. If the site doesn't work after that, I'll just close the tab and move on. The simple stuff doesn't need bloated frameworks.

Don't obfuscate your JS either if you can help it. If I'm not sure if I should be trusting your site I'll be looking over your JS to see what's doing, but I'm not spending more time than it takes to glance over it either.

a convenient user experience to me, is one that fails gracefully without JS. Displaying text/images shouldn't need JS at all. I'll accept that it won't be as fancy without it, and I won't hold it against you for having JS as a fancy default, but a site should still be mostly functional without JS. Content at least should remain accessible.


Displaying pictures is one thing. But how are you going to create a dashboard that updates every short interval without JavaScript?

Let's say a website serves React from its own domain, so no 3rd party references are used (which in this case it'll be very easy for them to serve you a modified version of the framework), they'd rightfully want to save bandwidth and minimize it! Which means obfuscation as a byproduct.

I understand where you're coming from and I do a fair share of blocking myself, but I also see why things are the way they are if you want to develop an online product.


For somethings you really can't help but depend on javascript. I don't really mind it if it's something ambitious enough, it's the display of basic content breaking that bugs me most.

A dashboard may still work if you can just display the current data and add a note that because JS is disabled the page must be refreshed manually to pull updated values. You don't even have to throw in a refresh button since the browser's already got one. If I go to a website without JS enabled I expect things to take a bit more work on my part.

Minified JS is a problem I run into a lot. Most of the time though, I err on the side of caution and just move on. There's so many interesting things worthy of my attention that if the site full of inscrutable javascript is just a random article/blog post or something shared as a general "hey look at this cool/fun/impressive site" it can easily be abandoned for something less troublesome.

For sites I actually do need to access for some reason I can take the time to analyze it properly, but most of the time those kinds of sites are already familiar/trustworthy, and I've got other options too like firing up a less restricted browser in a VM.

I certainly don't expect developers to give up using JavaScript for the sake of the few users like me, but the better a site does at keeping content accessible without it the less likely I am to just move on to the next tab.


I was converted once I discovered that amazon.com still works with javascript disabled. You just do things the old fashioned way, form submissions that change the session state on the server and generate the html for the next page. I use "redmine" for project development / issue tracking and that works without javascript too.

The different form input elements do a lot of work for you, and are very platform-agnostic, unlike javascript, which will fail silently as soon as someone is on an out-of-date android/ios version.


We had lots of convenient user interfaces on the web without JS.

"The most simple things" absolutely do not require it. Simple things can be done in pure HTML. That's what the web has always fundamentally been about.

You should TRY building a site without JS. Just try it! You'll be amazed.


> Even the most simple things require it

The “simple things” like displaying a document don’t need JS. Why does every knowledge base system is JS when they are literally just showing a document? Or discussion forums when you’re JS reading them?


CSS works just fine without any javascript at all.

Here's a really obvious example: https://nextspaceflight.com/

Every single link on this page requires javascript to work, when they could be actually just proper href links.


Do you have some examples of things that require JS to do?


Upvoting a comment on Hacker News.


actually that doesn't require javascript.

Up voting is done via a link.

https://news.ycombinator.com/vote?id={id}&how=up&auth={auth}



While true you can use HN just fine with JS disabled.


I could have said chat clients, games, maps or other things that absolutely require JavaScript. I said upvote on HN to illustrate that even the most basic site still needs some JS to create a sane experience. Having the page reload every time you upvote a comment is not a good alternative.


It's been working for me for years just fine without JS. As for chat clients and games, those things absolutely do not belong in the slow pile of abstractions that is a web browser. They should be native applications and they are on my computers.

But you're right about maps. They're extremely well suited to being viewed in a browser with javascript.


What of the most simple of things require JavaScript or a third-party framework?

Modern CSS can handle a lot on the visual side, and modern HTML allows you to do form validation, dynamic lazy-loading and resizing of images, dialogs, collapsible elements, etc without JavaScript.


When I typed that I didn't mean showing a picture or serving a text file. Anything that's not a static page will probably require JavaScript.


I’m upvoting you. Congrats.


Look, I do run uBO. It doesn't even completely block JS, yet it still routinely breaks very basic pages, and I have to use a lot of webdev and industry knowledge to even try to unbreak them without entirely turning off blocking. So no, for most people who want to use mainstream websites, just turning off JS is not a real option. That this is a dystopic scenario does not change the practical reality.


I never have to turn off uBO to get pages to work. Am I just visiting really boring sites, or is there a default setting I should change to be more strict amd break more pages?


I don't know, I guess it could be that turning on the "I know what I'm doing" matrix features made it more strict? It seems like every e-commerce site and most blog/news sites want to host all their images offsite (mostly CDNs I assume) in some weird JS-mediated way. Anything with a captcha breaks. Any purchase checkout process that involves other domains, which is almost all of them, can be counted on to break. If someone can tell me I'm just misunderstanding what's going on, I'd love to spend less time messing with websites' guts and still be reasonably protected.


uBO can break sites, but in my experience doesn't to the degree that you're describing. Perhaps you should clear all settings and re-install the extension once and see if that helps?

Do you maybe have other content blocking extensions running at the same time as uBO that are causing 'clashes'?


For me uBO breaks 3D-secure, so it does in fact break every e-commerce checkout that uses it. I just disable the entire extension before I buy something, but it’s rather awkward.


That “enter just your web password here” thing? Don’t seem to for me. Amazon checkout seems to be breaking from something recently but likely not uBO filters.


I'll probably give this a shot. The only settings changes I can remember are allowing things, but I could be wrong. No other content-blocker extensions, I believe.


Sometimes 'Enhanced Tracking Protection' built into Firefox can break sites too. Click on the shield in the URL bar to toggle.


In that mode you can make it much more strict, such that it loads no js at all by default, and/or loads only first party scripts by default.

I think out of the box with the default blacklist mode very few things are broken, but the trade-off there is that you're running way more code. I do imagine the default list would block this specific nonsense from tiktok, although I haven't verified that assumption.

I block all scripts initially with ubo and slowly allow what I need to make a site function (or just close tab on sites I deem unworthy of the effort). This is certainly more than you could expect from a normal user.


I just had this problem today when trying to book a flight at American Airlines. The purchase button would simply not work when uBO was on.


this may be daydreaming too shallowly, but I wonder if basic Javascript UI patterns could be identified by some tool. Something along the lines of a new pane showing up in the Developer Tools that semantically identifies the roles of certain Javascript.

* This block of code ensures these UI elements are synched among each other. * This block of code sends this UI data to this server address.

Something that gives a rough & coarse data tracing dependency analysis of inputs and outputs. There may be techniques to defeat this but I think some general patterns would become established where web designs could begin to be shifted to compartmentalize different functionality to streamline such a tool being able to to pare off the insignificant and safe parts of JS and enable folks like yourself to zero in on the interesting and questionable parts.


Even without that tool, HTML continues to evolve and add common “components” that people have used JavaScript to create for decades, which work without (or with minimal) JS. For example, <details> and <summary>, <dialog>, <datalist>, and more.


That’s interesting so your saying when developers offload to the html standard typical UI workflows that have been baked into certain html tags then there’s less innocuous JS to have to wade through in the first place. That’s amazing actually and very welcomed.


That's what LibreJS does.

https://www.gnu.org/software/librejs/

It is not terribly useful.


That is behavior based malware analysis.


I use uBO with JavaScript on and almost all the filter lists it ships with enabled. Never had any problems with web functionality.

I just tried adding tiktok.com to "My filters", but it seems like ads.tiktok.com is still accessible with that rule -- how can I block all tiktok subdomains simultaneously?


Seems like automatic loading of resources (hosted by third parties) by popular web browsers is beyond question.

I access the www everyday using a browser that does not auto-load resources. It would be dishonest to claim it is not useful.

I recall a brief period of time in the early www where web pages could contain "Java applets" and the browser would prompt the user if they wanted to run the code.

Web developers have become so dependent on all this control they have been given over unsuspecting computer users, how would they react to removing/reducing any of these "features".

The "modern" web browser feels like a Trojan Horse.


Is it really a good idea to automatically send resources to a client that has not explicitly requested them. It is behaviour that takes control away from the client. (Needless to say, this behaviour is open to tremendous abuse and can be wasteful.)

Auto-fetching took this to a new level. (I always disable this behaviour, either within the client or outside it.)

Going even further, HTTP/2 server push has IMO only highlighted the questionable nature of this behaviour. Its justification is "performance" but it still removes control from the client. Will push be removed from Chrome.

https://groups.google.com/a/chromium.org/g/blink-dev/c/K3rYL...


More than feels -- it is a Trojan horse.

More and more of our lives are conducted interacting through the web. The vast majority of the tooling is focused on eye candy.

What is given up completely if we leave javascript? Or, how can we lock down browsers to stop this nonsense without completely disabling it?


Not only that, but at least since 8 years ago give or take, your cookies/other identifying info can be matched to an offline traceable tender such as credit card / loyalty program.

8 years ago, I could very much target viewers (no click needed) who have bought XYZ using a credit card or on the attribution end, know you saw a specific ad and went into the store and bought it.

Javascript and browse behavior added much resolution to that.

Source: Am in the advertising industry and worked on accounts in the Fortune 5.


It's a shame uMatrix is no longer actively supported because it was the silver bullet for this kind of shit.


I think uBlock Origin can do the same things with uMatrix.


Yeah, but uMatrix still works. It's kind of a finished product. It still received a security update since support ended. I use it all the time and am fascinated at the amount of third party scripts so many websites use. I always appreciate a site that loads everything from their own domain, which is rare. As for social media tracking scripts, I block them all. Google/gstatic is the only one that gets a free pass, otherwise too many sites won't work.


uBlock has all the functionality of uMatrix and more, FYI.


Except the easy GUI UX


> It's hard to believe I have to say that after the many decades of people getting it drilled into their heads "Do not open random email attachments" (...)

You need to take a step back and figure out what you're failing to understand before going into these "everyone is a fool" rants.

One of the reasons people don't understand the risks is the fact that, by design, these risks don't exist at the eye of the end user. No one knows, not even you, how many servers are being hit when you click on a link.

You're opening emails, you're clicking a link, and hundreds of requests are flying out of your browser right under your nose to God knows where. How many of them are requesting useless images you never saw? How many if them are reporting telemetry data on how you're using a website? You do not know. Why are you whining about other not knowing as well?

Privacy is a hard problem because everyone is using a system explicitly designed to transfer information around without any control or supervision. Up until now the best tool we have at our disposal is a set of laws that require companies to disclose and delete data they collect on us.

Blaming the end user for clicking links is victim blaming, and demonstrates a colossal amount of ignorance about the problem domain.


Shouldnt this have been rendered impossible with Apple and Google’s crackdown on third-party cookies? Whether it a 1x1 tracking pixel or a full-blown Javascript, Apple’s ITP protection would seem to render it unables to track people across websites.

The key word is SEEM. I have no idea regarding all the contradictory news coming out of all the camps. I posted a question on StackOverflow with a significant bounty that expires in 5 hours and so far no one has even showed up to answer it:

https://stackoverflow.com/questions/73794780/what-exactly-do...


I can shed some light on this maybe, don't have a SO account. You're completely right about "SEEM". It does do enough that it significantly lowers the resolution of data and ability for systems to infer your behaviors / persona. It will force more of the ad ecosystem to server to server data pass through: https://developers.google.com/tag-platform/tag-manager/serve...

I mention it in a post above, but data brokers have existed for a decade, which don't really care about any of this. Your email/phone/credit card + purchase behavior for instance, is likely sent to 3rd parties (as md5/sha hashed values) It boils down to sample resolution and sample size. Javascript made it really easy for literally every website to collect browse behaviors. ITP makes the skill ceiling / investment to collect this data much higher.


How would third parties track you across websites now? Given any skill level?

I can think of only one: convincing websites to add a CNAME to point subdomains to their servers.


I don't work in the ad space, but it's my understanding that that's where things are going.

See here for Facebook, but I'd expect other providers to do the same.

https://developers.facebook.com/docs/marketing-api/conversio...


ITP doesn't conceal the IP & user-agent, which is enough to track someone reliably since most connections still have a persistent-enough IP address. It only breaks down if you're behind NAT (sharing the public IP with lots of other people) and have a very common user-agent such as the one of the latest version of a mainstream browser.

The only way to defeat this is to 1) have all browsers standardize on a single user-agent that never changes going forwards and 2) per-origin VPNs so that the public IP seen by each origin is different.


ITP is with Safari, and most people using it will have the latest version, so will blend into the Apple user crowed that way. And private relay is pretty much a VPN.


CNAME wouldn’t work because then the cookie is bound to that domain. So the identity doesn’t travel from website to website.


Well, yes and no

First of all, the subdomain can redirect to google.com and back, loading it in a first-party context, and then redirect to your main domain page. I guess Google here would be a “second party”. https://learn.microsoft.com/en-us/azure/active-directory/dev...

Second way is that subdomain can load an iframe from google.com and the iframe will send a postMessage to the enclosing page, which will send a request to the server to set a cookie.

As long as you logged in ONCE in google (let’s say in a popup, or maybe using ITP’s click-to-login) then it will store the session cookie this way. And after that it can meep a cookie around for 10 years and track this guy across all the subdomains where he signed in once.


All you have to do to stop cross-site tracking is to disable third-party cookies in your browser. That's really it. This misfeature should've never been invented in the first place.


> disable unsafe remote code execution

... cue the WHATWG cronies shreiking "but but but but you will BREAK THE WEB"


Non techie here. Can you explain the difference please?


A web browser is an application that safely executes untrusted remote code. It sounds like you don’t want to use a web browser.


Is it really safe when a company - such as TikTok - is executing arbitrary code on properties they don’t own or operate, without user consent?

That doesn’t seem safe to me.


Then I'm sure you wouldn't mind running this cryptominer in your browser for me. It's safe.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: