this post was submitted on 22 Aug 2024
19 points (100.0% liked)
Open Source
31089 readers
767 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
- !libre_culture@lemmy.ml
- !libre_software@lemmy.ml
- !libre_hardware@lemmy.ml
- !linux@lemmy.ml
- !technology@lemmy.ml
Community icon from opensource.org, but we are not affiliated with them.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
OK I got it, you are completely out of the loop here.
You do not grasp the idea of NoScript and other JS filtering extension. This is not about server code, your all arguments is baseless here.
By the way JS refered to Javascript and not NodeJS.
Anyway I got you whole company/business talk about "keeping the service available, secure, performant" and "GDPR [...] bankrupting fine"... yeah lemmy.world.
I'm a full-stack software developer working in the financial sector, their statement is factual.
Companies will never want to take on liability that has the potential to bankrupt them. It is in their best interest to not reveal the version of libraries they are using as some versions may have publicly known vulnerabilities, and it would make it incredibly easy for attackers to build an exploit chain if they knew the exact versions being used.
Securing client code is just as important as securing server code, as you don't want to expose your users to potential XSS attacks that could affect the way the page gets displayed, or worse, leak their credentials to a third party. If this happened in the EU or some parts of Canada, and it's been found that the company reduced their threat model "for the sake of openness", they would likely be fined into bankruptcy or forced to leave the market.
Unfortunately, this is one of those cases where your interests and ethics will never be aligned with those of service owners as they are held to a certain standard by privacy laws and other regulations.
No need to get aggravated, I completely grasp it, you've possibly misunderstood or not entirely read my comment if that's your takeaway.
I'm not talking about server code specifically, I'm going through the stages between the source code repo(s) and what your browser ends up receiving when you request a site.
NodeJS is relevant here because it's what runs nearly all major JS bundlers (webpack, vite, etc), which are what produces the code that ultimately runs in the browser for most websites you use. Essentially in a mathematical sense, the full set of dependencies for that process are a part of the input to the function that outputs the JS bundle(s).
I'm not really sure what you mean with that last part, really, anyone hosting something on the internet has to care about that stuff, not just businesses. GDPR can target individuals just as easily as for-profit companies, it's about the safety of the data, not who has it—I'm assuming you would not want to go personally bankrupt due to a deliberate neglect of security? Similarly, if you have a website that doesn't hit the performance NFRs that search engines set, no one will ever find it in search results because it'll be down on page 100. You will not be visiting websites which don't care about this stuff.
Either way, all of that is wider reasoning for the main point which we're getting away from a bit, so I'll try to summarise as best I can:
Basically unless you intend your idea to only work on entirely open source websites (which comprise a tiny percentage of the web), you're going to have to contend with these JS bundles, which as I've gone into, is basically an insurmountable task due to not having the complete set of inputs.
If you do only intend it to work with those completely open source websites, then crack on, I guess. There's still what looks to me like a crazy amount of things to figure out in order to create a filter that won't be able to work with nearly all web traffic, but if that's still worth it to you, then don't let me convince you otherwise.
Edit: typo