The scriptless web

Is there a search engine that will filter out any results that have cookies or js in them? More than filter out, it would base its results on that subset of websites only. I.e., it would NOT pagerank the currently available sites, and then apply the filter, but rather it would apply the filter first, and then pagerank THOSE sites.

Wouldn't the subset of the web consisting of no-js essentially be late 90s early 2000s web? I'm not talking about js being bad or good, but in a "media is the message" kind of way I'm wondering about the culture of those sites without js being indicative of a higher quality of content for its users. In other words, does no-js imply a certain culture (beyond just "fucking autistic weeb")?

Attached: aesthetic3.gif (640x400, 31.98K)

Other urls found in this thread:

wiby.me/
textise.net/
wiby.me
cvedetails.com/vulnerability-list/vendor_id-26/product_id-1032/Microsoft-Windows-Media-Player.html
cvedetails.com/vulnerability-list/vendor_id-6652/Media-Player-Classic.html
cvedetails.com/vulnerability-list/vendor_id-5842/product_id-9978/Videolan-Vlc-Media-Player.html
cvedetails.com/product/497/Adobe-Acrobat-Reader.html?vendor_id=53
cvedetails.com/vulnerability-list/vendor_id-317/Irfanview.html
cvedetails.com/vulnerability-list/vendor_id-49/product_id-10916/Apple-Mac-Os-X-Preview.app.html
twitter.com/SFWRedditImages

I'm late to the party. What's gained by making a website use only form submits for input and template rendering for output? What's lost by dumping javascript onto web pages?

infinite security,
negative security

I think I'm still missing something. What exactly causes dropping some jqeury to manipulate the dom to negatively impact security? Do you mean it opens the potential that the site will introduce malicious code, or that javascript is inherently malicious through built-in malicious code?

Just look at the kind of content and corporations that exist on jsless sites.

wiby.me/
textise.net/

why are you not posting on nanochan user

I'm going to post this here because this probably doesn't deserve its own thread, and it's kinda related to what OP is talking about. I'm working on a personal website at the moment, and it's just a simple static site. I'm using Pure as my CSS framework of choice, and I'd like to use no JS at all.

I want the site to be responsive - no, I'm not one of those 'mobile first' fags, but the reality is that a lot of people will access it through their phone, so the layout needs to scale well at all resolutions. Most of the page already does this. The problem is, the horizontal menu on top gets cut off when I resize the window to look like a smartphone. Is there a solution to this that doesn't require the use of JS?

Why not roll your own CSS? And something like width: 90% would have it fit every resolution.

Cancer. Write your own, pajeet.

I'll try to do that, thanks. I'm not a programmer or even a 'tech person', that's why I decided to use a framework that takes care of the basics so I can just focus on making the site look 'personal'.

No.

JS was used in the 90's, but the web wasn't oversaturated with it like today, and there were often Lynx-friendly version of pages, or sometimes even "no frames" versions. But even when the webmaster went full retard and used only JS for navigation, it was still possible to look at the page source and find the HTML link, so you could still navigate those sites.
Not sure why you're lumping cookies in there. Those are another thing altogether, and even Lynx supports cookies (but not JS, frames, CSS).

Attached: Cute Website.png (921x646, 162.87K)

Cancer. Reinvent the wheel again, no-project.

Your system is shit.

Attached: pepe revolution.jpg (776x1002, 159.93K)

What's the point of this post? It's a PERSONAL website, so the CSS should also be PERSONAL. Unless you want your site to look like just every site out there.

never gonna make it

Javascript is used for things which could be done easily though server-side templating slowing down your browsing experience. There are legitimate use cases for JS, but they should always involve improving your existing HTML, not replacing it. Look at twitter for a particularily bad example of JS. You don't need client-side scripting to display 140 character messages, and maybe a couple images.
Also security/privacy, but those concerns pale in comparison to how fucking slow JS/DOM manipulation is

...

Which could be implemented in what, 20 lines of JS? Auto-updating doesn't excuse the monstrosity that is twitter.com

Auto-update should be optional anyway, at user's choice. Else he can just hit refresh, if he wants to use browser without JS.

that's dumb
you might as well devolve the web to a list of text files with ctrl - f as a search engine.
not everything is autistic, backwards, and slow as fuck in tech as the rest of (((UN*X))) shitters

What you're suggesting is more like gopher than the web. There is a middle-ground and balance between your extreme and the other extreme that exists today. Basically your post is a straight up textbook falacy of the excluded middle.

It's literally an unsolvable problem to determine if a website has scripts on it. Ignoring the fact that the content can change at any time (so it may have scripts the next minute after it gets indexed), which isn't even a theoretical problem, the syntax of HTML/JS/CSS is completely arbitrary and changes every time the big 3/4 browsers update every week.

You're best off going with somdething like wiby.me

Absolutely.


It's not a matter of what you want, it's a matter of what the user wants. Outside of the consumer-whore West, the user wants every website to look exactly the same. So for example he can tune his CRT monitor down to a nice 40cd/m^2 at night and read through 20 different websites without some bullshit suddenly showing a bright white screen and making the user uncomfortable. Even something as basic not changing between 40 different retarded fonts while researching a topic...


shut the fuck up you fucking oblivious muppet. twatter has nothing to do with basic acceptable web design. remember when graceful degradation was a meme? now they don't even try.


your bait is too obvious because you call UNIX shit slow compared to ----the web----.


>>>/HN/

/thread

retard

What if you set your UserAgent string to google? :*)

Found the silicon valley fuckwit

You're actually fucking retarded.

...

...

absolutely freetarded

Try using your own websites on anything that isn't a developer workstation some time. Most people are poor and have shitty computers

I build extremely lean websites that run well on ancient hardware. This "JavaScript is slow" meme is a retarded fucking meme. Bad design is bad design. JavaScript isn't evil.

I'm pretty sure everyone thinks that. And maybe javascript isn't slow, but DOM manipulation certainly is

Alright, my dude. You're smarter than entire development teams at billion dollar enterprises.

Thank you. Please spead my gospel so I can browse the web on my chromebook once again

What websites are you visiting that a Chromebook can't handle? Like what? Modern JS frameworks are way lighter than shit like jQuery. Have you used Mithril? It's crazy lightweight.

I think it does. While there are a healthy number of Zig Forums types doing this, most people who don't have scripts on their page are low-tech types who care more about the content than the presentation.
However, ever since the coming of the GDPR, I've noticed a distinct trend regarding websites that put up cookie warnings: they are all completely worthless. If ever you see a website that demands you check your cookie settings before continuing, this tells you they care about tracking more than both content and presentation. Before the EU started its shenanigans, a website that set tracking cookies could defend itself by saying it didn't know better. These days there is no excuse.

There's an addon I saw called "I don't care about cookies". Presumably, this addon contains a list of all websites with cookie warnings. You could steal the list, and make a new addon that marks or removes every link to one of the listed sites.

Nice try Eich

Perfect. 10/10
I guess I'm no longer a weeb, but a wiib?

Good thoughts all around.

But why does it? What it is about jslessness that makes the content feel so much more...personal, I guess?

Attached: whatwhyamihere.gif (400x300, 1.92M)

no JS twitter works on my machine, you retarded nigger

Modern web is a platform for GUI application made with tools unfit for the job. HTML + CSS + JavaScript is a web analogue of GTK+ or Qt. So far I haven't seen a single suggestion to fix the web as a user interface framework, all suggestions usually come down to “LMAO! Just use white pages with text!” You might argue that we're using web wrong and that every modern site just should be a separate application or something among these lines but that's a whole other topic.

It doesn't increase the server load, unless you're using dynamic templates for every request (totally retarded idea except for very small traffic sites). The way it's normally done is your tool builds static html from the templates, and you serve the static html from a reverse proxy. Then your app server handles only requests for POST forms and such.

It's 7kb.

An SPA that has to make one 300kb request instead of a traditional multi-page application that has to make 100 40kb requests (or more depending on how long the user sticks around)

hmmmm I really wonder which one will have higher infrastructure costs that can (AND SHOULD) be passed onto the user

Pretty much this. The niggers behind the big tech companies seem to have forgotten you can just write HTML. You don't have to generate it on every page load in something as slow as javascript. It's actually retarded to the point of being funny when you think about what they're actually doing lol

He doesn't mean small bundle size. He means it eats your fucking CPU and RAM.

Webpages are fucking documents, not """"apps"""".
Why the fuck do you need that?

I dunno. "fucking autistic weeb" pretty much describes me and my no-js blog.

I don't want to hijack the thread with my pajeetness, but wtf even is a web/CSS/HTML framework?

Implying the user visited 100 different pages? And what if they leave after the first visit because they realize your site is a bloated piece of garbage? Then you just wasted everyone time and bandwidth, didn't you nigger. By all means, cache js and css so that they don't need to be re-served. Don't pretend that bloating your page is saving anyone though.

Keep doing whatever your doing user

I can't imagine being so stupid as to think that you know better than the billion dollar web industry because "LOL PROGRAMMING BAD, MARKUP GOOD"

dumby dumb retard

The absolute state of (You)

He's right though. A simple html website is safer from hackers, and also doesn't force the user to have bigass bloated browser and botnet hardware to view it.

The absolute state of Zig Forums

...

they get their billions of dollars from shipping ads and analytics, not from making half decent pages. bad bait.

It's the C++ of the web. People poking around a festering, bloated corpse attached to a web browser. Even more people maintaining the few means of making it work. And because all of it is so fucking overgrown and cryptic, the job pays well and maintains said value. It's not better by any fucking means of the word, but it's great at making money, government bureaucracy-style. So, unless you are making money from it, it's objectively bad from your point of view.

There's a reason it's done, and there's a reason none of you have jobs.

Attached: c2e58b6081dd4ebb59f5a4c55720fe1d850e8c1ed6c7a83b8f4d25ab85cc4866.jpg (600x367, 38.74K)

Who are you quoting?

Anyone knows some search engine filtering out websites without JS?

This site would be better and more responsive if it was programmed in javascript.
Prove me wrong.

Hahahahahahahaha How The Fuck Is This Question Real Hahahaha Nigga Just Use Lynx Like Nigga Or Links2 -g Haha.

Saved.

The site would be more responsive if it decoupled data processing and presentation. The trick is to be knowledgable about the protocol, HTTP, and understand that you should use static site generation.

JS isn't bad unto itself. It's the current state of affairs that is bad, that is:
>(((browsers))) come up with some optimizations that make this amount of JS bearable

)))we((( need a browser that is compatible with JS, but not in a way that endorses its overuse. I.e. it should implement only the very core of js, no bullshit like webrtc or webgl, and definitely no JIT compilation, only interpretation. This way we could enable JS on (((our))) site without having to trust the admin that much.

I use wiby for that, on a suitable browser (it's the perfect search engine for Links and w3m). Results like that tend to be more interesting, including a lot of old, forgotten websites. Those are always interesting to read. Their designs tend to be much better as well. Honestly, all I need is shit to read. Text and maybe some images. You can make it look pretty using html and it will still load instantly because it's old design running on new hardware, so of course it will be fast. Javascript just makes everything look cold and dead as far as I'm concerned, and it's difficult to say why. I have no use for it, it's slow and I despise phones, so I avoid it. Hell, I'd say that repelling phone users is always a good thing, I want them as far away from me as possible.

Well, a lot of these websites belong to people that have been around since then in the first place. You can also find websites that have been up for around 20 years. Those are always fun.

Normalfags are the ones that ruined the internet (and everything else). If anything, autistic weebs are what you should be looking for.

No seriously I hear about people using django and shit, but what is it? Do you write shitty python that spits out html or what?

because websites that have a ton of JS are corporate-to-consumer spam generated by markoff chains

because your IP wasn't placed on a blacklist for no reason, you fucking degenerate.

no website in existence has ever offloaded to the client, you dick fuck. they use JS for "UX enhancements". if they were concerned over performance they'd be generating the response with hardcoded assembly. the shit PLs they use are so slow it doesn't matter whether you substitute some templates locally or remotely. oh yeah and they wouldn't be using XML or JSON or whatever meme-ass format they use this week

you put a bunch of retarded "components" together and find out it's impossible to solve your problem in terms of them. DAE remember when even webshotters admitted that django is only good for implementing a blog?

imagine thinking JS is the only security issue on the web

sure, JS is *a* security concern
but it's not the only security concern

Or the biggest security threat: The US Federal Government.

Sure, all network clients can have exploits. And bigass stuff like Firefox, Chrome, etc. are bound to have tons more than the minimal ones, even without the JS.
And if you ignore all the stupid shit the web has accumulated since the 90's, it becomes realistic to just write your own browser in Ada, Forth, or whatever. That also means you can use any OS and hardware.

Attached: Schneider_CPC6128_with_green_monitor_GT65,_Wikipedia_logo.jpg (2392x3216, 3.16M)

correct, but even not considering security, JS sucks ass and gives cancer to everything it touches

are you retarded?
code execution is a huge deal
holy shit, people on this board are so stupid

not really
fewer people find security issues in small software projects that nobody uses, but that doesn't mean it's safer
it's like the BSD people who get confused and think fewer KNOWN/REPORTED security issues (due to fewer people using and auditing it) means there are fewer OVERALL security issues
these two categories are entirely separate

This isn't a BSD vs. Linux issue. Small codebase is esaier for people to actually audit, refactor, or re-implement compared to huge codebase. If you have any doubt, just look at systemd vs. basically any other init. And you can just write something entirely from scratch with a better/safer design, since the scope is smaller. With something that supports all the (((modern web))) you need a huge team of code shitters, so that doesn't happen. And that's how you end up with only a tiny number of browser "families" like Firefox, Chrome, and Webkit. Then it's easaier to write exploits that target 99% of people who are running those browsers.

JS is a *huge* security concern.

One of the applications of exploiting polyglots files inside the browser specifically is embedding HTML/JS data into a picture file with steganography. You still rely on something to execute code inside the browser itself.
Can't argue there, you can't just point the finger at JS and say that this is the only problem with webshit. But, rather than sitting with thumbs up your ass, starting with some mitigation is better.
How does this relate to browsers and JS specifically? The payload is delivered through MMS and exploits a shared Android library that's responsible for playing multimedia files in any program.

On the whole, I agree that it's not the only security concern. But I didn't see any user say that in the thread either. And, as a user, it's a good starting point of trying to lessen the attack surface (along with obfuscating your browser agent, manipulating canvas data e.t.c.) Going full autism, I'd rather have a browser that is just a simple restricted HTML/CSS engine (meaning newer retarded tags are not parsed) that has bindings to applications for viewing/playing multimedia and an API for scripting client-side so you can use any language you want.

>The continue button validates your browser via (((reCAPTCHA)))
No thanks

Works fine for me in simple browser without JS.

almost impossible. influencing normal web browsers would be better.

bruh, people on 4chan were tricked into renaming .png files to .hta (people really are that stupid), and they got pwned because they were polyglot files
it was called the cornelia virus and it's why 4chan has captcha in the first place
if you use a unique browser agent, you are easier to track, so it's better to use a generic/common one, not an "obfuscated" one
I'm a web developer and you'd be surprised at how much layout stuff depends on JS for modern frontend development -- like it or not, that's just how it is
media players/viewers have security issues too, you know
here are just a couple examples:
cvedetails.com/vulnerability-list/vendor_id-26/product_id-1032/Microsoft-Windows-Media-Player.html
cvedetails.com/vulnerability-list/vendor_id-6652/Media-Player-Classic.html
cvedetails.com/vulnerability-list/vendor_id-5842/product_id-9978/Videolan-Vlc-Media-Player.html
cvedetails.com/product/497/Adobe-Acrobat-Reader.html?vendor_id=53
cvedetails.com/vulnerability-list/vendor_id-317/Irfanview.html
cvedetails.com/vulnerability-list/vendor_id-49/product_id-10916/Apple-Mac-Os-X-Preview.app.html
you can literally get arbitrary code execution if you make a malicious video file or PDF

While a limited subset of HTML+CSS (no embedded video/audio, no animations, no pseudo-classes, etc.) goes even further beyond, I'd think that merely omitting JS would hugely simplify a prospective browser's DOM (or DOM analogue) and all related interfaces, renderers and all.

and it'd work with like 2% of modern websites

He's not talking about tracking though. He's just talking about obfuscating the browser so that probing it shows something altogether different than the reality, so that any automated exploits that are done on it will use the wrong target, and will thus fail. I don't know if this has any merit though, compared to simply reducing your attack surface to the bare minimum, by using a tiny browser with almost no features.
As for the rest of your post, do you realize you're in a thead discussing the SCRIPTLESS web? We don't give a shit how much JS is in normie websites like the ones you shit out at work. Personally I use Lynx and Links 99% of the time, and thus can't browse typical normie sites and like it that way. I only use Firefox when I need to do online banking or some other administrative task, but never for casual browsing. Not only it's far too dangerous, but I don't even like the modern web at all, and don't want to have anything to do with it.
As for videos, PDFs, and such, nobody forces you to save those, much less open them on the same computer or as the same userid. ARM boards are dirt cheap and it's simple enough to write a script that watches a directory for files placed into it and automtically scp them to the other computer, then deletes the local copy. Now with this simple method, you have minimized the risk, by sending it to a real containment zone that it can't escape from, unlike those VMs or sandboxes that all the cianiggers are shilling constantly.

JS compilation would be better than interpretation but ideally there should be no JS at all. The only proper use of JS I have seen is the automatic refresh on imageboards or the automatic refresh on a site like drudge; every other use isn't cognizant of what http is and what it's meant for.

Practically no loss. The vast, vast majority of web resources I use are usable in a text mode browser already. I'd bet that 8ch works fine on Links though I've never tried. Maybe I'd keep some SBCs or chinkdroids e.g. for banking, shopping, viewing trash websites when the need arises, etc.

what kind of autism are you on, you cock sucking faggot
you just stated that 98% of websites are programmed with JS. while that may be true, it ignores the fact that not even 1% of them actually remotely have a valid reason to use JS

Nice reading comprehension, buddy. I mentioned those things in response to what someone else wrote. If you could actually read, you'd know that.

They mentioned a browser "that has bindings to applications for viewing/playing multimedia"

you sound like a black millennial with a hint of autism. it's not an issue because i don't have a phone, and the subject of this thread is web browsers, not phones

The web doesn't cater to privacy-valuing people and it never will. JS makes frontend and SPA stuff easy, but you and a lot of privacy-oriented people think it's okay to have web design standards from the 90s. Here's the thing: even though this board is an echo chamber of neckbeards who want websites to be as unprofitable as possible, foregoing ads and tracking which make them money, the fact of the matter is that the web isn't going to bend over backwards just to meet the demands of some dude with asperger's syndrome.

Whether you like it or not, people who have websites want to make money, and privacy-centric design is not conducive to profit. If you can come up with a viable freetard business model, then you can solve the issue of privacy. But until then, there will be Cloudflare, recaptcha, and Google Analytics everywhere.


You don't have a phone and you're calling other people autistic? Oh man.

What are you even arguing for?

HTA is a mistake that could only be conceived by Microsoft. But yeah, this shit is why you actually remove IE if you use Windows.
Even being disconnected from webdev, I had to learn through using uMatrix. That's what I meant by full autism. A niche engine that purposefully ignores this, so the functionality can be replicated with client-side scripting later if it gets to that stage.
No software is immune to bugs and most of it is full of holes. What I have in mind is, by separating these parts into separate programs, you can sandbox and restrict the shit out of audio/video players and image viewers, while leaving the browser largely untouched.

Privacy is an issue of business, not just JavaScript. This board isn't an indication of what most people are like. Most people are fine with JS and tracking and all that jazz. They don't know or care.

Privacy isn't just invaded for fun, it's done to make money. Ads and data mining. All businesses want to make money, and most websites are businesses, not just for fun. They have operating costs and all. Unless you're a NEET, life is expensive.

What I'm saying is that, if you can come up with a business model that respects privacy, everything else will fall into place. NoScript won't change the fact that more and more website really require JS in order to work. Freetards consistently fail to come up with privacy-oriented business models.

what the fuck is .hta? some mac shit? why am i executing code because the file extension changed? the tool is stupid, not the user. literally no file manager on linux will execute code by double clicking on a file
4chan had a captcha since at least 2006 you fucking meme. and no that's not the reason. the reason is to prevent automatic posting, just like every other captcha
correct
no, i wouldn't. none of the retarded bullshit webdevs have done since ruby, hell perl/cgi, has surprised me
kill self
correct, meanwhile the entire web _is_ a security issue and physical health issue

are you trolling or just dumb?
wrong (unix-like OSes can infer a file type based on its shebang/content type sniffing, and can figure out what to do with it even based on its contents rather than filetype), but I get what you meant to say, which is that you think it won't execute code from an image file -- but that's also wrong, because there are image viewer exploits