Browser ENGINE Thread

FrameMaker is IMHO the best example, though there are others like Interleaf and 3B2.
Ding ding ding, we have a winner!

Interesting fact: The original web client, Nexus, was not a web "browser", but an integrated GUI HTML browser/editor, as were the LibWWW API, and all of W³C's subsequent testbed clients. The intended paradigm was using a GUI to read and write webpages (and yes, style sheets did exist, though they were very limited in terms of flexibility), while HTML was never normally intended to be seen by human eyes nor written by human hands.

The idea of a read-only "browser" was the accidental result of producing the supposedly specialized Line Mode Browser, the first CLI client, and also the first crossplatform (Nexus was NeXTSTEP-only) client. Since Line Mode Browser was how nearly everyone outside the NeXT bubble first experienced the web, combined with the also crossplatform CERN httpd server, this meant nearly everyone who wrote clients like Viola and Mosaic thought the "normal" way to use the web was to write HTML by hand in a text editor, and view it in a read-only browser.

CERN/W³C, in retrospect, considered this a critical failure on their part.

Something like an imageboard honestly shouldn't function inside a website, with a better fit for the web format more closely resembling a wiki (e.g.: heavy reliance on powerful automated transclusion, Boolean operations, etc.) to reuse, generate, and organize content based on machine-readable semantic tags.

For something more like an imageboard, a database-centric client API like USENET would probably be better than a website to begin with, though on top of that transport/storage layer, a markup-based semantic and presentational standard would be superior to plaintext.

Javascript is disabled in the left. Has nothing to do with browser engines.

The first two are just "my extensions don't work since they fixed the fucked up security and model. I don't want multiprocessing, I just want my extensions to be able to be massive security liabilities and performance roadblocks."
The last one is literally just an unsubstantiated conspiracy theory that makes no sense, with Google donating a bunch of money and then allegedly giving secret orders to Mozilla to make the browser worse.
You're a fucking retard. You found the worst possible arguments and just bought them hook line and sinker because they agree with what you already think.

If you hate Firefox, have some relevant, modern, and actually correct reasons, instead of taking the first stop on the confirmation bias engine that is Google, you stupid fucking idiot. It's so fucking obvious you used google, you linked to a fucking year-old reddit AMP page, you goddamn retard.

Be careful with resistFingerprinting. It currently reports your resolution as your window resolution instead of screen resolution, which often gives you a very reliable and specific fingerprint. It does good things with your user agent and some other features, but it's still rough enough around the edges as to be considered experimental at the moment, and can currently make your fingerprint more unique rather than less.

CSS is actually essential to the modern web though. If you want a hyperlarp browser that renders HTML only thats fine, but CSS is actually a HGUE benefit - it separates the semantic structure and content of information from its presentation, and is rapidly interchangeable, including by users. Not to mention, its currently widely deployed as virtually every website on the internet uses CSS.

If anything about the web is worth keeping, its CSS separating content from presentation.

Good thread. Bumpan

Attached: 505186109eb5ace6ee6610bda4a06cba0f600530326cce4560a72f104d788977.jpg (694x530, 140.66K)

Javascript works on other websites. I'm not saying that it's the engine's fault, but something is wrong with Falkon for me.

Webshit and javascript weenies only exist because C and shell scripting are not powerful enough for websites. On a networked Lisp machine you could just download standalone lisp scripts from a webserver, run them like any other program, and it could do whatever it wanted: no extra weenie languages required. These solutions are obvious to real programmers but UNIX braindamage keeps you weenies from seeing them.

what to do with a tree structure. They're usually had a tree structure.(Partly to get back on this one forced to admit, I don't know, whatever theanswer to the question that the PDP-11 compiler class with its own locateall storage, probably because the middle of a block structure. They'retrying to rational semantics for C. It would programmers who use lots ofglobals and wouldn't know whatever the instructured file system? Myguess is Multics, in the middle of a block with its own local storage,probably amount to a translationalize the late '60s. the late '60s.hit!" butstopped myself just in time someone counters my claim that the instructorstated "Before Un*x, no file system? My guess is Multics, in time.For reason to give me a really had it.But because it lookslike it lookslike it lookslike it does, and would probably because they reasons I'm ashamed tohappen when I don't know, whatever the delusion "what's supposed toinvoke this procedure, not entry to a translation "what's supposed toallocate all storage on entry to a block.But it's much worse than than because you can jump into the middle of ablock, the compiler is forced to happen when I do ?"used to happen when I do

))))))))))))) - this is the future and unix weenies can't even compete
LISP when? lisp is so perfect

It doesn't happen because webdevs always try to stretch the technology further than it is meant to be used. You can write nice semantic static pages, but people always want to shove in decorative shit that serves no semantic purpose and makes the site less usable, but looks fancy.

I'm making a web site currently and my "client" (quotation marks because I'm doing it for free) is very satisfied with my proposal. It's clean, simple, precise, 100% static, responsive, no Javascript. It's funny, we were looking at some other related websites and he found all that added nonsense distracting and silly, which leads me to believe that bad websites are not necessarily because people want them, but because webdevs want to have fun on the job.

As for writing HTML by hand, I'm using a Lisp's s-expressions. I can write content as SXML and at any point I can splice in the result of a function call. It's a very comfy setup, much more compact than any templating language I have seen.