What is the ideal browser to people to this board and what language it should be written in...

What is the ideal browser to people to this board and what language it should be written in? How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

Attached: Real_Sonic.gif (320x320, 4.91M)

It should be fast, secure, open-source, botnet-free, have a clean codebase, be minimal, yet be able to handle both older websites and newer websites (HTML5, CSS3, JavaScript (at the user's option), and when appropriate, ES6).

and what language it should be written in?
Rust, of course.

As with any programming project, the hard part is coming up with all of the good ideas, running the Github and Slack, and creating the artwork/design. After we've gotten that nailed down, it should be pretty easy for a few anons to actually write a browser that meets the above criteria.

I'll set up the Github and get started on the logo.

subjective, but I prefer a fork of Firefox. On non-GNU/Linux systems I prefer Waterfox, and on GNU/Linux I like GNU Icecat.
Currently, I think Firefox is written in mostly C++, although its new CSS rendering engine and probably some other shit will be in Rust.
Using an already-created web rendering engine like WebKitGTK, QtWebEngine, or others like that? Shouldn't be that hard
Building literally everything from the ground up? Hard as fuck. See Netrunner.

Is there a browser that does not handle old websites well? What could cause that and is there any examples of browsers failing to handle sites well? Also for the new websites, aren't they horribly written or just use a heavy amount of javascript or flash like amazon? Also would you define minimal as that can mean many things to a individual. In a related note why would someone even put a botnet in their own browser and by that I mean what advantage does the developer or company gain from it in the first place?


Would there be any disadvantage in using a rendering engine like you mentioned or is fine to use it?
I would like to have a better running firefox that did not get rid of the old fashion add-ons but from the looks of the other forks that may be possible so I am wondering if something similar should be build from scratch.

In a unrelated note how good a browser would have to be for you to pay for it; one time, a timely subscription, or just to donate?

Dillo. Plenty of old websites use previous versions HTML, and few websites, old or new, actual contain 100% valid markup. As a result, browsers like Firefox and Chrome contain a ton of code that tries to account for all of this malformed markup, but still display the page the way the author intended. Dillo doesn't. If your webpage isn't valid markup, there's no guarantee that it will render correctly, and it often doesn't.

So if you want to create a browser that most people will find acceptable, you'll need to ensure that it takes a whole lot of badly written webpages into account. Also old HTML standards.

Flash is dying. Lots of modern websites use a lot of js. Whether they're horribly written or not varies and is largely a subjective judgment anyway.

Merriam-Webster sez:
I'm talking about meanings a and c.

Do you consider a billion or more U.S. dollars to be an advantage? Google didn't botnetify Chrom(e/ium) on a lark because they were bored and couldn't figure out anything else to do.

A browser that shall only render text and images anything thing else like remote code execution are security issues and will slow it down.
It shall at least have options for anonymized P2P mesh-network navigation with an integrated firewall like umatrix to filter the possible content loaded.
Video links and other content that normally is rendered in modern browser shall automatically be opened in the appropriate software (with option to do or not)
It must be under the GPLv3+
C, Ada or one of the lisp family language.


baito
Choose one.


Placebo

It's impossible to do this without a big team.
At the moment you add JS in your browser it automatically becomes a impossible to secure/maintain over time mess.

Amazon take a long time to load and gets buggy so that is why I consider it bad, but I am not sure if it is just a waterfox thing on my part. Also is what you are mentioning about the old sites is solely a issue on the minimalist browsers or is that in the more packaged browsers as well? Also I would be nice on why you would prefer minimalist browsers vs more package browsers? I would assume that it is more then just runs much better than the bigger browsers right?


Is it impossible to not use javascript or at least minimize its usage?


What browser comes close to fulfilling that description

Also two other things I would like to ask everyone: what is the browser that is closest to your ideal browser and how would this browser fund itself without botnet, starting a non profit like mozilla or just ads on their website like palemoon?

We already have the ideal browser - it's Links in graphics mode.

First of all, forget about the anons saying your browser shouldn't include javascript. That's a quick way to the grave. 95% of sites will be unusable. Unless you want a browser only for anons to browse 8ch, Javascript must be there.


Concept-wise, Otter Browser, but it's a long way from being usable (no addons).


Mozilla has been funded by Google, Yahoo, etc. I don't know, try donations. The only ethical way.

As for the actual browser, old looking UI. None of the shiny, slick shit of FF or Chrome. No dialog boxes, or other stuff popping out of nowhere bothering the user.

Privacy focused. Nothing whatsoever sent out without the user's explicit permission. uMatrix, SSL Enforcer, Decentraleyes built in. Addon support should not be required if the functionality is there by default. User agent would be the newest Firefox (or Chrome) by default. One-button Tor toggle.

Forget minimal. They already exist and they suck. No one uses them.

OR fuck it, try a kickstarter or something? Video games were funded this way, and a browser is much simpler.

Has someone attempted to kickstart a browser before? Did forked browsers asked for donations?


So how does one make javascript good? another user suggested rust, is that better than C or C++? Also do you like Otter browser for the fact that it retains the old opera browser direction or is there something more than that?

Plenty of minimalist browsers parse old/incorrect HTML markup just fine. It's just Dillo (AFAIK) that insists on sticking to the standards. I'm just making the point that people are going to expect a new browser to function like Firefox and Chrome, and that means being able to handle old HTML standards and badly-marked-up pages, which means a lot more coding.

Speed is not my primary consideration in advocating for minimalism, though it's important. I think the browser codebase should be minimalist so that it is easy to maintain and easy to spot security flaws and other bugs in. But the browser should also be fully-featured, including support for js and the latest (X)HTML and CSS specs. Minimal in codebase, maximal in features and user experience. It should be pretty easy to code in Rust. I'm not sure why it's taking Mozilla so long.

Just carefully continue to improve netsurf.

What do you mean "make javascript good" ? Just have Javascript support, like every normal browser. I think webkit handles that...

Whatever language you use does not matter. Don't concern yourself with these useless issues. It's just for Zig Forums autists to stroke their dick over.

I don't know how the old Opera looked, but Otter's UI is exactly how it should be. I mean, even something like Tools > Cookies is something that other browsers don't have. Or importing custom styles without a fucking addon.

It already exists.
It's called Links.

something less shit than firecox or its forks for starters. i mean Links is the best but it doesnt work on all sites. once you start supporting more sites your browser becomes a piece of shit
one with memory safety

it's not hard at all it's just tedious and creating an ideal browser is an unsolvable problem because sites have too much freedom (JS, CSS, complex protocols, etc). no matter what you do it's going to be shit. you're literally better off just writing scrapers for every site you use and making a program to aggregate the scraped data

No it isn't, it's a programming language there's no real limit to what people can imagine with it.
Simple proof of that is the sandboxing on each browsers that keeps getting pwnd each year.

The closest one would be links but half of what was described in that post is missing in it.

The browser that was described in post doesn't need that much funding or people to make it two to three people are capable of this, simple funding like krita does would be good but the use of such browser would have a limited crowd since it wouldn't be compatible with websites who requires JS to load.
Why would this be possible ?
Because it doesn't require god dam JS, once you implement JS it's a perpetual game of cat and mousse on a security level.
And I'm not even talking of the W3C who spread their ass cheeks each time a megacorp wants to reinvent the wheel with backdoors in browsers thanks to JS.


And 99.9% of web problems will be gone.
No it's not mandatory.
Then don't implement JS it's the core source of fingerprinting/tracking.

html only and have people host their own webservers from their toaster.
I know the 2nd part isn't really part of the webbrowser, but there is no real reason for anyone to switch webbrowsers if no one is actually utilizing the technology.

No, you will just hide them.


Modern browsers send many requests on their own, regardless of JS.

As others have pointed out, it depends on what you consider to be a browser. If you want to use an existing browser engine the task is non-trivial, but it is an attainable goal. If you want to also write your own engines then forget about it. Not only are the official standards a pure nightmare, in order to be compatible with most websites your browser must also be able to deal with all sort of malformed shit.

This is a design problem in my opinion. When a browser sees invalid markup it should not render anything, only display an error and where it lies. Just like a compiler. Unfortunately browser developers were racing to the bottom to see who can be more compatible with shit code, and as a consequence people writing web pages saw that it "worked" and assumed that the code was correct.


A barebones browser that does nothing but display web pages (supporting CSS and JS of course). Everything else should be a plugin. Yes, even the user interface with things like forward/backwards buttons. Some essential plugins would be shipped with the standard installation, but on a technical level they would not be hard-coded into the browser. Instead the browser would just provide an API that these plugins can hook up to. Something like the design of Neovim, except for browsers.

What you just described is called XUL by palemoon. It is essentially a CSS, JS, and HTML interpreter with plugins/platforms for literally anything else. Pozzilla used to use XUL to make the thunderbird email client but then pozzed the shit out of XUL. Palemoon's devs seem to be trying to restore XUL to be useable like a plugin system again but at the coding/C level and not a click and it just works level like with chrome/modern browser plugins. Actually now that I think of it XUL could be the emacs of C.

Is there a reason for doing that? Also is there a practical reason why XUL as made obsolete by firefox? I never did found out the reason for it. In a related note is Basilisk browser any good?

This is incorrect, if you don't implement JS support you cannot execute any of it in any conceivable way thus you completely avoid the problem and not put it behind a firewall like umatrix does it.
Umatrix is basically a firewall it's not 100% failure proof. Depending on how the controlled opposition that is the W3C accepts new vague implementations it can go wrong on many levels without us knowing about it.
I was talking about external entities that uses functions in the browser not the browser itself sending X on it's own.
Software in general needs to avoid the BIG BALL OF MUD as long as possible and not implementing JS is one helps that. Of course JS isn't the only reason for that for example rendering videos in browsers instead of an appropriate software is part of what causes the Big Ball of Mud.
The past decade proves you wrong.


C is fast.

Economical/time reasons they dropped xul simply because they thought it would took too much time to correct it and went instead with what chrome did.

M
O
T
H
R
A

Attached: 1920x1080.png (1920x1080, 1.58M)

So you hide them (by not being able to visit the websites that use JS). But they will still use it, and you will be limited to shitposting on 8ch for the rest of your life - assuming it too does not require JS at some point.

If you consider JS a problem, that's not the way to fix it because eventually it will be required everywhere - so fix THAT, instead of hiding behind a browser that does not support it. Besides, those browsers already exist, so it's pointless to make another.

XUL was never designed to support security or a multiprocessing model. It's impossible to bolt security into XUL or make it do multiprocessing without destroying backwards compatibility of the existing XUL API.

Regex-based configuration.

. { -browser-javascript: off; }
(google|search) { -browser-autocomplete: off; }
^..\.wiki\w+\.org a:visited { font-size: larger; }
myfavoritesite\.com { -browser-screen: full; font-size: large; }
\.com$ { -browser-cookie: off; }
\.com$ input:last-of-type { border-color: red; }

In fact, I imagine it should be possible to hack Firefox to allow change of its existent config directives on such per-site basis... if not in possibility sense, then in the "it SHOULD be possible" sense.

Also, upon rereading the thread: directives such as

-browser-request: block;
-browser-proxy: myproxysettings;

would also be a thing of course.

Cool idea, but the vast majority of people wouldn't use this. I mean, I've even been programming for a long time, and I don't know regex...

Completely programmable and automatable. Allows user to override standard behavior; for example, control what some JS functions return or if they're even defined at all. Targeted at developers.

Your loss. The web is inherently string-based by means of URL-s, as opposed to IPs which regexes are far less suitable for.

Programming in what and how long is this "long time"?

Can't you do this already? Won't the following work?

alert = prompt; // fight me

So is keeping XUL not doable? Is it better to make a new XUL that can do both of theses things over too continuing the old XUL? Also what makes XUL so vulnerable and not suited for multiprocessing?

So is keeping XUL not doable? Is it better to make a new XUL that can do both of theses things over too continuing the old XUL? Also what makes XUL so vulnerable and not suited for multiprocessing?

Attached: nein.webm (480x360, 221.98K)

You can keep XUL today, just fork the old XUL implementations and do as you wish. What you won't be able to do is to extend the XUL system so that it has a security model to segregate the different concerns of different applications within Firefox while remaining backwards compatible with old XUL extensions.

In addition to this problem, you will not be able to implement a multi-processing model within XUL while remaining backwards compatible with old XUL extensions. Multi-processing is not a trivial problem to achieve correct computation. Single processing is straightforward to achieve. Multi-processing requires a much more involved understanding of your computing model involving issues of the timing of computations with the improper timing of processes that totally screws up the logic of the whole system. The XUL system was only designed to work on the single processing model; it will break down when you try to use it in a multi-processing context in its current architecture. When you change the XUL architecture to be multi-processing compliant, it will no longer be backwards compatible with the old XUL extensions because all of them assume to be single processing.

But if you accomplished this you would have the emacs of C.