Web Browser improvements A thread dedicated to improving Web Browsers; what would like like to see removed, added, or work differently?
I'll start with a short list of things I'd love to see:
Better caching (and thus privacy) - Completely ignore 'no-store' caching directive (cache everything) - Never send conditional HTTP request headers (e.g, "If-Modified-Since") but do checking based HTTP response headers yourself (privacy) - Previously browsed pages and resources will always be available offline and the user is asked whether the page should be updated or is allowed to connect to the Internet - Always try to hit cache without making any connection to the server which had (or has) the resource you are requesting - A portable, exportable cache storage format which is content-addressed with rich metadata - Clearly indicate to the user that he is browsing a cached page and how long ago it was cached - User-defined rules for URL de-duplication (make suggestions based on content with the same content hash but different URL) - User-defined cache invalidation (e.g., invalidate CSS every X months) - Keep track of which resource is dependent on what HTML document so it easy to delete web page(s) with all their resources (which are only used in those documents) - A cache whitelist and blacklist based on Regex or something more simple - Add WARC support - Add "Remote caching" support
Better privacy - Only send referrers on the same domain (sending no referrers can break some crappy sites) - Always try to connect to HTTPS version of a domain (if it doesn't support it maybe optionally store it in a HTTPS blacklist) - Decrease fingerprintability (Tor Browser does this quite well) - Install uBlock Origin or uMatrix by default with a sane list of defaults
This makes it sound like a NSA wishlist tbh. Why not add a caching option that stores everything in memory and deletes it every time you turn the page too? You can do this with pre firefox26 browsers where you set the cache to delete on next pageload from memory and never store anything to disk. Why did these options dissappear?
Like its nice to have the extreme control over the cache you want. But why not a option to have no cache of any sort? Its easier to hack into a windows computer then it is to break SSL encryption assuming the ssl encryption isn't already backdoored and then steal the browser cache of the windows computer windows 8 through 10 automatically send your browser cache of the major browsers to microsoft/NSA anyways so moot point.
Easton King
Don't use firefox anymore as its codebase is pozzed to hell and back along with the tor browser being based on the bad fruit that is modern firefox. Use palemoon 27, but not any version after 27, because it is based on a firefox 26 codebase with performance/security improvements. That and all the newer browsers have CSS2 which backdoors you even with javascript off and ublock installed. Palemoon 27 does not have CSS2/backdoor but palemoon 28 is based on firefox 52 which does have CSS2. Since you compile from source just use palemoon with newest libraries needed and no pulseaudio you can use alsa alone with 27. To get the tor browser security features install addons that change user agent, platform agent, app agent, javascript OSCPU/nonjavascript OSCPU, accept headers, and disabe canvas fingerprinting. You can change reported screen resolution with the CTRL+ or CTRL- keycombos. You can't control the reported fonts your operating system reports unless you change them on the OS. And the last thing of note is randomizing the time reported by http refferers or disabling them outright.
Nicholas Ortiz
Remote --- well, your own local server I meant (or whatever the user wants). That's already an option in every browser.
Kevin Flores
The thing that'd worry me is fingerprintability --- but if Palemoon uses the same SSL-backend (NSS with TLSv1.3 support) then you can imitate Tor Browser with no problem.
Gabriel Moore
'List of XUL-based (Legacy) extensions WebExtensions really do suck You can get a list of XUL-based extension (probably contains some WebExtensions too) by downloading the XPI from: github.com/JustOff/ca-archive/releases extract that like a ZIP archive and look for a SQLite file which contains metadata about every extension as of mid 2017
Brayden Rivera
Yes it supports NSS with TLSv1.3. It uses the same ssl background libraries, the openssl/libressl/insertssl implementation you have installed on your system if you use system libraries having compiled it from source.
But not like I was explaining. Most browsers delete the cache on browser close, but that doesn't happen if it is written to disk and then the computer is powered off without the browser closing, which means it stays on the disk till next power up. I mean deleting it as soon as it is in memory and storing it only in memory. Like lets say you load 8ch.net and then 8ch.net/tech/index.html, the browser I desire would delete the 8ch.net page from memory and all associated data/cookies as soon as it starts loading 8ch.net/tech/index.html and it would do no disk writes.
No modern browser does that, they always write to disk first and delete afterwards.
Regarding caching, I want it to be persistent. I want to ignore all server-side caching directives and store everything I want to store permanently. One thing I forgot to add: - Add versioning support for the cache --- so you can browse your own personal archive like on the Wayback Machine
Justin Cook
You could probably write a XUL addon that does everything you want OP. If you were using webextensions you would be shit outta luck but with XUL its possible to do. Or maybe make it a NSAPI plugin but IDK I am unfamiliar with the stuff.
Jackson King
Right, WebExtensions are extremely limited. So I might consider PaleMoon or some other Firefox-based browser that supports XUL-extensions. Otherwise I'd have to hack the C++ code of modern Firefox (or Tor Browser).
Adam Lewis
Theres a "Better Cache" Firefox extension that fits op's description somewhat but IDK where to find it maybe try that archive user posted above.
Nolan Hall
It's not in the database nor can the extension online... will try to find it because it seems useful.
I've been looking into PaleMoon though I don't really trust it for privacy or security, is WaterFox any good? WaterFox seems to support XUL-based extensions still.
Joshua Collins
They're mutually exclusive, you know.
Levi Cooper
Haha, maybe. I just want to able to extend it myself --- not necessarily install third-party XUL-based extentions. Otherwise I'd have to fork the browser and add the stuff I need in C++.
Gavin Smith
That's actually a fair point. Waterfox would be your best option.
Julian Allen
The way to extend Firefox is to implement your extensions in C++ and not in the extension system.
Lincoln Sanders
What the fuck Tor Browser ...
gitweb.torproject.org/tor-browser.git/tree/toolkit/components/resistfingerprinting/nsRFPService.h // Defines regarding spoofed values of Navigator object. These spoofed values// are returned when 'privacy.resistFingerprinting' is true.// We decided to give different spoofed values according to the platform. The// reason is that it is easy to detect the real platform. So there is no benefit// for hiding the platform: it only brings breakages, like keyboard shortcuts won't// work in MAC OS if we spoof it as a window platform.
The web should only be an index of links that get opened by external software. It should be lightweight and do only one task, that is to render the recived webpage. What it shouldn't be is a independently ran script that can do a number of undocumented actions. The problem with web browsers is that they aren't simple indexes that lead to other sites/sources. They are in fact full featured "suites" that to many things they shouldn't.
It should be up to the indivisual to choice what program he wants for email, videos, image viewing, etc. These things shouldn't be implemented in the browser, they should be implemented in a standard way that allows ease of access to external programs instead. For reasons, this kind of setup can't really be called a webbrowser as it definitly isn't following (((w3c))) standards. not that anyone cares
If you want a site to do more then just show links, there would be a basic markup and maybe some sort of implementation of a lighter weight CSS. For interactivity there is basic post and get request that have been standard for a long time and those never showed any real signs of being derespectful to privacy or invasive. The problem is when a browser implies these requests on it's own wim without the know of the user. Everything needs to be done in a way that the user is prompted and can answer.
If you have a problem with prompts, gtfo. A computer should never run anything implied ever.
Austin Morgan
Sounds cool but I think with blacklist a lot of dynamic sites will break. With a whitelist might be nice, especially if browser automatically detected frequently used sites and offered to cache them. You do have the same issue as Chrome not letting you disable history, though. Also why do this in browser when it can be done at a proxy server level? For brainlet users who can't into proxy servers? Maybe a user friendly caching proxy is a better idea than a new browser.
What would be cool is if it preemptively downloaded all the links on the site to avoid the delay when you click on them. Webdevs will of course endlessly bitch about muh serb loads, but yolo =DDD. Chrome does this but it also phones home to Google which destroys privacy.
If a browser came out with this feature I would I ironically uninstall all my other browsers and never look back. When goddammit.
Noah Smith
this is so god damned stupid
Anthony Barnes
I think that's a better and more flexible solution and I'm working on a hybrid solution. But when you use software such as mitmproxy you'll be fingerprintable by it's SSL implementation. Interesting idea, thanks.
Right. If people want to use a dynamic site they'll still be able to browse the 'normal' version but you'll always have the option of browsing the offline version of the web page.
John Flores
Are you actually retarded or just pretending? The reason shit isn't cached for too long right now is so that websites that DON'T use JavaScript can still be fucking used by normies that don't understand any of this. If you cache shit for days, the normie won't get the static site updates.
Stop posting. You're not smarter than the whole rest of the fucking world.
Christian Thompson
There's no problem if you make it clear people are browsing a cached version of the site and giving them an option to update the cache (as I've explained before) Also it could be completely transparently caching your shit, so in case you want to browse the page offline or visit previous version of a certain page (ala WayBack Machine) you'll be provided with that option. I'm not trying to implement the same shitty caching as you know it as. I'm only slightly above average Western European intelligence so no I'm not Start reading before posting