The situation is no better on the server side, of course. Why write static HTML for your pageful of text when you can have a web framework pulling C++ from a SQL database, compiling it to web assembly on the fly and vomiting forth to the browser just in time because someone might want their flying unicorn background to have different colour sparkles and pointing those people at a different static location is just gauche.
No, I’m not bitter, far from it. It all makes work for the working man to do.
Because pajeet is cheaper, and because bouncy flashy round cornered rainbow boxes and funny interactive flipping sliding panels keep normies engaged even if the page has no valuable or meaningful content whatsoever.
not all websites are like this maybe you were just visiting some shit websites
Because people must work, because UBI is way to JAR, because this all is just foreplay either to space exploration which will never happen until true internationalization or total darwinian genocide of all non-rich not-a-bee citizens. This not about "normies", it's about keeping people occupied.
Stop using insecure CIAnigger (((interconnected networks))). Use Tor, i2p or virtual intranets like cjdns that respect your freedoms and right to privacy.
Question is if it is even a good idea to have imageboards run in the browser. Would it not be better to just download imageboard software and run it on your computer? Then you could just have a standard htlm+css page with some information about the software and some links.
Agreed. Either that or BBS over ssh. Telnet was a joy, in simpler times.
How about you kill yourself
There's a lot of BBS around, but most of them are telnet. But you could wrap that with ssl, or use ssh also. Gopher is pretty limited, but it's very useful if you just want to provide a "file dump" type site. It has less overhead than http and is easier to deal with than ftp. It allows a small amount of server-side scripting too. A gopher interface to git/svn/cvs repos would make a lot more sense than all the nasty html+js most "open source" sites use these days. That stuff is pretty ironic since it basically shackles the user to web browsers that need amd64 with gigs of ram and GPU in order to run well, whereas gopher runs fine on a 30-year old computer or the puniest ARM SBC without GPU blob.
I just dislike companies going all in on numale JS frameworks like Angular or React. I feel it all goes against the original vision of hypertext. Just plain old linked documents. Modern websites are nearly impossible to navigate in text browsers too.
Good point. Websites are easier to filter than software.
Though the main problem is that the website/software needs to be filtered in the first place.
For good software that does not need filtering I don't see use in running it in browser, but for botnet software that needs filtering running software on the websites is probably the lesser evil, you are right in this case.
With respect to Emacs, may I remind you that the originalversion ran on ITS on a PDP-10, whose address space was 1moby, i.e. 256 thousand 36-bit words (that's a little over 1Mbyte). It had plenty of space to contain many large files,and the actual program was a not-too-large fraction of thatspace.There are many reasons why GNU Emacs is as big as it iswhile its original ITS counterpart was much smaller:- C is a horrible language in which to implement such thingsas a Lisp interpreter and an interactive program. Inparticular any program that wants to be careful not to crash(and dump core) in the presence of errors has to becomebloated because it has to check everywhere. A reasonablecondition system would reduce the size of the code.- Unix is a horrible operating system for which to write anEmacs-like editor because it does not provide adequatesupport for anything except trivial "Hello world" programs.In particular, there is no standard good way (or even any inmany variants) to control your virtual memory sharingproperties.- Unix presents such a poor interaction environment to users(the various shells are pitiful) that GNU Emacs has had toimport a lot of the functionality that a minimally adequate"shell" would provide. Many programmers at TLA neverdirectly interact with the shell, GNU Emacs IS their shell,because it is the only adequate choice, and isolates themfrom the various Unix (and even OS) variants.
This is some really interesting criticism on current technology. I'd like to see how the web would've ended up looking from a technology standpoint if LISP-Machines would've taken over instead of UNIX.
I love how we have all these people from the unix.haters crowd crying about how lisp machines are dead but not a single one of you has the technical expertise to fix this by writing a new lisp OS.
Not bad, but now you have do everything manually, including copy & pasting quotes, post numbers etc.
You're probably young and don't remember the 1990s where all the web designers were using Dreamweaver and Frontpage. Those were the first giant doses of cancer that was injected into the Web.
the web was always shit. it just gradually gets shitter over time, and lots of time has passed. the web is like some stupid corporate idea of a platform like flash. it has no bearing on anything. literally the only good idea in the web is static HTML with hyperlinks, but even that is ghetto as fuck. like how do you even parse that shit. for starters it would be a great step forward if you had some markup format that was actually machine parsable as opposed to some crap with no standard aside from one that nobody follows
this nigga knows some shit
it would have looked like AIDS because LISP is AIDS when you start using libraries from 3000 different people. you really want something with a real type system like SML but for the entire OS
as long as you remember/pol/ is now an edgy GOP forum, r/t_D part deux
So why the fuck doesn't somebody make a modern LISP machine? That one faggot's been sitting on Genera forever, maybe somebody can cattle prod his ass and we can get it released?
For that matter where's my Smalltalk machine?
Implement Xanadu using HTML 4.0 only. Go, on, let us see. Back in the 80's and early 90's people from warez and demoscene used to share diskmagazines. Every one of them was unique program, and not just a boring txt file or index like gopher. Treat web as works of art with hypertext capabilities.
Do you write OSes on your spare time? Are you Terry? No matter how simple whatever you're doing is, it's not simple enough if you're not shitting out boiler-plate. And unfortunately people don't generally take solitary hobbies as their main hobby. Otherwise we would have a ton of hobby OSes and software projects lying around.
I threw my hat in, but chances are I won't be drawn.
We actually to have a ton of toy OSes. These toys are nothing more than proof of concept and works of educational learning. Most of these kinds of toys don't advance beyond this.
Sure, but the main complaint in those days was that it was no longer possible for a *human* to know what the code was doing. We've come a long way.
daily reminder that you faggots actually care more about features than if something is a bloated piece of shit or not, which is why you are posting using a 20 million line kernel, on a billions of transistors cpu, with a millions of line web browser, on a shitty php image board.
Zig Forums status: B L O W N L O W N
T H E H E
F U C K U C K
O U T U T
you didn't have to be so mean user
this is a complete and total straw man
it's like saying you can't care about the environment because you drive a car
People actually do care more about getting to work than they care about the environment. People don't actually give a big enough shit about the world to change how they act. At best they recycle and post about it on facebook.
Don't defend your worthless shitpost, just kill yourself instead.
You are just like those supposed environmentalists that complain about muh pollution all day while they drive around in their private car, and consume hundreds of gallons water every day with their hot showers. Its all about the bitching and not the cause.
bitch all day, glug down that soy milk, never change
I 100% agree with you.
Lisp machines have a CPU designed to run Lisp with tag checking built in to speed up dynamic typing and GC. It's slower and needs a lot more code on different hardware. Remaking a Lisp machine is hard to do because we're not using the right hardware.
Once one strips away the cryptology, the issue is control.UNIX is an operating system that offers the promise ofultimate user control (ie: no OS engineer's going to take away from ME!), which was a good thing in itsinfancy, less good now, where the idiom has caused hugeredundancies between software packages. How many B*Treepackages do we NEED? I think that I learned factoring inhigh school; and that certain file idioms are agreed to inthe industry as Good Ideas. So why not support certaincommon denominators in the OS?Just because you CAN do something in user programs does notmean it's a terribly good idea to enforce it as policy. Ifsociety ran the same way UNIX does, everyone who owned a carwould be forced to refine their own gasoline from barrels ofcrude...
Lisp is too expensive and hardware companies cannot be trusted in this age to create reliable and secure Lisp machines
Because lazy programming.
Welcome to CY+3. Only now we can compute functional languages with bearable performance.
That's fearmongering. Lisp machine microcode could be changed by the user.
They're lazy because the systems programmers are lazy. Nobody cares if things work when the OS doesn't work. That way of thinking goes all the way from the kernel to the shell scripts.
You're missing the point. The web has huge redundancy and bloat because it does everything the OS does (or should do). Every programming language has to do the same things from scratch because the OS doesn't do it. Lisp machines have hash tables, closures, bignums, strings, arrays, exceptions, OOP, streams, and GC all there already and it's standard and every program can use it. You can pass a closure or hash table from one program to another without any kind of serialization or other bullshit. A Lisp machine has a systems language with the high level of scripting languages. I don't think everything should use GC and most OSes that I consider good don't have GC, but on the desktop we're using Python and JS so we already depend on the GC. Lisp machines are just taking away the bloat and redundancy that we don't need and making these features more powerful.
Up until recently, we owned everything from the hardware to the microcode to the applications. We could fix anything that broke at any level; we could evolve wonderful new systems. How do we "fix" the X11 releases or the SMTP protocol or SunRPC?? In my opinion, things got the way they are because market forces completely overwhelmed technological forces. Because UNIX was free (or nominally licensed) it came into wide use, first in CS and EE departments and later in the world. To some, moving from MS-DOS or worse, it seemed like a win. To those of us who have been around for a while and are aware of the alternatives, it seemed like a nightmare. We thought it would go away when users came to their senses. We were naive. Sigh. Meanwhile, thanks to BSD, UNIX grew like Topsy, or more like barnacles encrusting a sunken ship. Ultimately, UNIX began to be viewed by decision makers who were not technically competent as a panacea for competing technologies.
Because cucks are coders now.
hahahahaha, you think the commie MIT crew is new?
Blazechan: yes You can upload multiple files with the file input nope not implemented yet yes yes not implemented yet on the form with JS, on a separate page without JS
Can't you make a Lisp machine with FPGA board?
Choose one and only one
Absolute state of niggerware in current year. Is it so hard to put captcha iframe right under the reply box? oh my. here goes the blazing speed
would be forced to refine their own gasoline from barrels of crude... That would be an improvement. It would make the actual car operators masters of their craft and weed out the retards on the roads.
Because that breaks the workflow. The captcha is an extension, it cannot add iframes to the post form without JS.
We had websites made in Flash and Java before js/html5 took over.
Java wasn't used much, and the flash stuff wasn't nearly as widespread as the JS/HTML5 crap today. If a site used that shit, I could pretty much just write it off and not care. I even used non-JS browser for my credit union and ebay around 1999-2003. I would turn off JS in Netscape anyway because it crashed so much with that on. But I could pretty much browse most places in Lynx those days and it was comfy.
Dreamweaver had some pretty neat ways for generating clean static webpages with templates though. Also Dreamweaver was a 'dream' compared to anything frontpage rendered.