Why the hell is the GNOME shell written in Javascript?
I thought it was just the extension engine but no, Javascript has literally permeated the entire thing, this is completely absurd. No wonder it's so slow.
Why the hell is the GNOME shell written in Javascript?
lol wew laddie
Look, it's well supported and has plugins ok. I can't go back to WindowMaker.
Javascript is much easier to use than C. Programmer time is much more important to the Gnome application developers and computer time is less important to them.
That's funny because node.js is written in C and C++.
yes, they've made that painfully obvious to anyone whose tried to use that slow bloated mess.
enjoy your bloat, plebs.
Who are you quoting?
...
I use Gnome3, Gnome Shell, and Xmonad as my DE.
Explain. The entire codebase is 100% is javascript? I doubt that.
Is it just Gnome 3?
I've been programming in Javascript and it is worse than programming in C. You get the worst of both worlds if you program in Javascript, its a weakly typed, poorly defined dynamic language. It makes you wish you were programming only in the most restrictive ball and chain language with absurdly strong static typing. JSON is the only good thing to come out of this mess.
If you don't document and plan out the data structures and processes that your JS application uses, then yes it is trivial to run into type mismatching problems. I don't have type matching problems in any of my software regardless of the language because I take the effort to formally document multiple various aspects my application and I keep the documentation up to date. What's much more difficult in C is the resource accounting and pointer accounting that is inherent in any C application.
I've been coding JS web apps and node.js servers professionally for years now and we never plan out data structures or any of that shit. Almost never run into type mismatch issues.
Like, how fucking draconian and "enterprise" is your codebase that you constantly run into type issues? When you know where your variables come from and where they go, you don't have an issue.
Sure, but you don't go into C not expecting to manage memory or pointers really. There are many dynamically typed languages that just do a better job than JS when it comes to types.
Its more of a problem when you program with idiots or people who simply don't care. But most of the time JS's behavior is just annoying, not really too harmful most of the time.
That's not even the worst part, the whole display server and kernel I hear is written in C. What weenie came up with that idea?
Pic related
Well considering Rust and D didn't exist at the time it was a sensible choice.
55% of it is in Javascript, the rest is C (although most of that I'm guessing is just code from Mutter).
gitlab.gnome.org
please off yourself.
To be fair, Ada and Eiffel existed at the time, as well as Pascal and some Pascal-likes.
Ada took years to get decent compilers, let alone fast ones. X began development shortly after the first validated Ada implementation and as far as I know, there weren't any FOSS Ada compilers back when Linus began working on Linux.
Postdates X and I doubt it had a FOSS compiler when Linux came around either.
Only one of Kernighan's complaints was addressed in time for X. The rest came in Extended Pascal and six million incompatible, compiler-specific extensions, so even if it had a FOSS compiler in time for Linux the kernel would be very closely tied to the compiler, far closer than it's currently tied to GCC.
GNOME uses JavaScript because, even though it sucks, it's better than C. All these weenie scripting languages like JavaScript were created to avoid C or work around some C brain damage (like not having a working type system). A lot of languages were used for systems programming like Lisp, Fortran, Ada, Algol, Pascal, and PL/I, but they did not develop this scripting language philosophy. Instead of scripting, any language with compatible calling conventions was used directly by the programmer. This goes with dynamic linking and the ability to compile extensions and add or remove them while the program is running. UNIX uses scripting languages for extensions instead of dynamic linking.
This appeared in a message requesting papers for the "USENIXSYMPOSIUM ON VERY HIGH LEVEL LANGUAGES (VHLL)": UNIX has long supported very high level languages: consider awk and the various shells. Often programmers create what are essentially new little languages whenever a problem appears of sufficient complexity to merit a higher level programming interface -- consider sendmail.cf. In recent years many UNIX programmers have been turning to VHLLs for both rapid prototypes and complete applications. They take advantage of these languages' higher level of abstraction to complete projects more rapidly and more easily than they could have using lower-level languages.So now we understand the food chain of advanced UNIXlanguages: level languages analogous organism ----- --------- ------------------ low assembler (low-level PDP) amoeba intermediate regular expressions tadpole high C (high-level PDP) monkey very high awk, csh, sendmail.cf UNIX man
I always wondered why it seemed to run like shit
Quite possibly the most unbased thing you've said yet.
Fun fact: Brendan Eich just wanted to put Scheme in a web browser but his bosses kept telling him to add more features and make it look like Java. He had ten fucking days to get it ready for a Netscape Navigator beta and, surprise surprise, the niggerrigged language was shit.
When? Certainly not before ISO 7185.
GNOME Shell isn't written in JavaScript, it's written in C. GNOME Shell Extensions are written in JavaScript, which is a different thing.
just use unity, it's like gnome but not retarded
Canonical doesn't care to maintain Unity any more. Are you aware of the Unity community fork?
GNOME Shell isn't written in JavaScript, it's written in C. GNOME Shell Extensions are written in JavaScript, which is a different thing.
See
No, that would be Cinnamon.
i almost want something with a taskbar but i just cant find a good place for it on this dual monitor system. it just looks weird if its only on one monitor but even weirder if its on both
This but unironically
Perhaps, but it's true. JavaScript is at least as good as C, in other words, it's better than C. Strings and arrays are better in JavaScript and it has exception handling, bounds checking, GC, and objects. JavaScript sucks and is worse than most scripting languages, but all the parts about it that suck come from C. The problem is that too many "programmers" have no knowledge of anything besides UNIX languages, so they don't know what a good systems language or a good dynamically typed language look like. Plenty of low level systems languages have real strings, for example, and a lot of times they were added to make use of specific string instructions on the computer. C is incredibly bad when it comes to syntax, semantics, and types. Code shouldn't have to be wrapped in a useless "do while(0)" loop to be able to treat it as a statement, but that is the "correct" way to do it in C. Arrays shouldn't "decay" to pointers because they're totally different things. Assigning an array is not the same as assigning a pointer in any other language, because one copies all elements and one copies an address. Everything is broken and what would be considered terrible kludges in any other language (even ones as bad as PHP and JavaScript) are taught as how "programmers" (aka UNIX weenies) do things.
That's just more proof that the bad parts in JavaScript came from C (directly or indirectly). JavaScript without UNIX brain damage would be Scheme with map-like objects and prototypal inheritance.
Just like B, C, C++, sh, awk, Perl, UNIX, HTTP, and so on.
en.wikipedia.org
en.wikipedia.org
Subject: Advisory Locks are Like Advisory Array BoundsAnd like "advisory" type systems.You know, the systems that check for problems only when itthe checks can't possibly fail. (That way, you can optimizebetter, right?)
Keep whining but i'm going to keep enjoying my heavily customizeable and human guided experience that GNOME provides.
based
you don't get to make these calls
based
based
based but GNOME is shit
Languages and operating systems are more than a collection of features, what truly matters is how well those features work and fit together. JavaScript suffers from the C++ problem of piling on features regardless of whether they actually fit and like C++, writing good code in it means restricting yourself to a small subset which shuns most of the fancy features.
Saying a scripting language is better than a janky systems programming language because it has six million poorly implemented features instead of a few is retarded, especially when that scripting language needs a legendarily fat VM to support it.
Again, if you keep using the genetic fallacy to blame Unix and C for every way their descendants screwed up, there's absolutely nothing stopping me from taking it a little further and blaming Multics and PL/I for spawning Unix and C.
Both of which relied on custom extensions of the language, fitting the time-honoured Pascal tradition.
What this guy said
Just a few minutes into Ubuntu, and I was scrabbling for something else. Currently going with KDE.
why cant everything be in C and C++ like everything in Windows?
Linux own downfall is the fragmentation of user space
I'm sure you mean runtime GC, in this case, kill yourself. Promoting laziness and enormous source complexity in a system language is not based. To be really honest, I'm sure guys like you have never tried to implement a GC or seen how something like V8 or OpenJDK's GC(s) look like to be able to perform remotely well.
If you were talking about stuff like deterministic build-time GC (Rust, Mercury or Carp), then you'd have a point.
Mainly solving the namespacing problem with retardation.
Worse thing ever invented. When you read procedural code, you just look at function input/output and you know what's available and what are the side effects (if you don't use globals too much, of course). In POO, you must do like the salmon and climb up the inheritance chain before you know what's there.
Why must you open your mouth to say such retarded drivel? Dynamic typing is just there to work around the lack of a good type system; look at how the ML family does it. It speaks for itself when the JS and Python faggots' new fad is adding type informations (Typescript, Mypy).
All you're whining about C is about its historical baggage, which does make C shitty, but still better than having your language's core ideas bad from the beginning.
What are you talking about, nigger. Windows is a lot of C#, too. In the end, choice is what makes GNU/Linux what it is.
use i3 or dwm like a real huwhite man
Fuck you. OpenBox 4 Lyfe
there is not a single good DE on loonix/bsd, so stick to window managers, plenty of decent WMs
Unity was halfway decent but Canonical abandoned that just like they abandoned everything else
That's why I hate UNIX so much. Nothing in UNIX works together. Instead of proper APIs, there are commands that output text that have to be parsed by some kludgy shit "language" like awk or sed.
It has nothing to do with features. JavaScript VMs are bloated because the language is full of brain damage. Lua and Scheme are similar to JavaScript but are much smaller (and some would say more powerful). Most UNIX languages are like this, including C, C++, and Java.
No, because it would be dishonest to blame an OS and language that did it right for problems that are not present in that OS and language. PL/I doesn't have array decay, null-terminated "strings", "everything is a tape drive" I/O, or any of that other bullshit. It also has a condition system that can check and recover from a variety of errors, including overflows and array bounds. Multics also has a very different approach to users, access control, error handling, files, and so on. You can blame Multics for the hierarchical file system, the names of some commands, and some terminology, but that's about it. A lot of things from Multics that were added to UNIX years later, like memory-mapped files, were still worse than the originals. If these UNIX weenies made something this bad after using Multics and PL/I, imagine how much it would have sucked if they didn't know about them.
It's like the Pascal community recognizes flaws in their language and fixes them.
I'm not talking about real-time or embedded systems (in which case, Ada would be better), I'm talking about desktops and other home computers (phones and tablets) that use GC anyway.
Objects are a lot more powerful than you think. Look at CLOS or IBM's System Object Model which was designed to support Smalltalk, C++, Common Lisp, Java, etc. objects in one framework.
The ML family doesn't do it.
en.wikipedia.org
C's core ideas were bad from the beginning. Brain damage that was known to be bad when it was made doesn't become acceptable just because it's old. PL/I doesn't have these problems. A lot of languages like Lisp, Pascal, Fortran, and BASIC had problems worth complaining about but most of them were fixed over the years. Some of the problems in Lisp were dynamic scope and not having strings (using symbols and lists of symbols instead). A lot of languages didn't have strings in the 60s but now they do.
Subject: Revisionist weenies For reasons I'm ashamed to admit, I am taking an "Introto Un*x" course. (Partly to give me a reason to get back onthis list...) Last night the instructor stated "BeforeUn*x, no file system had a tree structure." I almostscreamed out "Bullshit!" but stopped myself just in time. I knew beforehand this guy definitely wasn't playingwith a full deck, but can any of the old-timers on this listplease tell me which OS was the first with a tree-structuredfile system? My guess is Multics, in the late '60s.
Same problem as other OOP implementations: complexity without real reasons. OOP should just have been the concept of instantiation and namespacing, no inheritance departing from the mathematics's model of function input/output and its powerful simplicity.
That's my point. ML is statically typed but with a type system expressive enough to not need dynamism.
C's (and UNIX, for that matter) core ideas are historical baggage. It's a reaction to the extreme bloat of Multics and PL/I (which took so much time just to get usable performances or simply a fully compliant compiler). A reaction too extreme? Was it standardized too soon and without enough effort to think about the future? Perhaps, but still better than what was fundamentally the Entreprise (tm) Java (r) OS of its time.
Of course, now you can spout your drivel because powerful hardware exists, but implying that this applies to Multics' era and that UNIX won because of sheer luck and retardation amongst hackers is stupid.
This is where you don't make any sense. Praising stuff like PL/I, Ada and Multics then implying that bloat comes from UNIX and not from the Entreprise (tm) crowd responsible for your fat idols.
By the way, I use Tcl with C as an extension language, I'm not an OpenBSD tier masturbating monkey.
Just because something doesn't work exactly the way you want doesn't mean it doesn't work.
And where do you suppose that brain damage originates from if it doesn't come from poorly thought-out and implemented features? Is brain damage some ethereal quality that magically makes Unix languages (in your vague "anything remotely influenced by Unix or C that I don't like" sense) worse in every way?
And yet you blame all the flaws of post-Unix and C operating systems and languages on Unix and C even when their flaws are not present in Unix and C. If you jump to the "but they're a reaction to how much Unix and C sucks" part, Unix and C were also reactions to Multics and PL/I yet you don't blame those two for Unix and C-specific problems.
Except everyone and their mother chose a different way of fixing them, so a lot of Pascal code ends up tied to specific compilers and platforms.
Multics is a very interesting operating system and deserves more study (especially with today's more powerful hardware), but it certainly had its problems. From my research, many of the later speedups came from replacing PL/I code with assembly and Multics' performance issues were severe enough that Honeywell gave several universities free hardware upgrades to save face and keep their customers.
based
unbased
unbased
There are real reasons, but they're not compatible with the UNIX way of doing things. Imagine an OS where everything is an object. Your PNGs and JPEGs are image objects that carry their methods with them. Text files embed fonts and can contain objects inside them. This was before web browsers, which spread UNIX brain damage to the rest of the world.
Are you talking about Haskell? What does that have to do with C and UNIX?
No, they're brain damage. The "core ideas" were known to be bad when they were made. The only thing that changed is that a lot of people today have never seen the right way to do things.
C was standardized in 1989, long after Lisp machines. There were more kinds of hardware at the time, including segmented and tagged architectures. Hardware "research" now is basically a worse way to do something we could do 50 years ago, but with the ability to run C and UNIX. Compare Intel MPX to real array bounds and descriptors.
Java is a UNIX language made by Sun.
None of the software that people complain about is written in PL/I or Ada, or runs on Multics. It's all written in C and C++ or in a language with an interpreter or VM written in C and C++. The Linux kernel is 7 MB or larger, plus a bloated RAM disk and all this other bullshit. Full PL/I ran on computers with 64 KB of memory. Multics supported tens if not hundreds of users on 1 MB.
Having to generate and parse text, which wastes billions of cycles, really doesn't work. Look at dtoa.c some time to see how much C code is needed to convert floats to strings and back. That's called thousands of times to produce a table or XML, just to have it piped to more code to parse them back into floats.
Brain damage comes from anti-features that have no rational explanation. Arrays are a feature. Array decay is brain damage. Numbers in different bases are a feature. Treating numbers that start with 0 as octal is brain damage. Function types are a feature. C function pointer syntax is brain damage.
The flaws in C++ and JavaScript come from copying C directly. The flaws in GNU/Linux come from copying UNIX directly. The flaws and bugs in C operating systems come from being written in C.
No, because Multics and PL/I did all of these things correctly. If someone said "Ford cars are too complicated" and made a "car" out of sticks and poured gas in the "tank" and burned his garage down, you can't blame Ford for that.
That's better than not fixing them at all.
Date: Tue, 24 Sep 91 14:15:45 -0700Subject: Decay What I find totally incredible is that the general levelof systems seems to be lower in almost every respect than itwas ten years ago -- the exception being that the machinesrun ten times faster. (Well, mine doesn't; it's a straight3600 and thoroughly canine. But everyone else's does.) I maintain a couple of large mailing lists, one of whichhas a nine-year history. Nine years ago it ran perfectly onITS, and then for another five years on OZ. Then we movedit to Reagan (a bolix) which really didn't work well, butwas tolerable, but got more and more broken. We've left itthere, despite lost mail and infinite maintenanceaggravation because *we can't find any machine that we haveaccess to that has more reliable mailing list service*.
That sounds even worse than UNIX braindamage.
Otherwise your post is BASED as usual.
gnome was always slow you retarded niggerfaggot.
fuck this place. you stupid nignogs still havent fixed the captcha after weeks. literally solve 2 captchas per post and 100 if you have JS disabled.
gnome has so much programmer time saved that it takes an entire minute for Eye of Gnome to open up and display an image on anything other than a high end gayming rig
This is hilarious coming from the man who screeches about Unix's text autism being too bloated.
Late in its lifecycle, maybe. I do know that Bristol started out with one and a half megabytes in 1979 and it took several memory upgrades (including a free half-megabyte from Honeywell, nothing to sneeze at back then) to run anything near a hundred without performance tanking.
Languages don't emerge from a vacuum. Array decay exists for partial backwards compatibility with BCPL and B code and has some performance implications. Was it worth it? Maybe not, but pretending there's absolutely no rational reason for it is a lie.
That would make Amigafags the pinnacle of brain damage.
Again, this boils down to your refusal to admit flaws come from anything but Unix and C, even in inconsistent "muh features" dumpsterfires.
Let's rephrase your mantra. C++ and Javascript have no flaws outside C influence. GNU/Linux has no flaws when it deviates from Unix. None of the flaws and bugs in C operating systems come from anything but C. Does this still sound sensible to you?
Congratulations: you completely missed the point. Not only that, in trying to make Unix sound like absolute dogshit you're making Multics' failure sound even more embarrassing. Imagine Ford actually being driven out of business by a fucking stick car because a Ford employee was tired of his car running out of gas before reaching the grocery store.
There is no reason to use Javascript for anything.
There are better, faster and even easier scripting languages like Lua.
Is KDE faster than GNOME?
I think the real question is why are you still using Gnome?
Because WindowMaker isn't well maintained and GNOME has good plugins.
unbased
Alan Kay thinks it's a good idea. Instead of each program including decoders for various file types, there's only one on the computer and it only has to be there if you want to view that type of object.
worrydream.com
That's one of the reasons I keep bringing up PL/I.
BCPL is a different language and didn't run on UNIX. C had a lot of changes that made it incompatible with B, like changing =- to -= and the string terminator from EOT to null.
bell-labs.com
In B, arrays don't decay to pointers, but "vector" variables are pointers themselves that can be assigned. That sucks too, but it destroys your "compatibility" argument.
All bad ones. It makes optimization more difficult and needs more complex compilers to come close to the performance of real arrays. Most of the techniques that were known since the 70s can't be used. There's no way to assign arrays or use array operations either, and there is no way to add them to C even though they were added to Fortran. C weenies want a 4-byte array to be treated differently than an integer that happens to be 4 bytes, but in PL/I, Ada, and other non-UNIX languages, assigning the array does the same operation of copying 4 bytes.
What's rational about "compatibility" with a language with totally different syntax that didn't even run on that OS or computer? Or "compatibility" with an older language that isn't actually compatible?
No, brain damage is when there is no rational explanation for something, in other words, whoever made it must have brain damage or it must have come from cosmic rays altering bits or something other than intelligent design. Sacrificing safety for performance is not brain damage if it really does perform better (and that's acceptable for that kind of software). The problem is that UNIX weenies don't actually care about performance. I have never seen anyone change GCC to use string length prefixes and compare speed and memory usage with null-terminated strings.
Flaws in languages based on C come from C. Flaws in UNIX clones come from UNIX. C++ and JavaScript have additional flaws that are not present in C, but those come from hacking features from other languages onto C or using C-isms for different semantics. You can't properly add OOP, arrays, generics, exceptions, dynamic linking, or anything else to C without it sucking. Bugs in C operating systems come from C. Nearly all of those BSoDs and exploits in Windows (even though it's not UNIX) came from memory bugs caused by C code. I'm not saying other languages and operating systems have no flaws, but blaming a language or OS that does something right because UNIX weenies copied it wrong doesn't make any sense.
Except weenies don't realize cars are supposed to have an engine and they got other companies to stop including engines too.
> ... > There's nothing wrong with C as it was originally > designed, > ... bullshite. Since when is it acceptable for a language to incorporate two entirely diverse concepts such as setf and cadr into the same operator (=), ...And what can you say about a language which is largely usedfor processing strings (how much time does Unix spendcomparing characters to zero and adding one to pointers?)but which has no string data type? Can't decide if an arrayis an aggregate or an address? Doesn't know if strings areconstants or variables? Allows them as initializerssometimes but not others?
It's a partial compatibility feature.
The semantics of arrays remained exactly as in B and BCPL: the declarationsof iarray and carray create cells dynamically initialized with a value pointingto the first of a sequence of 10 integers and characters respectively. Thedeclarations for ipointer and cpointer omit the size, to assert that no storageshould be allocated automatically. Within procedures, the language'sinterpretation of the pointers was identical to that of the array variables: apointer declaration created a cell differing from an array declaration only inthat the programmer was expected to assign a referent, instead of lettingthe compiler allocate the space and initialize the cell.Values stored in the cells bound to array and pointer names were themachine addresses, measured in bytes, of the corresponding storage area.Therefore, indirection through a pointer implied no run-time overhead toscale the pointer from word to byte offset. On the other hand, the machinecode for array subscripting and pointer arithmetic now depended on the typeof the array or the pointer: to compute iarray[i] or ipointer+i implied scalingthe addend i by the size of the object referred to.These semantics represented an easy transition from B, and I experimentedwith them for some months. Problems became evident when I tried toextend the type notation, especially to add structured (record) types.Structures, it seemed, should map in an intuitive way onto memory in themachine, but in a structure containing an array, there was no good place tostash the pointer containing the base of the array, nor any convenient wayto arrange that it be initialized. For example, the directory entries of earlyUnix systems might be described in C as struct { int inumber; char name[14]; };I wanted the structure not merely to characterize an abstract object butalso to describe a collection of bits that might be read from a directory.Where could the compiler hide the pointer to name that the semanticsdemanded? Even if structures were thought of more abstractly, and thespace for pointers could be hidden somehow, how could I handle thetechnical problem of properly initializing these pointers when allocatinga complicated object, perhaps one that specified structures containingarrays containing structures to arbitrary depth?The solution constituted the crucial jump in the evolutionary chainbetween typeless BCPL and typed C. It eliminated the materialization ofthe pointer in storage, and instead caused the creation of the pointer whenthe array name is mentioned in an expression. The rule, which survives intoday's C, is that values of array type are converted, when they appear inexpressions, into pointers to the first of the objects making up the array.This invention enabled most existing B code to continue to work, despitethe underlying shift in the language's semantics. The few programs thatassigned new values to an array name to adjust its origin—possible in Band BCPL, meaningless in C—were easily repaired. More important, thenew language retained a coherent and workable (if unusual) explanationof the semantics of arrays, while opening the way to a morecomprehensive type structure.
Guys, guys! Multicsfag is looking into a mirror for the first time!
...so what you're saying is, you want to use an Array type with {pointer, size, type} instead of just pointers. Write that library and use it in your own code.
Are you stupid or something? I am a no coder and even I know that in C arrays and pointers are the exact same thing, cancer
An arrary with one object translates to the exact same thing as specifying * in C. More then one object just means extended length. Correct me if I am wrong based unix hater. According to cuckexchange
There's got to be a more understandable way to communicate that. Especially if you are using multithreading of your instructions, what if pointer X that is 8 bytes into array Y changes because a instruction got passed early or something? I don't know honestly.
Why is Zig Forums just always about faggots complaining about Zig Forums that other people developed. LEARN TO CODE AND CODE SOMETHING BETTER OR STFU.
Zig Forumsgamer/
Can somebody please write 100,000 lines of code so muh vydia will run on Linux.
Zig Forumsfor_no_reason
Install Gentoo
Nobody uses arch, cruch, archbang, crunchbang or any of that homebrew shit either. Ubuntu, Debian, RedHat, SUSE, and CentOS are the majority of Linux install. But really the majority is Ubunut/Debian hands down.
Zig Forumscry_baby
The operating system philanthropists coders have donated their code base to is not good enough and I want to bitch even though I have contributed nothing to the code base. I want to complain about something that is free but still better than Windows 10 Pro which costs $200.
butthurt gnome dev detected
I love gnome and javascript. Why yes, I also love soymilk, why do you ask?