Where were you when Javascript conquered every ecosystem under the sun?

Where were you when Javascript conquered every ecosystem under the sun?

Attached: 1543895313802.png (639x842, 182.82K)

Is it just me, or is that first tweet nonsensical?

swiftonsecurity is a loser dude who pretends to be a girl and said he was a "sysadmin" even though he only worked an IT helpdesk position lmao

what a fucking poser, and he gives his shitty opinions on all things tech, even when he has no fucking idea what he's talking about

like he'll literally look up some shit on wikipedia or just read a preliminary article on a topic and then act like he's the fucking authority on the subject, and for some reason people actually listen to this phony

Wow user. Imagine being this butthurt.

Stay mad kid. He's right.
(((They))) always win. We always lose.

...

:D

You laugh, but this is the state of modern computing.

Attached: electron_DE.png (930x526, 87.31K)

This looks like a hacker terminal from a 1990s movie.

JavaScript is already the new C. Programs are being rewritten in the UNIX language JavaScript, just like they were rewritten from Fortran, Pascal, and other languages into C in the 80s and 90s. The C weenies had to "reinvent" threads and asynchronous programming too, even though they were important features of PL/I, Ada, and other languages many years earlier. C and JavaScript are being used in ways they were never intended for because incompetent "decision makers" believe the availability of "cheap" "programmers" overrides all other concerns. They think 15,600 interchangeable weenies are better than a few good programmers.

Date: Sun, 15 Nov 1992 17:55-0500Subject: Re: [Re: Eliminate TCP Input Demuxing] I'm just getting around to responding to this, with an historical note which I cannot resist. Paul's paper is a good, thorough, and competently done analysis, but the conclusion takes me back about 14 years. Are we always destined to reinvent the same stuff every N years?There is a well known effect in the computer architecturecommunity, which in summary states that all majorarchitectural mistakes must be and have been made at leastthree times: once in the design of mainframes, once in thedesign of minicomputers, and once in the design ofmicrocomputers. Perhaps a similar rule applies to operatingsystems.The same mistakes are made once in mainframe OS's, twice inmicrocomputer OS's, and N times in Unix (tm) operatingsystems. What seems surprising and different is that theydon't get fixed in Unix. Mostly people don't even realizethey ARE mistakes.

And it's made with js

like for example?

VS Code

GNOME extensions.

That was the idea behind it.

In a few years every bank is fucked.

Busy programming in Scheme.

...

holy shit is he me?

You wrote it wrong.
It's Windows One

Get punched Zig Forums

Playing videogames

comfy tbh

literally wat

If I could stop the unix-weenies/lisp/quote poster from smearing the whole board in his crap, I would

Attached: 41864894564.jpg (600x338, 46.81K)

What's the name of that? Can't find it using "electron de"

Web browser which explicitly don't support JS when?

There is no "decimal" numeric type in JS which would be natively suitable for storing money data? If not, why not just use a custom type, such as a struct of two integers (e.g. one for dollars and one for cents etc.)?

based


unbased

In javascript you'd just want to store it all in cents as an integer assuming you don't need to handle fractional cents.
If I were to make a data type for money that doesn't deal with fractional cents here is what I would do. First off, it should use linear types because money should not go poof and disappear or magically get duplicated. This means if you make a mistake, the compiler will catch it for you. The data type should store both the type of currency and the amount of it you have. An example type would be USD and an example amount would be 400 which would represent 400 cents or $4. Next you define operators for the same type of currency +, -, and * like normal. / on the other hand will return you a tuple including the quotient and the remainder. Remember that we are using linear types so you must account for what happens with that remainder or it will create a compile time error. It might also be a good idea to require the dividend to not be zero using dependent types. This mean that if we can prove the dividend is not zero we don't actually have to do a check if it will cause a divide by zero error. This means that most of the time you don't have to worry about it. Next you define how to compare money of the same type.
I'd say this would be a pretty good data type for working with money if you didn't have to worry about fractions of cents. Else that makes it more complicated to handle and is out of scope of what I've described.
As a hack though to handle fractions of a cent you can just define a dollar to be for example a million microdollars and then do your calculations in microdollars.

1000+ year JavaScript Reich. Hail Eich.

Let this happen.

It's Windows I

Ada has discriminants in record types, so IMO theoretically with the right functions and aspects you can have arbitrary precision using long_long_integers as your basis type within the record (this makes the record array-like). There is also pointer to another record too (so it can be linked list like too). The beauty is Ada is strictly typed so there's no fucking up by mixing types unless you explicitly bypass the typing.

Imagine having only half of your screen for the on-screen keyboard because you needed to fit a fucking FILE SYSTEM BROWSER in there, to be present at all times, taking up almost the same amount of space. And why do you need such a big clock display? And around 10% of the screen is a fucking GLOBE telling you where you are on the fucking PLANET! Sometimes, I also forget where I am, roughly, on the planet. So nice to have a constant reminder where I am, on a planetary scale.

Fucking meme'd. Also, look at those Copy and Paste buttons, I'd rather have a REAL keyboard, like in those meme tablets with foldable keyboard nowadays.