Why, oh why would anyone use this abomination? How did we even get to this point where it is acceptable to have something like NodeJS exist?
God truly abandonded us.
NodeJS
Other urls found in this thread:
vibed.org
flaviocopes.com
stackoverflow.com
seventeencups.net
dlang.org
twitter.com
vibed.org
captcha: phefbi
D is cianigger language confirmed
PHP didn't pay enough, node pays better.
Thats niggerlicious as terry would say
I find Crystal with Amber is better for this sort of stuff.
how much does node pay?
I used both nodejs and php, and here is the trick - instead of hiring two pajeets - one for PHP(back end), one for JS(front end), you hire one pajeet for both, or even 1.5 pajeet, still you have more money. Makes sense?
The WWW is badly designed - you have to know 4 languages to make a website - html, css, js and php. Some people like how it is and create frameworks like electron, pushing badly designed webshit to your computer, some people don't and they try to fix it. Unfortunately they want to "fix" it with JS. html is bad? - Use JSX. css is bad? - use JS. PHP is bad? - use nodejs. JS is bad? - use even more JS, or typescript!
Instead of using one language since the beginning of the WWW, for example LISP, because it is great for creating domain-specific languages, they made 4 languages and now try to "fix" it.
I have a bad experience with nodejs - as gif rel shows, a lot of folders - to make a simple thing work, you need thousands of modules. For a "simple" app you have to install 500MB of JavaShit or even a browser (with electron).
ok dude. Bet you blame c++ for how shitty boost is too.
typescript almost fixes javascript
stl is a shit standard library, how about that gay fag
Ok but why did they chose a shitty language just to fix it later, instead of chosing a good language for the first time?
Don't people like and hate javascript, thanks to dynamic typing and not having to add a semicolon at the end of an expression?
Strong typing is a meme. People fall for it because they are inept. If you don't have to worry about memory safety, you don't need type safety, the runtime environment is handling that for you.
And stl is a library. A standard library, but a library nonetheless.
That's Lua. In Javascript you have to to add a semicolon at the end of an expression.
You don't, actually. That's a convention to make code clearer to read.
actually i kind of agree with you there, whats the point of bothering with memory sizes if you're not managing it yourself almost fully (talking to the kernel to almost automatically map your memory to ram / whatever), fug ecmascript has shown me the light
er no, its optional
t. poo writing code full of bugs
THE JAVASCRIPT PARSER REQUIRES THE SEMICOLON IN ORDER TO CORRECTLY PARSE THE LANGUAGE BECAUSE OF THE STRUCTURE
You're fucking retarded. The parser only tries to guess it and in a lot of situations will guess it wrong, you fucking pajeet.
NIGGERS SHOULDN'T BE ALLOWED ACCESS TO COMPUTERS!
flaviocopes.com
Shut up, retard.
Well both nodejs and firefox js interpreters don't care about semicolons - it won't even report an error. Don't know what ECMA standard says about that though.
Right, especially, when you're looking for bugs. I spent a lot of hours looking for bugs caused by this "convention".
You had difficulties because there was or because there wasn't semicolons? The convention is to add the semicolons.
Niggers, this doesn't run:
// define a functionvar fn = function () { //...} // semicolon missing at this line// then execute some code inside a closure(function () { //...})();
This does:
// define a functionvar fn = function () { //...}; // semicolon NOT missing at this line// then execute some code inside a closure(function () { //...})();
stackoverflow.com
Even stackoverflow.com-faggots have figured that out. You should always place it instead of just when it causes trouble because it's implementation dependent when in which situations it will cause trouble.
understands from experience.
Right, because the runtime tries to guess without the semicolons and might not guess it right. It doesn' change the fact the standard allows you to not place them in your code. Even if you SHOULD place them, the standard doesn't force you. The error in that code is not a missing semicolon.
There is no convention to add or remove semicolons.
There's a language and every fucking language spec since it's inception says to put a semicolon after every expression.
Fucking niggers. If you care about code quality and compatibility, you always place them.
I replied to posts talking about Javascript.
Listen up you fucking schizo:
1: I have been saying the the convention to make code easier to read and write IS TO ADD them.
2: You just said yourself:
My point remains: you CAN write js without semicolons.
So yeah, nah, fuck off.
No it is. It's one of those situations where you can't omit it. It just won't work without a semicolon.
That's just the surface. It becomes even worse once they use JQuery to deform the entire language.
Imagine a world where js cucks aren't even using 'use strict'; smdh fams
Could you stop bitching around please?
I just said it is possible. I'm not coding like that myself, but people do, because nothing stops them.
Huh? What's wrong with that? At least I'm not a LARPer.
Nothing. I wrote that what he wrote is true and that he knows from experience. Lern reading.
Writing and reading to an unallocated position in a C array or a C++ vector is also possible and likely doesn't even crash your program.
Nice thing, next time someone will force me to code in JS, I'll definitely use it. But why an interpreter let's someone write shitty code? Are there any advantages of doing that?
You've probably just summoned Unix hater. And in my opinion this is a shitty design too.
Programming isn't the job of a graphic designer.
But why an interpreter let's someone write shitty code?
For the same reason, this works:
(new DOMParser).parseFromString("", 'text/html');
and outputs:
as DOM Elements. If something doesn't work for webtards it's the browsers fault. Or for the languages fault.
Backwards compatibility. Browsers couldn't make a change that would break existing shit. So instead they created a mode that has to be actived in the code to disable a set of features.
I'm sorry, I don't like garbage, C weenies call a feature. Look how clearly pointers are implemented in D, whereas in C it is hard as fuck, and still there exist idiots masturbating to that and calling it a great design. Not agreeing with you doesn't make me a graphic designer.
git gud
Only the subset SafeD actually guarantees memory safety and you can't use pointers at all if you want that memory safety.
Have you ever written a pointer to an array of pointers to function taking parameters two parameters, which types are int and double, returning an array of pointers? In D it is easy to write and easy to read - reading and writing is linear, whereas in C reading and writing is non-linear?
Have you ever written a pointer to an array of pointers to a function taking two parameters, which types are int and double, returning an array of pointers? In D it is easy to write and easy to read - reading and writing is linear, whereas in C reading and writing is non-linear?
Ok now it is correct.
That wasn't the point of my argument. It was just an example, that not agreeing with everything what C does, doesn't make me a nogrammer.
Not even possible.
You can only return a pointer to an array of pointers and I have no idea in which scenario that would even be useful.
You're full of shit.
It's not see . You're just a retarded noncoder.
Javascript was thrown together in ten days to animated dancing monkeys. No one thought it would be used by idiots to write actually important shit.
I would rather have to compiler catch an error before the program even runs, rather than have it crash at some unforeseen point in the future. Or worse, be like Javascript and silently "correct" the problem and continue running with the wrong value.
Are you sure? I've seen very complicated definitions using pointers, arrays and functions. I'll try to define this shit, if it is possible, but it'll take some time because I haven't touched C++ for a long time. Would be glad if you could show me, why this is impossible.
and by (((Brendan Eich))) who co-founded Mozilla and made Brave which replaces ads on the internet with his own.
This doesn't happen if you know how to use the cumpootah tbh. Software are not magical things with a will of its own, you know. But I suppose inept developers need to be spoonfed at all times with type safety.
Yes it doesn't work. Neither in C nor in C++ and probably neither in D. You can only return a pointer to an array.
I'm just repeating myself
More like never. As I said before D probably can't do that either. You're just not a programmer.
The reason for types in compiled high level languages which are still somewhat close to metal isn't some fucking abstraction for incompetents but so that the compiler know the SIZE of the value and the SIGNEDNESS. Otherwise you don't have signs and have to pretend the half of the maximum value is zero if you need negative numbers and the size is needed or else you would waste 56 bit on each fucking bool.
It's the typeless faggots like you who are so incompetent that they have to use some interpreted language because they can't into how computers work.
To be fair I can do pretty well with C and C++ too. I don't use js and lua because I can't use anything else.
Then you'd know that you can only return a type and not an array because an array is not a type.
You can't into C or C++ and I don't judge you for that. I judge you for pretending and giving retards weird delusions.
What the fuck are you talking about? There's more than one other dude in this thread, you know?
Ups. Mixed it up in my head.
Ok, thanks for explanation. I just invented "a pointer to an array of pointers to a function taking two parameters, which types are int and double, returning an array of pointers" as an example and didn't know it is impossible to return an array, but my point was that declarations in C and C++ can be complicated, for example:
float ( * ( *b()) [] )(); void * ( *c) ( char, int (*)());void ** (*d) (int &, char **(*)(char *, char **));
And D does it better, because declarations are linear.
dlang.org
So despite the fact I didn't know that, because I barely learned C++, my opinion on complicated pointer declarations or being able to read or write unallocated positions in an array not being features, but a bad design, isn't based on the lack of knowledge. I'm just learning, but I can see the flaws of the language.
You can't do this in D either.
Back to your previous example where I can at least understand what you're trying to achieve:
int* (function1*)(int sample1, double sample2)*;
Does this look complicated to you?
Have you ever written a pointer to an array of pointers to function taking parameters two parameters, which types are int and double, returning [a pointer to] an array of pointers [to ints]? In C it is easy to write and easy to read - reading and writing is right to left, whereas in D reading and writing is not right to left?
It compiled. I'm not reading further.
Your shit didn't. Fuck off.
./main.cpp: In function ‘int main()’:./main.cpp:15:18: error: expected ‘)’ before ‘*’ token int* (function1*)(int sample1, double sample2)*; ^./main.cpp:15:18: error: expected initializer before ‘*’ token
what a bullshit
I'll correct myself:
int* (**function1)(int sample1, double sample2);
I tested it and it works.
Like fucking clockword.
Yeah embarrassing. It was just too many pointy things for me to handle, cherry boy. At least I brought it fourth in the end:
*forth
Come on user
I don't think you meant me.
I'm the one that contradicts him all the time while saging:
It's better than the alternatives and is actually quite comfy once you get over your autism about not using C for everything. Just stay away from the bloated frameworks at all costs. Express is a damn good tiny server.
This.
Nearly every issue people have is with bullshit from the retarded community shitting out useless modules like leftpad.
IT COMPILED. CHECK YOURSELF.
types of that expressions:
FPA_PFfvEvE
PFPvcPFivEE
PFPPvRiPFPPcS2_S3_EE
typeid(function1).name() returns PPFPiidE
As far as I understand this is
function1 is a pointer to a pointer(to an array) to a function returning pointer to int, taking two parameters - int, double
Shouldn't this be PPPFPPPiidE?
Also said, it is not even possible to write, what I wanted.
NodeJS with Express is one of the nicest web servers to program out there. You can understand everything the server does without much issue, unlike bloated shit such as Spring MVC. God how I hate Spring MVC.
Are you retarded? If a program tries to add a string and a number chances are the programmer made a mistake somewhere along the line. In a dynamically typed language that error will go unnoticed until that code is run. Even worse, while a language like Python will crash and tell you that type types are fucked up, Javascript will just convert the number to a string, concatenate them and go happily along with the string instead of a number, which will then be converted to the wrong number at some point in the future.
(3 + "2") / 216
Now I'm getting 16 instead of 2.5, good luck debugging that when you notice the effect of the wrong result ten layers of abstraction away from where it occurred.
#include templateusing muh_function_pointer = std::array(*)(int, double);templateusing hurr_so_difficult = std::array*;
:^)
It actually is; it just has the ability to decay to a pointer type and can't be used directly in certain situations (i.e. as a return type).
In my area around 100k vs 80k for php
...
HAPAS ARE SUPERIOR TO WHITES
HAPAS ARE SUPERIOR TO WHITES
HAPAS ARE SUPERIOR TO WHITES
I love Donald Trump! Heil Israel MIGA 2020!!!
Yikes. Where did all this misogyny come from?
Judensheim pls go
This is a slide thread. SAGE AND REPORT
I smell some satanic fuckery here.
The original plan for JavaScript was to "put Scheme in the browser"
LISP faggotry is what got us into this mess in the first place, asshole
More like oracle and Java faggotry. The plan was to implement scheme, but idiots from Oracle told idiots from Netscape to implement Java. They took a language with static typing and because it was shitty, they made it handle dynamic typing.
If he implemented scheme, instead of this clusterfuck, everything would be fine.
Also
There's a reason why it isn't actual Scheme. Let's start with the fact that it's called JAVAscript, courtesy of Sun (who also bring you the courtesy of making it have NOTHING in common with Scheme, ruining the web forever)
what you wrote in the code block does not compile:
This is what you wanted with additions to make it possible.
No function 1 is as I wrote:
You don't seem to understand that a pointer to an array is just a pointer to the first element of an array.
Some epic C++11 shitcode that is. Why would you use a fucking container for a simple array, like ever?
I'm not even sure if Bravekike knows Scheme.
Kek, I actually forgot something. The pointer array: (In this case it doesn't even make sense. Pointers are more expensive than just having an array of ints but whatever)
int (function1)(int sample1, double sample2);
You were right about one of those Ps but not about the other two Ps because a pointer to an array is the same as a pointer to the first element of an array.
int** (**function1)(int sample1, double sample2);
Should be PPFPPiidE
So much contempt for all of you.
I just started learning crystal today. How does it compare to go? Is concurrency as easy with crystal?
...
What compiler are you using? I'm using gcc 6.3.0 and it compiles.
Yes I know.
You don't have to use stl faggot. STL is hot garbage and ever C++ programmer knows that.
*with Go
You just learned. A pointer to a pointer (in an array or not doesn't matter) to a function makes 2 pointers not 3 as you claimed.
if you type "gcc a.c" gcc will throw the error (I'm using GCC 7.2.0):
a.c:4:19: error: expected ';', ',' or ')' before '&' token void ** (*d) (int &, ^
If it actually compiled it's a bug in your GCC because "int &" is utter bullshit in C and C++.
Calm down. Things like vector aren't that bad, or are they?
semicolons are not optional in js. When you miss them the js compiler tries to correct this and it will guess where the semicolon goes. Sometimes it guesses right and sometimes it doesn't, so you should always put the semicolon.
Not him but I definitely remember old versions of GCC compiling type &var to be equivalent to type *var
Stop lying.
You wrote int &.
Cool suck my dick faggot
correct, the web is UNIX braindamage (domain names, centralization, files, shitty string encodings, shitty array of programming languages, etc) as a platform, but the idea of a general purpose PL embedded into a document is a bit of LISP braindamage sprinkled on top
Go is stupidly easy to learn strongly typed and compiled programming language that is can call into C without any major issues allowing it to perform as both a low level and high level programming language without the need to deal with C++.
I am not sure what Crystal is. Honestly the first time I have seen it. Sounds completely different, and has some weird python-esque snytax while having a lot of the same featureset that both rust and go have.
Never change Zig Forums.
He's not me, you idiot. I know Zig Forums is dead, but there are anons besides you and me on this board.