This is NOT Okay
Wtf javascript
wtf C int main(void){ char *s = "hello world"; *s = 'H';} This is NOT Okay
It's cool to shit on a shit language, but at least learn to use it correctly first.
Weakly typed languages are shit, there is nothing more to learn about it.
Op here
C is also weakly typed shit, horrible all around.
This is NOT Okay
This is NOT Okay C is a gift from GOD (Denis Ritchie is the only real god. Fuck off with your fake ass gods)
And where is you god now? Dead, lol.
So? All the other fake gods are dead too. Only difference is that the real GOD (Dennis Ritchie) actually existed and actually created something that is worthy of being called a Wonder. C is literally perfect and if you disagree you are a NIGGER (unless if you think that Rust is good than you are even worse than a NIGGER)
I don't believe in the bible and I don't believe in the greatness of C.
...
So the above image is representative of good language design?
It is representative of a dumbass abusing language behavior.
> var d = new Date();undefined> d.getTime()+10001523451581683> d.toUTCString()+1000'Wed, 11 Apr 2018 12:59:40 GMT1000'
Oooh noeees, what will we doooo if a language designed for string edition doesn't panic when trying to sum a string and a number because you probably want to concat them?
...
Have fun at peak stupid.
You are just making excuses for not being able to write simple programs on a simple language that any cum guzzling san fran faggot can.
You are an armchair developer.
Yeah why do we have errors at all? The interpreter / compiler should just throw in some defined behavior for anything the user could possibly do. You want call a number as a function 35()? Lets make that do something.
Any fucking 8 year old can use scratch or gamemaker or visualbasic or whatever the fuck kids learn on. You take the fact that 8 year old retards can use it and conclude "oh huh must actually not be that bad, i'm sure the code is easy to debug if you have trouble working out complex systems in these 'simple' languages you must be retarded"
lol
...
I agree, why even have the computer do any work for you. Real programmers write everything in assembly entered one byte at a time into a mainframe. If you want a system capable of detecting errors automatically you must be a dumb faggot who could not possible write anything.
Not really, but if you are so butthurt about not being spoonfed at all times that you feel the need of making retarded threads based solely on your own stupidity to justify your argument, you sure is pathetic.
What kind of dumb faggot would argue about bullshit all day? Not you and me that's for sure.
I just like to make fun of butthurt faggots.
HURR DURR THIS IS NOT OKAY, OKAY???
The concatenation operator should absolutely not be '+'. Sometimes people want to do "100" + "32" == 132, which doesn't work in Zhgrabbjascript.
Choose a different fucking concat operator for fuck's sake. Lua does it right. "100" .. "32" == "10032"
Its a shame lua does everything else wrong. The table system my god.
Why are you treating strings as numbers? If you really insist then it isn't that hard to do Number("100") + Number("32")
I don't want to have to write extra Number() calls everywhere. In juhblubberscript, which is weakly typed, people will do things like "100" + "32". Having that extra operator reduces the code length and line length, increasing readability, while also making sure the programmer doesn't have to worry about whether it the type system will work in his favour.
What's wrong with the table system in Lua?
C can sum a number and a string too
#include void print_string(char *s, int length){ for (int i=0; i
Again, why are your numbers strings?
If you weren't a brainlet, you would know you can just use + to convert a string to a number.
+100 + +32
But as the other user asked, why are your numbers strings?
How about we just let things be what they are. Strings are strings, numbers are numbers, objects are objects.
I've been messing with gtk with C lately. This reminds me of of this. Consider this snippet
GObject *scrolled_window_result;scrolled_window_result=gtk_builder_get_object(data, "scrolled_window_result");/* create new result table */GtkWidget *grid = gtk_grid_new ();gtk_container_add( GTK_CONTAINER(scrolled_window_result), grid);gtk_widget_show(grid);/* fill the grid */for (int i=0; i
GTK/bloat
Some of my numbers are strings because:
* they come from reading files
* they come from user input
* they come from parsing a string
etc.
And then if I wanted to write them back to a file or print them to stdout I would have to write string(x) everywhere. Weak typing has its advantages, but the (severe) disadvantages need to be reduced through fixes such as the one I suggested.
I don't like it. Why would I waste my precious line space on that? Also it looks ugly.
dumb fuck, stick to something a little easier for you
Then shut the fuck up and fuck off. No one like stupid bitching. Or take it to the people writing the standard.
gtk has also seen fit to create their own data types replicating every C type.
developer.gnome.org
char a='a';
But of course, what if the char type changes one of these days?
:^)
I ran out of arguments? Your argument is literally "i don't like it".
I just called you the little bitch you are.
This is because c has a weak as fuck type system where everything has to be void* to be generic.
Ok, let me restate my arguments because you couldn't understand English.
* Increased line length (significant for longer statements)
* Less easy to read
Either you are retarded and has to convert shit all over the place or you are autistic to care about negligible amounts of code.
Well, I assume you have issues with written language in general. Specially because anyone somewhat literate would use "harder" instead of "less easy".
...
Usually you wouldn't convert literals like that.
local nigger = 100local faggot = 32local retard = +nigger + +faggotlocal god = nigger..faggot
Why did you even make those two variables to begin with? Are you just pretending to be retarded? I got to stop taking the bait tbh.
What? It's just an example, tard. Replace them with foo and bar if you're too triggered.
Weak dynamic typing < strong dynamic typing < weak static typing < strong static typing
...
Yeah experts who have been writing c for 30 years never make mistakes. We never get massive security holes in important applications because of some trivial error that could have been caught at compile time by any non retarded language. We should expect absolute perfection from authors of programs in complicated situations.
What if the file is not formatted correctly?
What if the user typed one instead of 1?
Strings are not guaranteed to be numerical.
Most of this JavaScript bullshit comes from UNIX and UNIX languages like C and awk. What is that bizarre number? It's UNIX time times 1000. When that number was 0, UNIX sucked. When that number overflows, UNIX will still suck.
What a paragon of reliability. Speaking of time, could itbe that now that it's well past the 20-year mark, Unix isstarting go go a little senile? Why, not too long ago, Ihad a time server lose its clock, and all the other machineson the net happily decided that they, too, were back in1970. Kind of like your roommate's great aunt who thepolice keep bringing back from the other side of town 'causeshe thinks she still lives in the house where she grew upyears and years ago..."... sometimes, they even boot spontaneously just to let youknow they're in top form..." (retch)
This appeared in a message requesting papers for the "USENIXSYMPOSIUM ON VERY HIGH LEVEL LANGUAGES (VHLL)": UNIX has long supported very high level languages: consider awk and the various shells. Often programmers create what are essentially new little languages whenever a problem appears of sufficient complexity to merit a higher level programming interface -- consider sendmail.cf. In recent years many UNIX programmers have been turning to VHLLs for both rapid prototypes and complete applications. They take advantage of these languages' higher level of abstraction to complete projects more rapidly and more easily than they could have using lower-level languages.So now we understand the food chain of advanced UNIXlanguages: level languages analogous organism ----- --------- ------------------ low assembler (low-level PDP) amoeba intermediate regular expressions tadpole high C (high-level PDP) monkey very high awk, csh, sendmail.cf UNIX man
...
t. someone who has never programmed anything before
If the compiler did that, it would have solved a lot of problems. No C, no C++, no UNIX, no Java, and no JavaScript.
After a little headscratching, I realized that the leadingzeros in my frame numbers were causing cpp to treat them asoctal values. How precious.But still, if I say "#define FRAME 00009" then "#ifFRAME==00009" should still fire (or it should at least whineat me). Well, 00009==00009 does trigger, but so does00009==00011.Huh? Well, some C library thinks that the nine in 00009 isn'toctal, so it interprets it as 9 decimal. And 00011 is a fineoctal rep of 9 decimal. So, both "#if FRAME==00009" and"#if FRAME==00011" fired and I applied two translate callsto my object geometry. And(!), it's not that having adecimal digit makes the whole number decimal. The string00019 gets interpreted as 00010 octal plus 9 decimal = 17decimal. Lovely, not.
If you are to block quote something atleast say what / who it is.
You're a faggot for quoting this pajeet so much.
JS is a dynamic language. It's been influenced by every other dynamic language before it, including Lisp. You're getting way too carried away with your Unix boogeyman , m8.
...
Why even have the compiler check variable names for you? A typo is retarded and anyone should be reading their code before they deploy it anyways. Preventing retarded basic failures is not the domain of the compiler after all.
The fact is anyone can sit around all day and do retarded things with pointers.
Variable name typos aren't the samething at all. Compiler cant compile if you somehow place a random undeclared variable somewhere in your code what would it do?
I'm sure you could think of atleast on thing for it to do. Making it default a number to the machine width with a value of 0 for example. If its a situation where it could be a pointer replace it with NULL, etc. Most languages these days set it to a "undefined", or "null" value but they are usually dynamic.
This is a bad thing. A language which makes it trivial to fuck up in a way that accesses random memory is a shitty idea.
How is the preprocessor supposed to know what the values of "const" variables are? In Lisp, the equivalent to that is possible, but this is C. Leading zeroes have nothing to do with octal except in the brains of UNIX weenies. Maybe adding one more zero should increase the base by one, so 019 is octal, 0019 is nonal, 00019 is decimal, and so on. If an AT&T employee put that in the C compiler in the 80s and it was part of the C standard, UNIX weenies wouldn't think there was anything wrong with it.
That broken Date in OP's post doesn't come from Lisp. The number is UNIX time. JS was influenced by Lisp, but it still sucks. JS was more influenced by Perl and it shares the Perl philosophy.
Subject: Re: My First Perl ProgramI don't know, seems about par for UNIX for untested, hackedtogether software, thrown together on a whim and withoutplanning to be forced upon the entire universe withoutwarning. Sounds like sendmail... or finger.... or almostany other unix utility.>> The question is not whether it is a reasonable expectation> that you get a program right the first time when> programming in a new language, but whether it is a> reasonable action to force a large number of strangers to> use it when you have already decided that it is not a> reasonable expectation that it will work the first time.
...
No matter what you do it will be trivial unless you sacrifice speed the program.
quoting random bullshit in code boxes to give it false importance should result bans
...
Either:
A) You get an error and the program crashes
B) You get a NaN
The first should be desirable because summing two strings as numbers if they happen to be numbers and returning a number is all sorts of retarded. The second would be even harder to debug than if it just returned "face book"
Do you want the program to bounds check array?
Unrelated to using raw pointers for everything. Bounds checking every access would be slow. It is better to write your programs in such a way such that you never index an array. While not always practical it usually is. For example using MAP and REDUCE compared to a manual for loop.
But think of all the security flaws that can happen if people are retarded with arrays! The fact is you dont put your same logic in having arrays being unsafe and having pointers being unsafe. As long as its possible to use a pointer it will be trivial to access some random memory location.
A pointer is like a motorcycle. If you lean to much you will fall off. So don't lean too much simple as that. (edited quote from terry)
you lie
So PHP/JS < Python < C < Java? Whatever, Pajeet.
Purely in terms of type system, yes. That alone doesn't make the language better in every regard of course.
Wrong. You always benchmark your shit before you make any retarded claims.
blogs.msdn.microsoft.com
www2.cs.arizona.edu
Bounds checking certainly isn't free but the cost is in almost all cases negligible.
im dont wanna try n use that because i know the type of niggas on this app always tryna give me a virus
>PHP/JS < Python < C < Java < microsoft visual basic
Spoken like a true nodev.
Protip:
[code]
const d = (new Date()).valueOf();
d + 1000;
[/spoiler]
Your probably also think using [ ] operator vs pointers difference is negligible!
This is your brain on stupid.
Spottet the redditor pajeet.
(new Date((new Date()).valueOf() +1000));
I'm not sure why anyone would directly add a number to a date but fine.
There is no difference! It's all syntactic sugar.
Small price to pay to avoid bugs caused by implicit type conversion:
5 + null // returns 5 because null is converted to 0"5" + null // returns "5null" because null is converted to "null""5" + 2 // returns "52" because 2 is converted to "2""5" - 2 // returns 3 because "5" is converted to 5"5" * "2" // returns 10 because "5" and "2" are converted to 5 and 2
You have to admit that in a language like C where every bit can have critical meaning, having implicit numeric conversions everywhere is fucking stupid. I imagine they would have fixed a lot of things like that by now if not for the header model where there's no separation between new and old code.
Nice tautology, it solves nothing.
It solves a shit ton of things. Don't have to worry about type coercion constantly.
3% difference is not slow? lol
You can use map or whatever. In C every trivial little operation requires fucking with a pointer it cannot be escaped.
I'm referring to your statements of "strings are strings, numbers are numbers". That's a tautology. What you probably mean is "what if we actually enforce strong static typing?"
lol no. static typing is not the same thing as not having type coercion. schema is dynamically typed but its strict about it.
fucking auto correct every time *scheme
This is undefined behavior. It can crash if the compiler decides to put the literal in a read-only memory segment.
char* is not read only memory
Which is why I seem to recall the code snippet will not compile. CC will complain about trying to cast a char* from a const char*
works on my machine