C is awesome and anyone who hates it is a brainlet

C isn't great despite the fact that you can fuck up your entire system with it. C is great because you can fuck up your entire system with it.

To sum it up quite nicely: with great power comes great responsibility (don't make this a spiderman thread). C gives you all the tools you need to make your program truely slim while being just abstract enough to understand it all (try coding complex programs in assembly i fucking dare you).

Anyone complaining about C being unsafe have the same level of intelligence as people who think that owning a gun somehow means your home is more dangerous.

Other languages are great for quick scripts, but for a large program there really isn't anything that can replace C.

Attached: c_programming_language.png (1200x1276, 77.03K)

Other urls found in this thread:

sgi.com/tech/stl/drdobbs-interview.html):
stackoverflow.com/questions/8529501/method-overloading-using-varargs-in-java
stackoverflow.com/questions/17728425/varargs-method-modifies-callers-array-instead-of-its-own-copy/17728529#17728529
en.m.wikipedia.org/wiki/Robert_C._Martin
pubs.opengroup.org/onlinepubs/9699919799/.
twitter.com/SFWRedditImages

yeah i sure love writing my own data structures from scratch everytime i use c, great language!

You create a thing
Its core is super optimized Assembly
Rest is C++

neat future

>I don't know a lick of C, and yet I am able to run patch and make. Fuck right off.
I don't know a lick of C
I don't know a lick of C
I don't know a lick of C

C is great but not perfect. There's features that could improve it and make writing programs more comfortable without sacrificing the level of control you have, but they will never get implemented because C is not being developed beyond some fixes or whatever.

Syntactic sugar methods, destructors, namespace related mechanics, no headers, syntax for string/array concatenating, default values for structs/functions, optional function arguments, better variadic arguments, better type inferring so you don't have to cast everything all the time (like if you send a struct to a function, you have to send (sometype){x, y, z} instead of just {x, y, z}), deprecating functions/structs without removing them (see: gives compile warning if used), more condensed for loop headers...

Yeah ok bud keep LARPing, C is fun but it's neither great nor practical. Go learn D.

Did the bad code monster eat your source code?

I'm the other way around. I'm a C fag and I believe having a gun in my home or your home makes my home more dangerous.

Attached: spiderman-thread.jpg (1600x1200, 295.72K)

C's also great in that it really is the "write once, run anywhere" that was the failed promise of Java. Everything has a FFI to C, and I've even started compiling my C to webassembly as an experiment in replacing javascript. One language for everything is fucking awesome.

When MS said "It's a feature, not a bug" you all mocked it but when someone exposes C (or Linux's) flaws then "The user is stupid."

Your cognitive dissonance shows. Newsflash: you're one of those retards you complain about.

Unlike Microsoft, C actually provides value to mankind: it is platform independent and yet low-level like no other language.

...

polnigger C LARPer: the post. i'm just going to post this every time i see one now

t. JS programmer

And instead of posting a similarly portable and low-level language you're wasting everyone's electricity and time with that post.

uhh C99 added variable length arrays n sheeit
fuck you. kill yourself so we never have to deal with the possibility of C being fucked up even more in this way. type inference in C is a terrible idea

C isn't easily portable, and especially not for any LARPer

if you spend more time arguing about what language other people should use than actually making software yourself then die

A very insightful contrarian comment about the nature of C appears in an AlexanderStepanov (C++ STL designer) interview by Al Stevens in DrDobbsJournal (3/1995) (sgi.com/tech/stl/drdobbs-interview.html):"Let's consider now why C is a great language. It is commonly believed that C is a hack which was successful because Unix was written in it. I disagree. Over a long period of time computer architectures evolved, not because of some clever people figuring how to evolve architectures---as a matter of fact, clever people were pushing tagged architectures during that period of time---but because of the demands of different programmers to solve real problems. Computers that were able to deal just with numbers evolved into computers with byte-addressable memory, flat address spaces, and pointers. This was a natural evolution reflecting the growing set of problems that people were solving. C, reflecting the genius of Dennis Ritchie, provided a minimal model of the computer that had evolved over 30 years. C was not a quick hack. As computers evolved to handle all kinds of problems, C, being the minimal model of such a computer, became a very powerful language to solve all kinds of problems in different domains very effectively. This is the secret of C's portability: it is the best representation of an abstract computer that we have. Of course, the abstraction is done over the set of real computers, not some imaginary computational devices. Moreover, people could understand the machine model behind C. It is much easier for an average engineer to understand the machine model behind C than the machine model behind Ada or even Scheme. C succeeded because it was doing the right thing, not because of AT&T promoting it or Unix being written with it." (emphasis added)

Better without code tags (these didn't give boxes like that without JS, before):

A very insightful contrarian comment about the nature of C appears in an AlexanderStepanov (C++ STL designer) interview by Al Stevens in DrDobbsJournal (3/1995) (sgi.com/tech/stl/drdobbs-interview.html):
"Let's consider now why C is a great language. It is commonly believed that C is a hack which was successful because Unix was written in it. I disagree. Over a long period of time computer architectures evolved, not because of some clever people figuring how to evolve architectures---as a matter of fact, clever people were pushing tagged architectures during that period of time---but because of the demands of different programmers to solve real problems. Computers that were able to deal just with numbers evolved into computers with byte-addressable memory, flat address spaces, and pointers. This was a natural evolution reflecting the growing set of problems that people were solving. C, reflecting the genius of Dennis Ritchie, provided a minimal model of the computer that had evolved over 30 years. C was not a quick hack. As computers evolved to handle all kinds of problems, C, being the minimal model of such a computer, became a very powerful language to solve all kinds of problems in different domains very effectively. This is the secret of C's portability: it is the best representation of an abstract computer that we have. Of course, the abstraction is done over the set of real computers, not some imaginary computational devices. Moreover, people could understand the machine model behind C. It is much easier for an average engineer to understand the machine model behind C than the machine model behind Ada or even Scheme. C succeeded because it was doing the right thing, not because of AT&T promoting it or Unix being written with it." (emphasis added)

I said better varargs, because C's are really clumsy. To begin with you have to manually provide as an argument how many items you're sending.

No it isn't. It's fucking retarded to have to cast into something when it's obvious what should happen. For example this code is invalid:
Even though this is not:

Similarly for this function
You will have to do
instead of just

this is solved in c++11
works exactly like you showed in your post.

I don't want C++ though. It's like you're telling me to move to a shitty new house in a nigger neigbourhood because the house has a better toilet like I wanted.

Forth isn't great despite the fact that you can fuck up your entire system with it. Forth is great because you can fuck up your entire system with it.

To sum it up quite nicely: with great power comes great responsibility (don't make this a spiderman thread). Forth gives you all the tools you need to make your program truely slim while being just abstract enough to understand it all (try coding complex programs in assembly i fucking dare you).

Anyone complaining about Forth being unsafe have the same level of intelligence as people who think that owning a gun somehow means your home is more dangerous.

Other languages are great for quick scripts, but for a large program there really isn't anything that can replace Forth.

Elixir/OTP isn't great despite the fact that you can fuck up your entire system with it. Elixir/OTP is great because you can fuck up your entire system with it.

To sum it up quite nicely: with great power comes great responsibility (don't make this a spiderman thread). Elixir/OTP gives you all the tools you need to make your program truely slim while being just abstract enough to understand it all (try coding complex programs in assembly i fucking dare you).

Anyone complaining about Elixir/OTP being unsafe have the same level of intelligence as people who think that owning a gun somehow means your home is more dangerous.

Other languages are great for quick scripts, but for a large program there really isn't anything that can replace Elixir/OTP.

Although I have to correct myself, Erlang/OTP really can't fuck up it's entire system, that's the point.

Oracle Java isn't great despite the fact that you can fuck up your entire system with it. Oracle Java is great because you can fuck up your entire system with it.

To sum it up quite nicely: with great power comes great responsibility (don't make this a spiderman thread). Oracle Java gives you all the tools you need to make your program truely slim while being just abstract enough to understand it all (try coding complex programs in assembly i fucking dare you).

Anyone complaining about Oracle Java being unsafe have the same level of intelligence as people who think that owning a gun somehow means your home is more dangerous.

Other languages are great for quick scripts, but for a large program there really isn't anything that can replace Oracle Java.

C doesn't take your freedom to write platform-independent code. What's your complaint, again? That you don't know how to use the language correctly?

go back to C++ and stay there
who the fuck cares? varargs are a gimmick in any language

C has enough features the problem is the bad design that makes it too hard to write reliable code let alone remember the syntax
>inb4 muh (*)(*nigger((*jew*)*))[123][123] is soo easy to remember if i just practice every day making 4d arrays of function pointers to 2d arrays of function pointers

t. doesn't know how to use C correctly

Argument not found

Just because you don't need it doesn't mean I don't. It doesn't even need to be complicated, I could technically make do with sending a pointer to an array literal along with a item count, but the problem to begin with was that if the language supports varargs properly it becomes a lot more comfortable and simple to do that.

1 Still not an argument about the language but an ad hominem 2. you still didn't name a comparable alternative

you don't need varargs. literally nobody has ever needed varargs. varargs is fucking terrible in every language anyone uses. it makes problems stemming up to the type level

I program C fuckface, I'm not arguing against it, I'm stating facts about it. Literally every other language is portable and better at it than C. You can't have portability and low level at the same time, which is exactly how C fails. You can write portable code that's efficient due to use of low level mechanics (as opposed to something that would be efficient in _any_ language) in C but few people know how to do it.

I agree with your general premise, but tell that to LLVM or Go's pseudoassembler

Okay. Then what do you do when you want to call a function across several objects? Let's say you want to find the center position of N objects in space, or a bounding box that surrounds all of them. Or what if you want to merge several objects for some purpose. Are you going to write those calculations manually every single time you want to do it rather than just sending them to a varag function that does it?

You're asking me to keep writing shit like this:
float x_min, x_max, y_min, y_max;Vec2f positions[] = {{1, 2}, {6, 2}, {0, 6}, {6, 6}};for (int i=0; i

Not if you null terminate. The same thing happens if you pass an array instead.

...did you not know you could pass arrays?
>Vec2f center = findCenter(4, positions);
wow so difficult

Did you even read the argument at all?

For comparison, here's how it works in python:
def func1(*args): for a in args: print(a)def func2(*args): return func1(*args)myarray = [i*i for i in range(6)]func2(*myarray)
Why is is "a lot more comfortable and simple" if we just get rid of all the fucking dereference operators?

HolyC is best C.

Attached: 1535800747654.jpg (1933x1795, 1.92M)

Maybe "a lot" is subjective, but I hate having to deal with the argument counter and casting and shit manually. To me
would be a lot more clear and easy to use than
There's 2 extra components that are completely useless and makes it harder to read and more tedious to manage.

Come to think of it maybe this doesn't even count as "variadic arguments". I imagine them as the same thing.

This. I purposefully seek and disable any program not written in C or C++ or anything that needs memelibrary to function.

This has to be the most retarded argument I've ever read

Attached: 19ed0b73cca0bfb30a5c6e3e84ba0a46dc6cef89b28a688302995477749dd3ae-b.jpg (570x572, 24.31K)

as the other guy said, literally why can't you just pass an array or other list structure?

yes it would stop you from having to pass an array length, but at what cost? what if you iterate over the varargs the wrong way? what broken states or memory corruption can result?

varargs are absolute shit in python as well. anyone who thinks varargs matter are to be suspected as n00bs

Attached: a1f8071f84a221a383b6ed98d1eeda4e8752a9cb9cf8849496f6c37d136fb156-b.jpg (1252x1252, 152.01K)

Oh and also now I can't randomly access my arguments unless I waste cycles counting arguments which is just wonderful.

I could, as mentioned which has it's own problems.

Wrong how? The number of items should be available automatically based on how many items I send, that's what I mean with "improved varargs". If C arrays had length information then that would basically solve half of the problem. Obvious type inferrence would solve the second problem.

...

...

The answer is that the number of arguments being passed by the caller is the best solution, the problem is that I have to pass this number everywhere which increases code clutter when this job could be easily delegated to the compiler at absolutely no cost to the programmer. You don't get this because your code is unmaintainable garbage and you would fucking die if you ever had to write something of value.
Watch Uncle Bobs clean code so you don't destroy anything else with your arrogant retardation.

Attached: IMG_20180726_195846_018.jpg (540x641, 47.89K)

Only C weenies consider that a feature.

C weenies have the same level of intelligence as people who think a browser is great because you can get exploited by viewing a JPEG.


That mentality was created by AT&T in the 80s. They blamed the user for every single mistake until they believed they really were responsible for every bug. UNIX weenies think it's their job to fix bugs in AT&T's broken software that they paid for.


C takes away value. Computers are less reliable, software is buggier, programs need tens of millions more lines of code, more than a half century of research is ignored because it won't work with C, and so on. It only provides value to people who are paid to fix bugs and enjoy that mind-numbing bullshit more than actually improving software.

Bullshit. Ada, Forth, PL/I, and many other languages can do the same "low level" things you can do with C, if you use the C definition of "low level" as being able to work with pointers.


It works in Common Lisp and Scheme. Common Lisp provides a &rest argument, which is passed as a list.

That's true. Low level depends on how a specific machine architecture works and C is based on how the PDP-11 works. Most programming languages can use the features of the machine natively, but C requires you to emulate a PDP-11 on top of the real machine. Fortran and Pascal on Lisp machines use Lisp strings, arrays, integers (bignums), floats, and complex numbers for their own data types. C on Lisp machines has to emulate null-terminated strings on top of hardware already designed to work with strings and pretend Lisp types are machine words.

Low level depends on the hardware so it can't be portable except to different models of the same architecture. Flat memory pointer arithmetic is the only "low level" thing C gives you, which works on the PDP-11 but not x86 unless you ignore a major part of the memory model (segmentation). On a PDP-11 and x86, overflow sets the flags. On MIPS, it can trap or be ignored depending on the instruction. On a Lisp machine, it converts the number to a bignum.

This poor user tried to use Unix's poor excuse forDEFSYSTEM. He is immediately sucked into the Unix "group ofuncooperative tools" philosophy, with a dash of the usualunix braindead mailer lossage for old times' sake. Of course, used to the usual unix weenie response of"no, the tool's not broken, it was user error" the poor usersadly (and incorrectly) concluded that it was human error,not unix braindamage, which led to his travails.

then do, faggot
#define myvariadicfunction(...) _myvariadicfunction(sizeof(__VA_ARGS__)/sizeof(*__VA_ARGS__), __VA_ARGS__)
or better yet
struct my_list {int n; TYPE * data};

Having to type pointlessly literal extra shit when there's really no reason to is exactly what makes people prefer other languages.

...

After you've learned how it works, you begin to want to stop having to type all that shit all the time.

You're putting more importance on teaching Pajeets than letting experienced programmers do their work effectively.

t. I have never had to debug anything every
C macros are absolute garbage that make things difficult if not impossible to debug fam.

Attached: 0co08OC.jpg (500x515, 48.58K)

You also still have to populate that struct retard.

if typing speed is your bottleneck when programming, you are doing something very wrong. In any case, if done properly, it should require very few characters.

Yeah. Write a macro to create the struct, and then pass the struct to your function. Dead easy.


I debug macros all the time. Here's my standard script:
gcc -E $1 -o - | sed '/^#/d' | clang-format > ${1//.c/.i}
Either look at the .i directly, or compile it and step through it in a debugger. This is directly comparable to compiling with -S, and looking at the asm.

Gas.

I swore someone would complain that I used both gcc and clang
Anyways, how do you do this in posix shell? bonus points if you don't use external programs.

Nigger what the actual fuck is this are you retarded.
God help you if this is an acceptable way to debug code to you.

Attached: 5f01a1c00f49e6732a1ffd4ccb583bab4417795d62af7f7d6ce8efc3970496d7-islam.png (400x400, 347.18K)

This is jsfag level autism.

Holy shit this thread is filled with butthurt.

...

it's unlikely you care about cycles when you're fucking around with varargs

false. if we added every feature some fags on the internet wanted, we'd have JS, Java, or C++
u fuckin wot m8. you were just subtly trolling all along?

meanwhile, varargs is shit and full of edge cases in every other language as well as in C.
for example it makes it non-obvious what code should do in these 2 examples:
stackoverflow.com/questions/8529501/method-overloading-using-varargs-in-java
stackoverflow.com/questions/17728425/varargs-method-modifies-callers-array-instead-of-its-own-copy/17728529#17728529
the only thing being shitted up here is the language

Program them once, reuse them later.

No faggot, you shouldn't need anything more than your code and GDB.

...and gcc ofc. sed and clang-format are niceties, feel free to just run gcc -E.

So let's not implement anything ever then ig? Why aren't you writing everything in microcode?

It's not just about varargs, it's about your masochistic attitude to software development.

Educate yourself nigger.
en.m.wikipedia.org/wiki/Robert_C._Martin

The problem here is overloading, not varargs. But maybe you would know that if you knew what you were talking about.

Attached: YmluYXJ5OjEyMTM1MTE.jpg (400x225, 29.1K)

No, you should never need your compiler for debugging runtime.

You need a compiler for compiling, retard.
enjoy having slow code.


kys

Yes, and that's what they're for, not debugging runtime.

Enjoy autistically optimising shit only to find your optimisations are useless if not detrimental on other CPUs and your optimisations are slower than what the compiler produced anyway because you don't actually know any assembly. Add to that the fact you end up producing an unholy mess of unmaintainable code in an attempt to optimise some mundane shit that probably isn't a hotspot because like I said: You can't code for shit.

Git gud fgt

Attached: 7a0671000315f682499dd638ba0419fcdd2101ce325042a75f7bf50dd035c528-b.jpg (484x577, 44.97K)

Then how about you stop using Java and make a good varargs feature instead.

I'm a little confused about the exact definition of varargs at this point, but what I want is basically to send an array of things without having to type pointless shit every time. This is what my perfect "varargs" looks like (function header syntax up to debate):
void doStuff (int foo, int stuff[int stuff_count]..., float things[int things_count]...) { for (int i=0; i

stop trolling

Great argument.

You mean mildly changing how a function header is parsed. You have an interesting definition of "fundamentally changing the language".

Ada has advanced support for type checking, you can create types with specific ranges (e.g. an integer with values from 0 to 12), you can write modules, exceptions (in other languages the system gives you an error code, you can't just ignore it or the software will crash), 'generics' which means you can write a data structure (like lists) which will work on different kinds of data, like integers, floating point numbers, and strings. So, by writing just one function for all data types instead of writing functions for all data types specifically, the language has run-time checking and so many other features.
The kicker of Ada is that it's a higher level language than C and about a million times safer. The GCC compiler framework generates the fastest code for C, but GCC also supports Ada and the code generated for Ada is nearly as fast as the code for C. It also supports every architecture/platform that is supported by GCC.
Why do programmers hate Ada? C programmers in the 80's would use so many bullshit excuses like that the compilers for the language were expensive, or the language is too complicated (it's much simpler than C in day to day usage). The real reason is because they're just too lazy to learn Ada.
The reason for 'bloat' is because C makes it difficult to write good, maintainable, secure code. To you morons everything that isn't C is "bondage-and-discipline". Imagine the horror of using 1950's programming language features and practices.

I'd probably be more willing to use Ada if it didn't have such a linguified syntax.

yeah sure it is great and "well designed ;D"

varargs is just C programmers knowing how function calls work under the hood and wanting access to the arguments they know can be passed in any quantity but in a portable way. The difficulty is not everything is passed as a single word size so the varargs implementation needs to know the type to know how many words were used to pass it. It's trivial to understand if you know assembly but the edge-cases are confusing as fuck if you're coming from a HLL background. It's absolutely not meant for passing arbitrary length arrays, though.
for (int i=0; i

Are you retarded? "${1%.c}.i"

When I was a newbie, I was terrified of pointers and type conversions in C because I didn't understand them. Now after a couple of years of experience, I effortlessly use pointers, function pointers and everything related.
Pointer arithmetic is a boon, no matter what anyone says

C is just so amazing, I love it

Attached: 1514940247283.gif (460x351, 309.06K)

In that case I guess I don't care about varargs. I can imagine things that being able to push arbitrary things into a function would make more convenient, but I wouldn't feel comfortable doing that because it makes it hard to see and control exactly what you're doing.

Typo that I didn't bother to delete and repost for.

what the fuck nigger, i just realized you aren't even asking for varargs in this entire thread. you can already do variable length arrays in C99:

#include void nigger(int numnigs, int nigs[numnigs]) { for (int i=0;i

Yes, I already mentioned that method and explained why I hate it in this thread.

TL;DR: I already know what's happening, so I no longer want to be forced to type extra shit I don't need to because it makes writing and reading and maintaining the code slower and more annoying.
I really do not understand this idea of "be as literal and verbose as you possibly can even though it's obvious what should happen if you wrote less".

${parameter%word} is POSIX. Try man dash next time.

But that's what I said, nigger? He used ${var//pat/repl}. Also, dash isn't entirely POSIX, it's better to look at pubs.opengroup.org/onlinepubs/9699919799/.

function pointers syntax is cancer.
It is only defined for doing so inside of an array plus one past the end.
How can you enjoy using that piece of unix weenie garbage?

Why does literally everyone here hate Unix?

How can you not? I think the people who created, maintained, and standardized the C language were intelligent and thoughtful. Certainly their work has gone to see more success than anything preceding it. But all you want to do is argue opinion because you don't have a foot stand on when all computers today operate nearly identical regardless of the programming language used.

They are likely a bunch of try-hard academics who realized that their MIT CS degree doesn't mean jack in the real world of business and application. They want everyone to work on their LISP machines because it favors their worthless education regardless to the debilitating condition that LISP machines impose on users.
They also could be die-hard Stallman losers who cannot foment a single, solitary thought of their own without his approval. They agree with Stallman that the GPL should remove all rights from a person about their labor, and that software cannot be sold for profit.

If that was true why is function pointer syntax so poor? Why doesn't the C language include lambdas? Why doesn't the C language support interrupts?

The GPL does not prevent you from charging for your software.

Opinion.
Opinion.
Why doesn't the C language support interrupts?
Opinion.
Yes it does. It only grants you license to sell physical copies or technical support. It does not allow you to sell the copyright to your software while it is licensed under GPL.

...

C was awesome and there are some good reasons to hate it.
Prove me wrong.

because it's garbage?
t. gentoo lUNIX user

What would that even mean? Some ad hoc bullshit instead of setting up the function pointers yourself? How many instances of interrupt handling code do you have that merits a language feature (assuming the feature even accomplishes anything, which i doubt)

I don't argue opinion. The GPL regulates how source code is distributed, therefore, it usurps control of the copyright to the software.

Function pointer syntax is fine, it's when you abuse function pointers inside arrays inside structs that things get ugly. That's because of one of Cs best features: that ugly things look ugly.
I'm not sure what you think c is missing here. Just name your function _tmp123 or something, problem solved.
UNIX supports signals, which behave like interupts.

to go with your gun analogy, it's only more dangerous to have a gun in your house if you don't know how to use it properly

Yes, but if you own it you have full control over the copyright.You always have the ability to change the license, make it proprietary, or sell the copyright for it to someone else.

C is turing complete, so yes it can emulate any feature from any other turing complete language. The problem is that it is going to be more verbose.

You always have the ability to ... sell the copyright for it to someone else.
Not if the license is GPL.

If you own the copyright on the code you can. If we want to pretend in your silly rule that you can't, then you can just relicense the code to be proprietary and then sell it.

If char count is what matters, then name your function "a" or something. This isn't exactly brainfuck level, turing tarpit style verbosity. What I don't get is why you think lambdas are the pinnacle of language design.

lambdas are not just a separate function. They share the same scope as where it is defined. The reason why it is important is that I don't want to crate a trivial function for every HOF I want to use.

That's exactly what the GPL says.