Which one and why?
Which one and why?
Each one has its place
You're allowed to read more than one book.
t. guy who has gone through both
Also go watch the original SICP lectures on youtube
just get both
Both obsolete.
Get a book 'bout Rust. Unironically.
I'm not a sjw soyboy who has nothing at all against shitty ersatz ingredients like high fructose syrup or hardened palm oil, but for some reason hates gluten and lactose with a passion because xe was conditioned to.
Stay unemployed, LARPer.
tbh
fam
ok
Are these the only two 10/10 programming books out there? If not, what others are there?
why
can someone the books in OP and not this?
Too bad Dennis Ritche didn't give lectures (at least there's none I'd be aware of).
Pretty sure they're in the gentoomen library.
Because Forth> C >>>> Everything else
They are all bad. All are absolutely full of shitty runtime errors because of things that are trivially statically checkable.
if you have to ask, you probably are retarded.
tcpl should come first and is more of a tutorial and style guideline
sicp takes a while, and offers challenges you'll find tricky on the first read.
SICP is the greatest book I've ever read, ch4&5 blew my mind.
SICP teaches you algorithms, imperative code, functional code, logic programming, lazy programming, discrete logic, language design, interpreters, compilers, register machines / instruction set simulation, assembly language.
SICP is a fucking joke. The hardest parts of it are the few cases where the order of evaluation matters and the ass shit Lisp syntax. Only a legit mentally retarded code monkey would struggle with its math too. It's clearly a /g/-tier book. Also, FP is a meme for people who aren't intellectually capable of dealing with program states.
K&R is the designated shitting street of programming books.
...
Guaranteed you never made it past fizzbuzz.
I don't use functional programming languages, but it's true and not controversial that excessive state causes bugs. It's one of the reasons global variables are discouraged.
The more the execution of the body of a function is affected by state outside it, the harder it is to reason about it. A purely functional function takes it to the logical conclusion and only allows execution to be affected by the arguments explicitly passed into the function.
Whoa there. There exists an actual pajeet edition of the book, you know. No need to insult the normal version.
What would he lecture about? How to program like pajeet?
Painfully true. You never had to deal with a mess of global arrays in Tcl. It's enough to make you want to ban use of states permanently
What useful and/or widely deployed things were programmed in Lisp again?
This is a non-sequitur, and therefore not an argument.
Although, if you really want to know, tons of expert systems are written in Lisp, as are AutoCAD plugins, or GIMP plugins. Those are widely deployed, therefore fulfilling your request for useful or widely deployed.
lol you passing in 50 variables into every function... oh wait ding ding ding I think we have an OOP tard.
The airline search and pricing software used by most major US airlines and lots of sites like Orbitz is written in Common Lisp. Most people who have taken a commercial flight in the US or on a US carrier in the last 2 decades have used the software either directly or indirectly. That's millions of flights affecting millions of people involving billions of dollars.
It runs great on even the smallest 8-bit computers and microcontrollers.
left if you're newbie (nothing wrong wth that, reading this will level you up) and right if you're intermediate
I learned C with K&R and my Amiga 500. Worked out pretty well. Getting paid well now. Knowing how a computer actually works gets you money. Don't believe the "fearless concurrency" and "if it compiles then it's error free" hype from the new Rust kids. Compile-time checks are not a replacement for actual understanding and they are not a free pass. The restrictions that Rust imposes will force you into non-optimal code. From what I've seen, it encourages an arrogant attitude towards memory, memory allocation, and understanding how this impacts performance and stability.
If you'd need to pass 50 variables into each function your program is already fucked. Don't do that. Keep your functions small, reusable and understandable.
The C book is just a basic reference guide. SICP is clearly superior.
Lisp if you're a soyboy hipster who wants to wank over his "superior" code; C if you want to actually get shit done.
Good old days, in a sense. But in another sense not so much - it was way more difficult to get things like software, books, or even information on what to learn and what you need to learn it (such as books and software). I would have loved to have a C compiler at my disposal in my Amiga days, but never had one because as mentioned above information and software wasn't just a click away as it is these days.
Both. K&R will teach you how to get work done, SICP will teach you how to not shit up your code. They are different books for different purposes, one is a language reference, the other is a textbook.
If you need to pass that many variables you have already fucked up. That said, Racket has a great solution to global variables: paremeters.
docs.racket-lang.org
Basically, you have a "global variable" which you can mutate temporarily, and once to scope is left it reverts back to its original value. For example, the 'current-output-port' is basically stdout, but you can parameterize it to a file, call a writing procedure, and when it's done the parameter reverts back to stdout.
KnR C is like 100 pages. Why wouldn't you read it, even if just to read it to mock other people who do or don't read it.
Mine is 270 pages. But yeah, it's short. Just read it.
t. nodev
It's deceptively short though, as working through it thoroughly can take quite some time. And even then returning to it after some time may have you notice and learn something you didn't before. It's a part of the book's "magic" I guess.
Functions stay small and understandable if you aren't memed away from globals.
It's kinda interesting but it's hardly the same thing.
Stop projecting, Lispfag.
You've had 60 years and the best thing to come out of your shitty language is a bloated text editor made by some pedo commie that eats his own toejam.
No, because if you count all the explicit parameters and used globals and come out at 50 it's much too complated.
You could have mail ordered a C compiler if you didn't have a nearby shop that sold Amiga software. There were always tons of ads in magazines, with phone number to call and/or mailing address to send off for a catalog. I ordered some Fred Fish disks this way to get PD software, since I didn't have a modem.
Computer books were pretty easy to get this way too, at least for the ones still in print.
So basically you really just needed a magazine subscription to keep informed back then.
1st will become obsolete (and already is to some extent)
2nd is timeless, as it's not about any specific programming language.
also if you're a competent programmer, you should be able to pick up a language simply by checking out the official specification/docs. PL books are usually bloat, unless the language comes with some unique concepts (C is not one of them).
because of C or because another book made it obsolete
both, but mostly because of C
Why do you say this ? And what do you see replacing C ?
Not baiting you or anything it's just that I see people along your line of thought all the time but I never engage in discussion.
C is obsolete enjoy writing programs with heaps of vulnerabilities
Maybe for you brainlet
Rust
Ha! Nope.
you need not any spaces before '?'.
C is a footgun, and takes too much developers' time on mundane shit.
C++ (very unlikely but possible if they manage to finally unfuck it in another 10 years), Zig, Rust, or nothing (kind of likely; that would also mean we as a civilization would be sent back several decades)
…or maybe also a next version of C which will be unfucked enough to be viable again.
after all, the latest C standard is somewhat less of a shit bomb than the first versions were.
Maybe he's french. they put spaces before '?' and '!' (go figure).
If they weren't they how could they justify the price? At least 8 out of ten of those yuge 800+ page texbooks is >60% bloat, but thanks to its intimidating size they can keep charging $100 or more per copy. Nobody would pay 100 bucks for a 300 page book (even if it had exactly the same content as the 800+ one but none of the bloat, saving the reader lotsa time, ironically).
oh man… there's a lot of things that get bought all the time despite them being worthless.
you just made me sad again by reminding about this.
Forth is like everything I like about LISP Polish notation, self boot-strapping and expandability and C close to the metal, fast and light on memory rolled into one.
Is functional C even possible? Maybe lambdas and closures are not possible, but passing pure data over everywhere should be possible.
#include #include #include #include int my_reduce_func(int sum, int item) { return sum + item; }// reduce_int created by MAKE_REDUCE(int, reduce_int); predefinedint my_filter(int num) { return num > 25; }void my_map(int num) { printf("%d ", num); }int main(void) { srand(time(NULL)); printf("%d\n", reduce_int((int[4]){1,2,3,4}, my_reduce_func)); map_int(filter_int(make_array(int, 20, rand), my_filter), my_map); puts("");}
Possible, but not probable.
The issue with c is that its very difficult to make safe abstractions. For example if you want to make a generic list / vector / map / function whatever you have to make the type void*. This means you are going to end up manually casting all the fucking time, very error prone.
I like Go better.
What's Go's runtime overhead like? Wondering mainly about time. I'm curious which meme langs would be suited for system utils.
It's relative. Think of Go like a statically linked java.
What about pdf related?
Implemented a "game of life" program in C. Honestly menial stuff like facilitating a reasonable way to input the initial state and displaying frames took way longer than the actual logic (the state transformaton function I wrote worked flawlessly the first time around). Would Lisp dialects alleviate the tinkering with getting the I/O boilerplate to work satisfactorily in comparison to C?
any other language will be better than C for this.
Including assembly?
There's no such language.
If you mean all of the assembly languages, it surely depends on the architecture.
And, most of the times, however funny, you still have less of Undefined Behavior when writing in some assembly. Go figure.
just lol
Every time I write a program in C, I have to rewrite half the functions from the Scheme run-time before things become bearable.
yeah. just generate code at runtime.
Not recommended, something along the lines of this (needs changes to work)
void *lambda(void *fun, void *dat){ char code[] = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, /* 0: */ 0x48, 0x8d, 0x05, 0xe9, 0xff, 0xff, 0xff, /* lea -0x17(%rip),%rax */ /* 7: */ 0x4d, 0x89, 0xc1, /* mov %r8,%r9 */ /* a: */ 0x49, 0x89, 0xc8, /* mov %rcx,%r8 */ /* d: */ 0x48, 0x89, 0xd1, /* mov %rdx,%rcx */ /*10: */ 0x48, 0x89, 0xf2, /* mov %rsi,%rdx */ /*13: */ 0x48, 0x89, 0xfe, /* mov %rdi,%rsi */ /*16: */ 0x48, 0x8b, 0x38, /* mov (%rax),%rdi */ /*19: */ 0x48, 0x8b, 0x40, 0x08, /* mov 0x8(%rax),%rax */ /*1d: */ 0xff, 0xe0, /* jmpq *%rax */ /*1f: */ 0x90, /* nop */ }; char *x = palloc(64); void *p1 = dat; void *p2 = &fun; memcpy(x, code, 48); memcpy(&x[0], &p1, 8); memcpy(&x[8], &p2, 8); return &x[16];}
SICP is a good primer on base computer science concepts, and how to express computations in small, abstract units, and is applicable to all languages. K&R C is a good introduction to C, though it's insufficient on its own (and full of a bunch of flaws by necessity; tons of the examples are vulnerable to buffer overflow and other handfuls of errors because the proper error checking code would at least triple the sizes of the examples). C Programming: A Modern Approach is probably the best book to get into C these days.
SICP is still gold.
for each one of them, there's only 1 architecture.
and even if you finish this book you will still write vulnerable code in C.
Writing bug free code in C is hard as fuck and if you want to do it you spent all your time debugging memory corruption stuff.
Only thing that is trivially type checkable either 'hello world' or sordid Java-tier shit (and even then it is not really type-checked. It's more of an mandatory lint). Actually useful typing goes beyond introductory CS/software engineering courses
Is the memory allocated by this function executable?
This is just wrong.
its my own function but yes.
I wrote a quick & dirty interpreter for making direct calls to C functions.
Decided to support closures in the language, the interpreter here is fairly limited, but demonstrates "closures" in C.
If you want to test it, you need my symbolic expression library.
code here: pastebin.com
This was the snippet I should have posted.
void *palloc_heap = NULL;void *palloc_ap = NULL;void palloc_init(){ palloc_heap = mmap(NULL, 1024*1024, PROT_EXEC | PROT_WRITE, MAP_PRIVATE | MAP_ANONYMOUS, 0, 0); palloc_ap = palloc_heap;}void *palloc(size_t t){ if (!palloc_heap) palloc_init(); void *x = palloc_ap; palloc_ap += t; return x;}void *lambda_dispatch(ast *exp, void *a0, void *a1, void *a2, void *a3, void *a4){ void *new_env[5] = { a0, a1, a2, a3, a4 }; void **old_env = runtime_env; runtime_env = &new_env; void *val = ast_exec(exp); runtime_env = old_env; return val;}void *ev_lambda(ast *exp){ char code[] = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, /* 0: */ 0x48, 0x8d, 0x05, 0xe9, 0xff, 0xff, 0xff, /* lea -0x17(%rip),%rax */ /* 7: */ 0x4d, 0x89, 0xc1, /* mov %r8,%r9 */ /* a: */ 0x49, 0x89, 0xc8, /* mov %rcx,%r8 */ /* d: */ 0x48, 0x89, 0xd1, /* mov %rdx,%rcx */ /*10: */ 0x48, 0x89, 0xf2, /* mov %rsi,%rdx */ /*13: */ 0x48, 0x89, 0xfe, /* mov %rdi,%rsi */ /*16: */ 0x48, 0x8b, 0x38, /* mov (%rax),%rdi */ /*19: */ 0x48, 0x8b, 0x40, 0x08, /* mov 0x8(%rax),%rax */ /*1d: */ 0xff, 0xe0, /* jmpq *%rax */ /*1f: */ 0x90, /* nop */ }; char *x = palloc(48); void *p1 = exp; void *p2 = &lambda_dispatch; memcpy(x, code, 48); memcpy(&x[0], &p1, 8); memcpy(&x[8], &p2, 8); return &x[16];}
I'll post a working demo after din-din.
Now you understand it. A well-written C book would have been very bad news for C and RISC too. C with proper error checking code would be much bigger and slower than the same program in other languages, and it would still be buggier. RISC running that program would be slower than hardware designed to do the checks in parallel, like the Lisp machines. C would be slower (but safer) on a Lisp machine because it has to execute redundant code to do things the hardware already does.
Subject: why Unix sucksSome Andrew weenie, writing of Unix buffer-length bugs, says:> The big ones are grep(1) and sort(1). Their "silent> truncation" have introduced the most heinous of subtle bugs> in shell script database programs. Bugs that don't show up> until the system has been working perfectly for a long time,> and when they do show up, their only clue might be that some> inverted index doesn't have as many matches as were expected.Unix encourages, by egregious example, the mostirresponsible programming style imaginable. No errorchecking. No error messages. No conscience. If a studenthere turned in code like that, I'd flunk his ass.Unix software comes as close to real software as TeenageMutant Ninja Turtles comes to the classic Three Musketeers:a childish, vulgar, totally unsatisfying imitation.
Every fucking time you post. You never include a source or attribution.
WHEN YOU BLOCK QUOTE SHIT ESP AS PART OF AN ARGUMENT SAY WHERE ITS FROM FUCKYOU
Pretty cool. I have also tried writing my own JIT compiler before.
Some ideas. You might want to make a code generation API. It's like an assembler: you just make calls such as emit(opcode, arguments...) or opcode(arguments...) and it accumulates the output in a dynamic array. You can use regular memory for this part. Once you're done, you transfer the whole thing to the executable page and turn off its write bit.
its closer to a direct-threaded interpreter like forth.
Here's the code archive (.tar.gz) in base64.
pastebin.com
Use cat | base64 -d | tar zxf.
Pic related is the only way to really learn how computers actually work (especially for beginners) as it actually teaches one how one is built and doing so subtly introduces basic programming concepts instead of throwing a bunch of abstract outdated information at you a la SICP and The C book.
look at that kid go, programming his abacus
surely this is not cross platform anymore?
why bother with C at this point if you can build directly on top of your assembler of choice?
Of course it's not cross platform. Assemblers don't exist at runtime.
How (if) is it better that pic related?
Rustfags insulting C.
Juliafags insulting R.
PHP and JS shit everywhere
It's trivial to rewrite on other platforms.
any who, my code is more novelty than novel.
pairing a function pointer with data should be done with a struct.
HOPs in C are fine with a little boilerplate and pointer casting.
its the same approach used by every library with callbacks.
struct vproc1 { void (*p)(void*,void*); void *d; };void vproc1_call(vproc1 *x, void *v) { x->p(x->d, v); }void for_each_array(vproc *p, void **x, int s){ int i; for (i = 0; i < s; i++) vproc1_call(p, x[i]);}
I've spent loads of time trying to extend C using lowlevel hacks and it always falls short of doing it the dumb way.
I've used the codegen approach to make methods work like x->meth(...) instead of x->meth(x, ...), but this is runtime overhead for a syntax sugar.
same thing with coroutines, you might think the posix getcontext/setcontext are an option, but the performance is absolue shit compared to a callback queue.
If you try to make C elegant you waste time.
tldr; if closures, objects, or coroutines are how you model your software, trying to adapt the style in C will result in worse performance and infinitely more bugs.
FTFY
Any time you see someone talk about "C" being replaced here it is Rustcuck talking to you.
maybe in your dependent typed haskell with 30 extensions or shitty ocaml or whatever hipster bullshit system you use
heh, I agree.
the software that made C popular was designed to discourage use of C (sh, lex, yacc, awk, sed).
C was for compiler construction, and is more of an UNCOL paired with an ABI.
Why it caught on as *the* systems programming language?
Why not Algol? Bliss? Pascal?
My guess is ken's cc was the most studied peice of "real world" code, whereas other languages were fucked over by ivory-tower commitee driven design.
Never read that one.
I should also point out that I was referring to how good the books are in terms of CS introduction, rather than teaching coding style.
As for coding style, SICP is much better for containing proper programming vocabulary and ideology.
In my opinion, one should ideally read The Elements of Computing Systems first, then SICP if he finds it proper, then whatever he finds necesarry if he does so.
They used C because it was the only compiled language guaranteed to exist on Unix. C's poor support for data structures make it a bad language for compiler construction.
It's because of the PC and Windows. These also made x86 the most popular architecture even though 68K and ALPHA were faster at the time.
Because Microsoft chose to write Windows in C, which made other languages on Windows second-class. Microsoft had a C compiler. If the Mac was more popular than the PC at the time, we would probably be using Pascal. They were talking about making a successor to Algol 68 in the 80s. If IBM went with CP/M, we would probably be using PL/I. Microsoft killed classic VB when it was their most popular language. Those all failed for business or "political" reasons, not technological ones.
Most early 80s programming books don't mention C, and as the 80s went on, C started showing up gradually. Most C code was written after ANSI standardization in 1989. There were a lot more active ANSI and ISO standards groups back then (Pascal, PL/I, Algol, APL, Basic, Lisp, Prolog), so C was just another language at the time. There was a clear continuity from the 50s to the early 90s and then when C became popular, that knowledge and technology was lost.
What's with the Unix Haters fags and calling people weenies? It sounds whiny and retarded.
Yeah seriously. I wasn't big on reading CS papers, I thought they were all irrelevant theoretical crap... But when I started browsing these older papers I got hooked. It's impressive how much our antecessors have accomplished... Many of our databases are based on 70s technology and science. I used to think technology of today was state of the art, but when I look at some older IBMs I doubt that notion. Areas that today are pretty much dead, such as programming languages and operating systems research, used to be brimming with life. I started collecting these papers... And I'm always worried that I'm missing some hidden gem that was lost to time.
I wish there was an extensive, annotated bibliography.