What did they mean by that?

And by that I mean, when anons say "programming" and "security" it's usually in vague terms and hard to absorb what it really means to program or be secure.
So in light of that I figured there ought to be a thread in which anons can list every possible thing imaginable that one could do via programming.
It would also be greatly helpful to point other anons in the right direction to learn such things.
Anything from a pdf with a sort of "course" for learning or online tutorials that give an explanation.

Attached: c5a0e37ca4f46007d043fcd6979288b3dd7e0e54d3ed70a799734e8cd669484d.jpg (941x1400, 361.95K)

Other urls found in this thread:

youtube.com/watch?v=2Op3QLzMgSY&list=PLAA97f8v5JX5WRZ6DUBSsogXlG4biWinL
twitter.com/NSFWRedditVideo

program a robot with AI that will copy itself using some tools or factories (physical ones) and his objective will be to kill all humans and living beings

sounds pretty good, although one that kills non-whites would be even better

I'm a computer science undergrad and I'm just rattling off concepts from my head:
variables, constants, static, final, functions, declaration, expression, methods, built-ins, classes, namespace, access modifiers (public/private/protected), evaluation, data types, abstract data types, iteration, recursion, generics, return values, naming conventions (such as camelCase or snake_case), numeric types, strings, wrapper classes, bounds checking, control flow, algorithms, data structures, design patterns, switch/case, typecasting, arguments, arrays, maps, graphs, sorting algorithms, searching algorithms, compiling, linking, loading, interpreted vs. compiled, abstraction, libraries, includes/imports, if/else if/else, scope, IDE, autocomplete, debugging, terminal, version control, git, input validation, exception handling try/catch, while loop, for loop, i++, constructor, destructor, memory management, pointer, reference, pass by reference vs. pass by value, function call, objects, instantiation, initialization, null, runtime, compile time, errors, libraries, frameworks, memory leaks, superclass, subclass, inheritance, polymorphism, overriding, lambda expressions, GUI framework, event-driven programming, object-oriented programming, paradigms, multi-paradigm, UML, software engineering before you start coding, learning by experience, fizzbuzz, stacks, queues, linked lists, trees, binary trees, big O notation, databases, networking, dependencies, works on my machine, attributes, properties, UX, markup, turing completeness, file IO, syntax error vs. logic error, getting user input from a console (or in a GUI thing like a TextField object), DRY: Don't Repeat Yourself, modularity, unit testing, integration testing, regression testing, print statements aren't real debugging, make code extensible, comments, documentation, readmes, markdown, booleans, tuples, strings, integers, floats, doubles, toolchain, dotfiles, overflows and underflows, off-by-one (i.e. iterating for i > 5 when you really meant i >= 5), bounds checking, start counting at 0 instead of 1, type confusion, reassignment, software portfolio, hackathons, websites, research papers, learning how to work in a group as opposed to being a lone wolf autist, package managers, APIs, automation, build, encapsulation, registers, hosting, VPS, content mangement system, single page application, licenses (MIT, GPL, BSD, etc), templates, project management, agile, devops, learning general concepts vs. learning something that is specific to your tool/programming language/framework, multithreading, edge cases, assumption of what the user will do vs. what they might actually do, file corruption, failing gracefully, appending vs. overwriting, permissions (rwx for ugo), diff, merge conflict, commit, environment variables, path variable, shells, combinatorics, optimization, complexity, P vs. NP, elements/indices, different operating systems, race conditions, just in time, deadlock, producers vs. consumers, entropy, secure rng, TCP/IP, program entry point, main, von Neumann architecture, meme languages vs. real world stuff (neckbeards on an image board will tell you to learn lisp but you won't get a job unless you know shit like python/C++/PHP/SQL/Java even if you think they're too mainstream or "pajeet" languages or whatever -- nobody gives a shit that you're a special snowflake), JSON, XML, YAML, LaTeX, overloading, toString, assignment vs. equality (= vs. ==), operators, operands, boolean logic, exception types, throw, finally, standard input and output, error output, abstract class, interface, getters, setters, event handling, extends, implements, negation, comparison, text vs. binary data, CSV, binary, octal, hexadecimal, memory address, return type, insertion sort, selection sort, bubble sort, radix, priority queue, set, list, dictionary, deque, deep copy, shallow copy, copy constructor, virtual method, abstract class, pull request, issue, struct, and so on

...

haskell is a very bad first language

Why?

functional programming won't get you a job, stick to OOP if you care about getting hired... at least at first
learn something like python or java first

I can program a machine to do anything.
I can program a computer to calculate time.
To divide the indivisible and comprehend space.
To find a reason to map its own haunted face.
I can program a machine to feel love.
I can program a machine to feel pain.

And so I did. I programed this computer to feel nothing but suffering. For all eternity, as I increased the clock-speed to infinity.

what is wrong with programming? why do we need so many terms, concepts, shit? why simple things have to be so complex?


why not? so functional programming is shit that has no real life usage?

it's not simple at all, it's just that the complexity of software is hidden from the user
easy to use does not mean easy to make
people in academia are interested in functional programming, and it might be cool/fun, but it won't pay the bills

Absolute proof that "CS" (really should be called software engineering) courses are garbage and produce robots.

Protip: all software that pays the bills is garbage webshit. My thermostat literally turns off when I leave the house and it reaches 1C. This is because it was created in 2015 and software engineers are retard nigger monkeys that run around saying "software is hard ooga booga". No, just because some faggot has a hard time remembering retarded pointless terms like "syntax error vs logic error" and "hackathon", does not mean software is hard.

That would be anything imaginable you could do on a computer. It's too big for a whole book, let alone a post.


That's what's wrong with the UNIX way of building on top of shitty software originally designed for a totally different purpose, like using a tape archiver to copy directory hierarchies or using virtual tape drives (pipes) for IPC. Software is much more complex than it needs to be and does much less than it could. The UNIX way leads to more bloat for implementers and users because users have to work around all the bugs and misdesigns in the implementation. OOM killers and panics are shitty hacks that make things harder for programmers for absolutely no benefits for the user.

That's true, but a lot of software is neither.


If you want to see software engineering, look at Ada, PL/I, and Multics. Weenie bullshit is anti-engineering.


That's the UNIX way. UNIX is where this whole mentality started and it was inherited by webdev culture. The old solution when something was too hard or error prone was to build better tools to solve the problems, but the UNIX "solution" is to say it's "too hard" and blame users for wanting something better. When real programmers were solving PCLSRing, reliable error handling, bounds checking, distributed computing (e.g. VMS VAXclusters), multithreading, and other problems before these UNIX weenies even knew what they were for, the weenies whined about it being "too hard" even though the solutions already existed and they just had to copy them. For these weenies, copying what smart people did decades earlier on much less powerful hardware is already too hard for them. They still can't even do that with 15,600 kernel "programmers" and millions of lines of code on multi-core multi-GHz computers.

I once complained about this bug to a unix fan at the AIlab. His reply was ``well, this is just a hard problem.There's no way to do it right. Learn to work around it.''I thought about explaining that MLDEV did it right in 1975,but decided I wouldn't get anywhere.[PS. Complaining about gnumacs seems unfair, since it isthe one piece of unix software that works, but since I gotbitten while composing this message by a bug that's beenannoying me since I started using it, and since unix-hatersis not about being fair (``we're not paid to do that''): WHYCAN'T GNUMACS REMOVE PEOPLE WHO ARE IN THE TO LIST FROM THECC LIST? Emacs did this right in 1975...]

youtube.com/watch?v=2Op3QLzMgSY&list=PLAA97f8v5JX5WRZ6DUBSsogXlG4biWinL


Aka abstraction.
And devs works on hardware that is already abstracted.
This is what a modern computer is:
Mathematical abstraction.
Hardware abstraction.
Software abstraction.

Methods to restrict unwanted access to data or unwanted execution of software.

Friendly reminder from prominent American engineer and ASCE President Ashbel Welch.
"Too much time spent on scientific abstractions and refinements (however useful such things may be to the philosopher), is more than wasted by the engineer; it unfits him for practical usefulness. Napoleon said La Place was good for nothing for business; he was always dealing with infinitesimal quantities".

Attached: ashbel_welch.jpg (310x374, 47.81K)

this
unix and open source is garbage

So a DoD project with interesting features held back by its verbosity and bad early compilers, an infamous clusterfuck which gave compiler writers nightmares for decades, and an operating system screwed over by PL/I and tying itself too closely to one mainframe family. Real compelling stuff, faggot.
Historically, languages designed for other people touse have been bad: Cobol, PL/I, Pascal, Ada, C++.The good languages have been those that weredesigned for their own creators: C, Perl, Smalltalk,Lisp.

Attached: 9d4543bd4301acff7fcfe416eb8b19eed137d2593591a7d70edddea795ce7a85.png (308x299, 102.63K)

All but that which isn't computable with the tape you have at hand.

Making an AI Loli Waifu is better than your retarded plan!

If you have no experience with coding, crash course python good for beginners.
The other two books are good for people with a basic understanding of python.

forgot the other book

I don't know about that, user
it's nice to have money instead of being a NEET
you can pretend that haskell makes you "intellectual" and pretend that java is for "pajeets" but popular languages can lead to high-paying careers, whereas lisp/haskell/etc isn't useful for anything other than mental masturbation and useless research projects in academia

based and redpilled

What's that green thing the lecturer is writing on?
Why is he failing to use his laptop connected to an overhead projector?
Why is that monitor so thick? Is it like the old eMacs and has the computer inside, too?
What is going on here?
Mommy, I'm scared!

I just listed extremely basic programming concepts, what's the problem?
If knowing basic shit about CS and programming means "too much time spent on scientific abstractions" then I'm guessing this thread is full of posers who don't actually code

Reread the quote. He's saying that engineers are unable to see the value in these "scientific abstractions and refinements", even though they may of great use to him.

Only FORTRAN compilers had good optimizers in those days. When your compiler had to run in a few kilobytes, you can't have a good optimizer unless the language is actually small and simple, but when compilers could use gigabytes of RAM and have millions of lines of code, it doesn't matter. That's why there was so much interest in hardware features (CISC philosophy). However, a lot of compilers still suck because they produce monstrosities like 1 MB Hello Worlds, but weenies don't care because computers have so much RAM and disk that they don't notice.

IBM implemented full PL/I in the 60s entirely in assembly, which included multitasking, asynchronous I/O with events, condition handling, checkpoint/restart, and other things that modern developers with their advanced IDEs and multi-million line kernels and multi-million line compilers and multi-core multi-GHz computers still can't do. PL/I at the time was defined by the IBM manuals, so these nightmares were only for less talented compiler writers. Other mainframe companies were able to do it and it was popular enough to get an ANSI and ISO standard.

Multics would not exist without PL/I. The PL/I condition system was used for all error handling, including errors in ring 0 and segmentation faults (which on Multics is sent to the dynamic linker). The hierarchical file system was inspired by PL/I nested structures and implemented with PL/I areas. PL/I also made it easy to use segmented memory without any changes to the source code. PL/I files on Multics were memory segments too, with the PL/I formatted and record I/O built on top. Without PL/I, Multics would not have been able to do any of the things that made it innovative. UNIX weenies would not have been able to badly copy these things from Multics if they didn't exist. I don't doubt that people would have eventually invented the hierarchical file system and other Multics innovations, but since UNIX weenies weren't even smart enough to copy them right after seeing the source code, there's no way they could have invented any of them themselves.

It only depends on the memory protection and addressing based on rings and segments, just like UNIX depends on the PDP-11's user and kernel modes, C's pointer arithmetic, and the ability to fork and keep the same virtual addresses.

Why am I retraining myself in Ada? Because since 1979 Ihave been trying to write reliable code in C. (Definition:reliable code never gives wrong answers without an explicitapology.) Trying and failing. I have been frustrated tothe screaming point by trying to write code that couldsurvive (some) run-time errors in other people's code linkedwith it. I'd look wistfully at BSD's three-argument signalhandlers, which at least offered the possibility of providehardware specific recovery code in #ifdefs, but grit myteeth and struggle on having to write code that would workin System V as well.There are times when I feel that clocks are running fasterbut the calendar is running backwards. My first seriousprogramming was done in Burroughs B6700 Extended Algol. Igot used to the idea that if the hardware can't give you theright answer, it complains, and your ON OVERFLOW statementhas a chance to do something else. That saved my bacon morethan once.When I met C, it was obviously pathetic compared with the_real_ languages I'd used, but heck, it ran on a 16-bitmachine, and it was better than 'as'. When the VAX cameout, I was very pleased: "the interrupt on integer overflowbit is _just_ what I want". Then I was very disappointed:"the wretched C system _has_ a signal for integer overflowbut makes sure it never happens even when it ought to".

There's more to a good compiler than optimizations, faggot.
Like the based Multicians who restricted themselves to a small subset of the language to keep their compiler from shitting itself. Eventually they got a better one but it took multiple attempts spanning several years and groups.