Logistics of information

Just watched a talk about the degradation of technology over time (youtu.be/pW-SOdj4Kkk), and it made me want to start taking what I do more seriously. But when I think about it, I really have no idea where to even begin looking for information or even what to look for.

What can we do to improve the spread of information and ability to learn/teach Zig Forums properly? For example deep systems-level understanding of computer software and hardware, not python scriptkiddy-tier shit. How to compile a program, how to draw shit into the monitor, how operating systems work and what it would take to make one, how to do networking, how to make your program enjoyable to use, how to make a decent GUI, how do you "hack" or mod other programs, how would you replace a given part of your Linux distro with another or even your own, how to install Zig Forums into your computer so you can modify the source code, how to contribute code to other people's projects...
I run into this problem very often where I want to learn something or teach other people about something but it's so hard to find information that's comprehensive, easy to understand, not sunken into a sea of useless "information" (like what books tend to have), and is shared by someone who's clearly competent and personally understands the topic well and doesn't shove retarded opinions that are clearly questionable down your throat.

Try to search something in a search engine, and you're more likely to get some javascript retard from stackoverflow telling you to not do what you wanted to know about and instead do a different thing. Try to learn some fundamental programming topic and you'll end up in places where you're told to install visual studio right before being asked to import a bunch of libraries into some hideously convoluted OOP machine. Search for games in steam and you'll get all kinds of bullshit that have nothing to do with your search query, if you search for RPGs right now you'll literally get a bunch of porn and visual novels. There's something fundamentally wrong with the way the logistics for obtaining and sharing information is currently set up.

Now I don't play enough games to care about the videogames thing that much, but how do we improve this problem for the actually important things like learning technology? You can shit on retarded web developers all day long, but part of me thinks you can't really blame them because it's so difficult to find actually good material to learn to do things properly and not fall for hackernews-tier """best practices""" and other bullshit that people genuinely believe to be great even though the methods people used 20 years ago are 20 times simpler and result in 20 times better performing software. Nobody actually understand anything or how to utilize it correctly, instead of regurgitating some offhand shit they learned about installing some abominable node.js configuration that does 99% of the actual work for you, and thus stopping you from actually understanding anything about what's actually happening or how to do it without that specific configuration.

Should we try to catalog information and tutorials? Making some kind of rating system where you rate the usefulness and type of information in a given website/link? Collect information from many sources and combining them to create the "ultimate guide" for given topics? And how do we create an environment where it's okay to ask basic questions and directions without being told to kill yourself by a bunch of elitist larpers or told to use some epic Python library?

And I know there's someone who wants to shut this thread down right now by linking 70 books and saying the problem is "already solved", and if you're that guy then please Ctrl+W the fuck out. From my experience books are always bloated with a bunch of useless shit and are way more verbose than they need to be just to pad the book length and you often need to read MULTIPLE of them to get the full picture, often fall for the same kind of shitty best practices and visual studio and third party library importing traps and such, it's much harder to read a lot of sample code from a book and get practical examples, it's less convenient to search from, revising and fixing information from a book is very difficult, sharing 20 megabyte pdfs is very inconvenient compared to being able to link and browse various topics from something more digital, not to mention many of them aren't legal to share.

TL;DR: how do we make it easier to learn and teach to do and understand tech correctly, instead of just telling people they're retarded for not inherently having that knowledge?

Attached: 145246634.jpg (1025x709, 141.64K)

Other urls found in this thread:

urbit.org/
switchedtolinux.com
privacytools.io/
anonguide.cyberguerrilla.org/
learnpython.org/
maths.cam.ac.uk/sites/www.maths.cam.ac.uk/files/pre2014/undergrad/admissions/readinglist.pdf
strategyumwelt.com/frameworks/feynman-technique
mailinabox.email/
twitter.com/SFWRedditImages

People are too dumb to understand anything

People are brainwashed and controlled by jews, jews print money so they have unlimited funds to spend on marketing, brainwashing, misinformation

Unironically all true.

But to give a serious answer to a not serious post, just because the unwashed masses are retarded doesn't mean we shouldn't seek to improve the situation for the few of us who still have some brain activity left and do care about doing things the right way. Nobody knows everything, but everyone knows a little bit of something, or at least a good resource for knowing something. If we could somehow make it easier to share and find and create valuable information from all the noise in the internet, we could make OUR lives much better and over time grow more people to "our side". And we could make other people's lives easier too since it would be easier to learn to make the things we want to make and fix the problems we want to fix and do it competently, and possibly while at it take some of the ground back from the jews and other starbucks slurpers who keep shoving bloated botnet shit into our computers.

You can't spread information about things that are purposefully kept closed and obscured (modern CPU architecture, for example). It's not a complete coverage, but this is one of the angles free/open source software should push - not just tons of released code with practically no comments and a sparse readme, but stable releases with visual representations of code parts and explanations about why and how algorithms were implemented. Not to mention commented code, obviously. This way, people who are interested in software can start casually reading code that is actually readable and accessible - as opposed to a golfed repo by some turboautist that requires an even higher power level to read and understand. And then, armed with this practical knowledge, you can pick up theoretical books much easier.

As for hardware, I don't know. There are FPGAs for amateur development, but for actual competitive manufacturing (even in the smallest sense of the word), it's just not financially feasible. People used to be able to do everyday tasks on machines that are hundreds of times less powerful than what we have now, all we can really do is preserve those architectures and have minimal, clean code to accompany them.

Attached: sheeple.png (376x401, 45.84K)

Attached: ClipboardImage.png (566x577, 99.97K)

People fixed it for him

Attached: Free Speech Extended.png (757x3030, 240.39K)

problems that come with being smart

The only real thing I can find in world order that seems to be the case, without pointing fingers, is that the propaganda machine is doing its job in guiding life. And I do think its to guide the world into a heterogeneous race. Instincts of the Herd in Peace and War by Wilfred Trotter confirmed my suspicion on that. His essay is on the first world war and how to avoid it. He talks about intrinsic motivation vs "strong leaders", and how the former always wins from the latter, because when the strong leader loses a battle, he loses credibility and thus fighting power. While the intrinsically motivated gain strength when "their people" suffer. This is an argument for world order & race mixing, since it gives less incentive to warlike efforts since cultural differences fade.

His book concludes with a couple of points on how to avoid another war in the future. I can't remember what they were unfortunately, but its a good read if you're interested in that sort of thing.

If this was made for normies I can understand that being left in there, but for anyone with half a brain it takes away from the rest of the graphic.

As you say, the issue with trying to learn tech properly is that we are drowned out by web developer Silicon Valley types who like the current paradigm of layering garbage on top of garbage. These people spawn themselves exponentially quicker than those seeking to fix the foundations, due to the lowering of the barriers of entry to computers. My understanding is that in the old days of the internet, the people who used it tended to be much more technically sophisticated and it had the right environment for the type of information transfer we're looking for. The people who entered did so because they are curious and like to tinker with things, which are really the only types of people we can hope to teach. The key to recreating such an environment is to create space(s) on the internet with a high barrier of entry, of some sort, which may or may not be technical. Of course, you have to strike a balance wherein people who lack knowledge but have the right mindset we're looking for have a path to enter. This is really the age-old problem of creating a community that's big enough while still remaining high quality.

Maybe something like gopherspace is what we need. That inherently scares away web developer types. But we need some sort of pipeline, a trail of breadcrumbs, that leads the right people there.


The free speech angle of net neutrality is ISPs throttling or blocking websites they don't like. For instance, ISPs being pressured by outrage mobs into blocking "hateful sites" like Zig Forums. Net neutrality doesn't mean ISPs can't throttle you for using too much bandwidth (though there are probably lots of normies who think that), it just means any throttling must not be based on content.

If you want to spread comprehensive knowledge on computing, it may be best to implement some old architecture like on an FPGA, writing on how you're doing it and why and all that, basically a tutorial. Then proceed to write basic software for it like a BASIC or FORTH implementation to run stuff on, connect with a HDD, etc. continuing to write the tutorial, then a TCP/IP socket to connect to BBS/gopher, etc. with tutorial of course.
Basically, implement an autistic shitposting machine from the ground up and write a guide on how to do it.

Interesting thread. I'll reply in more detail later, but know that it isn't being ignored.

ik

KYS.

If people know they are retarded, they have 3 choices:
- Ignore you because you are mean, and thus ignorant
- Doesn't do anything but keep this in mind (maybe other priorities)
- Git gud, searching for knowledge and answers

Just insult them and point them to the good stuff, after that it's up to them to check where they stand and if they have more questions.

I was going to suggest chiseling SICP onto granite so that its truth will endure for ten-thousand years, but this guy has better advice.

Wirth's Project Oberon was basically that, though I don't remember if it had networking.

...

The architecture of current hardware is part of the problem where nobody understands the whole thing. Take a look at the Intel manual sometime. ALL of it is relevant and I will bet you any sum that there is not a single person on earth that knows everything in there, and in particular where the hardware deviates from the manual, which it does.
Because that's a complete waste of time since none of it will carry over to anything else. Webkit and Gecko are supreme spaghetti code, you are basically doing a study of the characteristics of select turds.

So what do you suggest then? Nobody is going to build a useful alternative, and realistically speaking, it's not clear whether it's even possible to render modern web pages without reaching that level of complexity. Better to use something when only understanding some aspects of it than using something and not understanding anything about it.
And don't tell me you use netsurf exclusively or something like that because I won't believe you.
Same thing about the CPUs. It's not even clear whether it's possible to have performant hardware without making it impossible for a single person to understand the whole chip.
As the botnet progresses it'll become basically impossible to live without interacting with it in various ways, if anything just because of government records and surveillance, be it through CCTV, taxes and legal documentation, drones, TEMPEST, passive radar, audio sniffing and beaconing, stylometric analysis, power analysis, etc. It's better to leverage the capabilities of modern software and hardware by managing complexity and risk rather than ignore what's going on around you and just be happy that you understand how the ZX spectrum in your drawer works at the gate level.

I believe the problem also lies in the ability to find a sequential path in order to learn some topic. Currently it is easy to find nodes in the web about specific topics but its is hard to find a path of resources to learn about CPU hardware without getting lost.

Spent all today and all of my energy learning assembly on a Windows machine. Got stuck though and not sure how to proceed... within the first 10 minutes when it was time to link the program.

I've exhausted every single link from several pages of search results from multiple different searches, no matter what example I use it will just refuse to link. Usually because of some architecture incompatibility or failing to find the libraries/procedures or whatever to print the fucking hello world. In other words, to rephrase; I spend all of today and all of my energy trying to figure out how to link an assembly program, without finding out how by the end of the day.

Attached: 18ps27.jpg (1024x860, 102.31K)

Pretty much the same thing suggested, though it doesn't have to be on an FPGA or a reimplementation of the ZX Spectrum. I've linked Project Oberon earlier, check it out.
Why would you want to preserve "modern" website abominations when your goal is to kill complexity? You can't get rid of complexity if you are not ready to cut it out. In fact this is one of the big reasons it flourishes so much: People can't say no to dumb features that fester and multiply.
You do seem to suffer from the (common!) misunderstanding that Javascript is this absolutely fundamentally necessary thing holding the web together. It isn't, but it's by far the largest source of complexity in a browser. Install something like uMatrix and try it on the appropriate settings for a while, you'll be surprised. Right now I have JS enabled on seven sites, of which four could be trivially cut and two are work-related.

I use Lynx for most things, which admittedly makes a very brutal cut, removing even CSS. Unfortunately I can't use it for all sites because of my bank (that one might be fixable though) and a library site. Obviously it is not necessary to go to such extreme lengths, but even that is usable for many sites. Whether you believe this is your decision, though if you can't even believe that someone would be using netsurf (which while buggy is still graphical), I doubt you'll believe this story.

In a very similar way this heavily depends on what "performant" means. Can it run whatever AAA game came out yesterday? No, not even in a decade. Can it run some Electron abomination? No, but this is not the goal anyway. You don't exterminate vermin by very nicely asking it to kill itself. Such a simple CPU can be more than enough for serious work, which you can see with the RISC-V CPUs. Now these are definitely not the simplest things possible and I personally dislike them, but they are far simpler than current x86-64 processors yet still usable.
Consider: You aren't doing many more amazing things requiring shitloads of processing power than you did ten or twenty years ago; in a loose sense this observation started the thread.

There is a difference between interacting with the botnet and letting it onto your box. Botnet requirements from the government side can obviously only be opposed politically, so you'll have to do that.

You aren't managing anything. You may have fooled yourself into believing you are (if there is one thing the current cancer is good at, it's making you believe that the spinning wheels are doing something), but consider that when you tried to convince me, you only gave vague buzzwords instead of an example. Why can't you name one?

The Mill guys are trying to build an alternative.

Have you tried FASM?

1070154
Tor could be such barrier

1070234
why do that when you can buy non-ME cpu which is still fast as hell?

This. I disable JS on nearly all website, most except work-related crap works. Bloat is bloat because it is not needed.
RMS has the right idea of cash-only and minimal information "sharing". Those who depend on the botnet are part of it. Everyone hosted their mail server in the past, face-to-face banking exists, snail mail still works. Freedom is not free, those who sell their own freedom and soul at the slightest glimpse of convinence is why freedom is expensive now. You can't expect god to do everything.

If you're interested then figure it out on your own- you literally have the source in front of you.

The code doesn't contain the "why".

What is easier, in your opinion, when you're taking a look at a program for the first time, especially if it's split over multiple files in multiple directories? To get a basic gist, I mean. Pouring through the API? Looking through files one by one? Or by looking at a rough visual representation of all moving parts by the person who coded them? Hell, it may not even be in a language that you know. Then this will help you port the project faster, as you will have a head start already.

Agreed. Also recommending this book for learning how to code.

Attached: copying-and-pasting-from-stack-overflow.jpg (600x788, 71.67K)

An architectural diagram is invaluable when beginning to work on a new project. Reverse engineering relies heavily on looking at flow charts.
It's not just about figuring out how a program works. If you have the binary you can figure it out. The question is how fast you can do it.
Source code availability and documentation (yes, including graphics) helps with that.

RMS has people to do everything for him, that's why he doesn't need "convenience". He doesn't need to know how to install his operating system. He has his income sorted out for the rest of his life with no chance of ever being fired. He can spend the rest of his life answering a couple emails a day through carrier pigeons for the rest of his life if that's what he wishes, and still have delivered to his door any food, clothing item, tool, or hardware he desires to a cabin in the middle of the forest, while paying in cash. Life isn't so easy for the rest of us.

Because they either cost like 10 times the money for the same performance (Talos II), are way slower (ARM boards), there's still no independent audit of the silicon for any of them, and you can't run x86 binaries.
At least ME is somewhat researched and documented, those other CPUs, on the other hand, are pretty much unknowns.

Sounds like a LARP to me.

I decided to fall for this meme, and from what I can tell from reading a bit and skimming around it doesn't actually seem to teach anything useful. The title "structure and interpretation of computer programs" sounds very interesting, but it seems to me that a more appropriate title would be "programming in Lisp". And in a typical book fashion, there's many pages worth of literally nothing at the beginning, you can unironically skip about 30 pages from the start and not miss a single thing. Then it proceeds to explain very simple ideas in the most convoluted, pretentious and lengthy way you could imagine for several pages, until at halfway to page 34 it begins to talk about things that actually matter.

What follows, I do not follow. I don't understand what it's trying to teach you, it starts by supposedly giving mathematical calculations into a mystery interpreter which supposedly spits out the result. What do I learn from this? The only thing I learn is the syntax for adding numbers in the Lisp language. It repeats this for about another 5 pages, after which introducing the syntax for variable names. After I went "what the fuck" long enough I decided to take a closer look at the table of contents and jump around a bit, and it literally looks like a book for learning Lisp, not a book for the "structure and interpretation of computer programs". I already know how to program in C, I know what data is and roughly how it behaves. It was not hard to learn, it sure as hell was not as hard as this book is making it.

At this point I cannot imagine that book to be recommended unironically by anyone who actually read it.

It's an introductory programming and CS book, aimed at people who never programmed before. Of course you are not going to find the first few pages very interesting if you already know basic programming. You sound like OP, so is your complaint seriously that any given book does not magically intuit how much preknowledge you have and skip the random parts you know?
A lot of programming books are complete garbage, so I thought that was what you are talking about in the OP, but it seems like you are just a nigger who can't handle books. SICP is not about Scheme, and this should be obvious if you read it. Have you ever read a single book in full?

I've always heard it's a very advanced book in some way, but maybe people are confusing "advanced" with "obtuse".

My main problem with books is how much useless shit and badly worded explanations they have. You can almost always find online tutorials that are significantly more to the point and easy to understand and give better examples and visual representations for what they're explaining.

I had the same problem with K&R, I did not learn anything C related from it, after reading some star wars references and stories from the author's past I decided it was much easier to learn C by looking up some tutorials and then experimenting and looking up things where I lacked. Maybe it's a personal problem but my brain seems to shut down whenever it detects I'm about to waste a whole lot of time consuming a load of shit instead of getting the useful information directly.

I don't mind if the book starts from the beginning if it's clear about the whats and the whys and doesn't waste time with them, I might even be interested of reading that just in case I've missed some tidbit of knowledge, but that's not the case when you read 10s of pages of Lisp instructions for some vague reasons and undefined purposes, and then keep reading more and more about Lisp without ever getting any practical purpose for any of it. I'm sure it comes at some point but that's the wrong angle to teach things from.

Maybe it gets better at some point, but from skipping around that didn't seem to be the case. Maybe I was unlucky and just hit all the wrong places by chance but my faith in it already died.

SICP is pretty advanced for an introductory book from the mid 80s. How many introductions to programming include functional programming, implementation strategies for generic operations and their tradeoffs, message passing, stateful computation vs. immutability, or the construction of a compiler? Chapter 4 is arguably a bit Scheme-specific, but only because thanks to the flexibility of Scheme, you need not pass to a different language that supports the discussed esoteric features.

That does seem like a personal problem, especially considering what you classify as "waste of time". The "vague and undefined purposes" are to establish the basis on which the entire book proceeds. In order to discuss programmming with a programming language, you need, well, a programming language. Scheme is a good choice there because it's very simple and uniform; quoting is just about the only idiosyncracy you encounter in SICP, but compare that with books using other languages.
I haven't read K&R so I can't speak for it, but C online tutorials are notorious for containing incorrect information. How many C tutorials claim that signed overflow is well-defined? That's certainly easier to visualize, but it's also wrong.

Like I hinted at earlier, I rather suspect that your definition of "wrong place" is broken.

Yes but do you really need 5 pages for telling people how to add and substract numbers? It doesn't even say a single thing about the data that represents those numbers, it should have taken a couple sentences just for the purpose of teaching how to read it. And that's all being taught in a hypothetical environment where you get the result as text after typing it in a console, not in any kind of practical context where you can/could put the knowledge to use.

To be fair it had to include a section about "pretty printing" the code because the syntax is so retarded that it's hard to read middle school math normally. Maybe there's a reason for that syntax but it didn't tell me let alone mention it would be explained later, so if there is one, then it too is vague and undefined. And yes I know it says "you can add 3 numbers without typing the + twice" but that's not a good reason to deviate from standard syntax for mathematics especially when it becomes harder to read as a result.

To make it worse every single sentence is worded in a way that makes it difficult to speed through unless you have lots of experience reading similar text. This isn't difficult stuff, yet I can't help but marvel how complicated the book makes them sound, there's no reason not to use simpler words to describe it. It's hard to describe what I mean by this, but it's as if a scientist is teaching other scientists the wonders of adding 1 and 2 together, in a non-mathematical snowflake syntax that one of them invented for some reason that they didn't explain.

True, but on the other hand I've seen books that tell you to install and do some esoteric build configuration in visual studio to program in Windows. What reason do I have to believe that a given book has 100% accurate information and no subjective opinions? They wouldn't need revisions if they were, SICP has a second edition as well.

It's not five pages, it's five lines of example code. The rest treats how evaluation works, with arithmetic as an example. This is different from teaching arithmetic, jesus.
What kind of practical context would allow you to "use" these complete basics? At this point you are basically learning the alphabet, did you ask the teacher what you can use a G for?
It mentions extensibility and uniqueness, though admittedly this downplays the most important advantage: Everything has the same uniform syntax. But this is a really minor point to get hung up on.
It's a university textbook. Is English not your first language?
Am I being trolled? There's a massive difference between a minor slipup that gets fixed in a revision and a text teaching complete bullshit, but revisions typically don't exist purely to correct errors like that anyway: The second edition of SICP primarily added material, changed some explanations and made the code compliant with the then recent standard. You can find all of this info in the preface. Otherwise, you evaluate potential literature like you would anything else, by either trying it out or looking for recommendations. SICP happens to have a lot of the latter.

How about a sample program? That's how I learned C. Give a simple program that uses a simple calculation and then starts explaining the components and what they mean and what they're made of. It's a real concrete thing that you can see, and then you're being taught to understand it. I'm not reading a math book for my grade school math class, I'm reading a book about the structure of computer programs. But that's not any kind of structure, there's no data, there's no program, there's nothing but this mathematical calculation.

The whole starting point is wrong as far as I'm concerned. If I were teaching this I would start by describing what data is, that a number on the computer/program is X amount of data in the memory. Then use variables to hold and interact with that data. That's an actual thing in the computer, and you can use it as a basis for understanding what's actually happening when you do something. The language used in the book doesn't even seem to have data types so I'm even further removed from understanding the context of where/how it runs and what those calculations are as a program. The book seemed to just keep going and adding more information with that line of thought without putting it into a practical context, in my head it's just information that kinda floats up there somewhere but I don't really know what it is.

Is that really an advantage though? I'm not saying there isn't possibly a reason for it to be, but you don't make things clearer by making different things look the same. Is there something the syntax allows that would be impossible with for example C-like syntax (not C syntax, but something that's structured similarly)?

It's not, but that's not my point. My point is that there's no reason nor excuse to explain something very simple in a very complicated way.

Zig Forums is a moribund community where religion overtook utility and solving actual problems. It reminds me of code golf on stackexchange. Golfing incentivizes writing trivial programs in trivial and convoluted ways. Zig Forums incentivizes writing trivial programs that are as "pure" as possible.

You can write a program that cures cancer and builds colonies on Mars and Zig Forums will just bitch that you used a proprietary library at some point. They will not suggest a good free alternative, and often they will bitch about it even though there is no alternative (they'll just say "drop what you're doing and write your own"). Basically there is no ability to accept a flawed prototype, focus on solving the problem and improve it in steps. Instead if a project is not perfect from day 1, fuck it! This kind of perfectionism is something all successful developers must overcome, but Zig Forums gives in to it entirely and productivity here is crippled as a result.

Another related problem is out of control ambitions. Even in replies here you see it: People are asking for 5 minute tutorials that will somehow teach them all of Gecko internals. Nobody wants to spend a month learning how one sound driver works and trying to do something useful with it. Everyone wants to learn how to make their own OS, how to make their own browser (with a superset of Firefox's features, lol), how to make their own internet. The best way to learn is to do bite size projects, but he prevailing attitude is to always bite off more than you can chew. Nobody gets anything done as a result.

To be clear: There are competent programmers here. Many of them are getting shit done. But they don't talk about it on Zig Forums because Zig Forums is incapable of providing useful feedback. It is only capable of telling them to tear down everything and posture about being the most based/free/oss/whatever. So people solves problems orthogonally from Zig Forums, and all the beautiful ideals of Zig Forums are moribund because it is impossible to collaborate with anyone on a Zig Forums project. Granted, I've since made my peace with it - I treat Zig Forums as a sports bar to drop by and shoot the shit with a bunch of other badgoys. I don't expect to get quality training advice from a bunch of fat drunks, nor do I seek it. It's not a place to learn, just a place to unwind and vent.

Where do you go for serious discussion™ then?

Attached: ae1a5cc8c114b88024e711c2046c037af0426a8d3c7213defe7db8b196057494.jpg (540x720, 53.59K)

These are example programs. (* 2 (+ 2 3)) is a Lisp program that calculates the product of 2 with the sum of 2 and 3. The square procedure defined a few pages later is a program that calculates the square of its input. This increasingly reads like you are stuck on some restricted fixed idea of what Programming™ is (probably based on C) and consider everything dissimilar "not programming". In that case, you should definitely read SICP. Let go of whatever preconceptions you have for a bit and read it, it'll be worth it.

That's just being part of the problem, no? Honestly I feel like the largest problem of Zig Forums is something that plagues almost everything in software, but Zig Forums doubly so since it's anonymous: You have a lot of people who aren't actually all that knowledgeable, but are 100% convinced they are geniuses and act like arrogant shits with nothing to back it up. You have entire threads that consist of nothing but people spouting hot opinions on things they (sometimes even admittedly!) have no idea about like it was God's wisdom.

You can't program without data nigger. "3" isn't a thing as far as the computer is concerned, binary 00000011 however is, and it's a byte of data in memory representing the number 3. So when you say "(+ 2 3) makes 5", you're teaching the syntax of Scheme or Lisp or whatever to do math, not explaining the structure of a program nor that calculation nor it's components. Like I said, it's the wrong angle to start from because I have no context for what's actually happening in the computer.

My suspicion was right. Read SICP.

No.

Essentially what we're dealing with is a lack of community for higher quality people. This begs the question: Why do we not have such places? One possibility is that people have a tendency of being lazy or thinking that nothing can be done. This mentality can be seen very well in this thread and on this board in general. Another possibility is that there simply aren't enough higher quality people for such communities to emerge. Since our genetic intelligence has been dropping for centuries, I think it is reasonable to assume that at least part of the problem lies there.

However, it is not only high IQ people who suffer from this. Various internet counter-culture movements are getting deplatformed from mainstream websites, making it harder for them to reach larger audiences. The most prominent example would be alt-right. Yet, even under such pressure, these notable alt-right people never seem to actually build their own communities. They always keep talking about "redpilling the normies", as if Twitter and Youtube are the only websites that matter. I believe that these movements will not succeed until they are willing to let go of their dependency on popular social media.

But why won't they let go? It is easy to rationalize that being in the public limelight is important for a growing movement, and I'm sure there is some truth to that, but I think the real driving factor for most of these people is the fact that they want to be part of the tribe. They are driven by social status and belonging, like most people are. It does not seem to be enough for them to build their own little communities, but instead they really, really want to be part of the big community. This may be why these figures barely even use Bitchute, even though they could clone their channel onto Bitchute with a click of a button.

Could it then be, that higher quality people are driven by the same instinct as these counter-culture movements? It would certainly explain why there is so much complaining on Zig Forums, but not much action. People want to be part of the big tribe, but they will never be accepted, which leads them to become frustrated and lose hope. If that is the case, how many of these people would have internal strength to overcome themselves, and start being more productive to build little übermensch communities for themselves?

If you can't wrap your head around absolute basics like what an abstract data type is, you have ridiculous knowledge gaps and need to read a fucking book.

True. We wouldn't have any of the stuff we have today if people in the past thought it was impossible or too hard or that things can already be done with the current setup so there's no need to make something better. Yet that always seems to be the case when you propose something big, everyone besides ideaguys and logo designers are unmotivated to think or talk seriously about doing it.

There's definitely not enough people. I think a lot of people who have a problem with tech and the will to improve it also have the problem that there's simply no time to make everything they'd like make.

I think it would be fun to make some videogames. More than that I want to start working on a replacement to some Adobe software which I require at work and personal stuff. But on the other hand I think it would be important to make a more free and controllable browser engine before we lose all the software freedom we have when using the internet. Prototyping a potential replacement for the entire web system for alt-web sounds interesting as well. All of the above run on an operating system, of which every one is different kinds of shit so I'd like to work towards either making them better or learn to Terry a better proposal. Double that because the OSes on mobile devices are separate. Then I want to discuss it and remember that all imageboard software is trash and there basically isn't a very good one period. Thinking of websites there's many that I would want to make, like a game/software store that doesn't ban wrongthink, or an esoteric webcomic site that supports animated/video pages. To manage business' websites I'd want something like wordpress except not shit. There just isn't enough of me to start working towards all of those, some of them so big that it's basically impossible to do it entirely yourself let alone start splitting your attention to other projects.

Another one of the things I'd like to make is a website where it's easier to manage and discuss and find projects somehow so you don't need to skim low quality social media discussions or oceans of dead githut repos for them and then subsequently be drowned in the noise. In the case of imageboards the entire "community" is set up to think that every board should have shitposting and porn and 0 rules, or else you're a SJW reddit soy cuck. And the software never changes so the capability to manage projects in imageboards never improves.

I think the internet did something very bad to people. The ones who grew up with it are somehow broken and don't know how to function separate from it because of how easily and much social activity and instant gratification they're used to getting from it. Going to a smaller community makes that flood of activity stop in it's tracks, so what you're used to doesn't work anymore. Getting ostracized in the internet is like being ostracized from society itself. It's probably why politics are so polarized too, you can't propose a third option or you'll be hated by everyone.

It actually kinda relates to this thread topic. Firstly because people with unusual but similar views can't find each other from all the noise. Secondly because people who are be interested in making their own community/website might have difficulty learning how. I was thinking how useful it would be to have, for example some kind of tutorial course for teaching people how to make their own website from the ground up and then directing them to additional tutorials for expanding the website technology in different ways if desired. And have different layers for different amounts of depth, so that you can choose between programming the entire server from scratch if you want to do something in-depth like a game server, or gluing together some node.js abomination if you just want to slap something simple onto the web. Different people with different preferred method of approaching it can all learn how to do it without having to collect fragments of information from amateur blog tutorials. Then repeat the same for other things than just website technologies, and make all of them easy to find.

I posted my projects on Zig Forums before and I never got bitched at actually.

You can't stop retards, especially of 4chan, from migrating and shitting up the place with crappy threads. It then turns off your "higher quality" people off from visiting, who then either (a) socialize with their peers irl, (b) find something else to do after work.

I never said to do it in 5 minutes nor that it had to be "ALL" of Gecko's internals you fucking kunt. I was thinking more like at least a 1 hour video or a series of 1 hour videos, teaching you at least the HIGH LEVEL ARCHITECTURE and maybe going into details about some interesting parts, not "all the internals".

He's more right than you are. Abstract data types are imaginary things, what actually exists in the physical world is transistors, voltage levels and current. If you want to do computations using abstractions then at some point these abstractions need to be turned into a labeled set of (V,I) pairs.

No, it's pure sophistry, and if this kind of obsessive thinking stops him from reading and comprehending a book on basic programming, it's actively harmful. None of this is relevant for writing a correct program. When you write arithmetic code, do you think of it as arithmetic or transistor gates? Were you physically unable to program before you learned how the machine works? Actually scratch that, do you still think a byte is actually a contiguous collection of bits in RAM? You'd be wrong. Yet mysteriously your code was still correct every time.

Starting to think of your data as some kind of abstract objects and disregarding what it represents brings you further from understanding what the program is and how it works as a "physical" thing in your computer.

Answer my question instead of spouting empty phrases.

I'm fairly ignorant about computer science. I want to bridge the gap between computers as voltage gates and computers as abstraction... Honestly I don't know exactly what the definition of a computer is. Does anyone recommend any educational material for me, besides Project Oberon which I'm dedicated to.

think about how many different sequences you can represent in binary with, let's say 4 digits. [spoiler]the answer is 16[spoiler]. Well a transistor is just on or off, so it's equivalent to 1 digit. Alright but how does that let us compute, well let's make a very basic circuit where, one transistor turns on if two transistors connected to it are on, and off all other times. that's an AND gate, let's make an OR gate and a NOT gate so we have all the logical gates we need [spoiler]duck logic gates[spoiler] alright so now given that we could set up a series of complex arrays of gates to have a system where depending on the input we go into one of many predefined states, or allowing us to output things based on the input. from there we add cache and memory and you'll have to research the rest. [spoiler]we keep abstracting it[spoiler]

lol oops i fucked up the spoilers

Hello newfriend. Please lurk for 2 years before posting.

The underlying problem can never simply be fixed by software. The demand for software has translated to an increase in the supply of developers, a significant portion of whom are morons who could give less of a shit about code quality. So long as the bar of entry for programmers remains low and the demand for them stays high, low quality code built on mountains of failed abstractions will be the norm. The amount of cs majors I've met with no passion or pride for their work whatsoever, and who only care about computer science for easy income, makes it abundantly clear to me that technology will stay terrible regardless of any efforts we can make. Fuck learning the abstract foundations of a computer, they've never touched anything lower level than Java, and see no reason to do so. If you want to fix computing you'd have to kill the demand for software and technology, make computers as weak as they were in the 90s, and undo the last 3 decades worth of Windows having a pissing contest on how they can market to an even lower common denominator. No software project can ever fix that inherently social issue

The problem there (which I don't have a solution to, sorry) is that bad programmers create more programming jobs because somebody has to maintain/fix the broken crap they produce, thus creating a "need" for more programmers. It's self-reinforcing.

Read a fucking book your comparison is shit

Attached: peasant.jpg (668x623, 111.11K)

Transistors are analog you mong

KISS or Keep It Simple, Stupid should be regarded as the first and only rule. I don't know much more than that.

urbit.org/

switchedtolinux.com

privacytools.io/

anonguide.cyberguerrilla.org/

learnpython.org/

maths.cam.ac.uk/sites/www.maths.cam.ac.uk/files/pre2014/undergrad/admissions/readinglist.pdf

strategyumwelt.com/frameworks/feynman-technique

Attached: free speech alternatives.jpg (640x456, 45.65K)

mailinabox.email/

This infographic is retardation

Attached: DEXATI20190715152139.png (1079x768, 300.85K)

No shit, user, I didn't make it, I just culturally appropriated it from normie twitter. It is meant to be disseminated among thier kind, not ours. For actual OPSEC in the electronic realm, the best I can think of is encrypted and coded mailing lists among small groups.

Attached: cia glow niggers.jpg (327x346, 62.26K)

Attached: kys faggot.jpg (429x271, 9.04K)

It depends on if you subscribe to the NPC hypothesis. I do, though evidence one way or another is scarce, so I'll be coming from that angle.

It's not so much a game of having everyone in the population being technologically savvy, but having them see us as a sect that knows what we are doing and take their information from us instead of the Apple commercial they just saw.

Lol?