RAGE thread: Tech edition

What is fucking wrong with these people? What could POSSIBLY cause them to behave like literal retards and shit up computers so much? Is this shit just general incompetence? I'm thinking I should just make a database of everyone that's involved with garbage webdev projects like this just to make sure I'll never work with, recommend or hire anyone that has ever touched this webdev cancer.

post made vague to not give any more attention to this webdev cancer

Attached: pierre bernard's recliner of rage.jpg (888x500, 147.51K)

Other urls found in this thread:

developer.apple.com/library/archive/documentation/OpenSource/Conceptual/ShellScripting/Introduction/Introduction.html
write.flossmanuals.net/command-line/introduction/
gnu.org/software/bash/#documentation
github.com/osnr/horrifying-pdf-experiments).
fastcompany.com/28121/they-write-right-stuff
zerohedge.com/news/2019-06-29/boeing-outsourced-its-737-max-software-9-hour-engineers-0
gnu.org/proprietary/proprietary-insecurity.html
twitter.com/NSFWRedditVideo

Oh yeah, post your own rage stories.

The computer ran like shit for a year and then supposedly quit working altogether. I say supposedly, because he gave it away to some dude at a garage sale for free, without even removing the hard drive first.

There is no video player for Android that plays webms reliably.

You must have some fucked up webms. They work in vlc, mxplayer, the default samsung video player, firefox. I dont think I have an app that wont play them.

Imagine being so autistic that you need to take it upon yourself to decide how your own parents use their own PCs

Not rage but wtf moment.

I'll probably get a Radeon once my 960 finally bites the dust, but isn't it a little pathetic that there isn't any non-proprietary drivers for Nvidia? People stay away from Linux because they expect everything to just werk when they first use it, but on Linux that's just not possible and why it's never going to catch on. You can shit on Windows and OSX all you want, but they work on first setup without any extra bullshit or CLI commands required, and that's more than any Linux dev can say.

Attached: Business Card.png (750x450, 415.94K)

Linux is like a religion, you need to join the Linux religion if you want anything more than a web browser out of it.

yet despite all the bullshit I still want it to be a good alternative. I am annoyed with Windows' spyware and SJW bullshit, but there's no good alternative. ReactOS is still a joke and will be until the end of time. But until Linfags get their shit together I don't have a choice.

There is a FOSS driver, Nouveau, but only on 7xx cards you can expect any kind of performance from it.

Driver support, especially for graphics cards, is primarily related to how much the card vendor supports the OS. It has little to do with the quality of the OS.

...

Attached: Heavy Bait.png (680x1147, 916.11K)

Nouveau was what I tried to install, followed the directions and yet it didn't work

There is your problem. There is nothing wrong with Ubuntu for beginners, you can always switch to a more hands-on distro later. What you did was like going to the gym, seeing all the buff guys liftin 70kg weights and deciding that you should also lift 70kg weights despite never having lifted before. So of course you get yourself injured, and then you complain how the reason people don't get fit is because they end up hurt when they go to the gym.

Ubuntu lets you do most things from the GUI as well. You complain how Ubuntu is too much like Windows, but you still want to use a system like Windows. WTF is your problem?

If you want to use a distro for more knowledgeable users like Debian or Void, then you have to sit down and study how the pieces fit together. If you don't want to waste all that time, then use a distro that has everything and the kitchen sink included for better or for worse, and live with the bloat. Or as a third option, find someone who will install and configure a lean distro that just werks according to your individual autistic specifications. There is no magic solution.

Do not do this.
In any situation, you should get it from the distro's repository.
There is a free driver, it's called nouveou. It's shit with any of the recent nvidia graphics cards.

Also, next time you try linux and if you happen upon the same situation, try running: "nvidia-xconfig" Xconfigs are mostly automatic these days.

This isn't windows, if it's a free driver it should already be included in the kernel.
Nouveau wouldn't need to be installed.

where does one even start with learning basic Loonix commands? Is there a manual I should start with or is it just a "learn as you go" kind of thing?

Attached: heav.png (705x651, 423.88K)

This is why we have kikes, niggerfaggots, sandniggers, and (You)

you gay

Attached: march-of-webdev.jpg (900x691, 182.99K)

It's GNU/Linux, the part you need to learn is GNU (the userland), not Linux (the kernel). GNU is a Unix-like system, so any resource on Unix will do. There is a number of books and tutorials, some of them free. Start small with things like ls, cp, pwd, echo, touch, rm, mkdir, rmdir (don't bother memorizing those names for now). Then learn that all these "commands" are actually just the names of programs and that you can add new "commands" by installing or writing your own programs. You can write down a series of commands into a file and then run that file, it's called a shell script (because the thing that interprets your typed-in commands is called a shell).

Back when I was a Mac fag I used this resource:
developer.apple.com/library/archive/documentation/OpenSource/Conceptual/ShellScripting/Introduction/Introduction.html

There is also a popular guide written by GNU:
write.flossmanuals.net/command-line/introduction/

I haven't read it, so no idea how well it is written, but it seems to be popular. You can also buy a printed copy if you really want to from the FSF.

Both. You need to know some basics, which is what the tutorials teach you. Beyond that you can always read the manual pages (also called manpages in short). Type 'man whatever' where whatever is the name of the command, for example 'man ls' will show you the manual of the "ls" command. Be aware though that the manual is a true reference manual, it is very technical and expects you to be familiar with Unix conventions. Manpages are very terse and are usually not proper user manuals. For a large nicely written user manual you have to look up the page of the project, usually they have links to a large manual. For example, GNU Bash has a link on its page to the full manual in various formats:
gnu.org/software/bash/#documentation

In the case of GNU projects you can also often type "info whatever" to read the large manual in the terminal. It uses the standalone info reader which has retarded key bindings though, so you should instead read "info info" first. Or use another info reader. Or just get the manual in HTML or PDF format. Or buy a printed copy.

A final word on documentation, Ubuntu often does not install manuals when you install a package. For example, if you type "info bash" you will only see the manpage. To the the full manual you will have to install the package "bash-doc" (sudo apt install bash-doc). In general, if the package is named "whatever", the the documentation is in "whatever-doc".

go to terminal
type man man
go from there

man intro

Try Parrot of some other distro with Windows style theme, those are pretty nigger tier.

Know how to identify who's responsible for what.
There's a free/libre drivers for nvidia GPU that required a lot of REVERSE ENGINEERING and Nvidia like the faggots they are forced some blobs to be signed on newer GPUs so even if there's a free/libre of software that's made to replace these parts it can't work on it since it needs to be signed by nvidia.
The only reason for why hardware doesn't work on GNU/linux is because hardware manufacturers are turbo cunts.

That's because they don't know how to use their computer either. They don't even know how to correlate physical and digital functions. Even when blatantly obvious.

Install LFS one day and then come back and try to say the same thing again.
It's not the Software that needs to catch on people. It's already extremely simplified. And it you go further in simplification we will be at idiocrasy level of simplification.
No it's people who need to catch up to software, they need to know simple generic information and pedagogical content for them to correlate physical tools they use everyday to digital tools.
For example just naming the tools that they use instead of saying the brand for example spreadsheets (not excel), search engine (not google), software (not apps), "the world wide web" or just "the web" (not google), a web browser (not google), video hosting service (not youtube) ecetera.

Typical response of faggots that shouldn't be allowed to use anything more complicated than a lawnmower.

Bait. anyway.
Git gud. Install gentoo.
You can go like that. But you don't need to learn that sort of thing if you're a normie.
Basically just list the functions that you need a web browser to surf on the web, a spreadsheet software for spreadsheets, video player to play video, email client to read and send emails and then just install them via the simplified installer system that exist on more beginner distributions

Imagine being so autistic that you don't care about your parents having to suffer through a shitty user experience on their computer, which culminates in your dad giving a stranger a computer with your parents' banking and tax information + photos and videos of ur mum getting pounded on it. You're a bad son.

Install adobe reader people need to be shoot.

Why not mu or poppler based? I never use adobe reader to like/dislike it.

adobe is bloated to hell and back, it's 600+MB when Sumatra is only around 60.

ok maybe not 600MB but way more than it should be

Sumutra is bloated to hell and back, it's 60+MB when mupdf-x11, muraster, mutool, and libmupdf.so are only, exactly, 57.9K, 29.9K, 98.1K, and 42.7M consecutevily which could further be reduced by statically compiling it. I don't even know how you are getting that 60MB figure. Is that just the standalone sumutra.exe without any of its companion dll/exe's?

I don't rage, but I simply can't understand people who pay good money to make their computer slow to a crawl with antivirus software. I was tasked to upgrade a family friend's computer to Windows 10. It was running super fast after upgrade, then I realized it was still doing updates and probably in safe mode with networking enabled. As soon as it finished and rebooted, it took forever to fully load again. This person will have to spend $400-$500 on new hardware just so some POS antivirus software can run on a POS OS.

It's superstition, plain and simple. They think antivirus is like a magic charm that protects them from the bad, and who can blame them if their OS of "choice" is a swiss cheese that advertises this particular scam?

Antiviruses help running cracked software with less risk of catching malware. It's not magic, it simply does morphological analysis on binaries and code that falls semantically close to the samples they've seen in the wild gets tagged as malware. No matter how the OS works, short of a "capabilities" based OS (also called sandboxing, etc.), programs will at the very least have access to your personal files, even on Linux. Android admittedly has a better security model in that regard. Having a program that automatically analyzes software for unwanted behavior reduces the risk when running unaudited binaries. The problem with antiviruses that are not continually running and scanning memory, is that programs may encrypt their payload on the disk and only decrypt them on RAM, and have anti-debugging features that stop them from working when virtualized by another program (such as a potential antivirus). Will it make the PC slower? Sure. Is it worth it? Depends on how valuable your data is, how much manual monitoring you're willing to do so you detect an intrusion if it does indeed happen, how good is your backup system, how fast is your hardware, etc.
The main problem is that antiviruses themselves are a privacy risk. I had Avast installed but I found out the Windows kernel was making connections to other IPs geographically close to me. I had the "delivery optimization" stuff disabled, so figured it must have been Avast doing some shady p2p updates shit. Uninstalled it and voila, no more weird connections from "System" (I had blocked all the known Microsoft IPs beforehand to the point where the OS didn't show any connections when not using programs that access the network).

>it works okay after digging through heap of different drivers on websites
And yes, it's a 960
Only real problem I've had is nVidia's removal of proper DRM/KMS support in favor of a stubborn contrarian attempt to exclusively support EGL.

Attached: works-on-my-machine-starburst_2.png (497x480, 236.28K)

I install mupdf for PCs who don't need the printing function.
I was not aware of its existence.


In all case all of these are faster and less bloated than the botnet that is adobe reader. And I don't exaggerate when I say botnet, if you read the EULA you give adobe authorization for the Adobe reader to be part of their private meshnet, sharing RAM, cpu power, Hdd space and more.

This. It's because they really don't even know that antivirus just compared the file hash to the known database of malicious files they have.
When they understand that (and also the fact that they install malwares themselves) you can install clamwin.

99.99% of all malwares for normie users can be avoided by:
-Not installing non-fee/libre software.
-Not loading JS in the browser.
-Deactivating macros for .docx .xlsx.
-Minimal pdf reader ( github.com/osnr/horrifying-pdf-experiments).
-Not using an admin acount

With just these five, even on windows, (except 10) the PC of users stay pretty sane for a long time.

m8 you just had to read the privacy policy.
Never trust non-free/libre software.

You don't belong here.

Fixed it for you:
use Microsoft's own PCs
Imagine letting your parents using nonfree software that hurts their freedom and privacy. You're a bad person.

Attached: Who Writes Linux Infographic Sept 2013.png (300x150 133.88 KB, 2.83K)

...

Security "proofs" are LARP, but you're right about auditing. Muh 1000 eyes is a complete meme.

I'm aware of all this and your point is ?
We have legal control of the sources thanks to the gplv3 and that's what matters.

Thanks again for the genius developing this to let smartphone manufacturers letting them embed it and DRM it. Sel4 and variants are like Minix3 one of the most used piece of software in the world thanks again for licensing it under the flawed GPLv2 and the permissive 3-BSD/Expat.

Linux is not GPLv3.

I think he meant GPL at all and freedom it gives. But you're right it's GPLv2 only.

Even if just one person checks the source code of a program it's still better than nonfree software, there not only developers often add malicious features and hide 'unintentional bugs', but also they make bugs that are hard to fix and only crackers know about them. And if you discover a bug in nonfree software, you can't fix it without begging developers to do it.

With free software we have developers, users, white hats and crackers checking the code, whereas with nonfree software only developers see the source code and mainly crackers reverse engineer it, because it's profitable for them. Even if an user of a nonfree program knows programming, he/she probably knows nothing about disassembling. Also white hats are less likely to audit this software, because it's not their job. While auditing free/libre software you own the thing you're contributing to, so it's your business to make a program work well. No one will do, for example Microsoft's job for them, because they get nothing.

I wonder why M$ haven't started a good goy points system on shithub where those of a politically correct enough standing in their commits can become a low tier Wangblows devs and get a special badge on their account+increased posting privileges, that way Microsoft could fire its entire pajeet QA contingent.

Ain't them pajeets diversity hires anyway?
*ba-dum-tssss*

You fucking wish. Most open sores code has been read by its author at most, if even that. Do you believe that mistakes in Wikipedia get corrected, too?
It's not the number of eyes that matters, it's the quality, and in FOSS, the quality is abysmal. Most of these "selfless" whitehats get paid for their work. Which is how things should be, don't get me wrong, but they could just as easily get paid to work on proprietary shitware. You seem to have bought REALLY hard into the open sores meme. Argue all you want whether FOSS is morally right or something, but "available source = better code" is completely false.

wasted 10 minutes debugging this a few times this week after forgetting it's a thing

Yes, which implies security, which open sores is no better at.
Wow, great, fabulous.
You realize anything evil in firmware blobs can simply be shifted to silicon, right? Also, open sores, copyleft license or not, is completely irrelevant to any restraint whatsoever against stuff like DRM and hostile management hardware.

HAHAHAHAHAHAHAHAHA!!


They're actually the first step along a path modern software (especially the small, low-level, low-maintainance, security-centric software open sores whores pride themselves on, such as kernels, libs, and APIs) desperately needs to embark along. Self-documenting code, separating implementation from standard, proper "big design up front" waterfall development, etc.:
fastcompany.com/28121/they-write-right-stuff
Developers need to act less like retarded toddlers, and more like professional engineers.

Attached: sonatype_survey_infographic.jpg (1350x4800, 1.37M)

They need to be treated and paid like professional engineers if you want them to behave like professional engineers. Instead, we have critical aerospace systems being farmed out to $9/hr pajeets:

zerohedge.com/news/2019-06-29/boeing-outsourced-its-737-max-software-9-hour-engineers-0

And plenty of other areas of software development are, similarly, a race to the bottom. Companies drive most software development--including open source software development--and most companies would rather get "developers" on the cheap who can code up something that works "good enuff" than pay for properly engineered systems.

I think a large part of that problem lies with the fact that programming, even the absolute simplest macro scripting that barely qualifies as programming, is still stuck in the stone ages compared to any other aspect of using a computer. Entering arbitrary plaintext into a file, handing it over to a compiler/interpreter, get cryptic non-feedback from a debugger if it fails, none of it has really advanced since the 1960s.

If the UI for programming had kept up with other software, not only would actual low-level programming be more efficient, but a lot of things that should just be done by normalfags instead of dragging down the prestige and pay of programming as a profession, actually would be.

We need better scripting for normalfags, and better automation throughout standard UI, so that "programming" will neither be nor need to be polluted by normalfaggotry.

Attached: Eager.mp4 (640x480, 6.66M)

It is people like you who are responsible for the world becoming niggerized

You can do all that without pretending that an unreadable "proof" (that five people understand) being accepted by the holy program (whose implementation three people understand) is meaningful. A good first step would be to introduce (or bring back) safety margins into programming. The current crop of programmers is pathologically obsessed with eliminating every safety feature it can find, at horrendous cost. But honestly, if you look at the deskilling most visible in webshit, and the C weenies who welcome every single thing that would make an engineer's hairs permanently erect, this won't happen without a big fat purge.

I hope it never catches on to the cancerous marketshare levels of Windows. I'm very glad it's relatively niche.

means fuck all when management values costs more than quality, which is the whole reason for that shit (and sad to say they're right because apparently it works).
it wasn't the 9$/hr programmer that downed airplanes, it was the person in charge of QA and design plus the schemes to go around having to re-certify and train. pajeets shitting out shit code means nothing if it works (because that's what the suits see), if it doesn't how much does it cost to "fix" it?
it's all about the money, always has always will.

Windows braindamage never ceases to piss me off.

I imagine developers of nonfree software reading over 100 LOC they wrote and understanding everything, it's so real! (Yes because nonfree software is often more bloated, example - recently opensourced windows calculator - about 140MB of code, gnome calculator - 500KB or something like this)
Comparing Wikipedia to free software isn't 100% right, because:
1. Wikipedia allows anonymous editing, without even having an account, free software projects often doesn't, and even if they do developers will probably check what code are they accepting.
2. Wikipedia is a broad project, having millions of articles and contributors, who often don't know each other. Free software projects are often much more smaller, developers know each other and if they don't know someone, again, they will check what code are they accepting.
3. On Wikipedia if an article is not good, nothing happens, until someone knowing the truth comes there and points out this article is a bullshit. On the other hand, if someone makes a bad commit to a free software project, software is likely to be slower, buggy or isn't working at all - it is easier to check if it is valid than an article on Wikipedia.

But even articles on Wikipedia are corrected sometimes and Wikipedia has some mechanism letting them keep good quality of articles, for example locking ability to edit them and marking as protected. Same with free software, you can't just go to kernel.org and just change the code anonymously, because the project is on a higher level of security than shitty one person projects from github.
Nonfree software is not better.
gnu.org/proprietary/proprietary-insecurity.html
Availability of source code don't often equals better quality, but if a project is lead by right people the open model of development is actually better. Just look at all these big companies switching from proprietary to open model - Google's Android, Fuchsia, Apple freeing their tools, for example WebKit, Darwin OS, even Microsoft freed recently .NET (ignoring the fact they have patent on it, so they can sue you if you modify it) and tools like Terminal application, calculator. They're all switching to this model, because it is cheaper, because people contribute and report bugs for gratis. I myself never reported a bug for Windows, or any proprietary program, because they don't pay me for this. I have a 'product' and I expect it to work. On the other hand I reported many bugs to free software projects, because I own the code I'm contributing to.

This obsession over Microsoft is typical for FOSS people but there is other proprietary software. I brought up Wikipedia because it gets advertised on the same tired "thousand eyes" principle and fails in the same way: Just because people could in theory proofread everything, that doesn't even remotely mean that they do. Most project leads do not proofread, they at most scan over contributions. This goes even for high profile projects such as Linux, check out some of the horrifying driver code sometime. There is no way in hell anyone read that.
>gnu.org/proprietary/proprietary-insecurity.html
The GNU servers seem to be down right now, so this will have to wait.
What exactly do you mean by "better"? Also this is special pleading. Why may FOSS projects assume good leadership but proprietary projects may not?

For comparison, a non-bloated calculator:
$ du Calculator8.0K Calculator$ file CalculatorCalculator: AmigaOS loadseg()ble executable/binary

Attached: amiga1wb13.png (640x512, 11.33K)

Did you actually expect women to give a shit rather than virtue signal?

Actually Linux has a problem, because it is badly designed monolithic kernel. The Hurd is better.
It's up now.
In my opinion good quality is harder to achieve with the proprietary model, because people coding proprietary programs often do this, just not to die of starvation. I've been in a corporation myself and well, what I saw was tired people, one grandpa was even sleeping during work and the code was a total mess. No one forces you to maintain a free software project though, people often do it as their hobby, they have fun while coding and they have less pressure - they won't use shitty hacks just to push the software into production, because they're not paid for their work. There are some proprietary projects lead in a good enough way, but there's a reason big tech companies switch to the open model. Maybe coders are more happy coding free software, and by that care more about software quality, I don't know.

and

mupdf is that small because it is not a 1:1 equivalent to SumatraPDF. The latter might be bloated but you'd have to hack something up with that suckless tab tool and some sort of outline support to get closer (seems to be only present in the OpenGL version of mupdf). Its obnoxious when people insist their favorite crappy tool is equivalent to a program with more features, I use mupdf and the only advantages it has is that its fast and there's not much getting in the way, half the time I'm using PDFTools for Emacs simply because there's more I can do there. SumatraPDF also has a lot of Windows specific crap in its codebase, may also have something to do with that executable size.

...

The problem with GNU/Linux is that people aren't dedicated enough to make it convenient. Rather there isn't enough people that want convenience on a platform that barely anybody uses (only 2-3% of all PC users).

Microsoft Windows (although closed-source) gives you a lot of convenience despite being proprietary. It basically works with the principle: "for everything, with everything". Linux magically expects you to know how the command line works from the very beginning (especially hard when there's no internet nor a manual) for the most menial of tasks. I get that not every program needs a GUI, but Ease of Access is something that GNU/Linux Distros have never been able to fully grasp.

The reason Android succeeded is because they worked on the principle of convenience by promoting their Operating System as the better alternative to Apple's Closed-Source hard-locked "for our approval, with our approval" mandatory registration Mobile Operating System. Basically anybody could make their own phone and sell it unlike a company that works on the basis of pursuing a Monopoly of middle-class idiots. Android had a sense of freedom where basically any company today can dedicate a little bit of their time to make a phone better than what the iPhone ever was. Every single one of these companies had something to gain from this while making it worthwhile for everybody else to use. That's how the true FOSS principle works...

And the result is today nearly 80% of all Smartphones in the World are Androids.

Same thing could happen to GNU/Linux someday, but they need to start dedicating themselves to the "for everything, with everything" principle to win people over instead of arguing with each other over pointless debates.
Until then, we're walking through tar.

Attached: morninglonelytree.png (1920x1080, 57.73K)

Tryhard. New lusers only need to learn to install zsh or fish and then spam tab to do whatever they desire.

The 64bit standalone executable is 6.91MB.

blablabla linux the best
Look at my filesizes, without including the filesizes of it's dependencies exclusivly used by that program
and ignore that they can only do one format while the other can also display other formats and is a lot more polished.

Attached: adobe reader.jpg (2040x8832, 2.38M)

linux is best when you use it in the most classic way as the Unix knockoff it once was. All that legacy stuff is in there. Build your own vt100 terminal and hook it up to your serial terminal. Use X11 with a classical toolkit at a low resolution. That stuff just works and you can run it on a fucking pocket calculator. Build your workflow around classical usage. It requires getting the soy out of your system first but it's very efficient and after a while you can only laugh at the problems "modern linux users" have. It's hard to do because you actually have to know how things work and become a wizard, but it's very rewarding once you do. People get irrationally angry at you if you talk about going the classic route, my guess is because they themselves know they wouldn't have what it takes and are lost without modern handholding and bloatware. As a bonus you'll also quickly realize we went very wrong somewhere around the early to mid 00s.

Things go wrong when you try to turn it into a free (as in beer) windows knockoff with maximum normie/soyboy appeal, transparent windows, doubly surround sound, gayming et all. Hint: Drop the charade and just use windows. Trying to please everyone in the GNU/Linux landscape just ends up pissing off everyone. Everything that has been bolted onto GNU/Linux to give it normie appeal or make it more similar to windows (systemd, modern GNOME et al) is utter garbage written by people who actually want something entirely different from what it was intended to be once.

Apropos of nothing: Try using android as a desktop. Doesn't fucking work. I seriously tried because I wanted a machine for botnetting and I am always open to try new things myself. Android with keyboard and mouse and one of the small ARM SBCs seemed like a good choice. It's theoretically supported to be used that way but honestly, it's garbage. Like using a fisher price toy. Android in general is clobbered together, poorly documented garbage and eventually it'll collapse in itself. They just used the Linux kernel because it was already there and it was easy to hack for the different ARM SoCs. That's also all they're doing. Hack.

Use a pen and paper. Realise electronics are shit. Feel 1337 because you have the best workflow in history.

Enjoy filing and sorting 1000 pieces of paper. Enjoy your literal copy and paste.

Actually, I take that back about Avast. I think I was looking at TIME_WAIT connections (maybe from my torrent client or firefox), which are just connections that are taken over by the kernel for a few minutes after being closed to catch any delayed packets and prevent them from disrupting new connections.

What if the police raids your home? Then they'll be able to read anything you stored on these papers.

How well did your attempt to play the programming game in C go?

...

that looks amazing.

real difficult


Whereas computers are notoriously hard for people to access right?