Technology, at least commercially available technology, should have stopped advancing after the year 2000

Technology, at least commercially available technology, should have stopped advancing after the year 2000.

During '90s, especially the late '90s, technology reached the sweet spot where it was useful, provided us with more opportunities, made our life easier and at the same time wasn't as pointlessly addicting and full of botnet as it is now.

Despite having objectively better hardware and a better connection infrastructure, however, newer devices offer pretty much nothing we couldn't do already in the late 90s, except we didn't have those fancy bloated interfaces.
We are effectively utilizing more resources for useless shit that doesn't actually benefit us in any way.

While there is indeed legitimate use for this increased processing capability (think scientific or real time computing) the average consumer just never needed such powerful hardware before bloated software and the bloated web came, and they still wouldn't need it now if software stayed lean and light.

I firmly believe that commercially available technology started evolving in the wrong direction after year 2000 or so, but while I think the situation was overall better in the past, I'm not saying it's the best possible scenario ever. We can still and we should do better, we just have to get back to the right track.

tl;dr technological innovation took a wrong turn after 2000

Attached: comfydoom.jpeg (500x500, 38.25K)

Other urls found in this thread:

primitiveways.com/agriculture.html
washingtonpost.com/wp-srv/national/longterm/unabomber/manifesto.text.htm
twitter.com/AnonBabble

You're completely right, and for the most part, we're still using much of the same software that was around in 2000. While there are a few good advances: laptop batteries, cheaper SBCs, affordable FPGAs, digital SLR cameras etc... it's not significantly different, certainly not in the transformative sense experienced from 1980 to 2000.

Just because you don't have use for modern computers other than playing video games doesn't mean other people aren't doing real work and will always need faster hardware.

This. If your use case is shitposting you can use anything, but people who compute with their computers appreciate the advancement.

What kind of "real work" do you do that always demands faster hardware?

Absolutely not. Technology should've stopped advanced after the Upper Paleolithic.
Agriculture only brought weakness and suffering to the human race, from the apex predator we became weak pushovers able to defend themselves only in groups.
You can read more about how agriculture was an act of desperation here: primitiveways.com/agriculture.html

We shouldn't have progressed past the levallois technique. Neanderthal technology was already enough for a beautiful, resilient and cultured species.

Not only that, from a healthy and thriving species with humongous brain capacity we became unhealthy, oversocialized slobs that have no control over their own life and fate, being mere pawns in the hand of the techno-industrial system.

Attached: a447637c-3faf-4c86-8f37-4045d08cbfc2.jpeg (720x561, 64.38K)

ReactOS shows how bloated Windows is

Dumb samefag who completely missed the point.

Do we really need these people doing their "computing" for a better life? Last I checked, most technological progress has turned out to be used to push even more Orwellianism rather than enhance your or my life in any meaningful way.

For example, why do we need smartphones when we should have just stopped at cell phones with voice and texting? What have smartphones done for us that haven't ended up just making people even easier to hate than they already were?

It's like you didn't even read the OP.

>Despite having objectively better hardware and a better connection infrastructure, however, newer devices offer pretty much nothing we couldn't do already in the late 90s, except we didn't have those fancy bloated interfaces.
Patently false. PC-wise the fastest and most expensive single system in 1999 was probably an SGI VW 540, which you could load to up to 2 GB of RAM, quad Xeon 550/2Ms and three 10K SCSI disks internally. Cobalt graphics weren't gaming-oriented but they blow OpenGL out of the water and paired with a Voodoo2 SLI setup they can be pretty great for anything, aside from DirectX or games which won't run under Windows NT (or 2000).
That said, I'll admit software became garbage after the late '90s.

How did the industrial revolution make things easier? How did agriculture make things easier? You can go on and on, and you end up on the fact that all agricutural technology is useless and used to enslave the human.

Agriculture made the human weaker, made him depend on crops not failing for food, ruined their diet, and made them frail and specialized in only one thing.

The industrial revolution just made things worse at a global scale. You can figure this out yourself, but for a fun fact some philosophers thought the industrial revolution would decrease our work week to 20 hours. Yet here we are, still having 40+ hour work weeks.

But that was exactly OP's point, current software quality simply doesn't match current hardware quality, with the former having worsened, and the latter having improved, botnet aside.
What good is powerful hardware if the software it runs is garbage?

Be unborn.

washingtonpost.com/wp-srv/national/longterm/unabomber/manifesto.text.htm

It really depends. There are no *great* DEs for Linux, but there are a lot of WMs that give you a '90s interface (IceWM for example). So, the GUIs today are terrible, but you can't claim that a workstation with a Threadripper + an NVMe RAID + OC'd RAM wouldn't blow the shit out of the best '90s box. We could do so much more with modern hardware; it's just that we have too many clodhopper idiots programming.

No, the 90s (and to a lesser extent late 1980s) were when everything fell apart:

Attached: report post.jpeg (211x120, 6.33K)

Read it 3 times.

What do you mean? Are you implying our current standards of living are superior to that of Paleolithic times? If yes, please respond with reasons why you think so, and I will refute most (if they aren't made up standards) if not all of them.

Well, no shit, who claimed that? Have them run the exact same application, and of course the former will murder the latter into oblivion.
As I said, the issue is not with hardware itself. Current hardware is undeniably better, anyone who denies this is just a delusional neckbeard. The issue, as I already said, is that it's not actually used to do more, bigger, better things, and we have idiots programming, as you pointed out.
That also has the nasty side effect of letting programmers get away with inefficient, unoptimized solutions and less discipline in general. I find it quite ironic, since all this bloated shit software seems to be created by proponents of "agile" methodologies, meme design patterns and the like, while programmers who couldn't give two shits about it had way more discipline and produced mean and lean end products.

Sorry. Just re-read your post. I really suggest you read Industrial Society and It's Future, it has all the answers you need, but in short, yes, I agree, productivity has increased, but is it really for the good?

The hunter-gatherer has 40 hour week. He mostly spends it on activities that most hunter-gatherer's consider leisure activities (that are still not surrogate activities that don't fulfill your basic needs, i.e. reading books, advancing science, etc) like chasing game, gathering plants, etc. The rest of the time is spent on boring (to the hunter-gatherer) activities such as skinning, cooking, tanning, weaving baskets, collecting firewood. But the funny thing is, after work time, the primitive man either socializes with his tribe/family, or he can do literally nothing (like actually nothing) for hours. Doing nothing is inconcievable to the modern man, yet the hunter-gatherer finds leisure in it.

Leisure time, homever, in modern society, is spent on fulfilling your power process (a kind of requirement for every human being to have control over his own life, feel fulfilled, it's desrcibed very well in ISaIF) with surrogate activity (as described above), such as reading books, browsing the internet, having excessive sex, et cetera. It's spent on fulfilling yourself, because you can't do it with your job like the hunter-gatherer does. The primitive man fulfills the process by hunting, and general hunter-gatherer activities that make the man use his body to the fullest and to sustain his and his tribe's needs.

If by better standards you mean we are more healthy, then you're outright wrong. Civilization created problems and tries to solve them (failing miraclously in the process). For example, with agriculture came permanent settling of areas, with permanent settling of areas came high population, with high population came diesase. Technology tried to solve it and succeded, but nature is fighting back with superbugs (bacteria resistent to antibiotics) that can destroy humanity with the flip of a switch. You might have heard that some bacteria are now getting resistant to hand sanitizer.

If by better standards you mean we eat more, drink more, then you're wrong again. Hunter-gatherer's are known for their amazing balanced diets (even without knowing what a diet is! Would you believe that rationalizing everything doesn't work out good for you?), you can read actual info in pic related.

If you have any other thing that you claim is progress, but in fact is regress, feel free to question me.

Attached: 87080e79ad46f4f4c573d78bd13272b05f76e0bf0e8a1bbffdf9772286a7f956.gif (236x255 6.17 KB, 14.81K)

My point is that we are so insanely productive, interdependent, and specialized, that for nearly all of humanity in industrialized society, the amount of labor required in a workweek to maintain a perfectly acceptable standard of living and technological advancement isn't 80 hours, 40 hours, 20 hours, or whatever, IT'S ACTUALLY ZERO.

Industrialization has granted 99% of the population the opportunity for what was once the exclusive preserve of a handful of nobles, but only if we make a philosophical rather than technological advancement: Freedom from toil.

Surely they still teach people how to (properly) optimize programs in school? I'm also puzzled as to why it got this bad.

...

Costanza asked Sussman why MIT had switched away from Scheme for their introductory programming course, 6.001. This was a gem. He said that the reason that happened was because engineering in 1980 was not what it was in the mid-90s or in 2000. In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work. Code ran close to the metal, even Scheme — it was understandable all the way down. Like a resistor, where you could read the bands and know the power rating and the tolerance and the resistance and V=IR and that’s all there was to know. 6.001 had been conceived to teach engineers how to take small parts that they understood entirely and use simple techniques to compose them into larger things that do what you want. But programming now isn’t so much like that, said Sussman. Nowadays you muck around with incomprehensible or nonexistent man pages for software you don’t know who wrote. You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts. This is a fundamentally different job, and it needed a different course. So the good thing about the new 6.001 was that it was robot-centered — you had to program a little robot to move around. And robots are not like resistors, behaving according to ideal functions. Wheels slip, the environment changes, etc — you have to build in robustness to the system, in a different way than the one SICP discusses. And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all.

I hope you're not claiming hunter-gatherer work is toil, because it isn't, but I care do disagree.
Without human intervention, at least the current industrial system would fall apart in a matter of seconds. Humans are still at the core of industrial society, and they still have to toil to keep the machines running. Not only that, industrial society made up shit like work in service, which is outrageous. It's the most abstract kind of job imaginable.
What the industrial system needs to function automatically and independently is letting the machines control themselves and all human action (i.e. AI). This would of course mean end of humans as we know it, as AI would know of better (perhaps incomprehensible to us) ways to manage society than we do.

Large scale simulations. It would be nice if meteorologists could make semi-accurate predictions for an hour in advance. Machine learning is incredibly taxing on computers as well. As soon as it becomes fast enough, new problems which were deemed impossible to solve will become possible. It never ends.

Is hunting down something to eat (let alone butchering, preserving, and storing it) toil compared to pressing a button and instantly receiving food, especially when you don't feel like hunting right now? You fucking bet.
But how much human labor is truly needed? My point is that, if we reduced our standard of living/progress by 10x, 90% of the population would no longer need to work, period. Modern industrial society, however, is TENS OF THOUSANDS OF TIMES more productive.

The need for toil today is literally an illusion constructed out of pure ideology, absent any material basis whatsoever.

Ted Kaczynski explains why not doing anything at all is the worst idea in ISaIF. It's a good read, and will answer most questions you have. I can't bother opening it right now, but at least please bother reading about the power process and surrogate activity. They are actual chapters in ISaIF.

And yes, I agree, it's pure ideology. That's why I noted the gov will always come up with new jobs and shit to keep the citizens occupied.

What about artificial intelligence? The impending singularity? It seems like life has been made more convenient over the years despite many problems but that happens with absolutely everything. We need more powerful computers to create more powerful AI. Progress can't be stopped.

I trust neurobiologists more than I trust yellow journalist sensationalists. The former say AI will hit a dead wall, and soon.

Sussman is the epitome of a butthurt lispfag.

I refuse to believe that, given freedom from any obligation to claw and scrabble for base survival, most or even many of humanity would collapse into alienated neurotic masturbation. Greater material freedom has always resulted in an outpouring of human potential to new fields.


"Strong"/"general" AI is unquestionably possible for us to invent, unless you're an autistic dualist. The likelihood of our current research efforts actually yielding in any specific timeframe? Pretty much totally unknowable, as our grasp on the function of the human brain (let alone other more alien sapiences that are possible) is still extremely basic. It's a bit like fusion power, in that it would be cool to have, but there's no indication of when exactly somebody will stumble on everything needed to make it work, so we can't rely on its future existence when planning our technological development.