Go core devs on suicide watch

TL;DR: My GF wrote a coroutine library whose context switches are more than 20 times faster than Go's goroutine scheduler and basically Go is cringe tier.
My GF wrote a C++ coroutine library. When she benchmarked it on her Intel(R) Core(TM) i7-8700 CPU (3.20GHz) on 12 threads, she achieved a system-wide throughput of 166MHz (6ns) for coroutine context switches, while the equivalent Go code (also 12 threads, same amount of goroutines and iterations per goroutine) performed miserably at 7MHz (140ns). When compiled with gccgo -O3 it was even worse at 3.8MHz (263ns). This means her implementation of coroutines is more than 20 times faster. How can you fuck up the main feature of a language that bad, considering Google had a whole team of developers working on it, and she did this on her own?
Benchmark suite repo: github.com/sm2coin/libcr-test
Go benchmark code: excuse the shitty codepackage main;import( "fmt" "runtime" "time")func loop(i int, finish chan int) { for i > 0 { runtime.Gosched(); i = i -1; } finish

Attached: ClipboardImage.png (500x607, 189.79K)

Other urls found in this thread:


Stupid nigger

Attached: ClipboardImage.png (200x200, 62.75K)

I rate it tranny/cuckchan.

My GF doesn't know about the chans. And she's a real woman.

You wearing drag and a wig does not count as having a GF.

It was real in my mind, though.

your code sucks and you say that just so people would not say mean things to you


The best part is it's probably not even faster because of anything she wrote, it's just faster because Go's snowflake-compiler can't optimise for its life.
also how 2 get smart gf, all my crushes end up being physics girls

[spoiler] traps :^) [/spoiler}



ffs that whole GF thing was a fucking joke to serve as a pseudo deanonymisation because of self-doxxing post. I never even held hands before.

I presented my coroutine lib which is verifiably 20+ times faster than goroutines.

As stated before, the whole GF thing was a joke.

I meant to write depersonalisation

I knew it. It's a pretty clean implementation. About the only evidence of potential chick-code I could detect was superfluous comments in the headers, e.g.
/** Initialises the scheduler. */HybridScheduler();
I mean, c'mon...

I know it's retarded in some cases, but I document every entity for the Doxygen docs. This sometimes results in stating the obvious, but otherwise, it can quickly happen that my lazyness tales over and I document less and less.



pic or fake

I don't think goroutines are coroutines.


Goroutines are coroutines: Routines that can be paused and resumed at will. Although Go doesn't have the yield and notify keywords, it does have runtime.Gosched(), and channels can be used to notify and await stuff. It's certainly not as powerful as coroutines in other languages, but I'd still classify it as a coroutine. And only having restricted instead of general capabilties, one would expect them to be noticeably faster than coroutines, but alas, Google writes bad code and fucked it up.

btw, even in debug mode it's way faster, so even if the go compiler could optimise, it would still lose against my code. That's why Go is cringe tier.

No surprise. Go is fucking shit. The only "positive" things about Go are:

Go doesn't even has min/max functions for integers. What a a joke.


after reading this I still don't understand why you think it's so apocalyptically bad, most flaws boil down to it's essential philosophy (dumbed down, goroutines, muh unix, muh 70s), or it being too young (muh generics, muh "slow", muh shitty compiler)

Go does nothing that other languages aren't doing better.
The only reason Go exists and is popular is Google.

Get a real compiler.

Idk man, I think it's doing pretty good, plus, no other programming language is as easy to optimize for or even gets near to go's coroutines.

It could easily replace babby's first languages like python or ruby while letting some growing room and not being extraordinarily terrible like the aforementioned.

The google hate is unjustified, even if it weren't for google it would've existed, it just happens that they got Rob Pike and Ken Thompson as token employees

Attached: Screenshot from 2019-06-21 11-15-53.png (1145x300, 41.57K)

Did you even read the OP?

yes, C++ isn't exactly a babby's first programming language

Yes, that's exactly what I hate about Go.
They omitted them on purpose because muh simplicity spiralling, because first-time coder interns at google can't into meta programming. And now that even experts are using it, they suddenly want to add back generics or something similar to it. In the end, the community will make it a second C++, but with completely gay syntax and semantics and worse performance.
If you really want to make a good modern programming language, then just improve C++'s syntax to be PDA-parsable and remove some obscure legacy shit.

My first programming language was C++. That was back in 2010. Before that I did a few hours of C64 basic, but that's hardly what I'd call real programming, it is so oversimplified that it hardly relates to modern programming.
Only pussies start out with a "beginner's language", which gives them only a superficial understanding of how computers work, and they stick to it for far too long because it's comfortable. Instead of needing 3x the time before being able to write a program, but gaining a deeper understanding, they just fuck around scratching on the surface for years. Those types (and SJWs (inb4 they are the same)) are exactly what is wrong with the programmer community, and probably the only reason languages such as python and Go are shilled everywhere.

Did you not read the OP properly? I said that gccgo was even worse than Google's compiler, which implies that I use both. Or are you hinting at a third go compiler that is superior?


If the compiler isn't complete trash this will get optimized away.

You fucking idiot. This is a scheduler benchmark.

Are you retarded?




No. I am posting with 2 IDs in this thread as I used my phone and my work PC.

Attached: ClipboardImage.png (310x124, 8.45K)

Fine, I'll only use c++ now, enjoy your segfaults

And to clarify, the only phone posts I made are

Use Rust

The last post is mine though.

No, I enjoy having a penis

Why would I have segfaults from you using C++?
Go back to your safe space language

Oops, well I thought that I could have written that, whatever. Point is, I did not samefag the way I was accused.

Ever used anything written in c++ by hobbyists? It's a hot mess
owncloud desktop client

I'd rather have them written in safe space languages, simply because I agree with go's philosophy that programmers are stupid and shouldn't come near c++

Don't tell me about smart pointers, they're calculated in compile time and thus limited in effectiveness

You're right, you shouldn't be using C++. But you shouldn't be programming computers at all. Even using your safe space language, you're just shitting up the code so your more competent colleagues will have to spend more time fixing your shit than the time they'd spend to write it from scratch. So just do the world a favour and become a trashman or something.

Sadly there's not enough c++ geniuses to fill up basic crap jobs like microservices engineering, which go covers well.

So I understand that your problem with go is that it isn't c++?


Ever compiled anything written in rust?

Coming back to this, context switches aren't a real world benchmark and no one will ever use your gfs library for anything


Proof that goroutines are fast in a real world environment

I was going to reply saying you are just 100% wrong about goroutines but then I noticed you were being ironic. haha good one faggot.

Most programs nowadays are made by the masses for the masses. To someone who is even a little bit of an idealist, any mainstream software is fucking cancer. You'd literally have to be some niggercattle to be content with mainstream software. When some autist voluntarily writes software that will last, without any deadlines and retarded tradeoffs (so basically, if he's not prohibited from "muh over-engineering" by the company), then once it is mature, that software makes it much easier to write more complex high-quality applications. Company code was never on the same level as autistic hobbyists' code.
Well, I also have problems with C++, but C++ is by far the best language I ever came across. But yeah, go is so much worse than C++ that there is no excuse for using it unironically.

So, if you have massive amounts of parallelism (say, 100k to 1M coroutines), which basically only occurs in high-performance applications, then it makes a huge difference whether you spend 1% of your time context-switching, or 20%.
read the thread before posting.

I bet it would be much better if they used C++ though. This is exactly the "good enough" mentality that I hate so fucking much.

Many times. It just werks.


That's using a database, I'm talking about the library itself

You're talking about microbenchmarks that have no basis in reality.

Isn't this what the thread is about?

Yes. Although context switching is a serious concern once you go massive parallelism.

once you go that far wouldn't you be using c++ anyways

Well, that's what I was thinking, but then came all the Go fags saying "muh cheap parallelism" and triggered me into this rage.

It's cheap, but it's not that cheap, 1kb is reasonable af

Again, for doing fast crappy jobs it's perfectly fine, and it's probably worth the savings in development time. Good enough means not throwing money and time into a bottomless pit. It's better than python by a long shot, and python is the go to language for that kind of job, so be glad it exists and embrace it

And make no mistake, if someone writing a language designed to support cheap parallelism doesn't make it truly cheap, you know what levels of sloppiness to expect in all other parts of the language. Google doesn't give a fuck about the language, there are no ideals behind it. They just want something barely functional and simple enough to employ masses of cheap programmers for their world domination machinery.

It's very sloppy overall, especially the standard library, but it werks

Cool thread

In libcr, coroutines have an overhead of 56/48 bytes (depending on whether you use the compact instruction pointer mode). Although you have that overhead for every recursion, that would still beat goroutines until you have a nesting depth of around 20, which is pretty rare.

Considering that python already existed, I wonder why Google even made the language. Nvm, it's because they want to lock in everyone technologically. That's also why you have to hardcode the remote location of your dependencies into your source files, so that components become hard to exchange, and you can't just use forks by third parties without having to go over your whole codebase.

see . Read the thread before embarrassing yourself.

I assume it's slow because it needs to have all the NSA Spyware shit in there.

I ran the Go code. Here is the output:
Is this a joke??????????????

I wrote a Rust version:
#![feature(async_await, duration_float)]extern crate futures;use { std::{ future::{Future}, pin::{Pin}, sync::mpsc::{self, Sender}, task::{Context, Poll}, time::{Instant} }, futures::{ executor::{ThreadPool}, task::{SpawnExt} }};struct Yield(bool);impl Future for Yield { type Output = (); fn poll(mut self: Pin, ctx: &mut Context) -> Poll { if !self.0 { self.0 = true; ctx.waker().wake_by_ref(); Poll::Pending } else { Poll::Ready(()) } }}fn main() { let mut executor = ThreadPool::new().unwrap(); let iterations = 10_000 * 20; let (sender, receiver) = mpsc::channel(); let start = Instant::now(); let coroutines = 600; for _ in 0..coroutines { executor.spawn(l(iterations, sender.clone())).unwrap(); } for _ in 0..coroutines { receiver.recv().unwrap(); } let duration = start.elapsed(); println!("{}s", duration.as_secs_f64()); println!("{}MHz", (coroutines * iterations) as f64 / duration.as_secs_f64() / 1_000_000.0); println!("{}ns", duration.as_nanos() as f64 / (coroutines * iterations) as f64);}async fn l(mut i: u32, channel: Sender) { while i != 0 { Yield(false).await; i -= 1; } channel.send(1).unwrap();}

This is where diversity and anti-meritocracy and "good enough" leads you to. Dumb your software down so even retards can contribute, and then praise whatever the outcome is (regardless of quality), confirming your bias that diversity is a strength.

Should've gone all the way buddy, ffs don't drop the act that quickly when pressured faggot.

I wasn't even trying to convince anyone, it was actually supposed to be so badly crafted of a story that everyone would instantly see through it, but alas, Zig Forums has some newfriend troubles.

I switched computer so the results are a bit different.


C++: pastebin.com/VnSA1DDA

Thanks, appreciate it.
Heh. Another reason I won't ever use rust.

Attached: ClipboardImage.png (747x850, 548.25K)

stop lying or gtfo

I don't think you're aware of how bad Go's optimisation is.

Yes, that's the problem. Essential philosophy permeates an entire language.

Stop being a fag and write D

I was making the case that even if I don't use C++ optimisations, my code still much faster than Go. So the hypothetical argument "you're comparing optimised code with unoptimised code" is irrelevant, as even if Go had optimisations, they would never be as fast as my optimised C++ code.

goroutines are still not coroutines you mong

Whatever it is, it is painfully slow.


You're probably doing something completely different than the goroutine. Go pretty consistently benches around 2-5x a decent C/C++ solutions for similar algorithms.

This one irks me. Everyone praises Rust for generics, which it monomorphizes. That means the compiler generates a type-specific variant of your function for every possible type the function could take (that the compiler detects). There are plenty of code generation tools in go that do the same thing, and it's a pretty uncomplicated tool to write oneself. But everybody flails their arms and acts outraged about it. I mean, typing
`go generate && go build` vs
`cargo build`
is treated like some real big fucking deal. They're doing the SAME thing, but because one is done by the compiler, it's enlightened, woke, based, and redpilled. Using a tool or writing your own for an infrequent problem just gets all these delicate programmers in a huff.

Meanwhile, rolling your own generics saves your compiler a lot of complexity, which, almost definitely improves build times. Paying your $150K/yr software "engineer" to read fucking xkcd and Reddit for 10-30 minutes EVERY time he compiles is a huge waste of money. You can hate Google for insulting your obvious brilliance with an unworthy compiler and language spec, but, from a business perspective, they made the right decision.

This is peak UNIX braindamage.
Holy shit you're retarded.

Also why should the compiler read the source code from multiple files scattered around in various directories? Just write a tool that does this and pipes it to the compiler.

Why stop there? Make every optimization pass and the code generators for different architectures seperate binaries.
Do one thing and do it well. Why isn't there a single compiler conforming to the UNIX philosophy?
Maybe we should change that. I'll get started on the logo.


Don't forget the furry anime mascot!

Did you write a bunch of shit defending C++ With its header and make file bullshit with one breath and in the other make a reductio ad absurdum argument that the Go compiler doesn't for literally everything you want? Shit, dude, why doesn't your compiler write the program for you? Why doesn't it provision some cloud bullshit, containerize your "app" and deploy just by reading your thoughts? Real brain damage, there, fren.

Are we comparing benchmarks between futures (an event loop) and green threads? Is that what this whole thread is about? Holy shit, Mr. BS in Comp-sci, this is even bad by Zig Forums standards.

What the fuck is wrong with you? Read the post he was answering to, everything was satirical. The joke is that Go doesn't have generics, and you need an extra tool just to generate template code, which begs the question as to why it is not part of the language.
He had to manually implement a yield function, if I observed this correctly. That's why he has to use futures. But the whole thread is about my C++ implementation being superior to Go and Go being cringe tier.



The "why" was rhetorical, and more of an appeal to common sense.

I know it sounds like bullshit but it's actually well thought out. The only problem is that none of us have the spare time to finish the project. I think we have around 80% of the specification done, but there has been no progress in about a year.

Go is not intended to out-compete properly-written C/C++ on pure performance benchmarks. It's intended to be a useful general-purpose language with *reasonable* performance that automatically prevents bad programmers from making the sorts of dumb mistakes that have led to so many security issues due to poorly-written C/C++.

And yet it fails to implement even basic functionality

Stop drinking soy and get cargo cults you learnt about on reddit off your head.
It doesn't matter if it's superfluous, it will avoid trouble.

Attached: make anon less of a faggot.jpg (1508x1000, 233.98K)

Nobody has a profit model for any blockchain "technology" that doesn't involve swindling fools out of investment capital with a buzzword salad. More than a few software "engineers" are going to need to take up professional lawn care because the investment capital (and the Fed through QE) who prop up the entire industry aren't seeing companies with stable, sustainable profit models emerge from those dollars. If your return on investment completely depends on finding more, new investors, then you have a ponzi. About 50-75% of devs I know are in the ponzi-tech/vaporware-marketing-hype economy. These pajeets and hipsters are about one bad month in the stock market away from "do you want fries with that." The rest work for government contractors or are in-house development for companies that make actual products. Oddly enough, those are almost all stable, married, java developers.

Contrast that with you:
"Minimal ecological footprint" is actually one of your stated goals. Hey, here's a plan. Don't make any of this gay vaporware shit nobody asked for. You'll be amazed at how much electricity you saved. Carbon-fucking-neutral, baby!

What are the odds that OPs truly brilliant event-loop that... uhh... computes a benchmark, definitely has the same runtime characteristics and is a comparable implementation to goroutines. I'm sure the ability to detect race-conditions is built right in, too. Of course, OP is just much, much more brilliant than anybody at Google, so his machine instructions are just faster than dumb googler-written machine instructions. Computers, can sense the intelligence of their programmers and go fast or go slow accordingly.

That doesn't follow. By virtue of being superfluous, it wastes space and increases cognitive load while not adding anything of value. So explain: how exactly do superfluous comments "avoid trouble"? I'm all ears.

Meme responsibly user.

Attached: 261b855b66490c21d8e430bfaa9d673684351395ab09a13c61a0602849106919.jpg (786x800, 314.54K)

No. C++ is trash. Go is worse though.
Are you retarded? How do you think those futures get executed?
Protip: The first line in the main function. It's a fucking ThreadPool. There is no event loop. Only green threads.