Haskell programming language

hackage.haskell.org/package/primitive-maybe-0.1.0 llol @ haskell

This library provides types for working with arrays of Maybe values. The types in this library can be used as replacements for Array (Maybe a) and SmallArray (Maybe a) that consume less memory and have fewer indirections.

so haskell is so slow that you have to optimize array of maybes. And you can't do that properly you have use a totally new array type. It get sworse:

nothingSurrogate :: Any
nothingSurrogate = error "nothingSurrogate: This value should not be forced!"
{-# NOINLINE nothingSurrogate #-}

unsafeToMaybe :: Any -> Maybe a
unsafeToMaybe a =
case reallyUnsafePtrEquality# a nothingSurrogate of
0# -> Just (unsafeCoerce a)
_ -> Nothing
{-# INLINE unsafeToMaybe #-}

newSmallMaybeArray :: PrimMonad m => Int -> Maybe a -> m (SmallMutableMaybeArray (PrimState m) a)
{-# INLINE newSmallMaybeArray #-}
newSmallMaybeArray i ma = case ma of
Just a -> do
x do
x Maybe a
unsafeToMaybe a =
case reallyUnsafePtrEquality# a nothingSurrogate of
0# -> Just (unsafeCoerce a)
_ -> Nothing
{-# INLINE unsafeToMaybe #-}

(yes its duplicated)

github.com/andrewthad/primitive-maybe/blob/master/src/Data/Primitive/Array/Maybe.hs
github.com/andrewthad/primitive-maybe/blob/master/src/Data/Primitive/SmallArray/Maybe.hs

look at the 2 files side by side exact same shit LOL. Fucking beautiful abstraction powers. lambda calculus is amazing.

this language is joke

Attached: main-qimg-814f75aadd13c0b23059fa906aa5d226.png (450x354, 263.22K)

Other urls found in this thread:

en.wikipedia.org/wiki/Memory_leak
patents.google.com/patent/US6560773B1/en
en.wikipedia.org/wiki/Unreachable_memory
ocaml.org/learn/tutorials/garbage_collection.html
twitter.com/AnonBabble

LMAO fuck off shill

Attached: DgAh0ACX0AEsa-N.jpg (960x576, 83.73K)

Hi guys! I'm doing hre some really low-level haskell (manual ptr management). I've got a container, which I created a debug Show instance for: instance Show (SmallVectorA a) where show = show . unsafePerformIO . toList . There is noinline pragma on it. However I observed that somertimes this show prints wrong values in non-threaded env. Coudl anybody tell me why it could happen? O jsut want to understand it

practical functional programming == bending over backwards to convince yourself you're not using global mutable state. if you don't tell no one will notice *wink*

baka baka

haskell is gay the more languages developed the more lisp proves it was ahead of its time

u guys are real high IQ I can tell

Attached: DWk2uWDUMAEjzev.jpg (396x396, 25.82K)

1) IO is not the only monad
2) Entire point of IO monad is NOT having side effects in program, but moving them outside (into hypothetical runIO that lies outside main). Otherwise you could just have functions with side effects, like ML

Wrong. Monad literally means one or unit.

bunch of people debugging whether making Either strict or making patter matches lazy would fix a memory leak. nobody knows why its leaking. I Thought this language had GC>

Lol

user this is 101 tier programming language shit. Everything has memory leaks GC or no.

GCed languages don't leak. I'm aware that webdev shitters have tried redefining what "leak" means to where simply not discarding something is a leak, but the real meaning of leak is that the object has leaked out of the programming environment and is no longer accessible to the program yet is still consuming resources. Allocating unmanaged memory and then losing the reference to GC would be a real leak in a GCed language, like with UnmanagedMemoryStream in C#.

There is no "redefining" here retard. Its what it means everywhere.

LMGTFY
en.wikipedia.org/wiki/Memory_leak

Yeah I know faggot, I told you it was redefined. We never used to call forgetting to free something a "leak".

Memory leak makes no sense if it's just memory you still have control over.

Analogy for the correct use of memory leak:
- You fill a sink with water but there's a crack. Some water leaks out into the room. You can no longer remove all water from the room by emptying the sink as it has escaped from the sink.

Retarded modern definition by Java Pajeets:
- You fill a sink with water. Pajeet points to the water in the sink and says some of it has leaked, yet it remains in the sink and can be emptied at any time.

LOL. Go home kid.

Attached: DUzkP4FWsAY71y8.jpg (750x717, 65.74K)

...

Attached: DZQqR9hVMAAQode.jpg (475x268, 38.3K)

In your second analogy you are forgetting the part where the water is overflowing the amount of water the sink can hold spilling out into the room.

You worthless goy, you can't even think for yourself, you trust the most manipulative people to decide what's true or false for you even when the correct answer is self-evident. Here's a patent application from 1997 with the definition we used to use of "memory leak" before Javafags co-opted a very old term. This is a legal definition that has been reviewed by a patent reviewer for accuracy.
patents.google.com/patent/US6560773B1/en
FUCKING OWNED, BITCH. SUCK MY DICK. CHOKE ON IT.

Attached: b-but kikepedia.png (487x80, 15.15K)

LOL

...

So he uses the same modern definition as wikipedia

...

...

Haskell is a joke, but you actually have to be a smart person to get the punchline (and that can come at the cost of many months of having being sucked in to a secret club...)

Attached: DeO2FLzUQAAYBy-.jpg (828x465, 75.08K)

yeah dude we should all have accepted that the earth was flat and was the center of the universe

the rope awaits your supple neck numale.

user this is not math

I'm partial to the "correct" one, but think about it like a cycle of a program not returning resources it's done with. It will take more each time in a way that's unsustainable and doesn't reflect actual resource use.

"correct" aka the wrong one

...

Yeah memory usage increasing forever when it should be freed. No way that could be a memory leak.

"The toilet leaked"
t. Zig Forums user

bad analogy
We don't call it a garbage collector leak

"The toilet leaked"
t. Zig Forums user

Attached: DZ3qU4oWkAAPjEk.jpg (1200x1200, 81.32K)

Anime rotted your brain. Ask your doctor if it's not too late to reattach your balls so you can resist shoddy redefinitions of a phrase that used to make sense.

LOL

You mean the original definition that it has always been used by everyone except for larpers like you.

...

Attached: 38a06bf079f727be1fd04016b5bf9494987511f9e45ef20f8751f363de482150.jpg (492x661, 105.25K)

He's right, though. When people talk about irl leaks they mean a flaw in the system not simple negligence.

Yes a flaw. Like letting random shit collect in your memory forever.

Attached: creating leaks.jpg (900x675, 442.25K)

LOL

Attached: DWhtk_KWkAAHHTm.jpg (749x702, 56.52K)

LOL
Memory in C can leak or clog.
Memory in a GCed language can only clog.
Negligently not cleaning when you should leads to a clog. Losing the ability to clean is a leak. This is superior terminology and I shall use it from now on and you should, too.

Attached: DZ5nLNVVwAEYTRp.jpg (396x594, 45.05K)

Here user you are thinking of
en.wikipedia.org/wiki/Unreachable_memory
not
en.wikipedia.org/wiki/Memory_leak
glad I could clear up those 101 tier definitions for you

This only supports my point, user.
There are two issues:
- Unreachable memory
- Neglected memory
In C:
- Unreachable memory -> leak
- Neglected memory -> clog
In Java:
- Unreachable memory -> normal operation
- Neglected memory -> clog
GCed languages CANNOT LEAK! Wikipedia agrees.

...

No you retard. It's that losing a handle in a manual language causes a leak. Where as losing a handle in a managed language does not. Not every leak is caused by losing a handle. JFC what pajeet university did you graduate from. Get off this board kid.

Attached: DWUw9veW0AAPSqt.jpg (1200x800, 157.16K)

huh lets see what they say

let's go back to this:

What's the point?

Attached: DYbRilRX0AAUl7G.jpg (1080x607, 43.36K)

ocaml.org/learn/tutorials/garbage_collection.html
free()dom ain't free. It can cost more than a good GC costs.

It's not an error as far as the language is concerned. The language is still operating properly and tracking resources in the memory clog scenario. I can even connect in via JPDA and unclog a running instance if really necessary (e.g. a Mars probe).
Real memory leaks are legitimate errors at the language level as the ability of the language to ever do anything about it is gone. For implementations of malloc that don't store the metadata necessary to do a heap walk, I couldn't even cure a running instance via hacking outside of the language with ptrace. That Mars probe is fucked.
Static analysis cannot identify memory clogs, but it can often identify memory leaks. That's a clue that they're completely different issues.
It's never too late to make a change. You just have to persist and not give up. RMS persisted for almost 25 years on the GNU/Linux naming and today many people call it GNU/Linux. Being shut down by a mere wikipedia page where the term is obviously inaccurate is pathetic and shows a lack of necessary autism for programming.

You mean everywhere

Attached: DcZRUDPVQAAtNfC.jpg (500x324, 26.16K)

Humm let me google your new shitty term "memory clog"... first result:
:^)

It's disingenuous. GC proponents never mention how severe the penalty of compaction-based GC is in cache-constrained code which is practically everything fast today. And they need it to make their claims about generational collection not being slow. The first rule of optimizing GCed code is to avoid ever freeing anything which should tell you something about how fast GC really is in practice.
Meanwhile, fast C code avoids malloc/free, we use slabs. In networking, we do this not just for speed but for reliability - it avoids unpredictable memory use due to fragmentation that could cause a crash. Were GC a better method we'd use boehmgc.

Attached: DYGf59vXUAAoid4.jpg (1200x759, 142.92K)

You'll never be someone like RMS. You'll always be some weeb LARPer who has to ask someone's permission to believe something.

Have fun believing in 42 genders, trannies not being mentally ill, and gamergate being an attack on women in video games.

Attached: DW--KenUMAAQGLT.jpg (657x493, 46.35K)

Proper GCed languages (Javalikes) don't have slab allocators as they're limited by type safety. The closest thing without resorting to unsafe code is object caching. Back to India, Pajeet.

what kind of retard faggot are you

You must contribute so much.

okay rat. more c fags that think they are the only ones that can do pointer arithmetic. it is simply impossible to manipulate the hardware in anything other than c.

...

you sure know a lot about java user

Attached: DXXXjnFX4AE_1XM.jpg (1200x941, 119.07K)

I know a lot about everything, qt. I actually program.

Just make sure to wash your hands before using the computer Aditya then you can continue your Java allocator work.

How did you even manage to finish typing this?
Who gives a shit what the language thinks?
On the many subjects of human error, languages are either helpful or not. Mostly, they're not.

Is it that hard to process? Memory clogs are a significantly different class of error. It's an error that the language cannot tell is an error, and thus static analysis cannot tell is an error, and thus runtime analysis cannot tell is an error, but can be corrected at any time during execution by a sufficiently capable language (many safety-critical languages have facilities that can be used for that purpose).
It's impressively stupid to try to roll that up with memory leaks, which are errors according to the language, and thus can often be detected by static analysis, and thus can often be detected by runtime analysis, but might become uncorrectable by any means during execution.
Memory leaks are a systemic failure, not just a decision made by the human that is later considered to be a mistake. As such, they are far more dangerous and warrant care to avoid. They're also worth designing automated tools to identify, and doing research on finding the best methods of detection and avoidance. There's no point to doing research on memory clogs as their cause is your programmer is a slob. No automated tool can save them from being a slob. The only tools in this space are heap visualizers which assume your slob is too slovenly to notice a problem without a visualizer rubbing their nose in it until they decide to change.

Attached: memory clog.jpg (870x652, 76.68K)

This is incorrect. Imagine this scenario. When a user connects to the system a user object is created. When the user disconnects this user object should turn into garbage. You can prove that this happens by showing that all the references to it have been removed. I don't get why you think it is impossible. These """clogs""" happen when an object does not turn into garbage at the end of its desired lifetime.