Zig Language

Zig is an open-source programming language designed for robustness, optimality, and clarity. It is intended to directly compete with C.

Key Features:
- Potential for developer awareness for every memory allocation happening; AKA the potential for perfect software with correct memory management in mind (EDGE CASES MATTER)
- Stress on compilation-time computations and heavy optimizations
- Syntactic additions are heavily discouraged -- no adding random bits of syntactic "sugar" just because it satisfies your eyes
- Cross-compilation from any OS
- Seamless usage of C code; NO bindings needed

Links:
ziglang.org/
github.com/ziglang/zig
youtube.com/watch?v=Z4oYSByyRak

Attached: zig.png (386x91, 9.97K)

Other urls found in this thread:

ziglang.org/documentation/master/
github.com/ziglang/zig/wiki/Why-Zig-When-There-is-Already-CPP,-D,-and-Rust?
youtube.com/user/superjoe30/videos
odin.handmade.network
twitter.com/NSFWRedditGif

Hello World

const std = @import("std");pub fn main() !void { // If this program is run without stdout attached, exit with an error. var stdout_file = try std.io.getStdOut(); // If this program encounters pipe failure when printing to stdout, exit // with an error. try stdout_file.write("Hello, world!\n");}

But typically you don't want to just print to STDOUT, so for a simpler interface for debugging warnings...

const warn = @import("std").debug.warn;pub fn main() void { warn("Hello, world!\n");}

Importing and using C header files is as simple as:
const c = @cImport({ @cInclude("stdio.h");});

Master branch documentation ziglang.org/documentation/master/

More on why you should care about zig: github.com/ziglang/zig/wiki/Why-Zig-When-There-is-Already-CPP,-D,-and-Rust?

Join us on IRC at irc.freenode.net #zig

Your own dupe:

Shit, didn't realize it did that. :/

Won't let me delete, pretty sure I'm using the same pw too. Rip

Wait 1.5 years for JAI

Looks like even more retarded ECMAScript

...

I want to like it but every time I look at it I like the syntax less or find another piece of ugliness in it. Like the switch statements.

spotted the LARPer

If you think Brainfuck is a bad language then you're a hypocrite.

Protip: Brainfucks syntax is fine. There are other issues with it.
Kill yourself, LARPer.

Double LARPer detected. Syntax can be a real issue. Look at Scala or C++.

LARPer spotted

>>>/g/

Currently doing this.

They don't really have the same goals, for example Jonathan Blowe loves implicit calls while zig purposefully avoids anything not explicit

An JAI doesn't look like Perl.

I'm still waiting for a language that is actually safe, that has NO edge cases where it crashes (unlike Rust). I have yet to see a language that handles stack overflows properly without crashing. Zig doesn't have that yet, but in the video that OP linked the language's author says he's working on it.

Why wouldn't you want it to crash when you fuck up?

I'd my software be able to clean up after itself if it fucks up. Or even better, to recover from the error and continue in a well-defined and consistent state instead of relying on the operating system to simply restart my application.

Stop using computers with finite memory. Leave the real world and join the world of academia.

with Ada.Text_IO; use Ada.Text_IO;with Ada.Integer_Text_IO; use Ada.Integer_Text_IO;procedure Smash is procedure Smashit (N : in Integer) is begin Put (N, Width => 0); New_Line; Smashit (N + 1); end Smashit;begin Smashit (0);end Smash;$ ./smash...174474174475174476174477raised STORAGE_ERROR : stack overflow or erroneous memory access
how's that?
60% of the time that same program dumps core

How about doing your job correctly? How about taking some responsibility and having some kind of quality control instead of wasting system resources because your software is shit?

Show me your quality control that checks stack usage.


Looks like guard pages, which requires paging. You automatically have those in your C programs, too. But that's not what I'm talking about. Crashing with a message is not "recovery".

That's an exception, so you can catch it:174534174535shit's fucked, trying to recover...donefrom:with Ada.Text_IO; use Ada.Text_IO;with Ada.Integer_Text_IO; use Ada.Integer_Text_IO;procedure Smash is procedure Smashit (N : in Integer) is begin Put (N, Width => 0); New_Line; Smashit (N + 1); end Smashit; Running : Boolean := False;begin Running := True; Smashit (0); Running := False;exception when Storage_Error => Put ("shit's fucked, trying to recover..."); Running := False; if Running then Put_Line ("and we failed!"); else Put_Line ("done"); end if;end Smash;realistically tho since this still fails 60% of the time, do as Erlang does and have an architecture that expects failure. Which means having the OS deal with it. An OS that "simply restarts" a daemon is an OS that can do more than that.

How would one write an OS that recovers from kernel stack overflows?
Also how can you be sure that your stack overflow never overwrites any of your application's other data? I think your stack-allocated objects would have to never be larger than your OS' guard page size, whatever size that may be. Imagine you were to put a large (multiple megabytes) object on the stack on a platform where the stack grows downwards. If you start writing to that object, you could potentially start writing below the guard page where other application data might be.

statically check stack sizes for kernel code
in case of non-tail recursion, have a hard limit on the number of calls which is proved statically to be impossible to exceed
problem solved

you really don't understand manual memory management

What?

Zig looks nothing like perl, and that's coming from someone who has used perl for years (I don't recommend it). Zig is practically the antithesis of perl

Place the stack in a separate segment, like in the good old DOS days. Enjoy the return of near and far pointers.
More realistically, we have 64 bits of address space now. Make the guard page gigabyte large. Problem solved.

You forgot the most important feature: IT MOVES FOR GREAT JUSTICE

Andrew's youtube channel: youtube.com/user/superjoe30/videos

It does already offer the possibility for smart developers to make crash-less code in ways that C didn't.

it crashing your shitty code is a good thing. buggy code is a security hole, forcing you to work it out is a good thing. shitty software should crash more, like it does when you try to run it on openbsd

that is what happens with zig, just the compiler commonly calls you a retard if it detects shitty code ontop of that

it's the POSSIBILITY of making bug-free software while retaining its low-level eyesight that zig is aiming to achieve

Yeah yeah, alright then OP.

odin.handmade.network

Fuck Zig

Is this meme vaporware edition of Jai

Isn't zig exactly the same thing

slower activity than a bag of rocks, looks shit


no

Attached: suckless dick.jpg (907x651, 242.81K)

Let's say I wanted to start a project in either Odin or Zig.
Which language should I choose and why ?
So far all I can see is Odin works and has third party libs, while Zig only compiles on x64 and has a ton of issues. But I didn't actually use either.

I wonder what happens when that one's developers money runs out?

yeah it's a real puzzle
why is this thread here about this language that actually works?

What happens if it hits the maximum recursion depth?


That's very architecture specific.
Sounds kinda arbitrary.

read:
>have a hard limit on the number of calls which is proved statically to be impossible to exceed
what happens is that compilation fails due to the failure to prove, statically, that the program won't exceed the limit.

You can use any C/C++ library in zig, and it is more stable than Odin in my experience (despite being so young).

Absolute shit syntax, just like C. It's garbage.

The internet is truly fucked.

times have changed

Choke on this, nigger.

sudo apt-get remove rust* libstd-rust* cargo*

sudo apt-get remove snapd* libsnapd*

Look harder.

he'd never do that with the amount of soy he's ingesting

Rust made Zig obsolete before it was even created.

see

You can't really limit recursion depth statically if it depends on user input. Whether or not it depends on user input is going to be difficult to check. Unless maybe you use a pure language that makes you consider all kinds of side effects, like Haskell, but at least that one has its own downsides.

No but you can detect when it is dependent on user input and then require the error to be manually handled.

you can, by passing along a number of recursive calls and end the recursion if that number reaches a limit.

Why not just use Crystal?

Why is the syntax so ugly tho?

unusable for the same things

How do I manually handle stack overflows that may or may not hit the guard page? Also if recursion depth did not depend on user input then I could compute its value at compile time.


How do I know that number?

I'm going to fork this language and call it "Zag". LOL REKT

Attached: a105fe6cb5d875ad93730dfd9fec9a496b8ebcc8718963cdc8fc17b5f26aa716.jpg (604x527, 242.89K)

wtf Haha bro epic

def op_dick(*some_shit, recursive_calls=0): if recursive_calls > 10: return 'deal with it' # do some shit a = op_dick(*some_shit, recursive_calls=1+recursive_calls) b = op_dick(*some_shit, recursive_calls=1+recursive_calls) return a + b
A hypothetic compiler can verify that:
1) the recursion ends on a simple condition depending only on arguments
2) the initial value of the argument has a sufficient lower bound (at every call site)
3) with each step the value goes strictly towards the condition which terminates

Of course there will be some cases where it's not obvious for compiler, then you need to either make it obvious or don't use recursion if possible.

not the same at all

repost this thread when it gets any microcontroller support or other low level chip support from manufacturers till then there is no chance for it to compete. it will just be a hipster language.

what are you talking about?

there has already been a thing that the main guy made where he booted directly into a written-in-zig kernel

look at all of the architectures zig directly supports

i said manufacture support. how development works in the real world for bare metal programming on micro controllers and the like is that manufactures both try to sell to companies with their chips specs and their development software/kit for the chip. the IDEs the compilers and maybe emulators they give out for free. so lets say your company you work for wants to development some chip, are you going to use ZIG because some guy says it fully supports your chip on the internet? or are you just going to use the C compiler the company that makes the chip gives you which you will get support for and you know is fully compatible. then think about the fact your engineer staff only knows C and the new people to come in also know C and not ZIG. and lets say you arent a larper talking about a hipster tier language and you want to get a job. is it worth learning ZIG? no fuck off nigger larper.

You have no comprehension capabilities.

I hate these shitty IDE's that each tries to sell, and I refuse to work with any microcontroller that does not have a working gcc toolchain ala the MSP's / STM's. Theoretically, for any platform where there's a FOSS compiler, there's no reason zig could not support these systems.