FORTH

Anyone here program in FORTH? It seems like a really nifty little language. I have no experience with it. I'm thinking of Working through "Starting FORTH" after I finish SICP. FORTH is interesting in that in many ways it's the opposite of Lisp. Lisp is a high level language where code can be treated as data. FORTH is a low level language where data can be treated as code. It's extremely minimalist and is basically a step up from assembly. It has a REPL and is often used as a shell on microcontrollers.

Attached: Starting-FORTH-Cover-216x300.jpg (216x300, 12.14K)

Other urls found in this thread:

forth.com/resources/space-applications/
youtube.com/watch?v=mvrE2ZGe-rs
open-std.org/JTC1/SC22/WG14/www/docs/n1124.pdf
groups.google.com/forum/m/#!topic/comp.lang.forth/nWAaPl6A-iY
science.co.il/companies/Software.php
bell-labs.com/usr/dmr/www/licenses.html
kittenlang.org/):
twitter.com/AnonBabble

That sounds... interesting? I'll look into it, let's hope this won't end up a meme -lang like Lisp or Rust

It's a very old language. Like I said, it's still used on microcontrollers because of how minimal it is. It is stack based. You have your data stack and return stack. The data stack holds the data, obviously, and the return stack holds the memory addresses to return to after after a function is exited. Everything is done in postfix notation. Again, this is the opposite of Lisp, which uses prefix notation.

I don't know why people call it minimalist as it's not something that easily produces arbitrary assembly like C, it's a primitive stack-based thing. It was semi-popular in embedded a while ago because the way the machine code was generated made it extremely space efficient. It was almost like running a binary while it was still compressed as the virtual machine jumps to the implementations of words which will be reused a lot in your program, so instead of a function being written out as a flat block of assembly like in C, it's written out as a list of words. That makes it slow but it was worth accepting slow in exchange for small.

Slow compared to what? Plain assembly?

Slow compared to C. Each word is an indirect jump and back, and it doesn't optimize across them. That's why it makes tiny binaries, but you pay a high cost in performance.

Interesting. I know back in the day one of its main selling point was that it was faster than basic. I guess it's only minimal and fast when compared to other interpreters.

That's why you use it on stack-based processors. Obviously Forth is slow on x86.

I used it on embedded in an academic context, it's alright. Really just the fact that it's interpreted makes it extremely useful for real world systems. You can just keep trying different values for pulsewidth until your servo is in exactly the right position. Although I think it's now out of favor with people using eLua, the reverse Polish notation probably throws people off. Fun Fact: The arm on the Space Shuttle was programmed in Forth.

lisp machine guy, is that you?

Friends don't let friends use Forth.

Jokes aside: Programming in forth is one of a few Zig Forums related things I will never do for money again.

Sucking dick certainly isn't on that list

I'd like to hear the story.

Forth is used in several aerospace applications, something they're quite proud of. forth.com/resources/space-applications/


Please tell us about your forth experience.

Sometimes stupid people want to do complex things with simple systems. And sometimes that can lead to you sitting on the floor of your office, holding a marker, surrounded by several boxes of colored building blocks, thinking about either crying, or becoming an elementary school teacher after all.
I finished that job and promised myself to never do anything like it ever again. It just broke my brain.

Oh come on. Give us all the nitty gritty. We want a good programming horror story here.

And I don't want to tell it. It's a sad story about PLCs, government certifications and stacks upon stacks. And the only moral to be learned from it, assuming I was a good story teller, would be that there are reasons why you want to use different computer architectures for different applications.

The thing that makes/made FORTH great: Refactoring.

At a time when programming saw just starting on its complexity slippery slope, FORTH solved a lot of Mythical-Man-Hour problems. Refactoring, Verification, and code density. Efficiency? Very good, compared to anything else commonly available, with direct-threaded and subroutine-threaded FORTHs (and indirected-threaded still shat all over BASIC). And unlike most other languages, it worked for ROM-based turn-key/embedded systems-- no manual loading of code from mechanical storage devices.

Your millennial is showing. Again, at that time-- C compilers produced shit code (wasn't until the mid-90s that changed), they were propriety programs with like $500/year/user licenses, and you needed a $80,000 computer as well.

That industry is state of the art circa 1960.

Exactly. That's why forth works so well for them.

More like the last time they were allowed to choose tech was before most of today's tech existed. Even SpaceX still uses Nazi nozzles that control the expanding gas despite the existence of aerospikes as no one in aerospace wants to take the risk to get a 1960s new tech to market.

youtube.com/watch?v=mvrE2ZGe-rs

Here's a video where guy shows how data can be treated as code Forth. It's an hour long, but I found it pretty fascinating.

I was interested in it for a long while, and it definitely helped me get a good grasp on low level matters (the stuff people recommend you learn C to get good at--which C hides and which C purists will tell you is not atoms made up of quarks held together by powerful forces but actually mere undefined behavior that releases nasal demons). I'd say that Forth is better than most languages, and that I'd still use it happily today--
if not for ML. If not for functors in OCaml, or dependent types and theorem proving in ATS.
if not for the 'community' being 95% deranged LARPers by population, and 100% (rounded up) deranged LARPers by output. When I thought they were merely deranged I could tolerate it. If they were wannabes like I assume many of you are here I would tolerate it.

would I recommend it? no. Forth is intrinsically a relatively hard language to do well in, because 1. to do it well you need to write lots and lots of small definitions, which need names, and 2. naming is the single hardest task in Computer Science (it's a joke but it's also true), and 3. Forth lacks a lot of visual cues that every other language has. You can't visually match parameters to functions. If you screw up the stack in a function, not only don't you know until shit breaks and you trace it back, but you can't easily see where you're affecting the stack - you have to start at the beginning and read through to each exit and maintain state while knowing what each word (with a name you hopefully remember--see #1 and #2) does with the stack. It's doable. I did it. It's objectively and materially a harder task than programming in another language, even before you get to the poverty of the community that has you writing all your own libraries and implementing your own hash tables (although see benefit: getting good at low level matters), or the compiler helping you with almost nothing.

if you do Forth anyway, fork over 100 USD and pick up iForth. It's really, exceptionally good (at everything except making tiny executables that exit really soon--ciforth and SwiftForth are the best at that). Use stack comments. Disregard anyone telling you to write Forth definitions like they're glossary entries, like in that one video. Give yourself a little bit of slack and use ALLOCATE and local variables every once in a while, or you'll burn out really fast and find yourself (God help you) writing Common Lisp or Haskell as a kind of retaliation against good sense and efficiency.
Forth is best at linguistic extensions that feel natural, DSLs that feel natural, doing lots of work with the whole language at 'compile time' to take a load off of runtime (and off of human-reading time--better to PARSE and populate a table from a string than have a bunch of CHAR C C, CHAR O C, CHAR C C, CHAR K C, nonsense.)

Attached: cidr.png (590x256, 37.3K)

Forth is more of a philosophy than a language. I think it's entirely useful to learn too, simply to get that mindset.

Read Jonesforth and Thinking Forth

for fuck's sake just name things f, g, h, f1, f2, f3 and have a shadow file like Forth used to be in the good old days

OP try using Retro Forth by crc

wat
A C compiler generates assembly the same way someone who knows assembly would. Forth generates something more like a bytecode vm with the bytecode mixed into the binary.
Oh, your head is full of fuck from Jew school. The reason why almost all famous programmers were self-taught before entering college isn't due to the head-start but that they knew enough to see this shit as poz when it was presented in class and easily resisted the infection.
Link us to some of your projects.
Forth is almost entirely unoptimized by compilers other than for attempting to combine words. Everything else is self-defeating as making optimized combinations of words would sacrifice the small binaries which are the only reason anyone ever used forth.

Weapons-grade smug seems to be a trait of memelangs.

It seems to me that FORTH has been getting increasing attention over the last couple of years due to its suitability on FPGA...

Attached: ice40-hx8k-breakout-pacakge-2400.png (980x540, 356.9K)

Indirect-threaded FORTHs do that.
Direct-thread, Subroutine, and Compilers don't have bytecode.

I'm not sure how direct thread changes that as it just removes a level of indirection as a trade of size for speed, it's still the same table dispatch design. There's an argument that either way it's not true bytecode as there's no translation, but imo that's really pedantic.

what's NULL?
Forth person: the address 0
C person: actually that's defined by the binary ABI which is outside C and blurble blorble theoretically hostile compiler nasal demons

what's the difference between a pointer and an int?
Forth person: there's no difference, really. it's all numbers, man
C person: triggered they're different bit-widths on x86_64! they could be totally different on exotic Martian pinball quantum machines! if you add 1 to a float-type pointer you advance the pointer by the size of a float which
Forth person: it's your compiler doing that. you could really add 1 to any pointer but
C person: I know that but it would be a misaligned pointer which is undefined behavior
Forth: misaligned fetches are just slow on x86. on other architectures
C person: you'll get nasal demons you witch

what's OO?
C person: a high level language thing. C is a low level language so it doesn't have OO
Forth person: OO? yeah I've written a few of those. I figure, why use structs when OO is nearly as easy.


fuck you, ML is divinely inspired and I didn't learn it in school; rather, I was blinded to its glory by hubris and an onanistic search for the next linguistic novelty. A Jew language would be, probably, MATLAB, Mathematica. Maybe an APL variant.
iForth and MPE Forth have analytic compilers. Other optimizing compilers that generate straight machine code, not "a bytecode vm", are generally just have peepcode optimizers

Something's really wrong with you. Here, let me help. You've been trying to argue that C hides the "low level matters" (>>959068) which you imply forth exposes and that you desired a "good grasp" of. Yet evidence you present is of forth hiding the low level matters which C exposes. And then you treat low level matters as complexity that you did not desire. Can you see how that doesn't fit what you were trying to say? It makes you sound like a raving lunatic.
Hmm. Seek help.

The C person there is unusually knowledgeable. If you learn C by the book, you won't learn any of that stuff; meanwhile you will have a paranoid and hostile attitude towards actual low level matters: for you the answer between "what's the difference between a pointer and an int" is at they are separate types and ANSI C says different things about them and caring about any so-called 'real difference' is a sin against portability-- and at such points the "C person" I portrayed can step in and act as an apologist, explaining why it's better not to know the things that he himself does know.
Meanwhile with Forth, you can't hardly miss what's really going on, and there is no priestly class that will encourage you to fear nasal demons.

What is bad about ocaml? It's arguably less jewish and more usefull then lisp, and it's quite fast to boot.
So if you want to add something less imperative/more functional and garbage collected to you toolbox, it seems like a sane choice.

Well yeah, because it's wrong. Misaligned pointers don't work at all on many architectures, it's not just a question of performance. There also hasn't been any performance difference with them on x86 for a while, too. And NULL has been defined as 0 for a long time. E.g. from the standard, "An integer constant expression with the value 0, or such an expression cast to type void *, is called a null pointer constant." and "The macro NULL is defined in (and other headers) as a null pointer constant"[1].
Perhaps you should have learned by the book?
Yet it sounds like you've missed that your forth code with unaligned access will just fault and crash on architectures that don't allow it, while the "paranoid" C guy knew that there was a low level difference there to be careful of?

[1] open-std.org/JTC1/SC22/WG14/www/docs/n1124.pdf

ok what's wrong?
ok cool. what's wrong again?
je ne crois pas
but whatever, the point is that an awareness of this matters, but it never matters if you only ever touch proper C. Just one day, ten years into your career, someone will explain to you that your structs have padding in them and that's why you're seeing such-and-such
I said I was interested in Forth for a long time, of fucking course I noticed it? Jesus why don't you start pointing out that ints and floats use different registers.
The point of the narrative isn't that everything the C guy says is wrong. I--what the fuck, I just fucking explained this right here:
C and Forth both put barriers between you and the machine. Forth pretends that the machine has efficient stacks and that it's ideal to fit any data into some multiple of 'cell' sizes--which on x86_64 causes naturally written Forth to be 2x as slow on some tasks than naturally written C. C and Forth have a standard and with both there's pressure not be gratuitously unportable. But Forth's standard has lots of "implementation defined" behavior (often meaning architecture defined--what happens is what makes sense for the hardware) where C has "undefined behavior" (often meaning ''GCC fucks your code up with nonsensical optimizations and then sneers that you should thank it for not also fucking your daughter"), and this difference comes with a difference in mentality that comes with one being better than the other for actually coming to an understanding of how shit works.
Go ahead and disbelieve me, but you're never going to find someone who spent a lot time with C and Forth and then credited any eye-opening realizations to fucking C.

C niggers best be staying the fuck away from my safe space!

Attached: pepe based.png (1200x760, 96.62K)

Sorry about all that swearing. I've been browsing reddit and hacker news a lot lately and it's getting to me.
There's some 'treat data as code' here: groups.google.com/forum/m/#!topic/comp.lang.forth/nWAaPl6A-iY
My newer version of -----BEGIN is: : -----BEGIN "-----END PGP SIGNATURE-----" parse "-----BEGIN PGP SIGNATURE-----" s:/ a:open drop /Hash:.*/ s:/ a:open nip eval ; (that's 8th, a proprietary Forth-like language from Israel.)
Imagine an environment where all the tools are written like this. I was disgusted when it came out that coreutils strings(1) could be exploited with malicious binaries -- i.e., the very stuff you'd typically want to run strings against. And when the only common protection in arbitrary execution in YAML was some weird check that a PID in the YAML match the caller (OK, no problem, let's just create a YAML file with 32k attempts at this exploit.)
meanwhile you do something you couldn't even imagine causing a problem, like checking that a cert looks valid, and instead it roots your server. great.
Forths can run EVALUATE in a very locked-down environment however. But I'd be wary of interpreter extensions relating to literal data.

This.

ML and Forth > C

I don't think 8th is from Israel. Ron is trying to make money and not be a poor homeless jobless loser. Use his Reva Forth if you want.

Test it. Since Skylake there is no penalty for regular misaligned accesses.
Sure it will. Try running a proper C CUDA kernel that does a misaligned access. You'll get an access violation.
Because I'm not still living in 1999. In "modern" use this century you'll find the xmm registers store floats and ints interchangeably on x86. Having unified registers is pretty common.
The main reason for C's undefined behavior is to let the programmer leave room for both portability and an optimizer. Forth doesn't really care about either so has no need. C can still be written brutally like forth without UB, portability, and tying the hands of the optimizer although it's bad form in C and requires knowing a lot about the language to do. But it's still something that fully learning the language would teach.

this is a weird response. you're weird.
also wrong: science.co.il/companies/Software.php
if money's the concern you could also just use the free version of 8th.

8th's compiler is proprietary, I'm not going to use it full stop.

$500? Surely, you've adjusted for inflation when you give that number?

Hell, in the early 2000s the GNU C compiler would just spit out completely non-helpfully garbage for error messages. It's incredible how much it has come along. People think they're more intelligent than their forebears but really the tools have gotten easier.

You have clang to thank for that. When the GCC devs saw how much better and more helpful clang's error messages were than gcc's, they dropped a collective India-sized dump in their pants and got to work on bringing gcc up to par.

Whatever else you might think of clang (being an Apple joint, licensing, technical details, whatever), the competition has certainly helped the jokers at GNU to get their shit together.

Ah, actually I didn't...
Hey Nerd, you want the C development environment for your AIX2 box? The one based on the PCC/Portable C Compiler that's in the Public Domain? But you and everyone can only have it as a proprietary licenced product-- 1983$500 per user, per year, thanks. The source code? FUCK OFF!

No we didn't.
I discovered and reported a bug in gcc-1.37 involving case statements-- a really weird one that buggered up INN. But I had the source. I was a 1980s Unix User Master Race, so I fixed it. We dissed other C compilers (Sun's C..) where we couldn't do that.

Probably worth noting for you younguns that in the 1980s youneeded to pay AT&T big money to use Unix, even if you were a BSD shop with their 'open source before open source!'. bell-labs.com/usr/dmr/www/licenses.html

How do I get used to the stack? Reading complex forth code makes me feel like it's using lots of unnamed variables and it's really hard to keep track of what's going on and which operation gets applied to which values.

Structure the code so that scope of any values only is on the same line so you can focus on it..

: times-table cr
11 1 do
11 1 do
i j *
.
loop
cr loop ;

"Stop trying to write Pascal in FORTH!" "oh."

: times-column ( n -- ) 11 1 do dup i * . loop drop cr ;
: times-table 11 1 do i times-column loop ;

"But sensei! My code /looks/ better! And you got all the dup/drop stuff that's confusing!!" "Now write a 6-dimensional times table." "oh."

Attached: the prices of single-user /microcomputer/ C compilers back in 1983, produced by small companies/one guy, with 1983-era compiler code quality. Note that Microsoft wanted 1983$200 for an assembler!

Attached: byte-3.jpg (513x1080 74.97 KB, 265.55K)

That's not so much different than getting used to the implicit variables like $_ and @_ in Perl.

Those variables are globals and have a very specific purpose. Forth variables can't even be seen, they're implicitly allocated on the stack and don't even have a name in the code.


I don't get it...

There's no k l m n o p access to loop indexes. The student's code would've needed very non-portable hacks to N-OVER the values from the return stack and produced Yandere-level code. The second way only needs the programmer to think about 'i'--and all on the same line. It's the sort of thing that crops up in larger programs, and FORTH somewhat introduced the 'structured programming' idea, albeit in RPN...

Maybe you can use the .S thing in the places you're confused.

Attached: .S.png (1366x768, 86.68K)

gforth has support for local variables:
Gforth 0.7.3, Copyright (C) 1995-2008 Free Software Foundation, Inc.Gforth comes with ABSOLUTELY NO WARRANTY; for details type `license'Type `bye' to exit: foo { a b } a b + ; ok1 2 foo ok.S 3 ok

If you're not liking forth, you can also try Factor, which has locals and is more normal-y, python-y, still concatenative though:
Press F1 at any time for help.Factor 0.98 x86.64 (1888, heads/master-39cbe60fd2, Jul 16 2018 03:34:25)[GCC 5.2.1 20151010] on linuxIN: scratchpad USING: locals math ;1: Note:Already using ``math'' vocabularyIN: scratchpad :: foo ( a b -- n ) a b + ;IN: scratchpad 1 2 foo --- Data stack:3IN: scratchpad

Fuck you can even try kitten but it's purely functional (kittenlang.org/):
Welcome to Kitten! Type //help for help or //quit to quit 1: define foo (T, T -> T): -> a, b; a b (+) 2: 1 2 foo 3i32 3: define foo2 (T, T -> T): -> a, b; a + b3i32 4: 1 2 foo23i323i32

Factor's a completely different experience. Really gives lie to the notion that 'concatenative' is even a meaningful category of programming language.

Not really, concatenation is just a special case of functional programming, most importantly using a stack for data and treating code as data

That sounds... dangerous.

Said the smart pajeet, and called the support to recover from a backup because someone screwed up his website with SQL injection.

Tbh I don't see any reasonable use case for FORTH outside of the demoscene community.

Forth is minimalist, unlike C. Idiot.

No it isn't. If you're coming up with meaningful names for every single variable in _any_ language, you're doing it wrong, and wasting time. Naming mostly just allows niggers to avoid comprehending your code and make invalid assumptions, as autists love to do.

What thread do you think you're in?
Maybe you could run through a Forth tutorial (gforth's crash course is good) and then come back

Forth was left behind because, like Smalltalk, it ends up being too easy for users to write software and current day computer companies hate that.

Gonna have to put smalltalk on the list of languages to learn.

Object oriented programming makes little sense outside the context of Smalltalk.