Just made a blogpost EXPLICITLY to BYFO all the LARPers on here

Should you learn C to “learn how the computer works”?


>I don’t think most people mean this phrase literally, so that is sort of irrelevant.
>Understanding the context means that learning C for this reason may still be a good idea for you, depending on your objectives.


words.steveklabnik.com/should-you-learn-c-to-learn-how-the-computer-works

Attached: steve klabnik.jpg (1400x933, 384.3K)

Other urls found in this thread:

sgi.com/tech/stl/drdobbs-interview.html):
bell-labs.com/usr/dmr/www/chist.html
harmful.cat-v.org/software/c /linus
quora.com/Why-is-the-C-language-taught-if-the-future-is-Java
glitchcity.info/wiki/Arbitrary_code_execution#In_Generation_I),
gbdev.gg8.se/wiki/articles/Pan_Docs)
sgate.emt.bme.hu/patai/publications/z80guide/
chibiakumas.com/z80/multiplatform.php
z80.info/lesson1.htm
homepages.ed.ac.uk/jwp/history/autocodes/
kednos.com/pli/docs/reference_manual/6291pro_025.html
en.wikipedia.org/wiki/Persistent_object_store
en.wikipedia.org/wiki/WinFS
twitter.com/SFWRedditImages

C is shit

Haha I saw that on Hacker News too OP ;) what's your username?

I'm Steve Klabnik

Attached: steve klabnik 7.jpg (1024x683, 119.92K)

based steveposter fighting the echochamber

For storing handling, yes. Otherwise gas yourself.

What about memory handling?

Steve will chop his dick off and xis suicide commit will be pushed to master.

It's mistress now.

What fucking retard ever said "learn C to learn how a computer works"?

Literally all the LARPers on here. That's why I wrote this.

Holy C is the only language that one should bother learning, so fuck off with all your retarded useless languages

Attached: SleepingTerryDavis.png (548x410, 173.55K)

Sleep Tight Terry

Sleep Tight Terry

There is such a thing as anti-knowledge.

And here I was thinking C started as a joke among mates


Never heard that either, avoiding idiots has a its perks

OP didn't even read the fucking article apparently. Here are some snippets, because OP can't fucking focus for shit.


Note that when he says "C operates in the context of a virtual machine", he doesn't mean a sandbox, and he doesn't mean a bytecode interpreter. Assumptions of execution that act to ignore hardware specifics (like ISAs) are a "virtual machine". All non-assembly languages that describe execution in the specification are describing a virtual machine.


Steve Klabnik is a fag, but OP is a retard who can't read. The entire point of the post is that learning C will not teach you how computers work. It will teach you more about how computers work than a higher level language, but C is still very abstracted from the hardware. C doesn't teach you shit about registers, dealing with interrupts, dealing with manual stack frames, manual manipulation of the program stack, or anything like that. C will not teach you how the computer works.

Steve is right in this case.

When people talk about "how the computer" works, they mean the operating system and operating system level components. This sometimes means the hardware, insofar as operating systems are designed to work on a small number (and sometimes only one) architecture. It also means OS level apis, eg graphics, io, ipc, etc. In higher level languages, these concepts are abstracted away into generic OS agnostic interfaces. In c, you are encouraged to use the low level concept directly, allowing you to learn how the high level interfaces are implemented, and how the operating system really works.

how about looking into what computers even are.

You should learn C to learn how modern computers with flat untagged memory work.

A very insightful contrarian comment about the nature of C appears in an AlexanderStepanov (C++ STL designer) interview by Al Stevens in DrDobbsJournal (3/1995) (sgi.com/tech/stl/drdobbs-interview.html):
"Let's consider now why C is a great language. It is commonly believed that C is a hack which was successful because Unix was written in it. I disagree. Over a long period of time computer architectures evolved, not because of some clever people figuring how to evolve architectures---as a matter of fact, clever people were pushing tagged architectures during that period of time---but because of the demands of different programmers to solve real problems. Computers that were able to deal just with numbers evolved into computers with byte-addressable memory, flat address spaces, and pointers. This was a natural evolution reflecting the growing set of problems that people were solving. C, reflecting the genius of Dennis Ritchie, provided a minimal model of the computer that had evolved over 30 years. C was not a quick hack. As computers evolved to handle all kinds of problems, C, being the minimal model of such a computer, became a very powerful language to solve all kinds of problems in different domains very effectively. This is the secret of C's portability: it is the best representation of an abstract computer that we have. Of course, the abstraction is done over the set of real computers, not some imaginary computational devices. Moreover, people could understand the machine model behind C. It is much easier for an average engineer to understand the machine model behind C than the machine model behind Ada or even Scheme. C succeeded because it was doing the right thing, not because of AT&T promoting it or Unix being written with it." (emphasis added)

we'll never stop.
you'll lose.
get used to it.

You're both right. C arose at a time where multiple machine architectures existed, but no easy way to port code from one architecture to another. C was designed as a light abstraction layer above these differing hardwares, with the compiler being responsible for translating from the abstract C machine to the target machine.

Then we went and built different C libraries for the different OSs, and reinvented yet another problem from our professional past.


Yes I know. I'm deliberately skipping them to keep this short and focused.

Scheme, and other functional languages are by design more abstract than languages like C. If you already "know how a computer works" then something like scheme or sml is going to look alien at first , and difficult to reason about how the machine executes your code.
Ada is also more abstract then C with things like parameter modes, thick pointers and array indexes. But in contrast to scheme languages it is still fairly easy to reason about how the machine would execute it, e.g. iterating over an array and updating values should compile down to almost the same assembly, even though with Ada you can use
for each x of foo loopx := x + 1;--orfor I in foo'range loopfoo(I) := foo(I) + 1;

Now the question is what the supposed benefit is of C where you do pointer arithmetic instead of working with indexed arrays, given that they compile to near identical code.

Basically right. Steve is also basically right, but phrased his argument in a way designed to induce controversy, not to facilitate understanding (he probably did this deliberately because he's a fag.)

Attached: thus always to tyrants.png (618x948, 315.88K)

F
Sleep Tight Terry

For Steve to be right his point is so fucking retarded that it's mind boggling, something like computer languages are for humans not machines. Y'know why because this is typical lefty-style propaganda.

No, he's just stating a obvious boring truth using deliberately controversial language. Contrived controversy is a great way to promote your faggot blog and act like you're really smart and great. You having such a strong negative reaction to it is exactly what he wanted.

You can use array subscripts in C too, you know. But the point is that, pointer arithmetic isn't black magic like array indexing is, since it's what really happening when you use those convenient brackets.
But I really find pointer arithmetic neat.

Literally who?

Alright riddle me this /g/. In most modern languages you can turn off safety features like array bounds checking and null dereferencing using compiler flags. So why not use a modern language like swift or rust instead of C. Both of them support the imperative programming design and have nicer features. You can be just as close to the metal in swift and rust. C has an ancient build system, its bloated, and you don't even have RAII. It seems you should only use C if you're working on legacy code.

Use whatever you want to for your own projects. And be assured that everybody else is going to use whatever they want to. Stop worrying so much about it. If you spend your time trying to get everybody on the same page, you are wasting your time.

Anybody who's confused by scheme probably has brain damage. It's one of the easiest languages to learn ever made. Certainly easier than C (not that C is very hard either.)

lemme check the niggersphere. Nope upboats everywhere. You think upboats come from people who think he's actively deceitful or from people who socially cuck themselves into thinking he's genuine?

The "upboats" come from people who think he's saying something novel or interesting instead of banal, due to the way he deliberately presented the banal as something other than banal.

Not really since the behavior, at least in Ada, is quite well defined.

I don't understand why people accept abstractions like if/then/else and loop constructs without second thought, but things like indexed arrays and strings are seen as the spawn of satan.

You didn't understand the point. It's a lot easier for C to guess what assembly will be produced, even in complex cases. For a dynamically typed garbage collected language? Fuck no.

Or switch, which will often work in a way that's completely unexpected to novice programmers (who expect that it always compiles down to an if-else-if chain).

Basically it's because people are dumb. They don't recognize their own biases, and frequently aren't aware of how little they know.

I haven't seen anybody recommend C for that purpose here, I say this because I shill assembly\emu dev for that purpose. The norm here is scheme for that, are you sure you are on the right imageboard?

Web developers, and people involved in that entire ecosystem sure don't.

People accept if/else/etc because of .

AT&T shills knew that C sucked, so they made up bullshit about it being "portable assembly" and this is one of the reasons why people think computers and software have to be so complicated and that bugs and exploits are normal. AT&T shills blamed everything C couldn't do on the hardware. C doesn't have nested functions because the PDP-11 didn't. C doesn't have array bounds or multi-dimensional arrays or slices because the PDP-11 didn't. C doesn't have garbage collection because the PDP-11 didn't. Lisp machines have all this, but the PDP-11 didn't. Then UNIX companies like Sun created RISCs which are designed around C and UNIX, so they could blame their new hardware for why C sucks and C for why their hardware sucks.


That's revisionist bullshit. The evolution into the PDP-11 style of hardware happened before C and it was just one kind of machine. String handling does not need byte-addressable memory. More advanced hardware was motivated by more powerful languages and more possibilities for speeding up programs and simplifying software.

This is more revisionism. C was a quick hack and it also sucked even more before ANSI. Most C weenies idolize C because they don't know anything else, or if they do, it's a UNIX language like Java or awk.
bell-labs.com/usr/dmr/www/chist.html

Tagged and segmented architectures aren't imaginary. Too many computer companies went out of business and everyone left has a monopolistic mentality. Everything now has to be "compatible" which means running 50 million line browsers and bloated C++ compilers.

C and C++ are so complex and poorly defined that everyone has to oversimplify and misunderstand everything. Even comp.lang.c and the standards committee get confused.

To me, "doing the right thing" means working correctly, signalling errors, and so on. UNIX weenies think the "right thing" is popularity and popularity is the only measure of value, which sucks. They need to defend everything wrong with C because losing popularity would mean C is worse.


That's absolutely true. Computer design used to be about how humans use computers and it always was. Programming languages are for people to read. Tagged architectures were built to make computers better for people. UNIX's tape drive emulators (pipes and byte sequence files) have almost nothing to do with how we use computers today and lead to enormous wastes in code, memory traffic, CPU usage, and so on, but they fit the PDP-11 and how it was used. Programs have tens of millions more lines of code than they need because they're based on "cobbled together gunk" that has nothing to do with how we use computers.

Hey. This is unix-haters, not RISC-haters. Look, those guys at berkeley decided to optimise theirchip for C and Unix programs. It says so right in theirpaper. They looked at how C programs tended to behave, and(later) how Unix behaved, and made a chip that worked thatway. So what if it's hard to make downward lexical funargswhen you have register windows? It's a special-purposechip, remember? Only then companies like Sun push their snazzy RISCmachines. To make their machines more attractive theyproudly point out "and of course it uses the greatgeneral-purpose RISC. Why it's so general purpose that itruns Unix and C just great!" This, I suppose, is a variation on the usual "the wayit's done in unix is by definition the general case"disease.

First half of the article is only about staying true to the clickbait. Don't waste my time and get to the point, everyone knows C is not asm.

Read "fixing my broken language for lazy bums who want an overly complex machinery doing their own malloc/free at runtime". Once you accept that only compile time garbage collection is okay, you'll be liberated from your ivory tower of lambda calculus retardation.

Stealing this
Sleep Tight Terry

He's stating the obvious by attacking a strawman in a quite obnoxious way. And even when I started learning C and knew nothing, I knew what assembler is and I knew that I wasn't writing direct machine code - I mean I needed to compile the stuff, that automatically made me think about that. No one can be that stupid. C is though a good compromise between trying to be an autist and starting with assembler or using Java and dealing with object orientation and all that modern bullshit that is really useful in software development but really not that interesting when you start out. Having pointers and memory management in C, that'll better indeed teach you something about 'how a computer works'. Or actually about how a computer doesn't work when you fuck things up.

Accurate.

Hmm, if I use a switch statement I expect it to be fast by not staggering through that chain. It should be a lookup table. I wonder if bytecode languages handle it the same or if it all gets either optimized or mangled to whatever in the process.

yeah sure, I'll use the apple language. Not too sure about who's behind rust either. C on the other hand is ancient. There's the gcc we all know and used for years. And there's a C compiler for everything while until a few weeks ago, I didn't even have a Rust compiler on my machine. (Some package I installed needed it to be compiled) Same thing with PHP when it comes to web stuff. It runs basically everywhere, it can do everything I'd ever need a website to do and it's pretty open whereas I'm always a little suspicious when it comes to Oracle.

somebody screencap this, it doesn't fit on my poorfag screen.

Most novice programmers get this wrong. Ask around and see for yourself.

Yeah, I think at some point even one of my professors said that. I wouldn't know either if it wasn't for Terry.

You know that not all switches get compiled into jump LUTs, right? A
switch(i){ case 0: func1(); break; case 100000: func2(); break;}
won't, for example. But yeah, you're supposed to use it when you know the compiler will not produce any branch.

>Or switch, which will often work in a way that's completely unexpected to novice programmers (who expect that it ALWAYS compiles down to an if-else-if chain).
It helps to read.

I remember I liked this book.

I'm with Linus Torvalds on this: "even if the choice of C were to do *nothing* but keep the C++ programmers out, that in itself would be a huge reason to use C."

harmful.cat-v.org/software/c /linus

C++ and object-oriented languages do more to teach you bad habits on how to avoid programming properly than help you in any way. Once you try to write a proper efficient program, you're frequently back to doing things the C way. If you take a C programmer but make him do OOP bullshit, the C programmer will still manage a decent job. But if you take a OOP programmer and ask him to write C, it will be a fucking disaster and you will realize what a clueless monkey you are dealing with. And when you want a OOP programmer to write efficient software, he's gonna have to resort to doing things the C way more instead of leaning on all those layers of abstraction and bullshit frameworks (and those layers of abstraction and frameworks also contribute wonderfully to bugs and unreliable behavior). And that's a big reason why people should learn to program in C instead of C++ and other BS.

baste and redpilled

ok ok learn c and how the gcc works aswell as assembly/machine language.

You should learn Forth and assembly for your arch, tbh. And start with something simple, like a Z80.

No. C is too high level, it'll teach you nothing.

Why the Z80? I don't even think PolyFORTH was ported to the Z80. Stop LARPing and pretending you're a forther, and just use gforth on .

Sleep Tight Terry

Here you go.

Attached: I'm the smartest programmer on earth! Only LISP is good! Your hardware isn't the PDP-11 anymore!.png (1472x847, 150.99K)

Imagine being exactly as retarded as UNIX weenies, with the same identical disdain of backwards compatibility aka "making software that stays useful and reliable"

Dumb nigger, every 80's computer had several Forth implementations. Z80 also has the advantage of tons of software via CP/M.

Attached: front_300.jpg (2839x1406, 3.8M)

If they actually did this you wouldn't need backwards compatibility.

That's because you are a newfag.
This is Zig Forums, not /g/. (((Scheme/LISP))) niggers are not allowed here.


based

Attached: steve klabnik a.jpg (1280x720, 61.78K)

Sleep Tight Terry

Sleep Tght King Terry ;_;

Attached: sample_752062ff81127f4f8dac89b2ef7c035e1f62d3c1.jpg (850x642, 71.6K)

CP/M isn't what I think of when I think Forth. The only Forth I know for z80 is CamelForth. If you want to learn Forth and a CPU, learn the 6502.

Sleep Tight Terry

Sleep Tight Terry
Press F

Attached: DjI_ZtuWwAgpoRJ.jpg (977x543, 92.38K)

Well that's one of them, but you didn't look very hard. That CDROM cover I posted is a hint of the vast amount of tools you can get from CP/M (in addition to whatever "native" software is available for any given Z80 computer, like say the Amstrad CPC/PCW series). And it just so happens, those systems have an 80-column screen, which is quite a bit more comfy for programming than a VIC-20 and its 22 columns. But otherwise it doesn't really matter which 8-bit computer you use, because they're all simple and a good starting point to learn the hardware.

I know javascwipt Im basicwy a pwofwessional pwoggrammaw I'm going to wite evewything in nodejs even my computer apps in node-webkit :)

thanks fren

Sleep tight Terry.

I didn't look, I was going off memory.

Sleep tight Terry

Tagged architecture has many serious advantages. All data structures can be used no matter where they came from because they share the same type system. That's great because it means you don't have to care about what language or garbage collector other people use. Not having to malloc/free is just one of the advantages.


UNIX weenies are brain damaged.
quora.com/Why-is-the-C-language-taught-if-the-future-is-Java

C teaches how C works. C shits on the last 60 years of compiler technology (even though it's not that old), so you will have no idea what compilers and programming languages can actually do.

Understanding how a GC moves objects and pointers teaches you a lot more about memory management than malloc and free. There are also a lot of languages that have pointers and the equivalent to malloc and free but don't suck.


That's because C++ sucks. The "C way" is the shitty way. People in the 80s and 90s were calling C++ "weakly object-oriented" but AT&T shills blamed OOP because blaming C for why C++ sucks would be bad for their business. OOP that was good meant different languages like Smalltalk and CLOS in Common Lisp. Common Lisp actually does have all the code reuse benefits that OO proponents said OOP has. Java is an improvement over C and C++, which is all it was meant to be.

Those are C++ problems, not OOP problems. OOP is about not needing frameworks and reducing bugs. Computer scientists used to say that most design patterns and frameworks are to make up for flaws and deficiencies in the language.

There are more languages than C and C++.


Browsers and Linux are not useful and reliable. They constantly need more and more millions of lines of code because they're badly designed, but they still can't do what mainframe OSes did in the 60s, like file versions and proper error handling instead of panic. UNIX linkers still can't handle anything more than PDP-11 assembly could. UNIX "design" is based on bullshit like running out of disk space on a PDP-11 (/usr) and using a tape drive archive for code libraries ("ar" format), so UNIX weenies assume that everything is only used because of popularity and "backwards compatibility" and not because it's actually good, or even that it can be good.

The fundamental design flaw in Unix is the asinine beliefthat "programs are written to be executed by computersrather than read by humans." [Now that statement may betrue in the statistical sense in that it applies to mostprograms. But it is totally, absolutely wrong in the moralsense.]That's why we have C -- a language designed to make everymachine emulate a PDP-11. That's why we have a file systemthat forces every file to be viewed as a sequence of bytes(after all, that's what they are, right?). That's why"protocols" depend on byte-order.They have never separated the program from the machine. Itnever entered their tiny, pocket-protectored with acalculator-hanging-from-the-belt mind.

And thank god it didn't! Because otherwise it wouldn't be fun to do anymore.

The reason why C is so ubiquitous is because C allows for manipulating what you blatantly say it cannot. C is also considered dangerous for this very reason -- extreme freedom.

Why does nobody recommend learning verilog to learn how computers work?

So what is the alternative to C? I have yet to find a programming language that is not over-engineered piece of cancer that never stops growing. Literally everything is bloated with (((enterprise))) anti-features or has shit ecosystem. Show me a simple programming language that is at least comparable to C simplicity, speed and ecosystem size and I will not write a line of C in my life ever again.

C closely resembles algebra. If people understand algebra, it's easy for them to understand C more quickly than OOP languages.
Captcha:plebQ.

Times change. C wasn't so popular in the 80's, and probably won't be in 30 years.

I see no language existing today replacing C as the de facto lingua franca of programming and computers in general. If anything, programmers will revert/resort to programming assembly to the specific make and model of processor. I don't know if Ritchie or Brian W. Kernighan would object to a student learning the electronic switches necessary to operate a computer effectively.

...because Verilog has nothing to do with how computers work?

Quite frankly, any object system short of CLOS is just pathetic.

Yeah but nobody knows what's going to happen in 30 years. Maybe the world will have moved to Javascript machines or whatever (not necessarily Lisp, but same concept).

Javascript machines? There's even more that can go wrong than in C!

It could be a LUT where the index is divided. If you had a lot of cases with constant distance like 100 200 300 400...

Because nobody knows, you cannot argue that point. I say that most programmers know C and leads to little cost. I do not think to myself that a job programming C would be a poorly paid job.

C/C++ has a lot of tools and 3rd-party libraries, unlike fag languages, like rust and swift, or some meme languages that nobody uses. C and C++ also have better performance than rust and swift.


That's a good book!


Z80 is popular and simple.

>tfw when computers itself become based on shitty frameworks, like (((electron)))

It might be a stretch to say C makes you "understand computers", but I think it helps to understand other languages. I started out with C++, and my understanding was surface level. Like, what the hell does #include even mean? Why does std::cout have such weird syntax? What even is std::? Don't even think about what a string is, just make a variable of the string type and hope it works.

Eventually I ended up playing with C because that's what the devkitpro compiler for Wii homebrew uses. Later on I dabbled in HolyC which legitimately helped. It really helps you to understand better when you don't have all these classes and weird operator overloads confusing you when you're just trying to learn what's actually going on. People who start with C++ or java tend to become the "mindlessly copy/paste from stack overflow" type of programmer. In other words, pajeet.

Z80 is fun too! Z80 assembly was actually the first language I ever learned.

I think people say 'learn C to learn how computers work' should be saying 'learn C to learn how computer software works'. Computers are much more than software.

Software is just a layer of abstraction on computers. The most incomprehensible stuff, at least to normies, is in the circuitry.

I doubt compilers do these kind of checks, though.

Some people tell beginners to start with X86 ASM, however, I DISAGREE. Z80 doesn't have any dumb crap that gets in your way and the ASM code is readable. If you like games (L-Liking to play games every now and then doesn't make me a man-child, right?), then learning GameBoy ASM might be fun. Play around arbitrary code execution bugs in Pokemon Red (glitchcity.info/wiki/Arbitrary_code_execution#In_Generation_I), or something. The CPU used in GameBoy is a bit like a synthesis of X86 and Z80. Read the Pan Docs (gbdev.gg8.se/wiki/articles/Pan_Docs) if you are interested in GB ASM.

Learn Z80 ASM
sgate.emt.bme.hu/patai/publications/z80guide/
chibiakumas.com/z80/multiplatform.php
z80.info/lesson1.htm

Attached: 692.jpg (500x478, 53.71K)

I strongly agree. In my case I started out programming games on the TI-83+ (which is a z80 calculator)

That quote is exactly right. The UNIX weenie idea of fun is writing tens of millions of lines of code to do simple things in complex ways. Linux has hundreds of system calls and there are tens of thousands of kernel developers and 15 million lines of code. Pipes are fragile and depend on the bit encodings of text which is supposed to be human readable. The UNIX way is to pretend that programs communicate through virtual PDP-11 tapes, making everything even more complicated. C bit operators are based on the PDP-11. UNIX weenies "have never separated the program from the machine." They "make every machine emulate a PDP-11."


He's absolutely right. C does not do anything about registers, interrupts, stacks, or many other things you can do in assembly and with a compiler, like garbage collection, coroutines, overflow checks, arbitrary precision arithmetic, lazy evaluation, closures, and so on. That's because C is not your computer. C is just a language with a compiler that doesn't do as much as other compilers. Weenies do not respect compilers that do things like bounds checking and string handling because AT&T shills have convinced them that worse tools are better.


High level languages resemble algebra. This goes back to "autocodes" designed for scientists and mathematicians in the 50s, which led to Fortran and Algol. C does stupid bullshit like 00011 being equivalent to 00009 and "hello" + 2 meaning "llo" (but you can also subtract 2 again) because it's pointer arithmetic.
homepages.ed.ac.uk/jwp/history/autocodes/


That's monopolistic thinking. UNIX weenies believe there should be one language used for everything even if it sucks because that's the AT&T culture. The whole idea of a "de facto lingua franca" sucks because different languages have different ways of doing things. Different data types, concurrency, object systems, and so on. Even worse is a language like C that is designed for flat memory PDP-11 hardware because it prevents better hardware from being built.

They're AT&T employees. AT&T employees didn't do anything to correct these weenies who say UNIX is the first OS written in a high level language and all that other bullshit, so they probably don't care much about you learning anything.


That weird std:: syntax comes from Lisp packages, like cl::setf means setf in the common-lisp aka cl package. Most languages that use . for methods and structs/records use . for that too, like Ada, Java, and OCaml. I don't know why C++ copied Lisp instead of something that would fit better with the rest of the language. Then again, C and C++ suck, so something that actually makes sense would stick out.

C, C++, and Java are all UNIX languages. A lot of things in UNIX languages make no sense and have no reasoning besides being whatever the compiler or interpreter did. They copy/paste because they don't know what the code does.

But it's much worse than than because you need to invokethis procedure call before entering the block.Preallocating the storage doesn't help you. I'll almostguarantee you that the answer to the question "what'ssupposed to happen when I do ?" used to be"gee, I don't know, whatever the PDP-11 compiler did." Nowof course, they're trying to rationalize the language afterthe fact. I wonder if some poor bastard has tried to do adenotational semantics for C. It would probably amount to atranslation of the PDP-11 C compiler into lambda calculus.

real mvp of the thread

You've piqued my interest, what else would a file be viewed as?

(I'm a non-techie)

I'd call you a cheap whore for sucking off anything which claims it isn't Unix, but that's too kind. You've abandoned any idea of useful payment long ago and your throat is looser than Nausicaa's valley of the wind, so sometimes you vomit up a mixture of everything you've swallowed lately and call it a post.

If his track record is any indication, he'll either forget to respond or give you some retarded filesystem concept from the 70s with such obvious downsides no one has tried it since.

Attached: disabled_1.png (263x326, 109.64K)

Typed files or data structures. PL/I has keyed files which can be accessed in a random order by key because a disk is not a tape. Files could also be part of an object store or a database. None of these require any changes to hardware because they're just ways to access data on the disk. UNIX weenies who think C is how the computer works also think UNIX file systems are how the disk works.
kednos.com/pli/docs/reference_manual/6291pro_025.html
en.wikipedia.org/wiki/Persistent_object_store
en.wikipedia.org/wiki/WinFS


Hardware is designed for programs, so it's actually the other way around. You're using hardware that resembles the languages less, which means they could be faster on different hardware. RISC does this too, except it's designed for C and UNIX.

If you've been following my posts, you would notice I've always had the same list of things I like.

There are no downsides. Not being supported by the C standard library is a flaw in C, just like not having strings that don't suck. Typed files don't prevent you from using sequences of bytes and pretending your SSD is really a PDP-11 tape, but it would be stupid.

>In another article WD writes:>|> [of VMS]>|> I sure hope so. Any o/s which puts file types in the o/s>|> instead of the program is really creating problems for the>|> user, and I find that more of a problem than the user>|> interface. If programs really want a "$ delimited left>|> handed nybble swapped hexadecimal" file type, let it be>|> done in the program or shared library, and not at a level>|> when all user-written file transfer and backup programs>|> have to deal with it. As my youngest says "yucky-poo!"Huh? Let's think about this.Tighter integration of file types to the OS are not aproblem. In my experience, UNIX offers the weakest filemaintenance offerings in the industry, save for MS-DOS. Inusing Tandem Guardian and VMS I've found that ultimately,one could:* Back up files.* Transfer files.* Convert files....much more easily and safely than with UNIX. Yes, it wasbetween Guardian or VMS systems but instead of going into an"open systems" (whatever THOSE are) snit, read on.As a result:* Each RDBMS has its own backup and restore facility of varying functionality, quality, and effectiveness, complicating support for sites adopting more than one RDBMS.* All existing UNIX backup and restore facilities are highly dysfunctional compared to similar facilities under the aforementioned operating systems. They can make only the grossest assumptions about file contents (back it up or not, bud?) and thus cause vast redundancy in backups; if you change a single byte in the file, you back up the whole thing instead of changed records.* Transferring files from system to system under UNIX requires that all layers of functionality be present on both sides to interpret files of arbitrary form and content. Embedded file systems ensure that file transfer is enhanced because the interpretation and manipulation facilities will be there even if the highest layers aren't (ie: you can at least decompose the file). Find me one person who guarantees they can decompose an Oracle or Ingres file (ie: someone who has a product that will always do it and guarantees it'll work for all successive releases of these packages).Once one strips away the cryptology, the issue is control.UNIX is an operating system that offers the promise ofultimate user control (ie: no OS engineer's going to take away from ME!), which was a good thing in itsinfancy, less good now, where the idiom has caused hugeredundancies between software packages. How many B*Treepackages do we NEED? I think that I learned factoring inhigh school; and that certain file idioms are agreed to inthe industry as Good Ideas. So why not support certaincommon denominators in the OS?Just because you CAN do something in user programs does notmean it's a terribly good idea to enforce it as policy. Ifsociety ran the same way UNIX does, everyone who owned a carwould be forced to refine their own gasoline from barrels ofcrude...