Do we really still need C?

Considering that C++ can be just as fast as C there is probably no need for C anymore as it's just too difficult to use right even for experienced programmers.

Other urls found in this thread:

wiki.haskell.org/Learning_Haskell
boost.org/doc/libs/1_67_0/libs/python/doc/html/index.html
boost.org/doc/libs/1_67_0/boost/python/module_init.hpp
en.wikipedia.org/wiki/Io_(programming_language)
crypto.stanford.edu/~blynn/c/
twitter.com/SFWRedditGifs

Just write everything in Rust

Pascal/Lisp should have been adopted over Cee.

What are you smoking? C will go away once there is a better alternative available. For large amount of tasks C is still the right tool for the job.

The question is do we really need C++ when all it does is increase compile times and create more OOP pajeet shitters.

No other language's compiler can give you the same performance (other than Ada, I think).

kill yourself rustfag

such as?
face it, even in "muh embedded shitstems" c++ is used


nice sage

In all seriousness, I'm not the only one who finds object oriented programming more overcomplicates right?

Javascript is must be a great language since it's used by so many people.

It's overused, probably because it was hyped so much during the 90's, like it was going to magically fix all software problems.

Javascript is not good, but it's absolutely the right tool for a lot of jobs. You can't really get around it, except by avoiding those jobs.

Stop pretending that you don't know.

>implying C++ is easier to use right even for experienced programmers
kek

Which job is Javascript the "right tool" for?

No, you're not. In my humble opinion OOP is a clever way to obfuscate code for job security (at its worst). With OOP, you no longer have plain functions that transform input data into output data, now you have "objects" (blobs of code + internal state) which have relationships and dependencies on other objects. But of course the machine doesn't think in OOP, it still thinks in code+data. So OOP is for the people, not the computer. People thought OOP would help with code reuse. They were well intended but wrong. You know what helps code reuse? Backward compatibility (not only the ability to compile old stuff as-is but also the ability to run old stuff as-is). Libraries and engines are also examples of code reuse; and surprise, they have nothing to do with OOP either.


Yes. C persists because C++ is an nastier and more complex language. Consider the difficulty of learning all of C++17 from scratch. Fun experiment, try writing a tutorial for this for beginners. Things will get even fuckier with C++2? when they introduce Concepts. Don't get me wrong, C++17 is nice and all, finally adding std::filesystem 20 years too late, but don't you have the feeling that C++ is ever more quickly becoming a bloated, unlearnable mess since C++11? I do, but then again I'm just a dumbass who has yet to master move semantics, naively believing that sort of optimization shit is for the machine to figure out, not me, the supposedly high level programmer.

Good question!
Allow me to
talk about a
... ugh.
...
C++ is too difficult, even for experienced programmers. What you can't even begin to understand, you certainly can't secure or build well, except by accident. As hard as this these theorem-proving languages are, I've never once thought "man, I bet C++ isn't as hard as this", because, although that's probably actually true, it's like saying that a 1kg bag of dicks is easier to eat than 5kg of steak. Absolutely true. Now give me the steak. I will pass on the bag of dicks.

Scripting on web pages.

All programming languages are. Otherwise we would be writing in assembly.

we bow down to your superior experience

Assembly is also for the people, especially on CISC processors.

That's only because it's the only language that browsers support, not because it's the best language for the job.

Isn't C++ like 95% the same as C, minus the stuff they added for classes? Can't you technically run C code in C++?

It's the best language for the job because it's the only language for the job. It's not the best language in the full possibility space of how computing could have turned out, but it's the only choice in this world.

no. C++ has a ton of shit, and if you're doing C++ right you're not writing anything remotely like C. And no, you can't write a lot of C in C++. C has continued to evolve and C++'s grandfathered version of C has not. The grandfathered version is actually quite old, and still has practices like casting malloc()'s return because C++ thinks it returns char* when in C it returns void* and simply assigning that to a variable of the appropriate type will do what you want.
C++'s ability to work with C is inferior to ATS's ability to do that.

Well now you're just arguing semantics, but fine.

It seems like the only reasonable meaning in the context of talking about whether we still need languages.
We definitely need javascript, even though it's a bad language. We also need C, and FORTRAN, for somewhat similar reasons.

The context was C vs C++ and/or what is the best tool for the job. The argument wouldn't exist if C++ was the only choice.

Umm sure if you count exception handling as "stuff they added for classes", so that you can signal an error from a failing constructor or overloaded operator but curiously not a failing destructor... I mean you could because it compiles fine but really, really shouldn't. C is 95% simpler though.

You don't even need C++, there is Rust.

That's because destructors are called during cleanup when an exception happens. You really, really, really don't want exceptions to except your exceptions *insert yo dawg meme here*, bad things would happen from that.

Stop trying to make excuses for your incompetence and lack of critical thinking.

yet another rust shill thread

It was never claimed C++ is better because more people use it.
Learn to read you brainlet.

lul

my daycare's floor is covered, three feet high, with plastic balls. it's a big ball-pit, like children's fast-food stores used to have.
it's a little irritating to walk through, but it's critical that we keep the floor densely covered in balls.
why?
because there's broken glass under the balls. What if a kid fell over? without the balls, he'd be immediately horribly injured. look at this mfer right here, hates kids, thinks that stone-age "buildings with clean uncluttered floors" is some kind of model that'll carry us into the future.

bumping shit thread

(Me)
brb, have to kill myself

Would've been funny if you also forgot to sage.

...

There are certain, small, parts of C that are different in C++. It's been a while since I've written anything in C++, but I can remember there being some problems with some C habits that I still used. Again, it was mostly small, obscure stuff, that I had to google around to see why gpp didn't like it.

Is this bait? I can't even tell anymore. C++ is too hard to implement and nobody in the world actually understands it (they don't even understand C but at least they can somewhat get away with it).

C++ takes a lot longer to compile because C++ compilers are worse than C compilers even in 2018. If C++ is used for embedded systems, then you have to use C-like subset. you can't use OOP stuff, because it uses too much memory. If you are looking for a simple language, then C++ is a bad choice because it's so huge and inconsistent. There is no point in using C++ for low-level stuff as the one of main points of C++ is to hide low-level details.


no, as others said. but usually a C-like subset of c++ is used when c++ is used in embedded systems
if it is C89

As an actual embedded systems guy, C++ is a terrible replacement for C as everything useful in C++ would be disabled due to requiring far too complex a runtime or non-portable magic (e.g. no zero cost exceptions), the standard library wouldn't be present due to it relying on template bloat to get the job done, and the absolute dogshit support for debugging C++ would make your life hell when you're faced with debugging with serial debuggers and crash dumps ala crashkernel. Have you ever tried to even get the right line a C++ program that uses templates crashed at let alone dump specific symbols and set breakpoints?

No, they're no slower, the speed difference is because C++ code depends on massive reams of template expansion and ends up asking the compiler to effectively compile far more code. Metaprogramming was a huge mistake and I was disappointed to see Rust repeat mistakes we recognized as mistakes 20 years ago leading to the same unsolvable compile performance problems.

I noticed Xilinx seems to be pushing C++ for their High level synthesis track, I do wonder if industry is taking to it though.

C++ takes longer to compile because the standard library headers liberally abuse template metaprogramming warts that are extremely slow to evaluate, and you include heaps of them with every compilation unit. Try compiling a C++ unit that doesn't use any standard headers. Blazing fast, just like C.
Bullshit. There is a freestanding subset of language that does away with some features to get rid of the runtime, but the core language is left intact. The only real problem with OOP in embedded is overuse of dynamic memory allocation, and that can be fixed with custom allocators and careful choice of algorithms.
When people use a C-like subset of C++ it's usually either:
a) they're old C hands who can't bother to learn better (very common in embedded), or:
b) the compiler available for the platform in question implements an ancient dialect of C++ without all the useful constructs, so there is not much point in going above the basics, and/or:
c) the compiler available for the platform has a half-assed implementation of C++ bolted on top of C constructs and produces awful code whenever C++-only features are used (again, common on embedded platforms).
There is nothing wrong with the language itself for embedded usage.


That's a bit of an overstatement. Sure, exceptions are out but they're cancer anyway, even on desktop, but plain OOP doesn't require any runtime.
...most of which is instantiated on use or inlined. You only get what you need/use compiled in. The freestanding subset of C++ specifically says which parts of it are available without a runtime, and it's most of the useful parts.


Metaprogramming is a very useful feature and it can be fast, when done the right way (hello LISP). However it needs to be designed into the language from the very start — not bolted on after accidentally noticing that some constructs evaluated at compile time are Turing-complete.

A voice of reason among all the FUD

Attached: herbs.jpg (648x1052, 75.84K)

Any language that relies on lambdas and delegates instead of coroutines for pervasive multithreading is not long for this world. C++ header files destroy locality and force you to write pajeet code. C++ has so few structured multithreading constructs that it forces you to take unmaintainable shortcuts. No two "multithreaded" parts of your app will use the same lines of code. You are always so scared to write anything elaborate that you pray every API includes a context pointer so they can handle architecture instead of you.

All C++ scalability comes from minimalism instead of thorough design. You can't get to the future with minimalism. Scaling down is scaling for death.

Java was a practice run for C#. You can't create god-tier shit in Java either.

C++ threading libraries are heavy on blocking threads and polling at worst, and force app developers to jump between lily pads of unmaintainable callbacks at best. Open Source should be called Open Boilerplate because that's all it is: the love, study, and proliferation of worthless boilerplate and the languages that require it.

Even if C++ was as fast as C at run time (which it's not), it still takes longer to compile.

Even if you knew what you are talking about (you don't), you still haven't written anything in either C or C++.

...

it is, if you disable exception support

Pascal, Lisp dialects, and Ada - C is an ugly bondage and discipline language.

We have this "debate" every week.
I give everyone permission to shit-post ITT.

Attached: 801256c649abb6ce8b6f0f18727bef9abeb5bc7f486ce1bc9a0f511cdaf3954a.jpg (600x600, 56.71K)

No one in embedded wants OOP, though. It doesn't even make sense in embedded. Look at something like the skb struct in Linux - how would you translate that to OOP?
You're retarded. We'd love to have exceptions, the problem is they can never be relied on to be implemented efficiently or at all.
That level of bloat is rarely acceptable.
How's undergrad?

Are you scared of pointers and printf or something? Yes you are because you are a little bitch.

I'm so fucking tired of kids on Zig Forums too afraid or too dumb to use an address register of a CPU. If you can't even understand indirect addressing modes and arrays, why are you here? Shitpost on Zig Forums or cuckchan with all the other children.

a board full of brainlets

if OP thinks that pointers are too hard for humans to use reliably
truth is, they're hard like "clean uncluttered floors" is hard. even an idiot can do it and over time even an expert will fuck up. the solution is vigilance, not giving up and putting ball-pits over broken glass and AIDs needles
then it would be odd for OP to also be good at pointers, no?
I don't get this Zig Forums meme of everyone here being LARPers to the point of literally not having written anything, ever. Anyone you could ever single out could look up a tutorial and get back to you in a day, from a cold state of "programming? will that help me cheat at WoW?"

if it's so easy to avoid problems when using pointers then why are there so many security problems related to them?
checkmate faggotface

This is true but C++ is not actually good for programming in general, regardless of even going into talking about performance, footprint, portability, or memory usage. But feel free to read the latest spec to find a workaround whenever you run into a philosophical problem.

C++ is great for programming in general. You're just a brainlet.

C++ does a great job at fixing what isn't broken by bolting on several object-oriented nightmares onto it in order to satisfy several awful use cases.

The only reason C++ is as popular as it is, is because it's actually 5 or 6 languages interpreted by the same compiler - so it has the popularity of that many languages.

If anything, this is a prime evidence that people badly need Scheme, Haskell and Perl6.

Attached: 2812558A-6F58-11E5-9A38-7E733498AD2D.png (1000x706, 11.73K)

OOP is the worst for programming, and you're the brainlet for not learning a proper functional language.

wiki.haskell.org/Learning_Haskell

any beauty you ever saw in Haskell was a reflection of some ML feature.

Why? Just use Go. Go solves all of C's problems without fulling embracing complicated software design paradigms.

All programming is for people. "The fundamental design flaw in Unix is the asinine belief that 'programs are written to be executed by computers rather than read by humans.'"

That's more UNIX weenie fearmongering. OOP does help with code reuse. C++ and Java suck because they copied brain damage from C and weenies blame OOP instead of the real problem, C.

Backwards compatibility is only good for "code reuse" for large corporations which are or can push around hardware companies. It doesn't help reduce bloat, which is the main reason for code reuse. If you can "reuse" code that is slow and bloated and increases the amount of code you have to write, it's better off not using it at all.


C is also bondage for your brain and your hardware. It creates a mentality where you can't imagine anything not sucking.


UNIX weenies blame pointers, strings, and arrays but the real problem is that the C implementations of them suck. Pointers, strings, and arrays shouldn't cause any security problems. All of these problems were solved in the 60s.


The only thing good about Go is that it's better than C, but so are C++, PHP, and JavaScript.

The worst disadvantage of C++ at the moment is political:accepting C++ as the standard OO language de facto tends tokill other existing languages, and stifle the development ofa new generation of essentially better OO languages.

I've been waiting for an appropriate time to use this quote.This is as good a time as ever. Programs are written to be executed by computers rather than read by humans. This complicates program comprehension, which plays a major role in software maintenance. Literate programming is an approach to improve program understanding by regarding programs as works of literature. The authors present a tool that supports literate programming in C++, based on a hypertext system. - abstract to an article in the Jan 1992 issue of the Journal of Object-Oriented programmingThe fundamental design flaw in Unix is the asinine beliefthat "programs are written to be executed by computersrather than read by humans." [Now that statement may betrue in the statistical sense in that it applies to mostprograms. But it is totally, absolutely wrong in the moralsense.]That's why we have C -- a language designed to make everymachine emulate a PDP-11. That's why we have a file systemthat forces every file to be viewed as a sequence of bytes(after all, that's what they are, right?). That's why"protocols" depend on byte-order.They have never separated the program from the machine. Itnever entered their tiny, pocket-protectored with acalculator-hanging-from-the-belt mind.

I'm convinced you're taught all these useless languages in Jew school to keep you from being productive. Self-taught programmers don't end up like you. Meanwhile, almost everything substantial (not webshit) you use is written in C, C++, java, or .NET.

Missing the point, the point being that OOP is a tarpit between the programmer and the machine. OOP doesn't help the programmer talk to the machine, it does the opposite. And no, I'm not advocating for coding in assembly language, although doing so makes one appreciate the straightforwardness of C.
So that's where the "all programming is for people" truism comes from, reeking of socialist idealism. Programs are not written to be read by humans. Oh look at this highly optimized game executable, Imma open it in Notepad and read it! Oh look at this random JavaScript code, Imma read it. I may not be a developer, but Imma read it just the same!

Your true pursuit is that of high quality, non obfuscated, well documented code which is straightforward, elegant, and preferably not object oriented. Or it would be, if it wasn't for OOP apologetics wasting your brain space.

C++ and Java are not even object oriented languages, nor are the common coding styles in them OOP. Most people still program procedurally with them.

Wrong. It's all about maintainability. If you write some obscure code that is optimized for the compiler to generate code for the machine you are writing on that no human can understand, then that code is unmaintainable. That might be fine for some small utility functions, but once you reach the size of a large application having everything be so obscure makes it a nightmare to work with.
We are not talking about executables, but rather the source code.

Having to pretend to be the processor 100% of the time is wasting human time in which they could be spending on thinking on a more abstract plane.

Load and use a C++ library from Python. I dare you.

boost.org/doc/libs/1_67_0/libs/python/doc/html/index.html

We all make that mistake once.

>boost.org/doc/libs/1_67_0/boost/python/module_init.hpp

Every time.

to avoid other programmers using the boost library out of nowhere

you literally cannot build opengl/other graphical modules in opencv libraries without using boost.

It's a stupid complaint. Do you complain that we use text-based serialization formats for interchange between languages?

The fact is the only way to write C++ programs that can interface with anything else is to limit yourself to a C interface. You're not really talking to a C++ program, you're talking to a C wrapper that has none of the benefits of C++.

Again, it's a stupid complaint. Do you complain you can't natively embed javascript in python? What about ML? Haskell? No. You reduce the interface to something everything can speak. C is very close to what the hardware speaks so interfaces between languages are almost always roughly C.

On a Lisp machine, you could mix any languages because of tagged memory. Multics and VMS do this with data descriptors. If you wanted to pass a data structure from Lisp to a Pascal or Fortran program and back, you could do that without needing a "foreign" language like C in the middle or converting to text. That's real compatibility. Today, it might be something like combining Python, JavaScript, and Ruby into one program.


C++ originated on UNIX, is based on C, was a preprocessor for C, and was made by the same company as C, but they can't even get their OS to work with it after decades of running C++ programs, and they have no interest in fixing the problem. That's how incompatible everything is on UNIX.


I certainly complain about that. In languages not designed for UNIX, including BASIC, you can pass an array or other data structure to another program. Multics and Lisp machines are designed around that kind of sharing. UNIX text serialization is a huge source of bloat and waste, not only in terms of cycles and memory, but also debugging and development time.

Generally when slamming C++ all you need to do is findsomething C does badly, point and jeer and say, "lookit howDUMB that is good golly!" and then pause, ask the audience,"can you imagine something worse?" and while they're allstill shaking and puking and they finally quaver "nonono!"you point out that C++ in fact *DOES* something worse!!Gets 'em every time.Let's do the exercise: C's notion of how data objects are stored to disk ispretty primitive. This can be excused to a degree because Cwasn't *designed* to solve that problem, it's a lower-levellanguage. However, it's clear that any reasonablehigher-level language needs to support some kind of notentirely brain-crippled persistent storage. C++ is designed as an "object oriented programminglanguage" but, if you look at how object persistence can bedone..... uh.... It *CAN* be done, right? You mean youneed to write some kludgy stuff that does a C-style write()and assigns a type flag and then does a *cast* or somethingdown in its guts?? And if you cast an object to anothertype of object it forgets what it is? Waitamminit!!!! Youmean every application that wants to do persistence has tokludge its own application-specific tricks together to makeit work? Yes. C++. At better compiler stores near you.

Then you're a LARPer. No one's interested in handling every language's special snowflake binary serialization format, or wrestling shitty languages like LISP into dealing with unsigned machine words.

Read the quote again - they are compatible, you just have to use the lower level ABI. How do you export to import classes into C? You convert it to a struct, then pass that to C. The same problem, BTW, exists between higher level lanuages: JS uses utf-16, Python uses ascii/utf16/utf32, depending on the largest codepoint in a given string (include one emoji, quadruple the size of your string), and Ruby uses utf-8. You can't just pass the memory between languages, you need to convert between formats first. The easiest way to do this is to convert to a standard conversion format, then convert back. This conversion format is called JSON.
The problem is that the data format is inexorably tied to the machine. If the language is tied to the machine this isn't a problem. But nowadays we care about P O R T A B I L I T Y, so the data has to be represented the same on every machine. MShit has a common data representation for .NET application, which you can use between C#,F#,PS,VB, etc. (probably, I don't touch windows). On the JVM you can mix and match Java,Kotlin,Scala,etc freely. Both those do this by implementing a custom VM, which are highly compatible internally, but incompatible externally. If you want python on either VM, you need to use a custom implementation which is probably still on 2.6, because fuck the last decade.

Attached: ClipboardImage.png (75x135, 5.41K)

It's not just about data structures. It's also about calling conventions. Function calls have very specific and rigid requirements regarding how they're supposed to be called at the assembly level; they expect specific arguments in specific places and have cleanup requirements and other such business. With C, Ada, Fortran, Pascal and other such languages, it is simple to follow a simple standard ABI such as the SysV ABI and produce a polyglot system that interoperates correctly.

C++ doesn't care about any of that. Using C++ begets more C++ and the only code that will ever touch your C++ interfaces is more C++ code, preferably code compiled with the same compiler and the same version.

So your solution to this problem is to serialize data and communicate via sockets. That's seriously slow. It's slow as fuck. It's what a dynamic language would do.


Those are virtualized languages, moron. They don't pretend to be OS-loadable programs. They don't compile to ELF executables. They don't export symbols and entry points. They're up front about the fact they need their own virtual machine in order to run.

Who said you can't embed JS? It's pretty much designed for that. In fact, you can load up the JS, Lua, Scheme, Java and C# virtual machines into any program and talk to them and interoperate with virtualized programs.

Good luck performing a method call on a C++ object from another language. You can't even get at the required symbol because of how C++ implements overloading, member functions and other such features. The symbol mangling probably isn't consistent even within the same compiler family; I've seen people get burned when they tried to reuse old binaries compiled with old ass compilers. Not even C++ can reliably reuse its own code.

Oh fuck off with that bullshit.


It's just a high-level instruction set implemented by a virtual machine. Obviously, if your language compiles to Java bytecode, it becomes possible to directly interoperate with other languages that also generate the same code and use the same high-level JVM data structures.

If C++ was implemented on top of the JVM, it would generate code for its own implementation of classes with ridiculously fucked up symbols that the other languages simply can't interface with; the C++ code would be able to call Java code, but the reverse isn't true. C++ projects are parasites that don't even try to play nice with anything but other C++ projects.

What you're talking about here isn't an intrinsic C++ problem, i.e. a problem with the language itself, but rather an issue of standardisation between compilers. The standards committee elected to leave many details of compiler implementations flexible, as this has typically been "the C++ way". This is not without cost, and has lead to the symbol inconsistency nightmare you've described. But this could be easily corrected in a future revision of the standard, if so desired.

It can't be corrected as some platforms lack the hardware required for some of C++'s features.

Such as?

Ask reddit if you need spoonfed.

In other words you don't know of any, and were just being flippant. My reason for pressing you is that it is likely that any C++ feature dependent on the presence, or lack of, particular hardware would have little to do with the ability of a compiler to generate symbols in a standardised manner. The problems are orthogonal; and if not, the interaction would be very minor, limited to those particular features. Thus it would be possible to standardise symbol generation across compilers if the committee wanted to, whilst keeping those hardware dependent bits platform specific.

Communism was never tried before. Real communism. Pure communism. Forgive me, I'm just trying to be funny because your OOP fanboyism invites ridicule.

So tell me, what are the "real" object oriented languages? Preferably, languages which you actually use? Which you write code in, for other humans to read, because they'll magically understand your obscure language of choice by virtue of them being human and your style being OO?

But they're not pretending to be the processor or they'd be writing machine code directly, or at least assembly code. Your mistake is thinking that you're doing yourself a favor by alienating yourself from the machine. Actually who knows, your fantasy of "thinking on a more abstract plane" may yet come to fruition with the advancement of AI... future computers may be programmable by voice commands and facial expressions. Would that be abstract enough for you? Or is it not the "correct" plane?

Kill yourself, retard. Make detailed posts or fuck off.


The usual target architectures and ABIs simply don't support C++ in any reasonable form. Linux loads ELF executables, and that format is made out of various sections and tables. The symbol table is just a C string to address map that associates an exported symbol to the relevant data or code. It's very simple and languages that export functions this way integrate very well with the rest of the system.

In ELF there's no notion of exported classes, per-class function name spaces, virtual method tables, nothing. Anything that isn't a name › value map is gonna be implemented as an abstraction on top of ELF. The C++ compiler implements this by encoding this information in-band with the ELF data. There's no ELF data structure that allows disambiguation between A::func and B::func, so it just exports two different symbols with the namespace information encoded into them. Same thing happens with overloads: the parameter information is encoded into the symbol. Obviously, this means other code can't simply load up your binary and start referring to symbols within it -- they'd have to know the compiler's metadata scheme in order to interface with it.

Linux uses the SysV ABI for the architecture in use, so when you load some binary you know how to call its functions. When you load a C++ binary, you don't know jack shit. You can't even get at the functions themselves, and even if you manage that, how the hell do you call them? What's the calling convention? How do things like this pointers get passed? There just isn't a C++ equivalent of SysV. Using C++ libraries from other programs is about as sane as trying to talk to the Go runtime from another program. It's so ridiculous, compilers have broken compatibility even with themselves.

Two off the top of my head are smalltalk and eo.
I don't use OO languages. I think it's an inferior way to think about computing.

Those are slow input methods, that makes it tough to communicate your ideas to the computer. Those ways also seem prone to not thinking about edge cases / other complications that your design might need.

That's since before C++. (Checked Wikipedia, 1972 with a stable release 38 years ago.) Thanks for the history lesson.
Never heard of it, ever. (Checked Wiki, nothing there but Esperanto...)

OK. What languages do you use that aren't "an inferior way to think about computing"?

I just knew someone was going to say that. How about direct access to the programmer's brain waves?
That's the point, being high-level means relying on the machine to do the low-level things on its own. The higher level you are, the less control you have. C is the best balance between low-level and high-level hence its longevity, but that's just my opinion.

Not that user, but "eo" was probably a typo: en.wikipedia.org/wiki/Io_(programming_language)

I think standardising the metadata scheme for symbol export would go a long way toward resolving these compatibility issues. Rather than the ad-hoc system in place today where every compiler just does as it likes, a compiler would be required to generate them in well-formed manner in order to be considered standards-compliant. You'd have to have glue to map it into each binary target (ELF, Windows, etc.), but that's mostly a job of translation; the heavy-lifting would have already been done in deciding on a standard naming scheme in the first place.
True, and this is ideally where a standardisation effort should be focused, imo - at least in establishing common basics like calling convention and vtable/virtual dispatch. I would be surprised if most compilers hadn't converged on a common implementation these days anyway, just copying best practices. As for a full C++ ABI, though, there are good arguments against passing "objects" across application boundaries, most notably validation: the receiver of an object can never trust an object to be well-formed, despite its assurance that it validates on construct. You simply don't have these issues when working with built-ins and PODs, because with plain old data and no implied "intelligence" there is no false assurance of correctness. There's a lot to be said for the simplicity of the C ABI for that reason, and I don't view full support for objects to be that much of a win.
Yup. It's a complete mess at the moment, but I wouldn't rule out the possibility of standardisation. The language has changed enormously in the last decade, particularly in how it is governed, so it's not completely outside the realm of possibility. And one of the primary arguments for permitting complete flexibility in compiler implementation - competition in compiler development space - has been more-or-less nullified these days with the availability of mature, free-software compilers.

unsurprisingly this thread is full of people who have no idea what they're talking about
no one on Zig Forums has programmed in any professional capacity but everyone here loves to spread FUD

Bullshit. Having to use assembly or a foreign language like C in the middle means it's not compatible. If they were compatible, you would use the same native formats.

Lisp machines have standard classes, structures, numbers, strings, and arrays. You don't have to serialize or convert anything at all. On a Lisp machine, you really do "just pass the memory between languages" because there's one address space.

This isn't true. A binary data format is no more "inexorably tied to the machine" than a text data format.


That's a bullshit comparison. OOP has been tried before and it was shown to work a long time ago. That's why everyone wanted to add objects and classes to their language, but a lot of them did it wrong and it doesn't work as well. It's more like claiming airplanes don't work because someone built one out of sticks and it doesn't fly.


These are UNIX/ELF design mistakes. ELF is incompatible with anything but C and something with types like C, which sucks. Everything else integrates very poorly with the rest of the system.

Lisp machines support an object system and packages, so this is a UNIX flaw. It was a flaw in the 80s too. Other systems could do it correctly.


This is true in a sense, but higher level gives you more control over the implementation. On a Lisp machine, numbers are usually automatically promoted to bignums and replaced by a pointer to the bignum because everything is a tagged word. UNIX's anti-user philosophy leads to distrust of OS developers, so users are brainwashed to believe anything that makes things better has to have an ulterior motive.

C is a horrible balance. Its longevity comes from its flaws. C still uses null-terminated strings because a better string data type would not be compatible. C arrays are still broken because they decay to pointers so fixing them would break every C program. C's longevity comes from extreme incompatibility, where you are not even allowed to pass an array from one function to another in the same program. You have to use a pointer to the first element instead.

The crowning glory of it all is C++, the birthplaceof function prototypes. Function prototypes must have beenimplemented the way they were because it was easier for thecompiler writer to get all the parameters before starting toparse the function, rather than having to figure them out inthe ensuing lines. When a language is bent, twisted, andmalformed to save the compiler writer a bit of effort,you've got to wonder why the guy's writing compilers in thefirst place. Now, C++ could really use a decent compilationenvironment, with link time name resolution. ButNnnnnoooooOOOOOOO, rather than "do it right" some late nightcaffeine and taco fever nightmare scheme was concoctedwhereby symbols get transformed into DES-encrypted dog vomitand back. C++ is heading for standardization. All of thishappened because nobody wanted to put a few hooks for typechecking in their linker and object file format. I thinkthere's a lesson in all this, but I'm damned if I can seeit.

'OO' is a bunch of stuff. You'll benefit from some of that stuff compared to someone writing FORTRAN in an ancient manner.
Today, other people are using the good stuff and the difference is now that you're spinning your wheels or committing self-harm with the bad stuff.

yeah kid, I had this experience too, as a 12-year-old watching US Senators debate each other on CSPAN2.
Since it was a back-and-forth, and since I was persuaded by each speaker in turn, I realized pretty quickly that I was too clueless to evaluate the arguments I was hearing.
Another thing you'll eventually notice is that snake-oil salesmen are much more versed in the virtues of snake-oil than are its detractors in the vices of snake-oil. The side that eschews the stuff naturally comes to have less experience with it and be caught in 'gotchas' like its actual color, whether such-and-such snake's oil is often used, etc.

My anti-OOP sentiment I draw in part from personal experience with C++ but also from reading "OOP Oversold" (may it rest in peace) and Ben Lynn's C craft*.

With that out of the way, I'm skeptical about your claim that "[OOP] was shown to work a long time ago" while at the same time, apparently, languages such as C++ and Java "did [OOP] wrong" while being the most successful.

As far as I'm concerned, naively, OOP is a combination of three things:

1. Put the noun before the verb: "data.func()" instead of "func(data)"
2. Have internal (hidden) state associated with a group of functions
3. Juggle with relationships between classes, hoping it makes the source code easier to understand and to reuse (the cargo cult of design patterns)

Now explain to me how can OOP help me when the machine still thinks in terms of code working with data? To rephrase this, explain to me what's the benefit in distancing myself from the "natural" way the machine works.

* crypto.stanford.edu/~blynn/c/