What are the advantages and shortcomings of JSON?

I'm writing a networked program which can send and store data either in XML or JSON. I've already mostly finished it and it uses JSON, but it's not too late to change. What are some good reasons to use or not use it?

Pic somewhat related.

Attached: modern programmers.jpg (500x618, 91.3K)

Other urls found in this thread:

langsec.org
langsec.org/papers/Bratus.pdf
langsec.org/insecurity-theory-28c3.pdf
seriot.ch/parsing_json.php
msgpack.org/
guides.gradle.org/performance/
twitter.com/SFWRedditImages

json is used everywhere by everything, your going to be forced to use it to interact with something at some point.
it works seamlessly with python, C on the other hand is fun to deal with. there's a bunch of c libraries to read it though. I just got done implementing cJSON, which was fun, but once you get the hang of it it's not that complicated.

i've never even bothered with xml, nothing i've interacted with required it over json, and json seems to naturally fit the data with no text bloat. i don't know why you would really want to represent data in html with custom tags.

fast but dont expect to use it in security oriented environments

Care to elaborate? How can it impact security?

are you 5

JSON isn't javascript.

JSON is not executable, and therefore cannot directly post a security risk. Furthermore I'm not using jabbascript as my programming language.
Yes, I did state "jabbascript inspired" as a disadvantage, but that was an autistic joke and not intended to be serious.

If you use XML then never make anyone interact with it directly. The syntax is so painful to interact with.

that is retarded

Are YOU retarded?

you have a childs understanding of security

why claim a security issue and post zero evidence there's a security issue. how about you post some proof nigger. right now it sounds like the only thing you know about json is from a jewgle search to figure out what it stands for.

You have no argument.

lol

lol

...

What if I told YOU...
Your computer is compromised. Your mouse and keyboard beam data directly to the NSA. The NSA has developed a microscopic, zero-waste-heat nuclear power source sufficient to power a long range high penetration wireless transmitter inside every single transistor, so they can continue to broadcast microphone and camera recordings to the NSA even if they are switched off. The only way to escape this is to throw all electronic devices out of the window, stop posting on 8ch and spare everyone from your retarded babbling.

Attached: hurr durr durr durr.png (274x321, 26.13K)

langsec.org

All you need is a parser that's proven to be correct and bug-free.

Attached: 7853475894378503.jpg (515x518, 59.59K)

on it

Use CBOR then, you autistic retard.

great

It's an IETF standard, moron.

wow what a fucking sea of change there columbus

Nigger it's a simple data interchange format, not a fucking language. What do you suggest we use instead, fucking XML?

Attached: 0561ca6e14ee9012e6d1ad7f1e407761d2fc5248a218963985c5c0360ecee352.jpg (720x858, 118.94K)

Let's just stop replying to him.

Protobuf is more efficient.

XML is dead, there is no reason to use it anymore, unless you are facing a particular edge case where you need XML, and if you were facing that case you would already know. So, use JSON.

Why has no one in this thread mentioned namespaces? Does JSON have an extension for them?

Plus, I may be an XML nerd, but I think XML entities may have a handful of valid uses as well.

is there data that can be expressed in xml and not in json and vice versa?

As I wrote in the OP, I've already implemented my stuff using JSON. So there's no pressing reason to change, I was just curious whether XML would offer any advantages. (Also I'm autistic about getting things perfect, so there's that.)

Well, this depends on what you mean by "expression." You can add semantics like comment:'foo' or namespace:'bar' to your personal JSON format, but in order to endow such user-defined extensions with broader (context-insensitive) qualitative distinction, you would have to effectively fork JSON as a whole, so for those extensions to have permanent functional behaviors (being ignored, raising exceptions, and so on.)

But, to reply more straightforwardly, if you're asking about the subset of "hierarchical expression of text structures," no, I don't think so, unless either format imposes some obscure limitations on field length limits or character encodings.

XML assumes some sort of schema/validation, so if you want that, it might be a valid reason to use XML. I'm sure there are are tools for validation of JSON too, but I don't think it's as common.

I looked at a couple of those papers because I thought maybe it would be about solving actual problems, or just copying how Multics and VMS did it, but it's more UNIX and C bullshit. Multics in the 60s was better than UNIX in 2018.

langsec.org/papers/Bratus.pdf
They're both bugs caused by C bullshit that wouldn't happen if you used a different language.

In the time it took to turn bugs into a “weird assembly” language, they could have done a whole bunch of things, like building a Lisp machine.

Do you know what's funny about all this? That “weird assembly” language is probably a better programming language than C.

If there's one thing which truly pisses me off, it is theattempt to pretend that there is anything vaguely "academic"about this stuff. I mean, can you think of anything closerto hell on earth than a "conference" full of unix geekspresenting their oh-so-rigourous "papers" on, say, "SMURFY:An automatic cron-driven fsck-daemon"?I don't see how being "professional" can help anything;anybody with a vaguely professional (ie non-twinkie-addled)attitude to producing robust software knows the emperor hasno clothes. The problem is a generation of swine -- bothprogrammers and marketeers -- whose comparative view of unixcomes from the vale of MS-DOS and who are particularlysusceptible to the superficial dogma of the unix cult.(They actually rather remind me of typical hyper-reactionarySoviet emigres.)These people are seemingly -incapable- of even believingthat not only is better possible, but that better could haveonce existed in the world before driven out by worse. Well,perhaps they acknowledge that there might be room for someincidental clean-ups, but nothing that the boys at Bell Labsor Sun aren't about to deal with using C++ or Plan-9, or,alternately, that the sacred Founding Fathers hadn'texpressed more perfectly in the original V7 writ (if only wepaid more heed to the true, original strains of the unixcreed!) In particular, I would like to see such an article separate, as much as possible, the fundamental design flaws of Unix from the more incidental implementation bugs.My perspective on this matter, and my "reading" of thematerial which is the subject of this list, is that the twoare inseparable. The "fundamental design flaw" of unix isan -attitude-, and attitude that says that 70% is goodenough, that robustness is no virtue, that millions of usersand programmers should be hostage to the convenience orlaziness of a cadre of "systems programmers", that one'stime should be valued at nothing and that one's knowledgeshould be regarded as provisional at best and expendable ata moment's notice.My view is that flaming about some cretin using afixed-sized buffer in some program like "uniq" says just asmuch about unix as pointing out that this operating systemof the future has a process scheduler out of the dark agesor a least-common-denominator filesystem (or IPCs or systemcalls or anything else, it -doesn't matter-!)The incidental -is- fundamental in dissecting unix, much asit is in any close (say, literary or historical) reading.Patterns of improbity and venality and outright failure arerevealed to us through any examination of the minutiae ofany implementation, especially when we remember that onecornerstone of unix pietism is that any task is really nomore than the sum of its individual parts. (Puny tools forpuny users.)

I hate LARPers

Bravo.

Attached: 1444601700261.png (528x297, 258.07K)

Look I think Unix is flawed as well but what the fuck are you even talking about

Reminder that you could solve all of your problems by creating a new LispOS yourself.

I would like to thank you for your posts. You've made me take a step back and actually see that C and Unix was not as good as I thought they were. Thanks to your posts I actually tried out Common Lisp (using ecl) and actually enjoy the language quite a bit. It just really clicks with me and is nice to work in.
Once again thanks for your posting.

I'm talking about how much C and UNIX suck compared to other languages and operating systems. If a malloc has so many bugs that they can form their own assembly language, it sucks. Why does C suck so badly with its array pointer bullshit that you can't even add bounds checking?

Here's another example.
langsec.org/insecurity-theory-28c3.pdf

Software sucks because UNIX weenies don't care and they ignore all that stuff. A lot of security and error handling techniques were known in the 60s when Multics was created, but that didn't stop the UNIX programmers from ignoring them.

What they're doing wrong is using C. C sucking is a law of nature and the problem of C sucking is not solvable, but that doesn't mean all software has to suck forever, only C software. C sucks so badly that people completely give up on trying to make software not suck. Some languages have the opposite effect.


That's what I'm trying to accomplish. I post these quotes because most of them are still valid today. The UNIX weenie mentality hasn't changed at all. Common Lisp does a lot of things well and is a good general purpose language. An OS written in it would eliminate an enormous amount of bloat and waste that is part of every program now.

Yes, and they've succeeded. Hordes of grumpy C hackers are complaining about C++ because it's too close to the right thing. Sometimes the world can be a frightening place. I've been wondering about this. I fantasize sometimesabout building better programming environments. It seemspretty clear that to be commercially viable at this pointyou'd have to start with C or C++. A painful idea, but.What really worries me is the impression that C hackersmight actively avoid anything that would raise theirproductivity. I don't quite understand this. My best guess is thatit's sort of another manifestation of the ``simpleimplementation over all other considerations'' philosophy.Namely, u-weenies have a fixed idea about how much theyshould have to know in order to program: the amount theyknow about C and unix. Any additional power would come atthe cost of having to learn something new. And they aren'twilling to make that investment in order to get greaterproductivity later. This certainly seems to be a lot of the resistance tolisp machines. ``But it's got *all* *those* *manuals*!''Yeah, but once you know that stuff you can program ten timesas fast. (Literally, I should think. I wish people woulddo studies to quantify these things.) If you think of aprogramming system as a long-term investment, it's worthspending 30% of your time for a couple years learning newstuff if it's going to give you an n-fold speed up later.

kill yourself rustnigger

yes. yes.
C could be good for things like making Quake game for hundred-Mhz computers. but it's horrible for developing operating systems, where you need security and reliability.
Why did UNIX idiots choose and keep using C instead of something like Ada? Yes, UNIX is older than Ada, but they kept using C instead of switching to something better. And if OS turned out to be slow, they could rewrite *some* parts in something faster and audit that parts very much.
but unix and open source idiots cannot into DESIGN and PLANNING. that's why they will always lose. linux is a big poop where random devs shit into.

There are none.

I think the biggest advantage of JSON is that it's human-readable. You can write a JSON file by hand in a text editor without problems, or you can take generated JSON and see what's inside.

The downside is that the JSON grammar actually has holes and it's never really clear which version of JSON one means.
seriot.ch/parsing_json.php

Being human-readable means that it's harder and slower to parse. If you don't need your data to be human-readable (e.g. when used for IPC) that's a pure downside. A good alternative would be MessagePack, it's similar to JSON, but binary, so it's smaller, simpler, and faster to parse.
msgpack.org/

The idea behind MessagePack is that you have a stream of binary data, and the first byte of each datum tells you immediately how to interpret the following data. For example, when reading and array, the first byte tells you what type of array it is, and when you know that you know how many bytes to read to get the size of the array, so you can allocate the necessary amount of memory, and then you read the actual content of the array. By contrast, in JSON you would first have to iterate through the list until you reach the closing bracket and count along the way to find out how long the array is.


You could have at least picked a Scheme, but Common Lisp is a Frankenstein monster that begs to be put out of its misery. Everything about the design of the language makes me rage. There is a beatiful language hidden somewhere under all that MIT autism.

Give one fucking example of a security vulnerability embedded in JSON, or never talk to me and my board again.

I wish. Fucking pajeet Java Maven sand niggers won't let it ever die. Java programmer genocide when?

Attached: 1002.jpg (192x192, 10.23K)

Ada is a terrible language and no-one should use it.

The problem with Maven isn't XML, and the best alternative being Gradle, you just swapping one declarative format for another (JSON).

I hate gradle. It just seems to be REALLY slow to do anything.

The traditional (stupid) way of parsing JSON is to eval() it in a Javascript interpreter. This is the accepted, canonical (stupid) way to do it. When people continue to do this, it opens up tons of security holes. Therefore, even stupider people will ban JSON rather than firing the incompetents under their care.

I thought that was JSON.parse or something.

No, the canonical way to parse JSON in javascript is to fucking JSON.parse() it. You have no idea what you're talking about, retard.

A developer wrote some pretty deep and unnecessary hacks in a "Gradle Commons" project that the other projects inherit where I work.
Now every project is stuck, pinned on an ancient Gradle version for all time because nobody wants to go unravel that shit.
Still beats npm.

You can't really compare npm and gradle as they differ in usage. You have to rerun gradle for every change which means that you have to wait for extremely long time for gradle to startup before it can compile and you can test out your changes. With npm at least you only have to run it when you are pulling down new packages.

I think there might be some sort of daemon or something you can get for gradle or maybe I'm just basing this off of an experience I had years ago with it, but I find it to take forever to do a regular compile.

guides.gradle.org/performance/