Are markup languages actual programming languages?

What makes a normal programming language better than a mark-up language? Do you consider mark-up languages such as HTML and XML to be true programming languages? And what about lightweight markup languages such as WikiText and Markdown?

Attached: 1ec2c72ea8ea30c6674c1c023bd5cd8855b65ef593830ef9c940a34dbcb68aa2.png (1277x1080, 1.68M)

Other urls found in this thread:

It's just domain of intended use. Some markups like LaTeX can be used for general purpose computations. Most, like HTML (no JS or CSS, as neither of those is markup), can only be used directly to represent format, rather than general purpose computation. They are built for the primary purpose of displaying text and being able to do otherwise with them isn't really enough to not consider them markup languages, just potentially turing-complete markup languages.
These aren't scientific labels. It's general consensus. You considering a markup language a programming language doesn't change what you can and can't do with it.

nah, if anything i would argue they're just formatted data that can be parsed by a computer program

babbage, confusion of ideas that would prompt such a question, etc.



Attached: DeclarativeProgrammingClassification.png (588x307, 21.62K)

is it turing complete?

It depends, but generally no.
Not really.
Definitely not.

The primary purpose of a markup language is to mark up the content. This means for example that I need to be able to clearly state that a piece of the text file is a heading, another piece a paragraph, another piece a list item, and so on. In that regard a markup language is not a programming language, it describes content, but no logic.

With that said, some languages do include the ability to also do logic to some extent. For example TeX is designed to mark up text for typesetting, but you can also use TeX commands to control the TeX process itself. This is where lines get blurry. You could also use XML to control the flow of a program in implement actual logic in it (I'm not saying that's a good idea). Markdown is an example of a pure markup language, all you can do is describe the content, but you do not get any control over the interpreting process.

Oops, I misread the first question.

They serve totally different purposes, so the question doesn't really make sense. A programming language is for implementing program logic, for making things happen. A markup language is for when you already have a thing (i.e. some text), but you need a way of expressing it in a machine-readable way. The machine can then take your markup and process it into another thing. For example, if you have a food recipe in mind you write it down as HTML and a web browser then transforms it into a nice graphical representation.

The downside of pure markup languages is that due to the lack of logic you cannot automate anything. Let's say you often want to repeat some piece of text:

< All work and no play makes Jack a dull boy
< All work and no play makes Jack a dull boy
< All work and no play makes Jack a dull boy
< All work and no play makes Jack a dull boy
< All work and no play makes Jack a dull boy

I have had to write this explicitly down five times. What if I made a typo? I would have to correct each line individually. What if I want it written 12 times, but I don't know how many times I have already written it? I would first have to count how many times I have written it, then subtract that number from the target number, and finally copy paste one line that many times. Now what if every line had a little variation (like printing the line number)? Now even copy-paste has failed me.

If the markup language had logic capabilities I could embed a loop like
for i in 1 .. 6: printf("%i) All work and no play makes Jack a dull boy", i);
and have the programming language generate all those lines for me. However, programming language bring with them complexity and security issues, now you have to be careful that your text document does not contain malware.

A middle ground could be to use a programming language to generate a markup document. The person doing the generation still has to be careful that the program does not contain malware, but the generated output (in a markup language) can then be given out to people without concern. Static site generators for example generate static HTML pages that way. If you want to play around with generation you could give the Jinja Python library a try:

You write a template, which is a markup file with little placeholder inserted, then run the generator to have those placeholder in the template filled in. The Lisp languages can do something similar and have it built right into the language.

Mark-up langs are for documents.
Document store formatted text. Document get parsed and displayed by a program and are as such arbitrary data AND NOT A PROGRAM!
Images, text, videos aren't programs.
Mark-up languages aren't programming languages but mark-up languages. Is that too hard to understand.

Here's a rough tree, so you get it:

natural languages: German, English, Italien ...
programming languages:
compiled programming languages: C, C++, etc.
interpreted/JIT scripting/programming(may or may not be compiled) languages: Java, Lua, Python, C#, JS ...
text storage languages:
INI-files (Simple name value pairs)
CSS (Stores name value pairs but with scoping and cascading. Good for document formatting.)
>Some markups like LaTeX can be used for general purpose computations.
LaTeX (Yeah you fuckers, storing a math formula is what a document does.)
markup languages (store documents or settings in guman readable form): XML, HTML, MathML ...

Some degree of programming can in fact be done with HTML, but most of it is scripting.

Are lainniggers actual human beings?
I think not.

coldfusion. *shudders*

Attached: 75EADD25-F05E-4F0B-9C6B-78A32267D66A.jpeg (1008x720, 91.52K)

The escape sequences in forum text markup should be easy to easy to remember, capable but uncomplicated, extensible, and most importantly, difficult to use accidentally.
This makes SGML/XML (probably with the familiar HTML tags) best for forum and BBS usage. But for a text database, it's more important to use less characters and reduce overhead, readability doesn't really matter. For those, if a comma-separated file isn't enough, then something terrible for human usage like Markdown should be used.


That's one ugly ear, to be honest.

A lot of people would say that a programming language must be Turing complete by definition, since "it can run on a Turing machine" is the most common definition for what an algorithm is, and the purpose of programming languages are to encode algorithms.

Why would you consider it a negative?

Because there are programs which should not be valid. For example.for(;;);Turing complete languages allow you to program an incorrect program like above which will never terminate.

Have you ever seen a lainfag in real life?
I thought not.

Life is a loop that never terminates.

According to my programmer friends html Is considered mark down And it does not count as a programming language

Totality is a meme. All the total by construction languages achieve it by sticking arbitrary restrictions in the middle of their spec. Much more sensible to just attach a maximum runtime after which the program is autokilled. You'll still get your precious "programs always terminate". The restriction won't make programming too much harder. Where it does, someone can raise the limit. And it will be clear to anyone that your language is "turing incomplete" in the stupidest way possible, just like it was before.

No. It can make sense.
In the loop could be an if(something) break; or if(something) return 0;

Not to mention, that a program terminating after 20 years isn't better than a program never terminating by itself.

The problem is that it is a loop that does nothing and never terminates. I don't necessarily have a problem with programs that never terminate. As long as Life is implemented correctly, it should not go into a loop doing nothing but rather continue to generate new states. It would obviously be a bug if it were to go into a loop doing nothing which is what I want to avoid.

you seem to misunderstand. A programming language where everything is terminated after 1 second is total. This fixes your "does nothing and never terminates problem". The larger problem is left unsolved: "what if it hadn't finished computing?". If we wants computability (as turing defined it) then this problem is unsolvable. Your options then are killing the program after a fixed amount of time or after a variable amount of time.
You haven't demonstrated that the task is huge. Taking lots of resources != needing lots of resources (as you've shown). I can easily write a program in a total language that takes 20 years to run and does nothing. I wrote this total fizzbuzz for keks a while back: GCC takes 130 ms to compile it. A control fizzbuzz I wrote takes 30 ms to compile. Those extra 100 ms aren't needed - they are wasted time. Total languages cannot guarantee that you didn't waste your time.
Suppose I wrote a fizzbuzz that takes 100 h to compile instead. Are you going to let it run because it has to finish eventually? No, you'll kill the program, replace it with a real programming language, and fire the idiot that wrote it.

But it is not very useful since you can't run any programs that last longer than a second. You can't run a server anymore since it will just get killed after a second.
Well, I don't want it. I am very happy with a subset that will produce total programs.
The previous poster did. He said that if something takes 20 years to complete, it might as well take forever. I then replied that using distributed computing that 20 years might turn into a reasonable time period.
Also a total program is a proof that it is total in all cases. Even cases which you may have not thought to test. You make it sound like you will easily be able to find the case which goes over your threshold, but in a complex codebase it may not be.

All total languages are similarly restrictive.
You don't want to, say, be able to look for things that might not exist? Or maybe you argue that as long as you fix the number of tries your program makes before it gives up, then it counts as total? But then how do you know you've allocated enough tries?
He said it took twenty years to complete. He didn't say it needs to take twenty years to complete. A large task and an inefficient solution are different things. Consider the program that looks for the roots of x^2 + 1. I argue that this program should be given N = a trillion tries, just in case. Is this a large task? No, a cretin could prove the solution doesn't exist. Will it take twenty years to run? For large enough values of N, sure. But at least it's total!
But how do you know it's a twenty year program? You run it, and it's taking too long, and you want to switch over to a distributed implementation... Do you dare disturb the program? No, better to leave it. We wouldn't want to kill a running process, now would we?
Or maybe you did make a mistake when typing. One trillion?! I meant to type one million! x^2+1? It should be x^2 - 1! Me and my fat fingers... well too bad, guess I have to make sure this computer doesn't get rebooted for the next twenty years...
Then consider the following proof that all programs are total:
One day the universe will end.
Consider yourself absolved of all duties to use shitty languages.

No they aren't. Total languages either will run for a finite amount of time, or run forever while being productive. This means you can implement things like web servers which need to stay up forever.

As a professional software engineer, I can estimate the approximate time things should take to run. For something on the scale of twenty years, you would run a small part of the data and extrapolate from how long it takes to do that.
Ideally we would not. You should add a way for the program to gracefully shutdown and then use that.
If the universe ends, it doesn't make a program total. That's like arguing, "Since I can pull the plug out of the wall for my computer I can make any program terminate."
There's no reason why you couldn't build a more powerful totality checker and make it usable with common programming languages now. It just gets harder than the simple approach most languages are taking now.

Sure, you are. And I'm processor engineer working for ARM.
It's called saving progress. Talk English. And if it doesn't have progress but just provides service to other programs there's no reason for suicide being better than killing it.
What's the point?

what reason would you need to ask such a fucktarded question aside from because of the fact HTML allows you to embed JS in it?
markup means you're statically defining a bunch of elements, presumably nested in each other for it to be useful
programming languages are a completely different thing. they have stuff like variables, and loops or at least functions

A graceful shutdown does not only potentially save progress.
No, it should tell the other services that it is going offline.

You work for ARM? Raspberry Pi 4 when?

I told him that we aren't on reddit. No I don't work for ARM.
Neither can he prove he's a professional software engineer (plain English: fellow HTML-Javascript programmer) nor that that helps him to be all knowing about the topic.
Welcome to the internet, kiddo.




Kike mods are trying to get TOR banned. Don't let them get away with it!

Lots of shills in this thread.

A lot of these posts seem to be automated!

I'm seeing a lot of bot posts spammed around here. Be careful anons

Yeah, right, and the moon is made of cheese.