Demoscene Thread

In short, demos are computer programs showing the amazing capabilities of a given platform in an entertaining way, sort of like a music video, but instead of rendering some animation offline it's real time. Of course modern computers are so powerful it's hard for just few people to make some new thing on pc that was never done before graphics wise, so to make things interesting people are restricting themselfs to small executable sizes like 64 or 4kb, sometimes even just 256 bytes.

I'm personally way more interested in the demoscene for retro computers, Amiga and C64 in particular. Some of you might remember me making such threads before from time to time where I was just posting clips from the demos I liked. Well... I finally started coding on C64 this year and managed to release 3 productions, one even at the largest C64 party in Holland.

For me nothing can best the fun I get from programming demos on the C64. The platform has lots of weird quirks and limitations that really make you think different about creating your desired effects. That's true for most of the retro plarforms, where you didn't have enough ram and fast cpu to just calculate everything, instead you are playing with the raster beam to bend the graphics and using the weird bugs of the chips to produce some unusual results, but I feel like the C64 has the most of unintentional features of the old computers and it also has an awesome soundchip.

Well, here's my crap, feel free download and run in emulator or look up the names on YouTube for videos. If you have a C64 and are able to run it on real HW that's even better. It's PAL only.

I can explain some of the C64 trickery if people are interested

csdb.dk/release/?id=168237
csdb.dk/release/?id=170934
csdb.dk/release/?id=171684

Attached: images.png (540x405 112.94 KB, 89.29K)

Other urls found in this thread:

pouet.net/prod.php?which=77371)
youtu.be/x0vFFiQ2rZM?t=59
youtube.com/user/RevisionParty
github.com/farbrausch/fr_public/tree/master/werkkzeug3_kkrieger
youtube.com/watch?v=ie4u2i_5OdE
youtube.com/watch?v=sQNcIKAIBF0
youtube.com/watch?v=rNqpD3Mg9hY
youtube.com/watch?v=IQIsGgbmpww
youtube.com/watch?v=N7dTNHaU2jQ
youtube.com/watch?v=jB0vBmiTr6o
csdb.dk/release/?id=173185
youtu.be/91wF7jmLM6w
csdb.dk/event/?id=2757
cpcretrodev.byterealms.com/en/
pouet.net/prod.php?which=56761
cpcmania.com/Docs/Programming/Painting_pixels_introduction_to_video_memory.htm
cpcmania.com/Docs/Programming/Sprites_I_Introduction_to_using_sprites.htm
oldskool.org/shrines/lbd
youtube.com/watch?v=hNRO7lno_DM
twitter.com/SFWRedditGifs

Please explain.

You need to understand that the CPU is in sync with the video, so you can know where on the screen you are while executing your code and you can change the video registers on the fly. This is true on almost all retro systems. For example, let's say writing magic 1 to address $d016 enters bitmap mode and writing 0 puts you in the text mode. If you make an interrupt at line 0 and in the interrupt handler you set the $d016 to 1, then ask for another interrupt at line 150, in its handler you set $d016 to 0 and you ask for interrupt 0 again, you will create a split screen with bitmap on the top and text at the bottom.

Some quick numbers to have in mind. This is for PAL C64
312 rasterlines, 63 cycles per line, 50 FPS. 312 * 63 * 50 = 982800. Exactly that many Hz the C64s CPU has (0.98 Mhz). One cycle is 8 pixels on the screen, same as one char.

Different instructions take different amount of cycles, but it is constant on all machines, so you can time your code precisely down to one cycle. Most of the interesting C64 stuff is exactly that, changing some video registers at very specific time to trigger some edge case. I'd say that more than half of the C64 tricks (and I mean individual tricks, not whole effects, effects are usually combinations of several such tricks and some maths, LUTs and all that) are caused by one video register... $d011

Unlike the NES which has hardware scroll in both vertical and horizontal with wrap around, making infinite scrolling games like mario or contra easy (it's not a coincidence these types of games were the most common on NES), C64 only has hardware scroll by 8 pixels (0-7 pixels scroll to the right or to the bottom). If you want to scroll the whole screen, you need to change this hardware scroll register from 0 to 7 and then when it goes to 0 again you copy the whole screen to the side you scroll in. This takes a lot of CPU, but that's what you have to do.

On the other hand, NES makes the CPU totally idle while the image is being painted, leaving you only time to execute game code in the overscan area, while on C64 the video chip and the CPU work at the same time. That means you get more CPU cycles per frame, even though the clock is almost half the speed (NES is about 1.7Mhz) and you can do these tricks I will talk about shortly. How is that achieved ?

The ram is twice as fast as the cpu and video chip. I'll simplify to whole MHz. Ram is 2Mhz, CPU and VIC are 1Mhz, They interleave. On one ram cycle the CPU reads the ram, on the other cycle the video chip reads the ram.

There's one instance where the CPU is idle on the C64. It's called a badline. If you look at a C64 screen you can notice that the visible part (I'll ignore the border for now) is a 40x25 grid of 8x8 pixel characters. Every first pixel line of these characters is a badline. VIC has to do extra fetches on these lines to read these characters, so it uses all 2 Mhz of the ram and the CPU is idle for that time.

Now what if you add vertical scrolling into the mix ? You can't "hardcode" the badline being always 0 - 8 - 16 - 24... line, because if you scroll the image down by one pixel the first line of the font won't be line 0 - 8 - 16 but 1 - 9 - 17 etc. That's why the rule for a badline is... If the low 3 bits of rasterline counter are the same as vertical scroll register (3 low bits of $d011) it is a badline.

But wait... we can change the $d011 whenever we want, right ? So what If on every line I change $d011 so the badline condition never happens ? What If I trigger a badline more often than every 8th line ? What happens if I trigger a badline in the middle of the screen ? That's when all the interesting things happen.

Attached: dustlayer.com-screen-raster-cycles.png (750x509, 174.7K)

This brings back good memories of C=64 and later PC scene and parties.

I've gotta give props to the guys who push new hardware as there's some real creative and visually impressive stuff out there (pouet.net/prod.php?which=77371) but most of it doesn't tickle my technical bone in the way it was tickled before OpenGL2/DX9c. Nothing gets me hard like .kkrieger.

I remember downloading and running .kkrieger when it was released, but I didn't know about demoscene yet. I got it because there was an article about it on a gaming site I used to browse.

great thread OP
poking the C64 is a fun experience.

FLD is probably the first $d011 trick discovered back in 1987. It allows you to scroll the whole screen down without copying any graphics, however, the scrolled out char lines do not appear at the top again (no wrap around), so it can't be used to implement infinite vertical scroll, but you can move bitmaps or chars up and down, thinking bouncing scrollers or just an additional way to fadein / fadeout a bitmap.

It works by denying the badlines. VIC is designed to read the screen memory on the first line of the char (the badline) and then for the remaining 7 lines it only reads the color and character definitions. After that the cycle repeats, but if you change $d011 on the 7th line so the badline condition is not met, VIC doesn't have new char indexes to draw... it goes into IDLE state painting only background color (and a "ghost byte" from $3fff, but if you just set $3fff to 0 you can't see it). When few rasterlines later the badline finally happens, VIC just draws the rest of the screen like nothing happened. Viola, you get a screen scroll.

You don't have to do it on top of the screen, you can also do it in the middle moving individual char lines.

youtu.be/x0vFFiQ2rZM?t=59

there is still a decent demoscene?

t. 18yo

I'll never forget when I was in high school, I had the hottest computer in the whole school, I can't remember if I had my 386 or had upgraded to a 486 by then, but I had a Soundblaster Pro and the first kick ass Future Demo dropped. I was the only kid out there who could play that thing without stuttering or lag of any kind. It was bitchin'.

Nobody does this kind of thing any more, there's no point. The hardware is all so advanced and so far from the programmer, what'd be the point?

* i mean to say, with respect to all current demo scene people, nobody does this shit on mainstream PCs because there's no point. I've seen some wicked bad shit lately though, the sample editor / player for the C64 is especially amazing.

youtube.com/user/RevisionParty

Are there some people who "demoscene" on modern hardware? Because game consoles could definitely benefit from somebody who can optimize everything so well.


.kkrieger is legendary tier, check the source code :
github.com/farbrausch/fr_public/tree/master/werkkzeug3_kkrieger

I doubt it, with all the shaders and other shit in the pipeline these days you're not really going more bare metal than the toolkits and drivers allow. Demo scene coders took advantage of the hardware directly because it was documented and addressable, you have no clue what's going on behind your GPU's drivers and firmware. I haven't looked in years but I suspect any modern demos on the PC are being made with CPU graphics only, not even touching the GPU.

op here. In terms of consoles you are correct, the modern demos however use the gpu of course, it's all directx or opengl. There's few that still do software rendering. Most popular and regarded modern PC demos are the 64kb and smaller, since that's some cut and clear limitation, quite severe too I think. With these kinds of prods you are allowed to dynamically link with directx dlls or anything that is installed by default on a system (99% of pc demos are windoze)

The demoscene is alive, especially pc, I'm sure there's way more pc sceners today than on C64. Month ago I was at x2018, exclusively C64 party and there was more than 400 people

Attached: a1bcab3e3170cb255f1dd0ed7bc0946f.jpg (2016x1134 53.5 KB, 953.02K)

Do you mean that it's entirely impossible to do or that it's impossible to do as long as we don't have the GPU instructions?

The last time I was this impressed was when I discovered modern LineRider tracks.
This is beautiful.

Attached: WOOOOOOOOAAAAAAAHHHHH.jpg (1920x1080, 329.27K)

If you have the spec you write a driver and then something like directx or opengl thar uses that uses that driver. You would end up with similar results to what we have now.

My group is working on a ps1 demo for revision 2019

Here's some choice picks because why not
youtube.com/watch?v=ie4u2i_5OdE
youtube.com/watch?v=sQNcIKAIBF0
youtube.com/watch?v=rNqpD3Mg9hY
youtube.com/watch?v=IQIsGgbmpww

Is there a full game that was made by demoscene guys?

many 2000s game studios were filled with demoscene people, for example Remedy (Max Payne) and Croteam (Serious Sam).

My neighbour was a demoscene coder and did work on Tomb Raider (1996).

XDDDD haxz0rz the pl4nit XDDDDD

Is it the guy at 5:46?
youtube.com/watch?v=N7dTNHaU2jQ

So 99% of pc demos are shit quality that dont do anything interesting with the underlying hardware, only the API.

If these fags used hardware supported by the nouveau/nvidia open source driver on linux they would get bare metal GPU access along with community documenation. Or if they wanted AMD they would need to use the r200/300 series of supported AMD gpus which are decades old now, the nvidia ones supported by nouveau are newer and more powerful.

It is impossible unless you have the documents and bare metal access things like nouveau or r200/r300 drivers provide.

That's shit, those guys are mediocore at best.

One of the all time great demos, Elevated.
youtube.com/watch?v=jB0vBmiTr6o

Because the focus shifted to mostly little size demos (64kb and below). what's the point of not using directx or opengl when these apis don't have that much overhead and guarantee the demo will run on all graphics cards and not only on the one gpu (or gpu series) that they wrote the demo for. I'm coding on C64 and like bare metal very much, but that doesn't mean I'm a retard who would want everything done this way on modern hardware, it's just not feasible. For example, look at templeOS. In order to be compatible with lots of PCs he could only use what's in the BIOS api, that's limited to the old vga graphics modes and reading harddrives/CDs. If he ever wanted something more like network card he would have to write the driver for all the fucking chipsets that exist and are most likely not documented.

Because you can push the hardware to its limits and do amazing things. Hence the point of demos in the first place you collosal and lazy retard. Now go lurk two years.

If you want to push the hardware to its limits, you have to use the hardware and not let someone else do it for you. You lazy faggot, I already gave examples of semi-modern GPU's you can go bare metal on with docs and do what you want to do with the proccessor ISA, DMA, and GPU ISA's like in hardware of old. But hardware is different, as will programming quirks like OP's autism with the c64 for it be so at such a level. You don't need networking to use a computer program to push the hardware to its limits btw. You can directly adress the GPU over pci/e space with things like nouveau and bypassing the MMU and using DMA in ring 0/-1 for intel GPU's. git good faggot, malware makers are appearently more skilled at pushing modern hardware to its limits then 100% of the lazy demoscene faggots.

Woops
Should be

Attached: f3f4991e5f93f1cb05ff49af9c7178bf611ea5b034e931771dc7f6b960ebe686.png (604x717, 813.74K)

I'm the op, show me your hot bare metal prods mighty larper. What you are asking for is realistically impossible, sorry to crush your dreams. How does one write demo for bare metal undocumented gpu? And why would they segment their production to such radical extend. It's not like we have 5 graphics cards, there's hundreds of them. People realized that and instead of trying to "push the hardware" (and by that you mean drawing 10 milion polygons instead of 8 ?) and focused on making small executables. Using the Dx/OpenGL api doesn't magically make your demo 64kb when a simple hello world is few kilobytes. Try doing it yourself and you'll see. Though I know it's not gonna happen because you're just a nodev lamer.

His argument doesn't even make any sence because malware works in user space or is trying to leave it and go to kernel or maybe even deeper. You need to go through several api to do it, of course while looking for bugs in said api and using it badly for their advantage. Next he'll day demoscene ain't shit cuz they use x86 opcodes instead of making their own eith microcode.

New intro from my crew
csdb.dk/release/?id=173185

Attached: 00079651.png (360x266, 8.54K)

Vid?

youtu.be/91wF7jmLM6w

Tight.
|░░░|░░360 ░░|░░░|| | 370 █▀▀ █▀█ █▀▀ █▀▀ ▀█▀ █▀▀ ▄ | ||░░░|░░380 ▓░█ ▓▀▄ ▓▀ ▓▀ ▓░ ▀▀▓ ▄ ░░|░░░|| | 390 ▀▀▀ ▀ ▀ ▀▀▀ ▀▀▀ ▀ ▀▀▀ | ||░◯░|░░400 ░░|░◯░|| | 410 >> to your Crew & (you)

It's nice, what else do you code usually?

demoscene related stuff, only c64. I started this year, you can see more of my stuff in the OP. Outside of demoscene I use c++ at work. I also worked on a pc game in c++ some time ago but now c64 took the priority.

tomorrow is the last day to submit entries for 2018 intro creation compo on csdb. Check out all the intros submitted already. csdb.dk/event/?id=2757

Attached: 171924.png (384x272, 7.17K)

Is there an event where actual games are being made or is it just intros and well optimized eye candy?

it's called intro creation competition so figure out what is it about. Game compos happen from time to time but sceners aren't really interested in games that much it seems (including myself)

Those would be game jams, and I've seen at least one in my lifetime that had a filesize limit. That's probably the closest thing you'll find to a demo-y game-y jam.

Probably a stupid question that will get me laughed off this board but here we go!
How would someone go from know nothing nodev to making their first demo?
I like the idea of having something behind my work and something neat to show for it but Im operating from babby tier knowledge.
I know this is basically asking for the spoon but rest assured I'll feed myself and if all goes well I'll have some OC to show for it

Most demoscene makers are on windows user.

PC or other platform ? On PC people even do demos in Unity, but I hate that crap (and most people will just roll their eyes if they see that). It also depends if you want to do 3D or just 2D stuff. If you want to do some 2D oldschool style stuff, you could just download SDL, get a framebuffer and just paint some pixels (one pixel will be 4 bytes for RGB + padding, or 1 byte if you use 256 palettized color mode). It's kinda hard to tell you what you should start with. For PC it's pretty much just like game engine programming but you skip coding AI and pickups and "gaming" stuff and just do the graphics.

That's true, but that's mostly because:
1) They want to reach the most people
2) Size does matter and it is allowed to use directX / Windows dlls without them being counted into your executable size. You can find quite a bit demos that run on X server (so you get linux / bsd)

wut. waste of trips

port scanning that shit would be really fun and interesting

Are you using some kind of debugger? no$psx maybe?

I don't know why we don't see more of this, a single hardware platform and trying to wring every bit of performance out of it. I bet the PS2 is still capable of some neat stuff.

You tell him to lurk more when a couple posts ago you were asking him if there are x86 demos... What an unironical autist you are to get mad just because modern demos run on Windows.
Get real freetard, nouveau pushes the hardware so hard that you get about 10% the performance of the proprietary drivers.
And even if you accepted that your demos are going to look like shit, the point of x86 demos is that anyone can run them on their computer. And even then, if you accepted that demos could only run on only 0.1% of the hardware out there, everybody would end up just packaging the proprietary drivers on a stripped down Linux install, because no demoscene team has hundreds of programmers working for years to write optimized drivers for a GPU in addition to writing the actual content.

3d consoles are in the middle between the simple 8 bit computers like the C64 which didn't need that much documentation since they were so simple, and the modern x86 where the documentation for the APIs is public. The documentation publicly available for the psx and n64 is really shitty and you'll end up doing a lot of guesswork and trial and error, and for the ps2 and newer it's probably even worse.

PS2 would be nice to see, also PS3 because of its unique architecture.

Not sure, I do t have anything to do with it, just saw couple of previous from the guy working on it. Desire has a lot of members working on different platforms, I'm the C64 guy.


The more you go into the modern hardware the less unintentional bugs turning into features there are. It also has the same problem as modern PC demos. If you have a really powerful machine like ps3 it would be hard for just few people to make assets that would push it. I mean, unless you really just want to display some weird model and say it has 10 milion faces ! never done before...
8 or 16 bit computers were really simple, you could look at the schematic and see how individual bits travel from chips where and when. It's just not the case on modern platforms.
I think there's a lot of games on PS2 that really do wonders with that hardware, for example GTA San Andreas. For a console from 1999 I think it's remarkable.

Intro to a new demo I'm working on.

Attached: simplescreenrecorder-2019-01-05_23.04.41.mp4 (740x552, 637.91K)

Good work user

demo scene is pretty sweet defintly the c64 one. Its a good way to flex programming skills.

Attached: 49515763_2037152019704753_7451975542271639552_n.png (960x750, 781.89K)

I don't know about "parties" but there's usually yearly game making contests, for 8-bit platforms. For example:
cpcretrodev.byterealms.com/en/
Pic was 3rd place in 2015.

Attached: Top Top.jpg (500x707, 240.01K)

CPC is even weirder than C64 and mostly frog

Except maybe for the unusual 3-inch floppy disks, it's less weird than C64 because it doesn't have any special custom chips like SID. You have a Z80 and a common sound chip, an 80-column screen mode (for CP/M support), and a really fucking good dialect of BASIC that doesn't force you to use PEEK and POKE constantly just to do basic (heh) things.
It was popular in UK, France, Spain, and Germany. Other places not so much.
For some reason it didn't get much interesting demos until the Batman one from 2011, which showed the system was vastly under-utilized and overlooked.
pouet.net/prod.php?which=56761
There's also the CPC+ models that greatly expanded the color palette. But at that point people were already moving to 16-bit computers like ST and Amiga, so it never got popular.

Attached: 1.png (384x272, 14.96K)

But that's what I'm talking about. It's really hard to make anything for CPC. For example, look at its framebuffer. C64 doesn't have a linear framebuffer, but it's not as crazy as CPC where every other byte is every 7th line on the screen or something crazy like that... On the other hand it's extremely easy to repeat lines which makes it perfect for stretchers and shit like that. Also HW scroll (I think ?)

Sure, it's not linear, but was anything back then? Anyway you just make a putpixel routine to abstract that low level addressing stuff and don't have to worry much about it after that.
cpcmania.com/Docs/Programming/Painting_pixels_introduction_to_video_memory.htm

That would be to slow for anything serious. Same goes for the C64. People come up with crazy ideas to avoid doing exactly that. And yes, there are 8 bits with linear framebuffer, for example 8 bit Ataris and CoCo. That's why these computers mostly do 3D or generally speaking "pixel" effects like zoomers, rotators, etc. While Platforms like C64 and CPC rely more on the hardware quirks and fake the real calculations. For example, since you can stretch the screen in Y axis with hardware tricks, you only need to do expansion in X in software. If you use sprites, you can just put them closer together, making them overlap and you only cut the sides of each sprite's graphics.

I don't know, this method is pretty fast for drawing sprites, so good enough for games.
cpcmania.com/Docs/Programming/Sprites_I_Introduction_to_using_sprites.htm
On the CPC, most games didn't have a scrolling background anyway, so you just move your sprites around, pretty much. Fancier games would require special routines for background scrolling and whatnot.
For demos, I agree that you'll spend most of your time trying to squeeze performance and tricks out of the hardware and then this won't cut it.

Attached: Amstrad CPC464 - Bruce Lee-HjwCRcIE_hU.mp4 (384x268, 9.71M)

Yeah, but why is that ? The hardware was apparently to complex for game makers. Though truth be told, almost all CPC games are lazy ZX spectrum ports and they don't take the advantage of the hardware. What they would do is replace these "putpixel" routines to work on ZX and CPC and didn't give a fuck it's slow as shit. This Bruce Lee game is a classic for many people playing these early computer games, but in my opinion it's shit.
I'm not even throwing shade on the CPC, just saying it's weirder than even C64 and it's simply true, it was so weird that people ignored it for so many years. Now it kinda seems like the best 8 bit, except for the horrible soundchip (ay is alright, but is nothing compared to sid). CoCo3 is also very attractive. It doesn't have any fancy video hw to scroll or stuff like that, but it's extremely convinient to program. The framebuffer is totally linear, you can set the screen pointer with little granularity. The screen width is 256 so you can just bitshift to calculate line offset etc... It's soundchip is even worse than AY though.

Attached: coco3_2.mp4 (400x300 1.33 MB, 526.82K)

I'm thoroughly impressed. I never was into the Windows demo scene but holy fuck that's awesome.

That is from 2009 too.

How much of the demoscene is on linux?

Not much but most demos will run in wine

oldskool.org/shrines/lbd
youtube.com/watch?v=hNRO7lno_DM
(in recommended reading/viewing order)

If that is true, big kudos to Wine.

wine is awesome for sure, but it's nothing special really. As I said, modern PC demos use the standard directx or opengl api and wine almost never has problems with that. If anything wine can struggle with win32 api (drawing the gui elements) or input.

Why not Vulkan?

Malware writers make money though.

Many computers don't have Vulkan drivers. It's up to the programmer to choose if they want to target Vulkan to the exclusion of DX and OpenGL.

Not much I'd think. Most of them I mean, not the few that manage to get organized and build a large network of automatons.

probably there are some demos on vulkan, but vulkan requires more code, so it doesn't fit for 64 or 4kb demos

At this point, the best documentation you can get is the source code of the leading emulators of those platforms. They're pretty damn close to the originals quirk-wise these days.

It's pretty easy to draw on the framebuffer in linux. It's basically opening a file and then making a buffer to get animation working.

Attached: hardness.webm (640x480, 121.81K)

Not if you have x running. You have to switch to another virtual console. And you can't change the resolution.

Yeah it's not easy because you have to press a key combo. What was I thinking.

chvt 1