on what computer was this made ?
gotta admit despite being more then 20 years old the 3d looks amazing !
Movie
Most of the scenes were built using motion control models and practical effects. Digital was used sparingly.
ibm thinkpad with blender
What difference it makes? Probably it was some intel processor or nvidia card, but I don't know. These scenes use lower quality when they're working on it (on beefy machines, the graphic cards alone cost more than a gaming PC because nvidia takes a really fat cut with the professional models) then add all the effects for the final product later. But these effects aren't added using one of those dev machines but many cheaper machines on what's called a render farm.
Also you can get good graphics because you have time, it's different with a game where you have one machine only and must render the next frame really quick.
You're looking at models. Zorn's critter was a model, the bad guy alien niggers were wearing masks, etc.
Digital was used for compositing, some explosions, the scream scene at the end where they stop the baddies, etc, but models were king.
It's the same reason ST:TNG looks good to this day, practical effects when done well don't age badly. 2001: A Space Odyssey was done with all practical effects.
Fifth Element
*sips* Yep, that was a good movie.
makes sens, the amounts of details that are in the scens would take years to make in 3D especially in 1997
The city backgrounds were digitally composed from photographs of existing buildings, I'm pretty sure. Take a nice high resolution scan of a photo of a building and use that as a texture, then you can pan the camera around and it will look great especially since most of the action was happening so fast in the city scenes.
The movies from the tail end of the era of actual cinema, I have it on Laserdisc and it looks amazing on that format. It's supposed to be a silly comic book style romp, and it succeeded perfectly. Modern capeshit has none of the charm, they try to instill way too much gravity into a genre composed from throw away stories printed on toilet paper by Jews to lure children's shekels away from them.
You don't use much processing power for practical effects, OP's is about rendering. In that pic he means the car, right?
In the hovercar chase scene the cars would have been shot individually as models in front of chromakey, then digitally composited onto the background which was obviously 3D rendered but with very high resolution textures. Final scene was digitally composited but the assets would have been mostly "real" in the sense that the textures on the buildings were photos of real buildings which had been Photoshopped into the proper sizes and such, and the cars and props were models which were filmed.
It's also why the cars don't look too shiny. I read a bit about it because of this thread, apparently one of the main props was the space cruise ship which was 8 foot long and fully motion controlled (had little motors for moving and actuating parts).
As said much of it was still practical effects back then but what digital stuff they did do was probably rendered on an SGI system, some studios that did purely digital such as Pixar designed and built their own systems.
Studios don't even use GPUs today for final rendering, as it turns out GPUs don't reliably implement the floating point to spec so for repeatability they render using CPUs.
Yep, all we get these days is rehashed crap and nostalgiabait, nothing original in terms of new IP comes out of Hollywood these days.
QBasic
best tetris ever was in qbasic
Some people excuse it saying there have always been remakes. King Kong has been remade what, six times now on the official franchise and a dozen times more with slightly off-brand names. This doesn't excuse the current studio heads from developing new or adapting old stories to the screen. It feels like the only effort made these days is toward milking every shekel out of a franchise until the momentum is halted by apathy.
Someone finally mentioned SGI. I think we finally threw out the old Octane that we were using as a door stop.
That's a shame but I feel you're just a troll so it's OK.
I think it's baser than that. The old movies we look back on are the best of that era. The movies you and I think as the movies of the 50s, 60, ... 90s, etc., are the best. I mean, do you realize there are hundreds of thousands of feature length films? Of course when you're watching all the shit that the (((they))) throw out every year, you're gonna think "Wow movies have gone to shit". However, in 20 years when your grandkids are watching Dunkirk or whatever you're gonna be sitting there thinking "yep, that was truly the end of the era. The last great movies were from 2010-2020".
No, because popular films of previous decades were actually good and intended for a mature audience. Films today are just garbage for kids or soy consumers. Look at the top grossing films of each year from 2010, utter fucking crap, they are meant to be the cream like when we look back at Star Wars.
The truth is there are loads of new ideas coming from writers but it's all in books, film makers are too afraid of losing sheckles so they just stick with what works until it's done to death. It's the same mentality that killed American tv, like what season is the Simpsons even on now? 90?
She only got the job by fucking her jewish agent.
Or the producer of the show, or whatever.
This, they started making films to market in China as well so a whole array of topics simply do not appear in cinema any more.
Who was taking the photos?
Spooky
Those were the only good photos in the original fappening.
If I recall, the guy was some sort of amateur photographer. Unless he had some cuck press the shutter button, I'll assume he's using a timer.
Not your consumer grade cards, maybe, but both MI and Tesla product lines are targeted at rendering/compute/AI
Is that really her?
The supergirl shit is extremely retarded because the female version of superman is already wonder woman. Maybe they try to get rid of her because she behaves to womanly?
They were motion-controlled models using compositing and other traditional film based visual effects. During the production period in question there were very, very few computer graphics systems capable of rendering film-ready shots. Evans & Sutherland's system was probably the top machine back then.
the movie was release in '82 user.
You must be thinking of a completely different movie.
Probably in 1995ish (movie was released in 1997 so it must have been in production for a bit) the top computers for 3D rendering were SGI units. Terminator 2 came out in 1991 or something and it had SGI rendering, IIRC it was the first big movie where they were on the forefront. But anyway Evans and Sutherland had fallen out by then. They had trouble competing with LISP machines in the 3D computer graphics and animation market which was owned by Miroku running on Symbolics machines. Tron was done with Symbolics machines.
You're completely retarded, and that's both funny and sad.
The statement is just as true for the Teslas, Pixar and other large studios still do final rendering on CPUs. The thing that GPUs provide is faster turn around for individual animators working on scenes since any changes can be rendered quicker.
GPUs, unlike CPUs, don't strictly implement IEEE spec FP, they implement whatever the fuck happens to be the fastest the engineers currently have and its not like it matters to most people because the platform is locked down and abstracted away behind CUDA.
Well it's a 1997 film, so it's more or less guaranteed that it was authored on an SGI workstation but the cluster they rendered on is anyone's guess. Alias (Maya), Softimage and RenderMan are mentioned in the Wikipedia article, all still current or precursors to industry standard tools.
Note that you're not looking at a single render result, but a composite of dozens of layers: the cityscape is a real miniature model with a computer-controlled camera, the cars are all rendered individually with a virtual camera that lines up with the shot, and then the frames are manually masked out, color levels adjusted in post to make everything fit together, shadows matted in where necessary... Rendering the whole scene in CGI, and all at once, would have been far more expensive and looked substantially worse.
Don't know about that, can you cite some source?
My sources come industry people I have ran into at conferences/expos so my knowledge is 1-2 years old but a quick google search returns articles like this one which talks about Disney rendering on a 55,000 core supercomputer
engadget.com
This blog post from Pixar in March talks about their new rendering engine they are developing that is capable of using both CPUs and GPUs.
renderman.pixar.com
Whats funny about that post though is when they talk about how the GPU produces the same pixels as the CPU the two images they use an an example are clearly different, with the shadows being less noisy and the textures better defined on the CPU rendered image. They later say that getting the GPU to render the same as the CPU is a significant challenge.
One wonders if they will switch to ray tracing. When I was in high school I knew a couple kids who interned at Cray Research and they were doing real-time ray tracing back then.
I visited the site with my class and all I really remember was sitting down and getting xroached by my friend. It did fuel my Unix interest though, which I keep to this day and have gotten some cash thereby.
wtaf, dude.
No, the filename wasn't enough warning -- I just assumed it would rag on the show's T'n'A flavour
Silicon Graphics. I even know one of the guys who worked on this movie, and well, Unix stations filled with RAM and excellent 3D artists, letting them render all the night so they can see the result the next day.
Now I'm working at a 3D animation studio in Dublin, working with like, a gayman PCs running Windows and C4D software, it's just like people used to work with their SGIs and Softimage3D, except we animate ZEBRAS instead of aliens n shit.