How many devices do you own that are not y2038 compliant? For me its just my router since I got a 64-bit Android device

How many devices do you own that are not y2038 compliant? For me its just my router since I got a 64-bit Android device.

Attached: f75bb-y2k1.jpg (259x194, 20.31K)

Other urls found in this thread:

en.wikipedia.org/wiki/Unix_time
en.wikipedia.org/wiki/Unix_time#Leap_seconds
github.com/golang/proposal/blob/master/design/12914-monotonic.md
twitter.com/NSFWRedditGif

If it ain't broke, don't fix it.

I'm employed currently producing network devices that will fail in 2038. You own more things that will fail than you know of.

Hey, niggers!
I am too optimistic about the future, but are you THAT sure that you will survive the oncoming war? Not a single one of us probably will unless we are to be reeducated in the 'good goy' summer camps. Verily,
THIS IS THE FUTURE YOU CHOOSE

OH SHIT WE ONLY HAVE 20 YEARS TO FIGURE OUT HOW TO MAKE A VARIABLE BIGGER

OpenBSD already fixed it a few years ago, so I'm good. Anyway most current gear will be disposed of by then.

It's not quite that simple. All the software that uses time functions has to be modified as well. Because they will deprecate the old time functions and create new ones. You don't just change a variable, because that makes it impossible to transition cleanly. You can't just make everyone upgrade on the same day.

Well I'm dead in 2038.

Attached: 51b.gif (297x229, 2.39M)

Maybe we fucking should reset time every 68 years...

Attached: snooze_button1.jpg (500x375, 43.57K)

20 years is pretty old for technology. Most people will have upgraded by then

Operating systems that aren't based on UNIX don't have this bug.

Cron went haywire today (big surprise, huh?) on one of thenetnews servers I maintain. Since misery loves company, Iwas tempted to put [email protected] on theusenet alias so you could share in my joy at gettingpeppered every 10 minutes by a demented process claiming itcouldn't run nntpsend (which was present, in the directoryit was supposed to be in, and with the permissions it wassupposed to have). Merely killing cron and then restartingit solved the problem. Needless to say, I didn't succumb tothe temptation to redirect the mail, but I'll keep all youwonderful folks in mind next time I have a few dozen sparemessages lying around.Unix - not only doesn't it have a real scheduler, it doesn'thave a real batch submission system either. Sorry if Ioffend the saints of ITS by bringing up "lesser" operatingsystems for the '10, but it's amazing how often I wish I hadGalaxy, damnit.What a paragon of reliability. Speaking of time, could itbe that now that it's well past the 20-year mark, Unix isstarting go go a little senile? Why, not too long ago, Ihad a time server lose its clock, and all the other machineson the net happily decided that they, too, were back in1970. Kind of like your roommate's great aunt who thepolice keep bringing back from the other side of town 'causeshe thinks she still lives in the house where she grew upyears and years ago..."... sometimes, they even boot spontaneously just to let youknow they're in top form..." (retch)

based

man 1p at

Anyway, yes, merging cron and at would be a good idea. at should just schedule a one-time cron job.

Prove it.

2038 is based off the UNIX epoch. If you haven't let UNIX timestamps into your system you won't be affected by this. Too bad Unix's brain damage has infected almost every operating system out there.

Get lost. You've provided no proof. The bug is using a 32 bit value for holding time, not anything to do with UNIX. Windows has the bug in 2032, Macintosh has the bug in 2080's. You've proved nothing except you are a shit shill and need to leave.

Yes, but the base time (t=0) and the ticking (frequency the time increments) was chosen by the designers of UNIX.
For example window's timestamps are from 01-01-1601 which increments every 100ns.
If the people behind UNIX decided to start it at 1960 we would have a problem with the year 2028.

Shut up.

I plan to have already killed myself by valiantly charging into a machine gun in the war against eastasia by then. But I'm sure we'll have upgraded all of our important devices to have used 64-bit integers by then.

Windows isn't a piece of shit like Unix. There's no 2032 bug.

Sure...

20 years is brand new by government standards though. The Canadian government was still using at least one multics computer in the year 2000.

32-bit Windows version support will end in 2024

There's 20 years to start rolling out updates, if there's something people want, someone is probably going to try and future-proof it.

Correct. But not every OS has the same time variable (time_t). Most OS' do because of the influence of UNIX and POSIX (for better or worse) but Windows does NOT count its time based on seconds past since 1970. It will have a time overflow, but much later than 2038. I think its somewhere closer to 2100 for the Win32 API. Granted many Windows applications that link to *nix libraries for portability reasons and use time_t will still break

I maintain deployed software older than 20 years. I'm well aware that a ton of this shit will still be live on 2038. Did you know even Novell Netware is still widely deployed? There was industry-wide panic a few years ago when they said they'd discontinue support. That shit's a horror show too - pre-internet (for Windows) LAN on Win 3.11, fully custom and proprietary. Next time you get an x-ray at the dentist, remember that win 3.11 and netware are likely powering it.

You're missing the point. Hardware and software SHOULD be capable of lasting indefinitely to the extent the laws of physics itself will allow. Even if nobody is going to be using a 32-bit machine by 2038, it still showcases our shockingly disposable culture. I have a 70 year old typewriter, radio, know a guy with a Model T, and they all work fine like they should. Something made today should last you, your children's, and your children's children's lifetime out of principle

RE user here. Windows software is heavily infected by UNIX timestamps. Some of it is due to cross-contamination with portable code, but most of it is because servers and databases are UNIX and rather than write layers of native type conversions people just roll with whatever they're given. And spare me the 'but if they wrote it correctly' excuses for an imaginary world.

It's easy to say that shit today as you walk around with supercomputers in your pocket you use to share pictures of your balls with, but in 1970, hardware was very limited and very expensive. There were real practical limits to the max size of things.

I really don't think that they will come out with new computers or OSs by 2038.
We're screwed.

We are going to be using windows mac and linux for the next 50 years.

There are no OSes that aren't based on UNIX code, because they all and that's why no one uses them.

They will. Spoiler: it will be designed to lock you out of your own hardware more securely than possible with Linux.

aren't UNIX epochs a unit of time themselves? What's wrong with entering UNIX epoch 2?

They're considered real-life epochs only to the most autistic of Rick and Morty-tier Redditors

2038 is when 32-bit UNIX time_t overflows. That's basic arithmetic. Do I have to prove addition, subtraction, multiplication, and division too?
en.wikipedia.org/wiki/Unix_time

You shouldn't have to care about how times are stored in the computer. UNIX weenies "have never separated the program from the machine." UNIX programs depend on what should be an implementation detail. There is no 1970 date in the hardware clock itself. That's a software bug and defect in UNIX. They could have made it possible to change the start time of the counter, or used a longer counter the first time they "fixed" it, but they didn't.

Even though UNIX time has a range of only 68 years in either direction which was broken from the beginning, it was even worse before.
I wish UNIX only lasted 2.5 years. What we have now is the "Plan 9" version of UNIX time, an attempt to fix it that still sucks.


What is it with UNIX weenies saying things that are obviously false like "There are no OSes that aren't based on UNIX code"? Next you're going to tell me that a signed 32-bit count of seconds starting on January 1, 1970 will not overflow in 2038 or UNIX was the first OS with a hierarchical filesystem.

I've been waiting for an appropriate time to use this quote.This is as good a time as ever. Programs are written to be executed by computers rather than read by humans. This complicates program comprehension, which plays a major role in software maintenance. Literate programming is an approach to improve program understanding by regarding programs as works of literature. The authors present a tool that supports literate programming in C++, based on a hypertext system. - abstract to an article in the Jan 1992 issue of the Journal of Object-Oriented programmingThe fundamental design flaw in Unix is the asinine beliefthat "programs are written to be executed by computersrather than read by humans." [Now that statement may betrue in the statistical sense in that it applies to mostprograms. But it is totally, absolutely wrong in the moralsense.]That's why we have C -- a language designed to make everymachine emulate a PDP-11. That's why we have a file systemthat forces every file to be viewed as a sequence of bytes(after all, that's what they are, right?). That's why"protocols" depend on byte-order.They have never separated the program from the machine. Itnever entered their tiny, pocket-protectored with acalculator-hanging-from-the-belt mind.

Currently typing from a 32-bit ARM device.

The shill cries in pain as he strikes you.

all of them and i don't give a fuck

also whatever LISP weenie will post ITT. this isn't a real problem, it's just retarded unix bullshit. however i can't imagine what nightmare of a solution they would use to "fix" it

64 bit time encoding

Making a single variable from an int32 to an int64 is a "nightmare of a solution"???

The gun has already been fired. It's just a matter of time before the bullet reaches the foot. All that can be done today is to stop shooting, but we're still going to bleed come 2038.

The true problem with unix timestamps is that they're in UTC, which means they can be ambiguous.
en.wikipedia.org/wiki/Unix_time#Leap_seconds

Its not ambiguous, it just ignores leap seconds. Because time_t is supposed to be a consistently increasing variable. Its the job of the programmer using it to account for leap seconds

In Common Lisp, the time has no upperbound.

Much better than a buggy 64-bit Intel tbh.

Currently typing from the toilet.
What are you trying to tell us?

This is actually true and one of the best parts about the Lisp paradigm. We really wouldn't have to deal with compatibility retardation with 60 year old variables

But in Unix, time has no defined upper bound either. Just change the size of time_t when the need arises and recompile your software.

You're acting like people can just recompile 32-bit kernels for their home routers. you know its not that simple

With LISP the impact on compatibility would certainly be far less severe however

Hi John here,
I just got back from 2038 and saw this post.
Only faggots like OP are worried about Y2K38, everyone else uses q-bit computers.
t. J.Titor

Well, I didn't knew it was a thing.
I got a couple devices, tho I don't really see how this y2038 problem will be problematic for private users.
Sure I guess corps and big institution will be maximum paranoid on that like with the Y2K bug, but seriously, you and me, how this will affect us?

But if you read the OP you'd see he wasn't bothered by it at all since the only 32-bit *nix thing he has is his router t. OP
The real John Titor wasn't this bad at reading comprehension and context clues tbh

Its worse than y2k. Y2k crashed some word document software and prevented some devices from booting without removing BIOS batteries. Y2038 has been known to brick 32-bit Android devices beyond recovery because some fastboot ROMs sync their kernel time with the OS for logging purposes

The OP always makes a thread on a subject they are concerned with/bothered about - otherwise there would be no thread. QED.
So, Titor is better at comprehension than you, newfa/g/

I was trying to say that I own a device with a 2038 EOL. It's also a 6 year old phone with Linux 3.0, and doesn't run anything; it walks Android 7.1 briskly.

Can't decide if I should laugh at phonefags or feel immense hatred for CIAniggers and the Pajeets working for them. I'll just do both.

- Do not use nonfree software.
- The manufacturer should fix this, as this clearly is a bug.


But then you have the additional overhead for arbitrary size integers.


What stops you from using 64 bit integers on 32 bit machines?

One entire row of bits

This compiles and runs fine on my 32 bit machine:
#include int main(){ unsigned long long x = 1000000000000000; printf("%llu\n", x); return 0;}

Of course it works if you free one entire row of bits.

oh yes it sound simple until the UNIX niggers start implementing it

case in point
github.com/golang/proposal/blob/master/design/12914-monotonic.md
yes, gophers are UNIXniggers incase there's any confusion here

It works just as well with "long long" and "%lld". Why don't you try it yourself instead of talking about stuff that you don't know about?

Throw a minus in there if you like.

What does any of that have to do with y2038?

In an age when fucking Rust is being used for systems programming, I don't think it would matter all that much

He's a moron constantly derailing threads for his own gain that needs to be banned.

He triggers faggots like you and I think he deserves to stay tbh

Is that all the reason you have for being an asshole?

I'm the OP of this thread. I think if you need to grow thicker skin

Prove it.

Attached: 121 (1).png (600x1081, 330.64K)

are you retarded? routers are meant to be secure!

based

That's why you update the OS's time function, all well behaved software should then transparently receive the benefit. Software that does its own datekeeping is broken by design.

They could've made a separate variable to store a year offset. You can still do that so the kernels and programs would think they're in 1972 but your software would show 2040 and change behaviour depending on the offset value. Though you'd have to make every software do this, at least the governments and companies don't have to move from 32bit so no hardware changes are necessary.