Your Compiler is Backdoored

Countering "Trusting Trust"
schneier.com/blog/archives/2006/01/countering_trus.html
web.archive.org/web/19961220000813/acm.org/classics/sep95/ this is an archive of the dead link in the top of the article

There have been cases out in the wild of this, a high profile one only a couple years ago, but I have to dig it up. Most people, even in the security industry, don't know about this type of attack, and almost no one even tries to even mitigate it. It's tough enough to get people, even developers, to compile their own code; but now they have to compile it from multiple possibly new compilers and then do extra work beyond that.

Other urls found in this thread:

shellterproject.com/
paulgraham.com/thist.html
youtube.com/watch?v=KrksBdWcZgQ
en.wikipedia.org/wiki/Multi-project_wafer_service
multicians.org/security.html
unix.org/what_is_unix/history_timeline.html
twitter.com/NSFWRedditVideo

There was a ?RedHat repo compromise about a decade ago where just the GCC packages were targeted. Everyone suspected it was a Thomson backdoor attack.

Even pretending this article has a shred of evidence couldn't you just make your own compiler and assemble it by hand or with an assembler to compile a modern compiler, then proceed to compiler from that one?

How your compiler may be compromising application security
Researchers at MIT develop a tool to identify code that your compiler may inadvertently remove, creating vulnerabilities
archive.fo/NNele
ie The compromised compilers deliberately make software insecure, but with plausible deniability.

Maybe that is what zeromq is for then. Backdoored the compilers decades ago to insert zeromq/TANGO code into everything. IDK though and have no proof.

Fucking genius. That's why these guys are working for the air force, and plebs like you and I are stuck here.

No seriously, a real solution would be to have a series of bootstrapping programs. The first would be a simple assembler written in machine code. The second would be a simpler compiler written in assembly. The third would be the complete language written in the simple language. To verify the program, you would first check that the machine binary operates as expected, then check that the assembly operates as expected, then check that the compiler works as expected.

Actually, there was a blogpost I saw on hn that talked about an attack that subverted literally everything: the compiler, assembler, linker, then wireshark, so you couldn't see the packets it was sending out, then the router so you couldn't check with that. They dragged out an old router from before the botnet and saw the packets being transmitted on it. Then they wondered why the botnet had given itself away in the first place, to make them do all this debugging, and they realized it was using them as penetration testing, and now it would patch the exploit they found and try again. Spooky

More on mitigation
dwheeler.com/trusting-trust/
David A. Wheeler’s Page on Fully Countering Trusting Trust through Diverse Double-Compiling (DDC) - Countering Trojan Horse attacks on Compilers

Anyone have anything more recent than this decade old paper? I'm learning about mitigating this attack, which has bothered me severely for a long time. Anything on Diverse Double-Compiling being used, or anything on improved methods?

y

1. it would be easier to use fancy tooling
2. they didn't expect things to be subverted so deep
also, how do you know the osciliscope isn't botnetted too?

Because you can build the oscliscope with an appropriate sampling rate yourself dumbass. Then you can either decode the binary data on the line manually or make a SOC to do it that is seperate from the scanner part of the scope.

Huh?

You are essentialy building a line of trust from the osciliscope on up to the compiler and so on. Who the fuck is going to go through that effort when they can just write the software themselves for hardware that hasn't ever touched botnet software or can be nuked of botnet software? Or atleast who would go through that effort except 1337 hax03s for isis and or the ghcq and or some other secret service.

Modern cpus run at billions of hertz. Building such a thing at home is non-trivial
Modern cpus run at hundreds of millions of operations per second. An operation that takes a tenth of a second is millions of operations, which you propose decoding manually
If your using an off the shelf SOC, then you can't guarantee it isn't broken either. If your making it at home... good luck.

You are looking at $thousands, probably a lot more, to accurately measure at GHz bandwidth.

You don't have to go through billions of instructions. Just only use it for simple programs with simple software/an OS which still have many instructions but you can underclock/volt your proccessor which can be low power/clock to narrow it down for repeatability. So that simple software would have to be like what described eventually working up to something like LLVM/clang or gcc.

Sorry for posting controlled people, but I was hoping someone would call me a faggot and post better articles. It was good enough to get the thread started I think though.

So which compilers to use?


How to implement your techniques?

Literally write your own compiler and stop being a nigger

For this to work you would also need to trust your firmware (leaving you back at stage 1), and your hardware (leaving you somewhere in the late 1970s)

A PDP-11 would probably be a good choice if we're talking about C code as in the Thompson hack for the following reasons:
With ~12MHz CPUs, 4MB RAM and HDDs in the 10s of MBs, you could probably cross-compile a minimal *NIX for other architectures and slowly bootstrap the rest. We still have no way of checking any newer commercial CPUs for back doors though

I read that article a long time ago and if I remember correctly, their solution is basically to use a trusted compiler. The only serious approach to a defense against the Thompson attack I know of is Ada, which gives you pretty massive control over the compiler to make binary audits easier. However I never really used that so a more knowledgeable adafag is probably better suited to explain the details of this.

Stop shilling in literally every thread. Not even the Rustfag is this obnoxious.

Get help.

Stop shilling your shit everywhere.

Seriously, get help.

Great argument. What if I had written my comments in Ada? I would be ok then, right?

OP a good payload runs completely in memory and never touches the hard drive unless you instruct it to. A good payload will leave as little forensic evidence as possible.

Altering programs on the target machine will eventually bring attention to what the attacker is doing. Here are some thoughts though.

Why not just insert a few lines of code into the source code?

Going to the trouble of backdooring an a compiler is not a bad idea. This suggests you already have code execution on the target machine. Sure if target is a developer/maintainer and you can backdoor entire software repository and entire userbase.

But if you have code execution on the target machine why not just replace the kernel?

I think what I'm getting at is that hacking can be a lot of work and there methodologies that make it easier.

Here's the thing with a backdoored program. There's a cryptographic checksum of the legit program and people check them or last least they should check them.

Heuristic detection should pick up programs that are doing what they are not supposed to be doing. Things like svchost running as user on Windows. Things that are dead giveaway.

Anyhow there's a cool tool for backdooring binaries called Shellter. You can use this to backdoor a Windows binary. What it will do is read through about the first 100,000 instructions in the program and insert raw output from a metasploit payload.

shellterproject.com/

But really it's a matter of getting your instructions to execute on the target machine.

Linux Mint repositories were hacked a few years back and basically the entire distro was compromised. Not sure how the attacker got in or what kind of tricks he was using.

Attached: matrixback2.jpg (1152x864, 1.22M)

The 1337ness of this.

Attached: ackerman.jpeg (300x168, 10.86K)

You are a faggot and a script kiddy. How do you know this all encompassing malware didn't change the hash output of the *insert hash library here* because the compiler that compiled it was backdoored? You don't, it could change the hash output to be the same and you would never know the difference short of examing every bit going across the line and checking the difference using a physical tool that also didn't report false output.
Think about this methodology. You hack a compiler to insert a backdooring running in 64kb of memory that can be accessed over every interface of every OS ever at that time. You did this decades ago when no one cared about compiler security/looking for program running in ram as rouge. From there you just kept the malware up to date with the latest and greatest of everything it could be accessed with using your computer that doesnt have that malware on it.

That's decades of computer hacked just because a compiler kept spreading your 64kb malware into everything it compiled, from one hack. All you had to do is make sure you can update that malware for a new OS i.e openbsd or a new architecture i.e NEON instructions in ARM. Or from up and coming detection software i.e antivirus or wireshark. That malware is called tango/corbra/zeromq and very possibly has been in software for years.

Fascinating topic OP. I'm reading through your paper and all the other articles anons have posted on this now.

No u.
You're the one imagining an attack that requires every single compiler in the world to be compromised in an inter-compatible way.

Look at this glow-in-the-dark.

This kind of hack is another example of the monopolistic mentality and Bloatware is Proprietary. Weenies want one kernel (Linux), one instruction set (RISC-V), and one compiler (GCC) for everything. Unlike academic researchers or 1970s computer companies, there are no plans for the future beyond that. This monopolistic mentality creates a monoculture that lets these kinds of hacks spread to every computer without detection with no way to prevent these backdoors in the future. The culture of UNIX, which rewards shoddy hacks, also makes them easier to hide because any exploit can be considered a "clever" hack, just like ncurses has "clever" memory leaks.

Bloatware is Proprietary allows backdoors to hide in multi-million line programs without anyone noticing that the source code and object code don't match. Backdoors can be hidden in plain sight in the source code itself. Even worse is that shitty "languages" like C are so broken that nobody can tell whether a backdoor was an intentional "optimization" (another respectable CS term these weenies took a big shit on). How would you know if a debugger had a backdoor or just an "honest" bug?


The problem is that the C standards committee doesn't know what C is because it's too poorly specified. It's both underspecified in the sense that basic meaningful operations are undefined for no reason (well, it's because the standards committee was afraid to make anyone rewrite code, even though ANSI C needed a lot of changes anyway) and overspecified in the sense that it is designed specifically for a PDP-11 memory model and is difficult to run on segmented and tagged architectures, preventing 60 years of CS research from being used.


That hack was actually implemented in PDP-11 UNIX, which is just another reason to avoid anything UNIX. "V7" tools are also so shitty that you would be better off making your own OS. They actually give the PDP-11 a bad name, like UNIX does for all hardware it runs on. The weenies who copied the vacuum cleaner slogan "nothing sucks like a VAX" for DEC's computer weren't running VMS.


The solution is simply to have a variety of compilers and architectures based on a variety of real standards, not weenie "standards" based on whatever one OS, compiler, or browser does. This kind of hack wouldn't have been possible in 60s FORTRAN and COBOL simply because there were so many implementations.

Nice theory, but I'm afraid you are too generous. When I wasporting a Scheme compiler to the RT, I managed to make adb-- nothing like a fancy, source-level debugger, or anything,just dumb ol' adb -- dump core with about 6 or 7keystrokes. Not arcane keystrokes, either. Shit you wouldtend to type at a debugger.It turned out that the symbol table lookup code had afencepost error that barfed if a particular symbol happenedto be first in the list. The C compiler never did this,so... it's good enough for Unix! Note that the RT had beenaround for *years* when I found this bug; it wasn't rawsoftware.The RT implementation of adb also had the interestingfeature of incorrectly printing the value of one of theregisters (r0). After I had spent a pleasant, relaxingafternoon ineffectively trying to debug my code, anddiscovered this, I remarked upon it to my RT-hackingfriends. They replied, "Oh, yeah. The debugger doesn't printout r0 correctly." In order to use adb, it seems, you justhad to know this fact from the grapevine.I was much amused at the idea of a debugger with large,obvious bugs.I recently managed to wedge our laserwriter server byqueueing up a large number of short files to it. Myofficemate flamed me for doing so. I replied, which led tothe following illuminating interchange: >> How was I to know the idiot system would vapor-lock on me? >> Because it's Unix?And that about sums it up for Unix.When the revolution comes, and the mob drags Bill Joyscreaming from his Ferrari at the factory gates, I, for one,will not be there to urge clemency.

This post is so stupid I'm actually wondering if someone is collecting this fag's quote collection and impersonating him for laughs.

Attached: 0c9d761acffa9893693c283334382aa40047f81eec3b5c45fdf276019d5c9ff4.png (316x342, 98.33K)

No, you're just too stupid to understand it.

Attached: (you).png (635x627, 67.62K)

That's the point of this thread, and you've missed it entirely. Unless the developer is trying to mitigate this attack (he's not), he's likely already been compromised. It's not standard practice yet to compile code using multiple compilers, which is why this threat happens to be so dangerous right now. Now, what if the software is proprietary? You literally have no recourse of any kind because you can't compile the code yourself. This means this attack is likely extremely widespread. This makes the checksum worthless in detecting that attack. How the fuck did you not get that all this was about needing more than checksums?

What happens if the developer of the compiler is a spook, it compromised? Everyone using that compiler will be unknowingly aiding the spooks. For whatever reason, you're thinking in targeted attacks against random nobodies, which literally never happens. The attacks that concern us are those trying to reach millions of people.

AV software works mainly on virus signatures, and notoriously bad at anything but that. If the malware has not been detected by an actual person in order to add the signature to the AV, the infection will likely never be detected.

It's fairly common for software targeting multiple platforms, but that's usually limited to GCC, LLVM, and (((MSVC))). Smaller compilers like TCC and QBE are less likely to be compromised but there's a performance tradeoff.

And it's bullshit, because binaries will calculate the checksum correctly, any program compiled before the attack will checksum correctly, and because it's impossible to compromise all possible implementations of a certain function in a consistent way.
The entire thread is bullshit.

Figured easily.
(((it glows)))

How was the router even subverted?
I think the wireshark could be if it's in the same host but if you have a logging router then that's where you can check it unless the malware is capable of checking the MAC vendor and preventing the leak with the said MAC vendors suspected for modified firmware.
also that is some spooky shit considering how every communications commission around the world literally does stuff like this
remember the guys at defcon who hacked sim cards? they're gone now
they wouldn't want you looking at the shit inside so they put all these organized ecosystem for id-ing all citizens and technologies with beacons for human goyim
not even talking about jew bullshit here, just the entire mankind possibly was only meant for gold mining and nothing else. planted originally by alien archons because somehow they think earth is dirty and full of diseases that they might catch while also a spiritual energy loosh farm because spirits don't just vanish - it travels like any other particle
now you know why life is so frail and shit here
bioweapon makers are based

Your DNA is backdoored.

muh trusting trust isnt a groundbreaking attack. its already an obvious consequence of running malicious code
then why do i hear about it every fucking day for the last 15 years?

How does having reproducible builds close this vulnerability? I am a brainlet so no bully, but aren't the compilers themselves compiled? Potentially by another compromised compiler? Wouldn't the only solution would be to have a chain of custody down that deblobs everything to do with the compilers upstream?

You have the original person who bootstrapped the compiler tell you what the hash should be. Now your problem is making sure your hardware can be trusted.

just learn to compile in your head and write down the hex code bro

...

what do you even need to compile the compiler? now they just do it with gcc but they could not do that for the first version

You use an existing compiler. Generally the first compiler is written in an existing language. Once you have a working compiler, you can write a new compiler in the new language, and compile it with the old compiler.

See for example the history of T, which recounts the first version being written in maclisp: paulgraham.com/thist.html

Worst comes to worst, you could always write an assembler in machine code, then a C compiler in Assembly and go from there.

...

The vulnerabilities are in C itself. You don't even need a backdoor when you have a language where the most basic standard functions are exploitable.

C is as exploitable as the programmer is retarded. But if you're so afraid, write a Rust compiler instead, or whatever meme language of the week's turn it is.

So you're saying everyone who has ever used C is retarded, especially its creators. Why would I want to use a language created by retards?

You have never written a nontrivial correct C program.

Calm down descartes. Remember your cogitos.

uhhh why not just compile it once with the trusted compiler?

At some point you have to trust something, determining who/what you're placing that trust in and what vulnerabilities/attack vectors could abuse that trust is very important.

bumping for interest

Maybe I misunderstand the paper, but for this to work doesn't your payload first have to spread to most compilers in use? Until most people's main compiler is actually compromised it, there's a pretty big chance of being detected by programs from compilers that haven't been subverted yet. For example people who go ages without updating their compiler.

If this attack had really happened it must have taken over the whole planet a long time ago, otherwise somebody would have detected it. But how come there's been no damage done by such a wide scale subversion, no money missing, no networks mysteriously compromised. If you pwned the whole world would you really not do something big with it?

As others have pointed out, it's technically possible to write your own compilers from scratch and researchers actually do it. Some of them could have stumbled on the virus by accident. News of such widespread infection would be huge so we would all hear about it. Especially people working on low level stuff in esoteric systems where the payload isn't very compatible are likely to notice it. And in any case, doesn't the idea rely on some hacker decades ago coming up with a compiler virus that never screws up and manages to stay compatible with all new technology? Seems a bit much.

If one person tried this attack then they couldn't be the only one. The virus could subvert your software to prevent detection of itself, but would it also prevent detection of a rival virus? How would the programmer even know about the rival virus? What if the rival virus comes out after he started his infection? Just generally this seems like a very long term attack where a lot could go wrong and lead to detection. If it's such a great attack a lot of people would be doing it. Somebody would eventually screw up and get caught. Then security researchers would go hunting it in the wild. It doesn't sound that hard to detect if you know what to look for and if the virus hasn't completely infected every compiler out there yet. Plus you would expect there to be multiple viruses so you're implying they somehow maintain a perfect pact of silence with each other. How would they even communicate? It's not like there's a virus writer conference all hackers go to where everybody announces their latest virus.

What the fuck are you talking about, everyone knows this, even fucking winfags, even a whole bunch of non-programmers know.

You linked to people who were working on how to mitigate it...

No they don't as your solution isn't implemented yet.

I don't get your stupid whiny bitch post. Is this a very elaborate shill to turn nerds into defeatist console faggots doing mundane smalltalk on reddit and counting upvotes?

...and whoever can hack gcc devs/repos can probably also hack my desktop environment's, which also gets them root once i reboot.

Still something worth looking into.
Another mitigation method is what was invented (or made popular?) by the bitcoin community and is now also available for Tor Browser, and that is deterministic compilation.
Meaning you build something and check the hash and if the hash is good then you can start distributing what you got.

based

tl;dr:
What makes you think that if FORTRAN and COBOL were useful programming languages that hadn't died out they wouldn't be as monopolized as is the case with others now?

And also a few years ago Apple was shilling their shitty clang compiler everywhere so it's not like there is only one.
There's also the trusted, reliable and verified Microsoft® Visual Studio™ and Borland C++ (if that's still alive).

It's also not like everyone compiles the very same version of GCC so idk how much that Attack even scales.
Imho there are way worse issues, like how Linus did nothing but talk shit when the deep state basically killed pax/grsecurity for normal users.
Or how Linus insists that broken sha1 is fine for git because, as he says, nobody puts binary data (images, etc.) in git repositories.

[continued]
In fact I think OP's "problem" might not even be an issue if Linus got his shit together.

Git commits can be PGP signed so if the latest commit is signed by NSA instead of GCC then I know I shouldn't compile it.
With sha1 however, NSA can modify any commit and keep a valid tree because git signatures sign the hash, not the diff.
All you get is some binary data in the modified commit, otherwise bruteforce would be too expensive.

Also: The demoscene will notice when their 4k files are 400k.

It's a cautionary tale about trusting your compiler/chip architecture/OS. It's not referring to any contemporary attack of this nation. Its purpose is to get you to think about this kind of attack.

then you just have to trust that only the right person has the key. the nsa could have a copy of it and you would never know

Yes, this is all nothing new. Snowden brought those thoughts into mainstream culture.
All sorts of obscure attacks are possible.
OpenVAS told me to firewall certain types of ping packets because people could get my exact system time which makes it easier to bruteforce pseudorandom numbers.

I still don't understand what you want to tell us.

For people who are interested in it.

A processor is not a trusted black box for running code; on the contrary, modern x86 chips are packed full of secret instructions and hardware bugs. In this talk, we'll demonstrate how page fault analysis and some creative processor fuzzing can be used to exhaustively search the x86 instruction set and uncover the secrets buried in your chipset.

youtube.com/watch?v=KrksBdWcZgQ

We could all go back to 68k...

Of course it's possible but people just don't care. People are too addicted to their high performance processors to downgrade to lower speed chips. I've got a Lemote Yeelong 8101B for the sake of my freedom and a Thinkpad x240 as my low priority shitposting machine.

bumpadem

Terry solved this backdoored-compiler problem.
You simply roll your own. You write your own compiler, and your own OS.
Problem solved. QED.

The trusting trust problem also includes the possibility of the processor itself being malicious to the owner. This scenario is referenced in the paper.

Which is probably why Terry never implemented networking.
Building your own CPU is tought though. sub-micron lithography isn't as cheap and easy as 3d printing.
Universities and others do have lithographic systems that can be used by students however.
They are usually terribly outdated, but they are probably a more secure option than multi-project wafers, althought the latter is still better than buying mass produced crap.
en.wikipedia.org/wiki/Multi-project_wafer_service

Terry didn't implement networking because God told him not to do it. God also told him to use 16 colors and a very low screen resolution.

It really is a UNIX and C problem. POSIX had 1100 "functions" plus dozens of commands and tons of other bullshit. It's a random pile of shit AT&T got standardized in order to collect licensing fees. It was successful at that purpose, since all sorts of non-UNIX companies like DEC and IBM started selling UNIX to comply with the "standard," but it sucked for users. With a good OS or programming language, you can read the history and papers and discover why they chose to do things one way over another, but with UNIX all the reasons are brain damaged bullshit like putting binaries with the home directories in /usr because /bin ran out of space. What really sucks about UNIX is that most of the bad things in UNIX were originally good things in another OS, but the UNIX implementation is so bad that even non-weenies think the whole idea is bad. It's not just OOP or binary files. I learned from the Microsoft article on "fork" that it was originally from Project Genie, which did it better. The one thing I thought was a real innovation of UNIX (albeit a bad one) turned out to be yet another misimplementation of something in a better OS.

That's another problem with UNIX and the "minimalism" philosophy. Everything is less flexible than the operating system it was originally from. It used to be that new versions of something were more flexible and more powerful. Now things get less powerful and remove features but still get more bloated.


Because the languages were simpler.

If a vendor decides to do something about the crassinadequacies of UNIX we should give them three cheers, notstart a flame war about how the DIRECTORY command *must*forever and ever be called ls because that is what the greattin pot Gods who wrote UNIX thought was a nice, clear namefor it.The most threatening thing I see in computing today is the"we have found the answer, all heretics will perish"attitude. I have an awful lot of experience in computing, Ihave used six or seven operating systems and I have evenwritten one. UNIX in my view is an abomination, it hasserious difficulties, these could have been fixed quiteeasily, but I now realize nobody ever will.At the moment I use a VMS box, I do so because I find that Ido not spend my time having to think in the "UNIX" mentalitythat centers around kludges. I do not have to tolerate ahelp system that begins its insults of the user by beinginvoked with "man".Apollo in my view were the only UNIX vendor to realize thatthey had to put work into the basic operating system. Theyhad ACLs, shared libraries and many other essential featuresfive years ago.What I find disgusting about UNIX is that it has *never*grown any operating system extensions of its own, all thecreative work is derived from VMS, Multics and theoperating systems it killed.

Why wouldn't you put system resources for unix in unix system resources? You seem to be the one with braindamage.

Use 7400 series TTL or CPUs that have been extensively reverse-engineered like the 6502.

What are you going to do? Post to Zig Forums from an NES? Couldn't really do anything contemporary with an old processor like that, but I am not saying demoscene stuff isn't cool and technically challenging.

I'm sure there's some autists out there using a text browser to post from a NES.
Yes you can, you can play games on it for example. What you mean to say is that your selection of modernish software to use on it would be extrememly limited since most programs are bloated and unoptimized peices of shit.

Do you not what the word 'contemporary' means? Pull your head out of your ass.

bumpy

HAPAS ARE SUPERIOR TO WHITES

classic

I can post with Seamonkey on OpenBSD 6.4 and a Thinkpad X21, it's not even slow or anything.

Whatcha sliding mordecai?

Wow. Just wow.

I smell some satanic fuckery here.

Heil Israel

Masons, Masons everywhere...

This is a Q Boomer psy-ops.

this is completely disconnected from unix or c, you schizo, it affects all
take meds

Those kind of bugs predate UNIX, you MIT reject

multicians.org/security.html

Because that's not what it means. It's revisionist bullshit to pretend /usr wasn't really a disk for user home directories. One of the things I hate about UNIX culture is that they shit on real history and hate when people know the truth. It's not enough that they made a shit OS and a shit language, they have to pretend better ones didn't actually do what they really did. That's why they say the hierarchical file system came from UNIX and not Multics, because it would get people looking into Multics and comparing it with UNIX. They made up this myth that Multics died in the 60s and didn't do anything besides influence UNIX.

unix.org/what_is_unix/history_timeline.html
That's all bullshit, and that's the official UNIX site. For some reason, only UNIX weenies do this, then they say truth and historical accuracy don't mean anything when you call it out. Another thing they do is only talk about popularity and how widespread UNIX is. Never once do they mention technical aspects, like how some innovation in C or UNIX improved programming or made things easier or more reliable. Their whole "argument" is that UNIX is "good" because it's popular and replaced operating systems and languages that were originally used for the same things, then they complain when anyone wants to replace C and UNIX with something better.


Monopolies and monocultures are bad for security, but especially when they're based on an insecure OS written in a known bad language. Computer companies used to have multiple hardware systems each with multiple operating systems. It would be a lot harder to "take down the Internet" when people aren't all running the same thing.


I posted that myself once to show that the "Thompson" hack was before Ken Thompson. That's why I mentioned multiple compilers and FORTRAN and COBOL. Real standards are about having multiple implementations and not designing a language that would prevent hardware innovation. Weenie "standards" are about making multiple implementations harder and forcing everyone to use one single implementation. It's happening for Linux and Chrome.

Raise your hand if you remember when file systems had version numbers. Don't. The paranoiac weenies in charge of Unix proselytizing will shoot you dead. They don't like people who know the truth.Heck, I remember when the filesystem was mapped into theaddress space! I even re

based


based

There's nothing better.

based


based


unbased and LARP