Webcam Paranoia

Everywhere I go someone always has their webcam covered up on either their laptop or phone, and they can't give any legitimate reason for why they do it. They always say it's so nobody spies on them, but they never can explain to me how they're able to do that besides
They don't even go to the trouble to cover up their microphone, which is arguably more damning then the webcam. When you're using your device, you probably have the exact same expressionless face the whole time while you're using it. If you're stupid enough to do something retarded that people can see like strip naked and take pictures of yourself, then that's on you. Don't blame hackers for when they inevitably leak. Having the microphone exposed is vastly more exploitable since any form of auditory information such as passwords and other personal incriminating info can be passing through the area you're in that's worlds more interesting than your blank face. And if someone really wanted to know more about you they'd just install a keylogger or screencap tool, they couldn't give less of a fuck about what you actually look like.
Now we're at the point where companies are actually trying to spin disabling webcams into a feature we should pay more for
invidio.us/watch?v=1BiPVQD0EoU
My question is why the fuck are normalniggers so concerned about the ebil boogeymen getting a quick peak at their face? It's all placebo in the long run and I cannot stand when people this they're safe just because of a piece of tape. Either go all the way or don't fucking try at all.

Attached: DISABLE CAMERAS BLYAT.jpg (929x960, 61.4K)

Other urls found in this thread:

duckduckgo.com/html?q=principle spies on students through laptop
slashgear.com/samsung-smart-tv-a-spy-in-the-living-room-as-webcam-hack-revealed-02292655/
newscientist.com/article/dn9059-invention-apples-all-seeing-screen/
twitter.com/AnonBabble

Why does it have to be magic? Is a remote code execution exploit magic?
Sage for fedposting. NSAniggers confirmed for spying on people through their webcam.

If you ain't ever using it but isn't gonna remove it either, then taping it up is a "it just werkz" solution to it.

People don't understand the technical details of how technology works. Covering the camera is a low cost and highly reliable solution to spying using the camera. Covering the microphone is also a good solution.

It's too bad that they're ignorant because they should also realize that non-free software is also spying on them.

I don't know
duckduckgo.com/html?q=principle spies on students through laptop
slashgear.com/samsung-smart-tv-a-spy-in-the-living-room-as-webcam-hack-revealed-02292655/

One can scan for certain open ports to view peoples house/buisness cams. Whatever. Appliances have had these open out of the box, or by default. As well smart tvs had/have mics open, cams wide open. Where even putting a passphrase on these was useless as using a passphrease aqwerty, where the passphrase was adequate.


No doubt a simple answer, is that 'they' cover their webcam

In certain products, circumstances, if the option is there
and people turned off their webcam mic, the exploits were
still active and fully open, as easy as connecting to a
persons IP address.

Tapeing a webcam, a good idea is to put a piece of paper over the lens then tape over it to keep glue
from getting over the lens. I have done work for pretty well off people who their business could be
compromised just as easy as a moment of a passphrase, email name password being seen on
a piece of paper for a moment through a flaw, or spyware/virus designed to do whatever.

i only disabled them in the bios on the only device that has them. maybe the nsa tranny can still access them but i dont really care. they would get disgusted and bored quickly.

Metasploit.

Attached: cubietech.png (512x512, 24.47K)

There's a photo of Zuckerberg in his office with his taped over. That's a good enough reason for me.

The cianiggers disable Intel ME on their systems. Why don't they just leave it open like everyone else? Really makes you think.

After those lostboy.exe threads I can't imagine leaving a webcam uncovered on any of my laptops.

op is gay ah

Nice goalpost moving, fag.

Why are you even unhappy? The sheep are doing a positive thing. Why don't you encourage them to cover up their microphones and explain whilst you're doing it.

their smartphone is the real spying device. didnt google even openly admit that they have actual people listening on what people talk near their phones

what

OP, you fucking nigger,
you should encourage the practice.

Remember how Ubuntu send your search results to Amazon, freetard?
With Jews, you lose. Independent of whether you have access to the source code.

People don't want slow shit.

Post pic pls.

That's my question too. How do you cover them? Won't you still hear everything or most? Wouldn't it be more reasonable to unplug them?

ubuntu isnt free software. a real "freetard" would use only a fsf approved distro and ubuntu isnt one of them.

Shut up freetard. No one wants to hear your whining.

They're not people. They've normies who love botnet. Anyone who doesn't want botnet loves slow shit.

actually had a question about this. I had to get a new laptop and it has a camera on it. I have it covered and disabled but I don't trust that.

What I'd like to do is wire a switch into the ground line of the camera, microphone(s), and speaker so that I could actually cut power to those things.

Is there any reasonable way a regular person could do that? I have a cheap soldering iron and that's all, I doubt I'd be able to somehow solder leads to a switch inline with the small wires on a ribbon cable or connector.

What a retarded statement. Just because you can't have something that's both free of spyware and fast doesn't mean you have to lie to yourself and pretend you like slow shit.

Slow shit means the code can't be bloated trash, so that's one reason I like it. The other reason is I grew up on 8-bit computers, so it's comfy. Besides, I don't like GUIs and don't care about games past the early 90's.

Attached: 2.png (384x272, 5.56K)

You stupid nigger.
The 5 eyes have had the "Optic Nerve Program" for some time now.
Even when a computer is turned off yet left connected via ethernet or wifi....the nsa and other government processor back doors allow constant surveillance.
Dont believe me?
Just google it or whatever you search with.
Mark Jewberg is seen in a photo with tape over his cam and mic.
With new laws, lower level agencies will have access to this software like they do with "Stingray Devices" for capturing cellular telephone sms and calls.

Attached: download.jpeg (290x174, 12.07K)

Remote code execution vulnerability in VLC remains unpatched

Same. You can have a quality and fully functional X11 environment in under 80 mb of memory used. (which might arguably still be too much but I like desktop enviroments with comfy colors) I have such a setup on my cubox. Neat machine. Contrary to what I grew up with none of this is slow. You just have to cut out the garbage made by bad programmers, since every soy manlet goes into tech, there's just a ton more garabage. There's still people writing programs out there that know what they're doing though.


In your never-ending chase for basically meaningless higher numbers, ask yourself: "What do I actually gain?" Often less than you think.


k. Now put your pants on again.

You realize the SCREENS themselves have been camera lenses since 2006.

I'm surprised more people on these forums don't talk about that fact. I guess most of you don't know.

You realize the SCREENS themselves have been camera lenses since 2006.

I'm surprised more people on these forums don't talk about that fact. I guess most of you don't know.

newscientist.com/article/dn9059-invention-apples-all-seeing-screen/

Id rather not sell my eye movments to amazon so they can place adds where I look most often.

The screen is a camera lense. That's why your eyes are focused on your eyes when you look at yourself anytime you use the camera. If you look at the camera lense it appears as if you're looking ABOVE the camera. That's because you are looking above the camera. The screen is the camera.

The lense on top is just there to hide the fact that your screen is a camera lense.

You're fucking retarded, Even a cursory glance at your computer/display connection can you tell if that technology is being used.

Spoiler: it's fucking not.

80 MB is a lot though. I used to run XFree86 on a 486DX/33 with 8 MB RAM, and it ran fine there. Could even use Netscape 3 on it, but I considered that bloated back then, so I mostly used Lynx and Mosaic. And actually that computer originally only had 4 MB RAM, and there I sometimes ran a trimmed-down version of X called TinyX (but no Netscape, of course). But mostly I got used to the console, which at the time could do graphics via SVGAlib, and that was plenty good enough for image viewer, postscript viewer, and various games (even Doom, but it needed 4 MB for itself, so only realistic on 8 MB computer).
But even that was bloated compared to my old Amiga 500 that ran Workbench fine with just 512 KB RAM (and actually the older Amiga 1000 could even do it in 256 KB). But hey, even if we say 1 MB, which became the OCS Amiga "standard configuration" (given that trapdoor 512K RAM expansion thing), you're still looking at 1/8th the memory for a full-featured windowing system (equivalent to X + desktop environment, not just barebones X). And its execution was flawless too, no jumpy mouse pointer even under load, none of the modern problems like tearing and whatnot. Instead you got features like being able to use virtual screens of different resolution at the same time. But now the modern bloated Xorg wants to keep me stuck at the same resolution I booted my ARM SBC into, can't even swith to 640x480 (which my monitor supports) in order to play some Doom without needing to scale up the graphics (and thus wasting CPU). Instead it wants to use all its shitty modern, complicated, bloated, inefficient GPU code to do the scaling at a fixed monitor resolution, which is why games and emulators now are using OpenGL when they're just doing 2D operations. But someone has to *support* this functionality via drivers, and it's a huge mess and security nightmare, as pointed out in old CCC talk 30c3-5499-en-X_Security_h264-iprod.mp4 (search for it). So their "solution" is now to abandon X entirely and create new project Wayland, but that too will get complicated drivers and extentions with security bugs. And it's because these cunts don't like simplicity, they just love to make complicated designs, it makes them feel real smart to take something simple and turn it into a disaster (and that goes for other modern projects like systemd and the web in general).
I've got a Nintendo DS Lite with 4 MB RAM that can run Inferno just fine, with plenty of memory to spare (it really just needs 1 MB, so 4 is more than enough). Now *that* is what I don't call bloated. Everything else modern is. And the price you'll pay is the dependency on botnet hardware needed to run the bloated modern junk code. People don't get this, they just think it'll all magically work itself out in the end, that somehow the magical hardware fairy will show up and fix the botnet. That will never happen of course, instead it will just keep getting worse.

Attached: igothonks.jpg (546x546, 45.77K)

Heh, yeah, 80 mb of RAM is a lot, but in a way that's also trading some flexibility. There's sadly no other realistic way to get there. That machine has 2 GB and I am willing to accept it. I have to honestly say that optimization of memory consumption wasn't even a priority, it just kinda settled there with the programs I use.

I started off with the Amiga 2000, bought it first thing in ~1987 when it became available, I bet many people reading this weren't even alive or still shit into their diapers then. Had it for many, many years. Got an hardware mpeg1 decoder for watching Video CDs, an accelerator and even a graphics card. I ran MacOS later on it to prolong getting new software for a while, which worked incredibly well because contrary to AmigaOS software, MacOS apps were written after strict guidelines with little direct hardware-poking and bare metal coding. You could run the OSes concurrently. I remember playing that Star Trek TNG adventure game on it, voice acting and everything. I still have that computer and it still works, did some repairs over the years on the power supply and such, typical aging stuff. I barely touched it in the last few years after my last SCSI drive died.

The graphics cards were interesting. Since nobody made integrated graphics for the tiny Amiga market, the solution was to fake an ISA or PCI Bus onboard the graphics card and communicate with the graphics chip through a "window" via that from Zorro. It was by no means perfect and many implementations were buggy and/or slow, but better than nothing. Sadly AmigaOS had no support for RTG. Commodore promised it for a long time but it never came, so graphics cards needed OS hacking, a bit like the custom-hacked kernels for some ARM SoCs, although on a much smaller and much more competent scale. also had an early 24 bit video card that went into the video slot. It gave you a 24 bit paint program and image viewer and worked cleverly by basically overlaying the picture over the Amiga's video signal, this happened all in the world of analog video so the computer itself wasn't really "aware" of it. Bit like the Video Toaster but not quite. There was an incredible amount of know-how in the Hardware Market of the Amiga, especially considering how limited the resources were. I cannot stress that enough. Contrary to the cleverness in the Amiga market, PC development thrived on the merits of capitalism and winner-takes-it-all strategies, problems were brute-forced with ungodly amounts of manhours and by just throwing faster and faster silicon at them. That's exactly what you still see today, big problem is the silicon stops getting faster in ways that really matter and apparently we also acquired technical debt by making it so fast so quickly to begin with.

The Amiga Workbench was by no means perfect though and Commodore promised a lot and ended up not delivering on it. Also, oh god the bugs. Both in Hardware and in Software. The Zorro 2 bus on my Amiga only got really somewhat stable when I learned more, acquired an Oscilloscope and started adding and removing resistors while putting my tongue at the right angle. You can't imagine that today. AmigaOS also just needed to be rebooted sometimes. That was a fact of life. No way around it. Eventually people started hacking the OS to fix annoyances. I think they still do, as bloated as the Linux landscape is sometimes, I have to say compared to that stuff back then, it is stable and you usually can rely on stuff simply working and that is incredibly enticing.

X is a strange animal in that the farther you go back in time on it, the better thought out the software and the hardware utilization is. In my personal opinion that's not the fault of X, it's simply just that the quality of the work has gotten worse. Wayland will have exactly the same problems, no matter what is claimed now, because it's made by the same people who cause this.

I always think if a community of OS and hardware hackers would focus on something simple like one specific, good ARM SoC and one only and write something entirely customized for it without even thinking about compatibility or support for anything else, we could achieve amazing things. Sadly, there just isn't any money in that and the current crop of programmers would get too busy with hunting titles and social clout inside and outside the project, and less with actually doing something. The're mainly about marketing and appearances, they're not academics.

What a trip down memory lane, sorry everyone for the wall of text.

Kek. That's some next level autism. Good job.

Attached: _.png (1129x971, 2.13M)

Just use a desktop. No mic or camera unless you put it there. Also you may delete the beeper/speaker as it can be retasked as microphone

I never had any problems with the Zorro bus on my A500, but I actually bought it in 1992 so it was a later revision motherboard and they probably fixed any problems by then. The only major upgrades I did were a GPV SCSI HD+RAM sidecar, and installing Kickstart 2 ROMs and Workbench 2. All that went really smooth, and I never saw any reasons to upgrade to WB 3 or get better graphics. I was satisfied with the OCS, even though the PC VGA had outclassed it by that point (and actually I have a great fondness for the low-res old pixel art, and don't much like the later high-res renderings and 3D models that became the norm after the early 90's). The only reason I moved to PC is that Commodore went bankrupt, and almost every software company immediately moved on to greener pa$tures. In retrospect, I should have just bought an 68030 accelerator card, or better yet the A3000 (16 MHz) some old guy had put in the local paper ads for like $700. Would have ended up using that a lot longer than the 486, which even with Linux became outclassed fairly quickly. So that's my biggest regret in the tech department. My life could have been drastically different if I'd managed to stay away from all this insane open source stuff. At first, Linux seemed like a good idea, but wasn't long before the bloat came in the form of desktop environments and all kinds of extentions and bloat in X. And now you have when I look at my package lists, there's endless libraries and stuff that idk wtf it's even doing there. And much of this requires constant upgrading and attention. When compiler changes, some old code needs to be updated, and ditto with libraries. So the open source world is never a stable target, but more like a barely managed chaos. And the security problems are never-ending (something I never really had to care about on Amiga).
As far as the SBC thing goes, I think people have picked Raspberry Pi, but there isn't very much OS projects being done, except for RISCOS and maybe AROS (but I think this is just "hosted" on Linux, so doesn't really count). Of course there are the typical Linux, BSD, Minix, but nothing terribly exciting there... They all suffer from the open source problems. They all depend on complicated compilers, libraries, and whatnot that are constantly in a state of flux. It's not manageable, and there's no real future in this, because the security problems are only going to get worse. It's a house of cards built on top of a shaky foundation (hardware included).
So at this point, the solution isn't to repeat the same mistake and do big open source project, but either very small dedicated teams, or just go solo like Terry Davis. That also means you can focus on the parts you're interested in. I don't particularly care about high-res graphics or GUIs, for example. And I don't need advanced audio either, since I mostly just listen to tracker and midi music. And I don't need a C compiler, because I can just use Forth instead.

Attached: aaa60-2.jpg (1600x1200, 414.73K)

why did they suddenly start adding cameras and mics in every laptop? cant believe that anyone asked for it but they just started doing it

No, it's next level truth.

Also, the thing is they started putting SIM slots to tablets around late 2012. It used to be an additional $20 for the feature but they just did it on the upcoming models. It's much worse than mic/camera as it can act as beacon and some military level tech like the GPS too which used to be the plot in based 90s movies and even most phones and all tablets did not have it before. Now everyone just have it. It doesn't matter if it's a tablet, it's just a fucking larger phone. Also the beacons can be detected by a mossad van so if you have a phone while you're in Palestine then you should expect RPG to hit you. OR they could also use it to exfiltrate shit in your pockets.

I mean that's true for any radio transmitter. In WWII the tech was already there to track down the IF oscillator in AM radio receivers at close range.

The A500 didn't have a Zorro II bus, literally missed the IC providing the additional functionalities. The expansion slot was a lot less complicated and heavily leaned on some intrinsic functionality the 68k had. Interesting with II was that it was self-configuring automatically which the PC world would only see much later on with PCI. It was literally plug-and-play and simply worked even if every slot was full, took a shit ton of revisions in the PC world to get this working reliably for most chipsets. The problem on my A2000 wasn't the theory, more the cheap-ass implementation, typical for Commodore. I got an A1200 later on an instantly regretted it, AGA was trash and didn't feel like an upgrade in any way to my then upgraded A2000. Traded it in for my first PC, an 486 which I used alongside the A2000 for a few years. What always stuck with me with the PC was how everything just always felt so needlessly complicated on it. It did the job I guess but it just wasn't that fun.

Understand the unmanageable package lists, I use gentoo because it's whole portage system allows you to get really involved with what gets installed and what not and every bigger update feels like I'm fighting the system to keep the garbage at bay. I am very careful to keep the amount of installed packages down and try to circumvent more complicated software that has a laundry list of dependencies and maintaining the system gets easier, but it's not always fun. Most GUI programs I use I pick very carefully so that they at best only rely on the X toolkit. For example, I use worker as file manager which is *very* close functionality and look/feel wise to the old school 4.x version of directory opus before it went weird and also has a plethora of functionality while basically only relying on itself. (It can even do the magic byte thing to determine file formats, just like dopus) The window manager is jwm. I'm using an i.MX6 with 2 GB of ram (which are overkill, could do with 1 GB or less but it comes in handy for ramdisks when compiling) and that certain graphic acceleration functions and even OpenGL is supported without any blob certainly helps with the graphical environment. That my screen resolution is 800x600 on a small 12" screen helps too. It's very comfy. Also Bitmap-fonts all the way. I removed the spectre mitigations from the kernel and compiled everything without PIE or stack protectors. IDGAF about these slow band-aids, I am not using the machine with untrusted code anyways. Internet is done on a different machine.

It's by no means an idial solution but it works as good as it can, I guess. I agree that the work would have to be started from the ground up to get something truly great again but I don't think it's ever gonna happen. This is the hill we're going to die on.

Are you fucking shitting me? Normies are at least a bit concerned about opsec and you're bitching about it. Go think about what you've done and come back when you're sorry.

...

You assume I can't actually use my computer to accomplish more than responding on an imageboard. Yes, those "higher numbers" actually translate to more productivity for me.

That's like 1 in 50 you nigfuck. And I've never seen a single person cover their phone cam in any way. Even with those slips to hold the phone and protect it somewhat from fall damage, they have a hole for the webcam with no shutter.
Trojans and exploits have been used to capture webcam of users' PCs since the 2000s or earlier (literally the moment they came out).