WIRED: Big Tech Wants Centralized Censorship

DECENTRALIZED P2P INTERNET ALTERNATIVES: >>>/prepare/75
YES, RELATED:

In the immediate aftermath of the horrific attacks at the Al Noor Mosque and Linwood Islamic Centre in Christchurch, New Zealand, internet companies faced intense scrutiny over their efforts to control the proliferation of the shooter's propaganda. Responding to many questions about the speed of their reaction and the continued availability of the shooting video, several companies published posts or gave interviews that revealed new information about their content moderation efforts and capacity to respond to such a high-profile incident.

But some of these responses have also included ideas that point in a disturbing direction: toward increasingly centralized and opaque censorship of the global internet.

Facebook, for example, describes plans for an expanded role for the Global Internet Forum to Counter Terrorism, or GIFCT. The GIFCT is an industry-led self-regulatory effort launched in 2017 by Facebook, Microsoft, Twitter, and YouTube. One of its flagship projects is a shared database of hashes of files identified by the participating companies to be “extreme and egregious” terrorist content. The hash database allows participating companies (which include giants like YouTube and one-man operations like JustPasteIt) to automatically identify when a user is trying to upload content already in the database.

In Facebook's post-Christchurch updates, the company discloses that it added 800 new hashes to the database, all related to the Christchurch video. It also mentions that the GIFCT is "experimenting with sharing URLs systematically rather than just content hashes"—that is, creating a centralized (black)list of URLs that would facilitate widespread blocking of videos, accounts, and potentially entire websites or forums.

Microsoft president Brad Smith also calls for building on the GIFCT in a recent post, urging industry-wide action. He suggests a "joint virtual command center" that would enable tech companies to coordinate during major events and decide what content to block and what content is in "the public interest." (There has been considerable debate among journalists and media organizations about how to cover the Christchurch event in the public interest. Smith does not explain how tech companies would be better able to reach a consensus view, but unilateral decisions on that point, made from a corporate and US-based perspective, will likely not satisfy a global user base.)

No one outside of the consortium of companies knows what is in the database. There are no established mechanisms for an independent audit of the content, or an appeal process for removing content from the database. People whose posts are removed or accounts disabled on participating sites aren't even notified if the hash database was involved. So there's no way to know, from the outside, whether content has been added inappropriately and no way to remedy the situation if it has.

The risk of overbroad censorship from automated filtering tools has been clear since the earliest days of the internet, and the hash database is undoubtedly vulnerable to the same risks. We know that content moderation aimed at terrorist propaganda can sweep in news reporting, political protest, documentary footage, and more. The GIFCT does not require members to automatically remove content that appears in the database, but in practice, smaller platforms do not have the resources to do nuanced human analysis of large volumes of content and will tend to streamline moderation where they can. Indeed, even YouTube was overwhelmed by a one-video-per-second upload rate. In the days after the shooting, it circumvented its own human-review processes to take videos down en masse…..

wired.com/story/platforms-centralized-censorship/
web.archive.org/web/20190419000635/https://www.wired.com/story/platforms-centralized-censorship/

Attached: Big Tech Wants Centralized Censorship.jpg (1618x1079, 1.27M)

The post-Christchurch push for centralizing censorship goes well beyond the GIFCT hash database. Smith raises the specter of browser-based filters that would prohibit users from accessing or downloading forbidden content; if these in-browser filters are mandatory or turned on by default, this pushes content control a level deeper into the web. Three ISPs in Australia took the blunt step of blocking websites that hosted the shooting video until those sites removed the copies. While the ISPs acknowledged that this was an extraordinary circumstance, this decision was a stark reminder of the power of internet providers to exercise ultimate control over what users can access and post.

But proposals for quick and widespread takedown, with no safeguards or even discussion of the risks of overbroad censorship, are incomplete and irresponsible. Self-regulatory initiatives like the GIFCT function not only to address a particular policy issue, but also to stave off more sweeping government regulation. We've already seen governments, including the European Union, look to co-opt the hash database and transform it from a voluntary initiative into a legislative mandate, without meaningful safeguards for protected speech. Any self-regulatory effort will face this same problem. Safeguards against censorship must be an integral part of any proposed solution.

This hash-tagging and blocking won't work with P2P networks that are decentralized, node to node.

Keep that in mind in the near future. Have plans ready too.

bumping REAL NEWS

The post-Christchurch push for centralizing censorship goes well beyond the GIFCT hash database. Smith raises the specter of browser-based filters that would prohibit users from accessing or downloading forbidden content; if these in-browser filters are mandatory or turned on by default, this pushes content control a level deeper into the web. Three ISPs in Australia took the blunt step of blocking websites that hosted the shooting video until those sites removed the copies. While the ISPs acknowledged that this was an extraordinary circumstance, this decision was a stark reminder of the power of internet providers to exercise ultimate control over what users can access and post.

But proposals for quick and widespread takedown, with no safeguards or even discussion of the risks of overbroad censorship, are incomplete and irresponsible. Self-regulatory initiatives like the GIFCT function not only to address a particular policy issue, but also to stave off more sweeping government regulation. We've already seen governments, including the European Union, look to co-opt the hash database and transform it from a voluntary initiative into a legislative mandate, without meaningful safeguards for protected speech. Any self-regulatory effort will face this same problem. Safeguards against censorship must be an integral part of any proposed solution.

This hash-tagging and blocking won't work with P2P networks that are decentralized, node to node.

Keep that in mind in the near future. Have plans ready too.

We have botnets attacking Zig Forums now!

bump 7

bump 14

bump 22

bump 29

bump 37

anti-slide

Did Jim stop the botnet that was attacking Zig Forums this morning? Or is the BO back on here?

reer nuse

Bad goyim! Use our centralized social platforms! Stay away from those evil alternative platforms!

Of-course they do, they're state-subsidized entities to spy on people! Hell yes they want to control us.

When corporations control government, its fascism. When the government controls corporations its communism. To which extreme has America become again? I forget.

Are these niggers retarded? Gunna make a hash juggler script that refreshes all the files in my folder, fucking idiots.
I'm going to get EVERYTHING banned.

Noyce digits 4 dat

I want one of those!

Is there a keyboard signal emulation scrambling script out there that you can run in the background to defeat keystroke logging?

bumping back news

This hash-tagging and blocking won't work with P2P networks that are decentralized, node to node.

Keep that in mind in the near future. Have plans ready too.

bump

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

Of course they do.

HAPAS ARE SUPERIOR TO WHITES

If you believed this you wouldn't be spamming it.

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

LEGIT NEWS

real news bump

actual factual news!

I'm getting everything I can grab up from p2p and backing it all up, have been for years. Fuck the future of the internet, I have no intention of relying on it in the future either, especially if it becomes another corporate dominated platform like TV.

Oh, and just another update: someone said on ANOTHER THREAD a couple weeks ago that CDs will stop working within 5 years, they start to deteriorate…..

WRONG!

As an UPDATE on that I checked 10 different older CDs I burned at least 10 to 15 years ago. Some of them included Slim Whitman, Roy Orbison, Metallica, Scorpions, OZ, Dr. John, Iron Butterfly, Y&T, AC/DC, et al. …. All of them at least 10 years old….

Every single old CD still worked JUST FINE!

Here's the trick of keeping CDs in good, working condition:

1) Don't leave them lying around to catch dust or get scratched up.
2) Always hold them by their sides so you don't get fingerprints or smudge on them.
3) Keep them in binders, away from humidity or away from the sun.

Thats it, they'll still work just fine! Do have some digital backups of your favorite music though just encase!

pressed CDs, that is the one you used to buy from the record store, all play fine here.
burned CDs, that is empty cdr that i burned back in the 90s with a yamaha 4x, range from readable to having literal holes, depending on their quality mostly. My suggestion is to backup nao. Or at least put linux on a usb stick and do some parity files with dvdisaster

REAL NEWS BUMPAROO

Well all the CDs I tested were burned back in the late 90s to early 2000s. All of them were still good, HOWEVER I am very picky about backup storage and always keep my optical backups like CDs and DVDs in closed binders: away from humidity and heat. I am very careful not to scratch them. There are others who treat their CDs/DVDs like shit, and I am not one who does.

So all of them still worked, however I do have multiple digital backups of my whole music collection as well! Many of them on lossless FLAC or ALAC formats! I have about 7 to 8 backups of my entire music collection. So if one or two backups happen to fail, I have plenty more backups in my Faraday Cage!

Just here to Sage.
Reporting this thread and this post.

Excellent! It's good to see music preservation.

I do the same with older TV shows and older movies too, only with DVDs. I have the original copies (VOB formats) backed up on several different mediums too, encase one DVD were to suffer data loss over time. This has happened to very old DVDs a few times before, 10+ years old, and I've had to re-burn those. It is seldom though, I'd say re-burn any discs that are getting around 15 years old.

Typically you can tell if an old DVD starts to suffer bit loss over time, because it will start to glitch every so often while watching it. Most of the time its no big deal, just a little hickup now and then… but if it starts to get really bad, you'll need to re-burn an older copy from your digital backup.

anti-slide 4

real news bumparoo