I have little ideas but big ambition

So, I may have access to a server soon. Being inspired by our local Jesus's Chiru.no, what kind of services would Zig Forums be interested in? I wanna host some neat, privacy respecting, and fun things for my fellow Zig Forumsnicians. I don't think I can do an internet radio, because making another one when chiru.no exists (and has so much excellent FLAC content) would be kinda silly. I was thinking, maybe a cytube instance?

Attached: Haskell-full.png (498x707, 362.91K)

Other urls found in this thread:

gitgud.io/m712/goyt
cheekyvideos.net/
diforfree.org/online
yacy.net/en/index.html
yacy-websearch.net/wiki/index.php/De:InterfaceÜberHTTPS
twitter.com/AnonBabble

video hosting site

Yes, world needs a lot more youtube competitors. Only China has them because they have no Youtube.

gitgud.io/m712/goyt is a good YT replacement
cheekyvideos.net/ but with more variety like /a/ and /co/
Also mirror of inclibuql666c5c4.onion or inclib.i2p would be useful

Host nodes/instances of decentralized services and platforms. Also a Tor node. We desperately need more of them.

This.
And an hosting service, if you want to end up like the guy behind cocaine.ninja

How about you host a radio with actual good music?

No such thing, the only software that can be remotely privacy-respecting is one that runs locally
Chiruno radio is garbage, don't ever call it "radio" if you don't do live talks, it's just a looped music stream full of autistic weeb crap.
Go host some basic stuff: metasearch proxy, git and pastebin.

diforfree.org/online

Well if you want live talks ask people from matrix.ordoevangelistarum.com since many there wants to have their own talk show online.

Terminal user interface when?

this.

give us shell accounts so we can mess around with your box

There are a list of m3us in so we can CLI it no sweat

Run a YaCy node! yacy.net/en/index.html

yacy blows and you know it. A searx instance would be much more useful if you want to have a search engine.

YaCy doesn't need any proprietary search engine to work. It can do all indexing itself. Can searx actually do anything like that, or is it just a dumb proxy?

YaCY is amazing but has two flaws that stop me from using/downloading/compiling it. It does not support encryption of the data being transfered and their website only serves the code from http. Which means for all we know it is backdoored in transit to your computer assuming you don't compile it yourself and even then that isn't encrypted either.

Someone add encryption to YaCY and I will never have to use a metasearch or botnet dependent engine again.

The problem with YaCy is that the results are terrible and it is simple to game the system to get your site to be the first result. In fact if you actually used YaCy you would know there's at least one site who is first for almost every single search since they gamed the system. The results from it are terrible. I did give it an honest shot and used it for around a week, but found that the results were too inferior for my needs. Oh yeah, the results are not just inferior but they also take like 10 seconds to load. It's terrible.

Searx is not a dumb proxy because it searches many search engines at once and smartly combines them into a single display. Searx has great results and hasn't let me down. I haven't had to use a centralized search engine since I've used it since the results are so good.


You obviously have never used it.

This guy is right; a very good idea, but it's unpractical right now.
There was a domain that would connect you to a random searx instance, but I lost it. Do you have it?

in what country does the server reside?

Also a BBS is always a good thing to try

It actually supports https.
yacy-websearch.net/wiki/index.php/De:InterfaceÜberHTTPS


That's because YaCy network isn't big enough yet. Site ranking is made when user clicks recommend button, so it needs people to start searching well. But you're right, results are weak. YaCy should steal other search engines' results too.

Just e-mail would do. Since all the others suck. Especially if it supported email clients instead of just webshit.

Don't talk about what you know nothing about.

this

Actually, if you DO make this email service, I will recommend it on my website.

This is your best option OP people will actually use it and if the domain name isn't pure autism you might attract a good number of people. It'll be a good lesson in server management too.

How's it going, user?

Run a i2pd node and seed i2p torrents

I own /t/ and would be interested in us having our own tracker, but I don't currently have time to maintain it. Contact me if you're interested and we can discuss what form it could take. See my e-mail.

As others have said, more decentralized nodes. Tor, IPFS, PeerTube if you can afford the bandwidth costs. If you want to do internet radio, I'd love to see a classics station; classic stations are riddled with ads and their song selections keep getting worse, so an internet-based alternative would be excellent.

Attached: c369e6570333418b7589149b1d579bfa840a7e0ad9f03299b7547c4e77d16822.jpg (640x349, 37.89K)

If you want to give back, it would be useful if you could create a service to make it easier for refugees from the middle east and Africa to find accommodation in Europe. On the one side, the sponsors can list their available accommodations and facilities, and migrants can search for homes and locations that will meet their needs.

What kind of hardware would i need to run my own tracker?

An SSD for fast access and an ethernet/landline for low latency. Everything else is with scale, a pentium 4 would be fine for a server with less then 10,000 seeders. What really matters is your ethernet card and that you run *nix/*BSD and not windows for concurrency of requests.

Depends on how complex you want it to be.

If you want a small one for you and a few friends, you can just set up a tiny server and let them connect. Then you guys just need to manually add torrents to it.

If you want something that's actually usable by a wider audience, you're going to need a website configured to manage the tracker, and you'll likely want to track what people upload and download. Most sites try to encourage good behavior.
The tracker doesn't do much; it coordinates between nodes (your client and other users') so it really just needs to be able to handle connections. In terms of hardware requirements, it's a question of how many requests you can handle. For a handful of users, you could run one off of a Single Board Computer. For something like The Pirate Bay, you'll need some heavy-duty server-grade hardware to scale to that many requests.

Attached: 1459489823044-2.gif (500x375, 324.31K)

So i could run a Zig Forums tracker off my 2009 dell laptop? Damn. I just need to get some sort of ethernet switch first.

You're underestimating the scale required for that. You could host something small for a few friends off of an old laptop, but the moment you want something big enough for a board like Zig Forums you're going to want better networking equipment, dedicated hardware built for that task, and you'll want a skillset that lends itself to optimizing and maintaining that infrastructure.

Attached: 1454098340003.png (199x200, 145.74K)

How many active users does Zig Forums have anyways? isn't it like 300?

He's just a bot shilling against your hosting your own tracker outside of it's control. Zig Forums has less then 350 unique /16 ipv4 ranges connected to it with a maximum of 256 users per range. So it's anywhere from 350-90,000 users/lurkers. Considering the posts per hour is 23 I'd say about 5,000 people max would be connecting to it unless it got DDOS'ed by some salty faggot.

You are underestimating the scale older hardware can operate at. You most certainly could connect as many human users as Zig Forums has to it without a problem with the right ethernet/internet connection and software set up. Dedicated hardware for switches and such shit is for if you are tracking/decoding SSL traffic in real time with shit like packet tagging or redirection. And even that is only if you are doing hundreds of millions of requests a second.

Create a cloud-based service to share pre-trained neural networks leveraging block-chain technology, say for natural language processing models, self-driving car technology or other deep learning applications.

Any estimates on how much bandwidth each one of these people would take?

It depends on what you are running, with a bittorrent tracker that is like 10KB a person or less and by 5,000 that would be 48MB if literally every single person was connecting to you at the exact same time downloading exactly 10KB. Realistically though not everyone is going to connect at the same time, and not everyone is going to be transfering a large text fillled magnet as a peice of text. There will be less people constantly and even less with large textfile transfers in the form of a magnet file.

The problem is less that the hardware will be bogged down and more that unless you're using a network and networking hardware configured for a large number of concurrent requests you're going to run into a problem when you have a bunch of users with a bunch of torrents all trying to update the tracker.

Generally a high-bandwidth network is going to have room for more connections without dropped packets, but some configurations will cap your number of connections, either with a hardcap (Windows is infamous for its default cap) or a softcap where one device simply can't handle that many requests and starts dropping them.
I'm admittedly used to older hardware than a 2009 laptop, but I still foresee congestion issues. If Dell cheaped out on his networking chipset (entirely likely on a laptop) or his home network is mediocre, it could result in clients being unable to update his tracker. This isn't the end of the world, but it's bad for swarm efficiency. It also neglects the need for an external web interface, which is another set of requests and dynamic pages that will need to run, preferably on the same hardware.

I'd highly recommend looking into an OpenWRT router at minimum for maximum network control and security. No real upgrade path for the laptop, though.

Real question is, what happens if it actually takes off? Or if the hardware dies? What if his home network becomes congested with many small packets and it starts to present a problem? It's not very forward-thinking. I'd host one as a test and see how it goes. If the guy doesn't even know how to guess his bandwidth/connection requirements then he's not ready to host a public-facing server open to, essentially, anybody. Better to start with a small project, encounter problems there, learn from them, and then expand afterwards.

Attached: 1444407174021.png (242x312, 74.06K)

wew Let's start small

I'm not at all trying to discourage him from trying; I'm trying to get him to understand the benefits of _prototyping_. He should push the limits of a first attempt and then learn from that. Follow it up with some better design choices.
If he doesn't know how to measure his network capabilities or know how to raise his connection cap, he's not going to know how to backup his database and share it.

I'm recommending he take careful stock of his hardware options so he understands what will fail first and why, followed by starting a sample server and opening it up for some initial testing and troubleshooting of initial problems. Once he knows how it all fits together and can adapt to changes, he can look into how to export his data.

If it takes off running botnet hardware and no backup then it's both a security liability for users here and a waste of time if the data is lost or if he exposes himself and gets caught by authorities. A small server for friends won't attract much attention but an 8ch server is going to have a target on its back from Day 01.
Plus it'll be over residential lines. That's another consideration.

You imply you want P2P services not controlled by botnet devices but you think it's okay to just tell everyone to use a service hosted by an inexperienced user on hardware that's guaranteed pozzed through similarly cucked networking equipment.
I'm here trying to tell him that, hey, you can learn all this stuff and then apply it to make a decent semi-public tracker, but start small so the first time it blows up it doesn't impact anybody else. You learn way more from screwing up something like this a couple times than from calling it a day and hosting it with no maintenance or consideration for where to go.
He should just make his own thread where he can ask questions and we can debate solutions and he can apply what he thinks is feasible. At least that way it would be collaborative and he'd learn a lot.

If it ends up taking-off then we can look into some CSS hacks to make integration easier with the site. We'd need to have a discussion about the best way to keep things anonymous while keeping out Feds and malicious users.

Attached: 1459300486643.png (550x550, 49.94K)

No, run IPFS

Well you can run multiple services in a single piece of hardware.
Start with a TOR exit node.

dear lord

What happened with cocaine.ninja?

Someone uploaded cp, the owner got vanned by sharia police

a website about secrets.
people can post secrets and only those with the link can read them

don't use anything made by that faggot. i had to leave a chan because he apparently wrote a new proprietary frontend for it.

Underrated.

Tor exit node
email server

dumb idea, what's it for?
it's same as posting an encrypted file anywhere really.

Nah unless the fun things you're referring to is about shota/lolis, 2d traps, qts, kikis/libbies then maybe yah.

Make tech's redpill.
nah


found the predator :^)

What do you expect from (((k00lkikes)))?

THIS

hentai
i like hentai

Well >>>/hydrus/

Peertube with a usable default space