🇨🇦


Fair point.
The self-hosting part was mostly about total control over my own systems and less about the paid features. It’s very much not necessary.
As far as pro features go, It was the TOTP authenticator integration that was kind of important to me. ~20% of my accounts have TOTP 2fa, and bitwardens clients will automatically copy the latest 2fa code into the clipboard when filling a password.
Bitwarden will even tell you if a saved account could have 2fa (the service offers it), but it’s not setup/saved in bitwarden atm.


I currently have 110 unique user+password combos. I wouldn’t want to change all those even once, if I were breached and had used similar credentials everywhere.
Bitwarden keeps them well managed, synced between devices, and allows me to check the whole database for matches/breaches via haveibeenpwned integration. Plus because I prefer to keep things in-house as much as possible, I even self-host the server with vaultwarden walled off behind my own vpn, instead of using the public servers. (this also means it’s free, instead of a paid service)


I feel horrible for the kids, they get forced to follow their parents mind-boggling stupidity.
Derek and DeAnna on the other hand are adults who thought out, planned, and followed through with their idiocy.
Politics aside; how could you possibly think moving your family to a country that’s actively at war with its immediate neighbor, and struggling to find bodies to throw on the front lines, is even remotely a good idea.
It touches on the fact that Derek Huffman ended up sent to the front lines in the war against Ukraine, and for a while there were rumors he’d died there.
I would genuinely be more surprised to hear they were left to live peacefully.


That will solve part of the problem, preventing downloads before an item has even released; but there’s still lots of potential to grab unwanted torrents and leave the arrs asking for intervention when they can’t import it.
Ideally the indexers would be filtering out this junk before users can even grab them, but failing that I think we’ve got a decent solution. Check out the edited OP


Check out the edited OP.


I’m taking a look at this. It looks like it’s the malware blocker portion that I’m interested in, but if I enable it and ‘delete known malware’, it just complains every minute that there are no blocklists enabled. (though the documents say it’s supposed to fetch one from a pages.dev url that has almost no content)
Do you have a specific malware blocklist configured? Enabling the specific service blocklists demands a url for one.
I can host/build a list over time for these to use if that’s what I’ve gotta do; just wondering if there’s a public collaboration on one already on the go.
/edit: found it
https://raw.githubusercontent.com/Cleanuparr/Cleanuparr/refs/heads/main/blacklist


That’s what I’d already done as per the OP, but it leaves Sonarr/Radarr wanting manual intervention for the ‘complete’ download that doesn’t have any files to import.


I just did some digging and found I do have some good quality content from them, but they were all grabbed via NZBGeek.
Every torrent I’ve gotten with that label has been garbage/malware.


This comment prompted me to look a little deeper at this. I looked at the history for each show where I’ve had failed downloads from those groups.
For SuccessfulCrab; any time a release has come from a torrent tracker (I only have free public torrent trackers) it’s been garbage. I have however had a number of perfectly fine downloads with that group label, whenever retrieved from NZBgeek. I’ve narrowed that filter to block the string ‘SuccessfulCrab’ on all torrent trackers, but allow NBZs. Perhaps there’s an impersonator trying to smear them or something, idk.
ELiTE on the other hand, I’ve only got history of grabbing their torrents and every one of them was trash. That’s going to stay blocked everywhere.
The block potentially dangerous setting is interesting, but what exactly is it looking for? The torrent client is already set to not download file types I don’t want, so will it recognize and remove torrents that are empty? (everything’s marked ‘do not download’) I’m having a hard time finding documentation for that.


Awesome. Thanks you two, I appreciate the help. :)


Awesome. Thanks you two, I appreciate the help. :)


Ok, I think I’ve got this right?
Settings > Profiles > Release Profiles.
Created one, setup ‘must not contain’ words, indexer ‘any’, enabled.
That should just apply globally? I’m not seeing anywhere else I’ve got to enable it in specific series, clients, or indexers.


To be perfectly honest, auto updates aren’t really necessary; I’m just lazy and like automation. One less thing I’ve gotta remember to do regularly.
I find it kind of fun to discover and explore new features on my own as they appear. If I need documentation, it’s (usually…) there, but I’d rather just explore. There are a few projects where I’m avidly following the forums/git pages so I’m at least aware of certain upcoming features, others update whenever they feel like it and I’ll see what’s new next time I happen to be messing with them.
Watchtower notifies me whenever it updates something so I’ve at least got a history log.


I’ve had Immich auto updating alongside around 36 other docker containers for at least a year now. I’ve very very rarely had issues, and just attach specific version tags to the things that have caused problems. Redis and postgres for example in both Immich and Paperless-NGX have fixed version tags because they take manual work to upgrade the old databases. The main projects though, have always auto updated just fine for me.
The reason I don’t really worry about it: Solid backups.
BorgBackup runs in the early AM, shortly before Watchtower updates almost all of my containers, making a backup of the entire system (not including bulk storage) first.
If I was to get up in the morning and find a service isn’t responding (Uptime-kuma notifies me via email if it can’t reach any container or service), I’ll mess with it and try to get the update working (I’ve only actually had to do this once so far, the rest has updated smoothly). Failing that, I can just extract yesterday’s data from the most recent backup and restore a previous version.
Because of Borgs compression and de-duplication, concurrent backups of the same system can be stored in an absurdly small amount of space. I currently have 22 backups of ~532gb each, going back a full year. They are stored in 474gb of disc space. Raw, that’d be 11.8TB



It’s strange, but not a terrible use for the excess heat a pc gives off I guess.
Seems like the kind of thing you’d find as a random 5.25" bay accessory though (where a dvd drive would go).


https://github.com/nicolargo/glances
I have a dashboard as well (Homepage), but this is a nice look at system resource usage and what’s running, at a glance.
Uptime-kuma emails me when services or critical LAN devices are unreachable for whatever reason.


OMG Joey doesn’t have a gun!
20 students draw on him
Yeah; I mean, if this was any other content from the same shows/movies it’d be a non-issue covered under Fair Use.
I can understand being upset about entirely new content, AI deepfakes for example; but this content was created and distributed to the public, intentionally, with the consent of the individuals that are filmed within it. It’s just been transformed into a different format; arguably, in a creative and educational manner. (the same way something like a ‘Family Guy funny moments’ compilation is)
If you didn’t want people looking at your nude body, why did you perform nude scenes in front of a camera, knowing it’d be distributed to the public…