You are pretty much as safe as it gets as long as you update that container. Ip/Port scanning basically isnt a thing in ipv6 land as youd have to scan the entire /64 which amounts to 18,446,744,073,709,551,616 addresses.
I’m Oha and I run the funny fake download button instance. More about me: https://ohaa.xyz/
You are pretty much as safe as it gets as long as you update that container. Ip/Port scanning basically isnt a thing in ipv6 land as youd have to scan the entire /64 which amounts to 18,446,744,073,709,551,616 addresses.


I know how to selfhost stuff, I just dont want to deal with docker


Any way to set this up without docker? Couldnt find anything in the docs
Having something as critical as your vpn depend on a 3rd party kinda sounds like a bad idea


Are there any european alternatives to Serverpartdeals?
Imap doesnt support 2fa.
IMAP protocol as well email clients do not support second factor authentication for the mailboxes. Even if they did, IMAP connections are made way too often which would make authentication unusable. Imagine needing to enter your TOTP token every few minutes. We could enable 2FA on the webmail, but IMAP/POP/SMTP accesses remain unprotected which beats the purpose. We are working on solution here which will allow sand-boxing a username/password pair to a webmail use only. We do offer so called App-specific passwords via mailbox identities though. These are commonly touted by email providers as 2FA. They are not.
I use Migadu with my own domain
swap Photoprism with Immich. Its a lot better imo


been using https://migadu.com/ for a few months now and its pretty great


Minecraft java, for some reason, doesnt


Just updated, thanks!


A native linux client would be nice


nah, selling messages is way easier


My websites Backend is written in flask so it was pretty easy to add


probably. never used it tho


Only in the robots.txt
I know that. Thats why I dont ban everyone but only those who dont follow the rules inside my robots.txt. All “sane” search engine crawlers should follow those so its no problem


cause many crawlers seem to explicitly crawl “forbidden” sites
Nope. Search engines should follow the robots.txt
thats… not how cpu sockets work