jeena
@jeena@piefed.jeena.net
- Comment on PSA: In case you were experiencing problems with feddit.org, this is because a post from feddit reached the front page of Hacker News. 1 day ago:
I had the problem that peertube redundancy only works on public videos and most of my videos are private. And in my specific case I hosted them in Germany where my server is and because of routing and peering they would always buffer a lot in South Korea where I am so I had to solve it in a creative way, the S3 bucket is one part of my solution, putting it in the right country was another, which I explain in detail here: https://tube.jeena.net/w/uXZN52xsH75LbHWNt8dsLY
- Comment on PSA: In case you were experiencing problems with feddit.org, this is because a post from feddit reached the front page of Hacker News. 1 day ago:
I also put the video itself into a S3 bucket, so PeerTube basically only has to show the meta data and the comments from my server, so kind of like what Mastodon or Lemmy/PieFed has to do. I just had a look at the [PeerTube nginx config((https://github.com/Chocobozzz/PeerTube/blob/develop/support/nginx/peertube) but couldn't see anything there which would do caching, so I assume the app does it's own caching somewhere.
For my website, which is a rails application, I did
proxy_cache_path /var/lib/nginx/cache/jeena.net keys_zone=jeenanet:30m;
and then
location @rails { # ... proxy_cache jeenanet; }
- Comment on PSA: In case you were experiencing problems with feddit.org, this is because a post from feddit reached the front page of Hacker News. 1 day ago:
I wonder if the caching is not aggressive enough or something.
I had a PeerTube video from my instance on the HN FrontPage last week and the load was minimally higher compared to before or after.
I had several of my blogposts on HN FrontPage in the past. The first time it happened it brought my poor VPS to the Knies, but I learned from it and cached pages with nginx for some minutes and since then never had any problems. Just invalidate the cache when there are changes.
- Comment on Self hosted calendar 3 days ago:
I use Radicals för it.
- Comment on Pied 3 days ago:
Looks like she was PieFed.
- Comment on What is with this new generation of shooters writing stuff on the bullets? Is this some new fad like if I go deer hunting or something I write FUCK BAMBI on the bulllet? 5 days ago:
- Comment on Forgejo fills up hard drive with repo-archives 6 days ago:
For now I feel disabling archives and my simple list of bots to drop in Nginx seems to work very well, it doesn't create the archives anymore and the load went down also on the server.
- Comment on Big Tech: Convenience is a Trap 1 week ago:
That's what Richard Stallman has been preaching since the 80's
- Comment on Will I become a bad person in a year? 1 week ago:
Yep, off to prison you go! /s
- Submitted 1 week ago to technology@lemmy.world | 0 comments
- Comment on Did I used to be homophobic? Am I? 1 week ago:
The most funny part of the post is the last line :D
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Hm, but this only works on tmpfs which is in memory. It seems that with XFS I could have done it too: https://fabianlee.org/2020/01/13/linux-using-xfs-project-quotas-to-limit-capacity-within-a-subdirectory/ but I used ext4 out of habit.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
For now I asked chatgtp to help me to implement a simple return 403 on bot user agent. I looked into my logs and collected the bot names which I saw. I know it won't hold forever but for now it's quite nice, I just added this file to /etc/nginx/conf.d/block_bots.conf and it gets run before all the vhosts and rejects all bots. The rest just goes normally to the vhosts. This way I don't need to implement it in each vhost seperatelly.
➜ jeena@Abraham conf.d cat block_bots.conf
# /etc/nginx/conf.d/block_bots.conf # 1️⃣ Map user agents to $bad_bot map $http_user_agent $bad_bot { default 0; ~*SemrushBot 1; ~*AhrefsBot 1; ~*PetalBot 1; ~*YisouSpider 1; ~*Amazonbot 1; ~*VelenPublicWebCrawler 1; ~*DataForSeoBot 1; ~*Expanse,\ a\ Palo\ Alto\ Networks\ company 1; ~*BacklinksExtendedBot 1; ~*ClaudeBot 1; ~*OAI-SearchBot 1; ~*GPTBot 1; ~*meta-externalagent 1; } # 2️⃣ Global default server to block bad bots server { listen 80 default_server; listen [::]:80 default_server; listen 443 ssl default_server; listen [::]:443 ssl default_server; # dummy SSL cert for HTTPS ssl_certificate /etc/ssl/certs/ssl-cert-snakeoil.pem; ssl_certificate_key /etc/ssl/private/ssl-cert-snakeoil.key; # block bad bots if ($bad_bot) { return 403; } # close connection for anything else hitting default server return 444; }
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
I already have LVM but I was using it to combine drives. But it's not a bad idea, if I can't do it with Docker, at least that would be a different solution.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Sadly that's not the solution to my problem. The whole point op open-sourcing for me is to make it accessible to as many people as possible.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Hm, I'm afraid none of them really seems to cover the repo-archives case, therefor I'm afraid the size:all doesn't include the repo-archives either.
But I'm running it in a container, perhaps I can limit the size the container gets assigned.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
I have monitoring of it, but it happened during night when I was sleeping.
Actually I saw a lot of forgejo action on the server yesterday but didn't think it would go so fast.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Codeberg is a instance of forgejo, I run my own instance because I don't want to be dependent on others.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
I need to look into it, thanks!
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Yeah, I really need to figure out how to do quotas per service.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
But then how do people who search for code like yours find your open source code if not though a search engine which uses a indexing not?
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
It makes a zip file and a tarball, and keeps them for cached for other people to download in the future.
- Comment on Alternative to github pages? 1 week ago:
I also thought about it, but the custom domain feature only works on the $5 / month plan.
- Comment on Alternative to github pages? 1 week ago:
If you want free static hosting then probably: https://wasmer.io/
If you have the machine at home then you could set up port forwarding to it, but you would need to do everything yourself like:
- running a web server like nginx
- setting up ssl for it with certbot
- storing the static files in /var/www/html for example
- port forwarding from your router to that machine
- using some service like DuckDNS to point a domain to your dynamic IP at home
- pointing a CNAME to the DuckDNS subdomain on your domain - Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
It does not, because that feature is usually used for scripts to download some specific release archive, etc. and other git hosting solutions do the same.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
I have nothing against bots per se, they help to spread the word about my open source code which I want to share with others.
It's just unfortunate that forgejo fills up the hard drive to such an extend and doesn't quite let you disable this archive feature.
- Comment on Forgejo fills up hard drive with repo-archives 1 week ago:
Yeah I understand, but the whole point of me hosting my instance was to make my code public.
- Submitted 1 week ago to selfhosted@lemmy.world | 35 comments
- Comment on What I host myself 1 week ago:
No, I'm running everything on one server, there is sometimes a lot going on on PieFed and the load gets too much so it times out. I haven't had the time to research it.
And it says on, just because I set it to retry some times.
- Comment on What I host myself 1 week ago:
I have 3 locations right now:
1. Hetzner cloud (1 server)
2. Home (my PC and a raspberry Pi)
3. My parents house (a raspberry Pi)I have most of those things on https://uptime.jeena.net/status/everything