There is an issue with your database persistence. The file is being uploaded but it’s not being recorded in your database for some reason.
Describe in detail what your hardware and software setup is, particularly the storage and OS.
You can probably check this by trying to upload something and then checking the database files to see the last modified date.
I use zfs so not sure about others but I thought all cow file systems have deduplication already? Zfs has it turned on by default. Why make your own file deduplication system instead of just using a zfs filesystem and letting that do the work for you?
Snapshots are also extremely efficient on cow filesystems like zfs as they only store the diff between the previous state and the current one so taking a snapshot every 5 mins is not a big deal for my homelab.
I can easily explore any of the snapshots and pull any file from and of the snapshots.
I’m not trying to shit on your project, just trying to understand its usecase since it seems to me ZFS provides all the benefits already
Start with this to learn how snapshots work
https://fedoramagazine.org/working-with-btrfs-snapshots/
Then here the learn how to make automatic snapshots with retention
https://ounapuu.ee/posts/2022/04/05/btrfs-snapshots/
I do something very similar with zfs snapshots and deduplication on. I have one ever 5 mins and save 1 hr worth then save 24 hourlys every day and 1 day for a month etc
For backup to remote locations you can send a snapshot offsite
Thanks for this list!
First thing I would do is boot a live Ubuntu image from a USB. Make sure the hardware all works as expected.
Does anyone know if these ai chips will be good at transcoding (jellyfin) or facial detection on a security camera (frigate). Seems these might be good for homelabers.
This right here. I tried to join Mastodon today.
Download the most recommended app, Moshidon
Open app and get asked which instance i want to join. There are no suggestions.
Do a search for instances and pick one, go to the website and register with email and password. Requires email confirmation. Still waiting on the email confirmation link, 4 hrs later and 2 resends.
Literally haven’t been able to sign up yet.
Even if it had worked, the workflow would have been to change back to the app, type out the instance then re-login.
I’m not sure how anyone expects anyone other than the most hardcore to sign up for these services. Maybe that’s the point but if the point is to grow the user sign up process to significant overall
Mastodon being 65% was a surprise to me for sure. What’s the best mastodon app?
Also how do I use it efficiently for tech related news and info? I never got into twitter.
Edit: I should probably specify for Android
Ya bazzite is based on fedora with an immutable file system, so it’s called fedora atomic. Fedora atomic then has variants like bazzite, universal blue etc.
I’m curious if the baseline fedora desktop would have the same issues.
https://fedoraproject.org/spins/kde/download
Multi refresh rate on monitors is a relatively new thing for Linux so bugs are still being ironed out. It sucks that things like these are still not at parity with windows but it’s improving.
Interesting. If you have some time, might be worth trying to live USB boot drive of something like fedora desktop kde spin or pop_os cosmic DE just to see if the issue persists for other distros.
I’m theory this should be working now, it’s too bad it isn’t. My desktop is a 4 monitor setup that I’m hoping to move to a fedora based distro as well.
Did you use bazzite with gnome or kde? If I recall correctly, kde plasma 6.1 has support for multi monitor with different refresh
That’s really awesome.
Does an old version of Ms office like 2010, 2013 or 2016 work with this?
I tried to find this on DDG but also had trouble so I dug it out of my docker compose
Use this docker container:
prodrigestivill/postgres-backup-local
(I have one of these for every docker compose stack/app)
It connects to your postgres and uses the pg_dump command on a schedule that you set with retention (choose how many to save)
The output then goes to whatever folder you want.
So have a main folder called docker data, this folder is backed up by borgmatic
Inside I have a folder per app, like authentik
In that I have folders like data, database, db-bak etc
Postgres data would be in Database and the output of the above dump would be in the db-bak folder.
So if I need to recover something, first step is to just copy the whole all folder and see if that works, if not I can grab a database dump and restore it into the database and see if that works. If not I can pull a db dump from any of my previous backups until I find one that works.
I don’t shutdown or stop the app container to backup the database.
In addition to hourly Borg backups for 24 hrs, I have zfs snapshots every 5 mins for an hour and the pgdump happens every hour as well. For a homelab this is probably more than sufficient
Fair enough, I primarily use NFS for Linux to Linux sever communication and high file access.
Smb is mostly for moving files around occasionally
Not sure if trying to run a database over smb is a good idea but I do it on NFS all the time
Regardless it doesn’t have to be exclusive. OP can change it up depending on the application
You can use both without issue. I use NFS to share between two Linux servers (unraid and proxmox/dockers) and then some of those same folders are shared via smb for desktop windows or Linux laptop.
Just checked, it’s working fine for me Seadroid: 3.0.0 (from fdroid) Server: 11.0.8 Pixel 8 android
It was working for me before as well with the 2.3.x version that I was using (don’t know the exact version)
For example, in my aegis, I can select backup folders then in the file browser, seafile shows up and lets me select a folder for backup as expected.
You could try restarting the phone in case it’s a weird android issue. Then you could try 3.0.0 seadroid.
What happens when you select the 3 line button on the top left? For it shows seadrive.
Nobarra or bazzite are the way to go.
Self hosted AI seems like an intriguing option for those capable of running it. Naturally this will always be more complex than paying someone else to host it for you but it seems like that’s that only way if you care about privacy
Use the multi container extension for Firefox and have all your Google stuff in one container, banks in another, social media in another etc.
https://addons.mozilla.org/en-US/firefox/addon/multi-account-containers/
I world strongly suggest a second device like an RPI with Gitea. There what I have.
I use portainer to pull straight from git and deploy