Does every PeerTube instance store all other instances’ content metadata (title, description, comments)? Would federating virtually with YouTube (through a YouTube frontend like Piped) give a too high ammount of data to store on other instances?
Does every PeerTube instance store all other instances’ content metadata (title, description, comments)? Would federating virtually with YouTube (through a YouTube frontend like Piped) give a too high ammount of data to store on other instances?
As of June 2022, more than 500 hours of video were uploaded to YouTube every minute. If that video is stored at 6Mb/s, a conservative estimate, that equates to about 10TB a minute. Obviously you wouldn’t want everything, but it’s still way more than the peertube network could handle in it’s current form I’m sure. I cannot fathom the amount of data that Big Datatm handles, the fact that most of the services are free makes me highly suspicious.