If the algorithms of social media are any indication, it’s more likely they’d be programmed to manipulate the user into spending as much time with the AI as possible, while the AI serves them ads.
Big privacy advocate so I was curious what it takes to self host something like that, more so just wanting a very flexible personal assistant for product, weather alerts all in one.
Takes a lot of RAM and GPU power, more than I have sitting around.
Have you been looking at quantised models? You can get pretty good ones at the 20 gig RAM+VRAM level which is very reasonable if you have a gaming PC and are ok with responses not being instant.
Not just scraping data but spreading disinfo and radicalizing people. There was one case recently I saw about that. Not just AI lovers, but potentially a number of AI applications
Neural engines are coming to basically all CPUs. It won’t be long before you can run your own girlfriend offline on your phone. Training the data is the expensive part after all. I can already run basic llama 2B on my iPad, though offloading the software instead of just downloading off the App Store.
I’m fairly sure anyone with a good GPU can also run these, but I haven’t tried.
deleted by creator
They could even arrange meetups like double dates and parties and such. The future is gonna be so chaotic. I love it.
deleted by creator
Some would do that. Some wouldn’t. Some would nudge in the direction of becoming even more fscked up.
Some moral limitations we take for given in real people (often imagined even there) don’t exist for computer programs at all.
If the algorithms of social media are any indication, it’s more likely they’d be programmed to manipulate the user into spending as much time with the AI as possible, while the AI serves them ads.
Hahahaha
I like how you think!
Until those partners have bodies …
A lot of top minds are saying 2024 will be a big year for robotics ಠ_ಠ
What would be the issue with that?
Big privacy advocate so I was curious what it takes to self host something like that, more so just wanting a very flexible personal assistant for product, weather alerts all in one.
Takes a lot of RAM and GPU power, more than I have sitting around.
deleted by creator
Have you been looking at quantised models? You can get pretty good ones at the 20 gig RAM+VRAM level which is very reasonable if you have a gaming PC and are ok with responses not being instant.
Already a thing on /g/
Not just scraping data but spreading disinfo and radicalizing people. There was one case recently I saw about that. Not just AI lovers, but potentially a number of AI applications
Neural engines are coming to basically all CPUs. It won’t be long before you can run your own girlfriend offline on your phone. Training the data is the expensive part after all. I can already run basic llama 2B on my iPad, though offloading the software instead of just downloading off the App Store.
I’m fairly sure anyone with a good GPU can also run these, but I haven’t tried.
Pasterama, Don’t Date Fumos that are also Tulpas and if your cold their cold, put them behind 3 secret walls that are on fire, under the sea.