![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
It’s unfortunate that they’re using an old processor, but this is super cool and shows that the framework platform allows companies to tinker with unusual laptop motherboards without having to design the rest of the device.
It’s unfortunate that they’re using an old processor, but this is super cool and shows that the framework platform allows companies to tinker with unusual laptop motherboards without having to design the rest of the device.
It hasn’t been decided in court yet, but it’s likely that AI training won’t be a considered copyright violation, especially if there is a measure in place to prevent exact 1:1 reproductions of the training material.
But even then, how is the questionable choices of some LLM trainers reason to ban all AI? There are some models that are trained exclusively on material that is explicitly licensed for this purpose. There’s nothing legally or morally dubious about training an LLM if the training material is all properly licensed, right?
All it would take is a simple law to be passed.
I get the impression with a lot of Israeli politicians, they didn’t hate the Nazis so much as they hated being the out-group of the Nazis. Since, as soon as they were able to establish a foothold of power they’ve largely followed the same playbook, but with themselves as the “Master Race” and everybody else as the vermin to exterminate.
Most computer displays also support hdmi too though. In the last though there were usually tradeoffs in using the hdmi input. Now hdmi has caught up enough that usually there’s no difference, assuming the manufacturer is using the latest standard.
Oh yeah those are problematic, but I’m pretty sure a court has ruled in a customer’s favor when the AI fucked up, which is good at least.
I’m all down for weight reduction, like switching to titanium.
Admittedly yeah this would be super cool, especially if it was as thin as the new iPads. Never considered buying a foldable phone, and part of that is that when folded up they do seem too thick. An iPhone that could fold out to basically be an iPad mini and folded completely flat would be cool.
If it ends up being ruled that training an LLM is fair use so long as the LLM doesn’t reproduce the works it is trained on verbatim, then licensing becomes irrelevant.
I have a 14” M3 Max MBP for work and I have to agree, the design is fantastic. The weight and thinkness have never been an issue for me, and it’s nearly silent unless I have all cores maxed out for more than a couple of minutes. Battery life is phenomenal too, I love when traveling that it can make it through an entire day no matter what I throw at it. If they ruin that for me I’ll be so disappointed.
Chatbots are fine as long as it’s clearly disclosed to the user that anything they generate could be wrong. They’re super useful just as an idea generating machine for example, or even as a starting point for technical questions when you don’t know what the right vocabulary is to describe a problem.
I thought we did this already and came to the conclusion that thinness only matters up to a point, then it’s just removing battery life/functionality/durability chasing a benchmark that nobody actually cares about. Oh well, hopefully they learned their lessons last time and it’s better this go around.
Yeah this is super sensible. Out of curiosity, do you have any decent examples bad usage? I think chatbots, GitHub copilot type stuff to be fine. I find the rewording applications to be fine. I haven’t used it but Duolingo has an AI mode now and it is questionable sounding, but maybe it is elementary enough and fine tuned well enough for the content in the supported courses that errors are extremely rare or even detectable.
Lmao you got some criticism and now you’re saying everyone else is a bot or has an agenda. I am a software engineer and my organization does not gain any specific benefits for promoting AI in any way. They don’t sell AI products and never will. We do publish open source work however, and per its license anyone is free to use it for any purpose, AI training included. It’s actually great that our work is in training sets, because it means our users can ask tools like ChatGPT questions and it can usually generate accurate code, at least for the simple cases. Saves us time answering those questions ourselves.
I think that the anti-AI hysteria is stupid virtue signaling for luddites. LLMs are here, whether or not they train on your random project isn’t going to affect them in any meaningful way, there are more than enough fully open source works to train on. Better to have your work included so that the LLM can recommend it to people or answer questions about it.
This kind of comment just stigmatizes people getting actual medical help. Be better
The fucking government is more concerned with punishing made up “criminals” than ensuring people have access to the doctor-prescribed drugs they need to function.
Anything you put publicly on the internet in a well known format is likely to end up in a training set. It hasn’t been decided legally yet, but it’s very likely that training a model will fall under fair use. Commercial solutions go a step further and prevent exact 1:1 reproductions, which would likely settle any ambiguity. You can throw anti-AI licenses on it, but until it’s determined to be a violation of copyright, it is literally meaningless.
Also if you just hope to spam tab with any of the AI code generators and get good results, you’re not. That’s not how those work. Saying something like this just shows the world that you have no idea how to use the tool, not the quality of the tool itself. AI is a useful tool, it’s not a magic bullet.
I assume the primary market for this is insurance companies who salivate at any data they can use to justify a rate hike. Secondarily advertisers, but they probably wouldn’t pay nearly as much since they have all sorts of data sources to pick from.
This seems like a win for privacy. Modern cars collect a creepy amount of data often without the users knowledge or the ability to opt out. This article makes it seem like some car manufacturers are no longer selling the data, but I’m not sure how true that is.
Probably not. Electron is popular not just for its cross-platform support, but also that its skills are highly transferable from existing web dev.