• 1 Post
  • 96 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle






  • Microsoft’s bread and butter has been selling and servicing to businesses.

    So with that in mind, the hell are they thinking? Windows 10 end of life guarantees that businesses specifically will have to switch. Then the next option in line is one that will by default vacuum up all your proprietary information to feed into an AI, effectively “copyright laundering” it?.

    Even if there’s ways to deactivate the feature, the non-tech savvy managers will just go off of the headlines and the tech savvy ones will recognize the security risk. And government/healthcare computer might just fork Linux into a non-open source version.

    Ironically it feels like they’re focusing too much on consumers (on extorting them) and shooting themselves in the foot for their business clientele.




  • I’ve met four different people involved in the military and also have met four questionable people.

    My dad, never got deployed, was in prison for fencing items, also owned businesses that in retrospect were suspiciously ideal for money laundering. o7

    Then my childhood friend, sprayed nazi graffiti around town, went to juvie, now serves the troops. o7

    Then a coworker, former military (allegedly), has a psychosis (which isn’t bad!), and was harassing his ex at her work based on delusions (which is bad). o7

    Then a different coworker at a different place, active military, very authoritarian despite not knowing much and not being our supervisor. Made everyone uncomfortable and frustrated including our actual supervisor. Now he’s becoming a National Guard. o7







  • In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn’t downplay what’s happening because it’s generally accepted that having a source of information hallucinate is bad.

    I feel like the alternatives would downplay the problem. A “glitch” is generic and common, “lying” is just inaccurate since that implies intent to deceive, and just being “wrong” doesn’t get across how elaborately wrong an LLM can be.

    Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.

    Ultmately all the sciences are full of analogous names to make conversations easier, it’s not always marketing. No different than when physicists say particles have “spin” or “color” or that spacetime is a “fabric” or [insert entirety of String theory]…


  • I’m a bit annoyed at all the people being pedantic about the term hallucinate.

    Programmers use preexisting concepts as allegory for computer concepts all the time.

    Your file isn’t really a file, your desktop isn’t a desk, your recycling bin isn’t a recycling bin.

    [Insert the entirety of Object Oriented Programming here]

    Neural networks aren’t really neurons, genetic algorithms isn’t really genetics, and the LLM isn’t really hallucinating.

    But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.