• EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    9
    ·
    edit-2
    3 months ago

    A LLM is basically just an orchestration mechanism. Saying a LLM doesn’t do reasoning is like saying a step function can’t send an email. The step function can’t, but the lambda I’ve attached to it sure as shit can.

    ChatGPT isn’t just a model sat somewhere. There are likely hundreds of services working behind the scenes to coerce the LLM into getting the right result. That might be entity resolution, expert mapping, perhaps even techniques that will “reason”.

    The first initial point is right, though. This ain’t AGI, not even close. It’s just your standard compositional stuff with a new orchestration mechanism that is better suited for long-form responses - and wild hallucinations…

    Source: Working on this right now.

    Edit: Imagine downvoting someone that literally works on LLM’s for a living. Lemmy is a joke sometimes…