• theherk@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    18 days ago

    The human could be described in very similar terms. People think we’re magic or something, but we to are just a weighted neural network assembling outputs based strictly on training data built from reinforcement. We are just for the moment much much better with massive models. Of course that is reductive but many seem to forget that brains suffer similarly when outside of training data.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          17 days ago

          I could find a dozen better ones in google, but I’m not a neurophysiologist.

          The important thing here is that neural nets do not describe human brain.

          • areyouevenreal@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            17 days ago

            Artificial neural nets no, but neural networks in general yes. Just because the computer version isn’t like the real thing doesn’t mean that humans do not use a type of neural network.

      • theherk@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        17 days ago

        I’m slightly confused. Which part needs an academic paper? I’ve made three admittedly reductive claims.

        • Human brains are neural networks.
        • Its outputs are based on training data built from reinforcement.
        • We have a much more massive model than current artificial networks.

        First, I’m not trying to make some really clever statement. I’m just saying there is a perspective where describing the human brain can generally follow a similar description. Nevertheless, let’s look at the only three assertions I make here. Given that the term neural network is given its namesake from the neurons that make up brains, I assume you don’t take issue with this. The second point, I don’t know if linking to scholarly research is helpful. Is it not well established that animals learn and use reward circuitry like the role of dopamine in neuromodulation? We also have… education, where we are fed information so that we retain it and can recount it down the road.

        I guess maybe it is worth exploring the third, even though, I really wasn’t intending to make a scholarly statement. Here is an article in Scientific American that gives the number of neural connections around 100 trillion. Now, how that equates directly to model parameters is absolutely unclear, but even if you take glial cells where the number can be as low as 40-130 billion according to The search for true numbers of neurons and glial cells in the human brain: A review of 150 years of cell counting, that number is in the same order of magnitude of current models’ parameters. So I guess, if your issue is that AI models are actually larger than the human brain’s, I guess maybe there is something cogent. But given that there is likely at least a 1000:1 ratio of neural connections to neurons, I just don’t think that is really fair at all.