• Clbull@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    3 hours ago

    Cortical Labs are the ones who pulled this off. They already have biological computers running on 800,000 lab-grown neurons available for ~$35,000 (just going on what a quick Google search told me) and are planning to open up a cloud computing service with its own API soon.

    This makes me feel uneasy. Imagine if reincarnation were a thing and you get brought back into this world, and your purpose is to learn how to play DOOM.

    • gerryflap@feddit.nl
      link
      fedilink
      arrow-up
      13
      ·
      2 hours ago

      Personally my worry really isn’t reincarnation, there’s no reason to believe that that’s true. But if these are fundamentally the same neurons that make up our brains, then how much do you need to put together before they acquire some form of “sentience”? Does a clump of 800,000 human neurons experience pain, sadness, a sense of self? Where is the line between an emotionless biocomputer and torturing a living organism for its entire lifespan?

      Despite the fact that I really hate “AI”, that question was of course already sort of relevant for the latest AI models, even though we can generally conclude that they’re not there yet at all. But real neurons are different, we know what they’re capable of. How many do you need before a clump of neurons has rights?

      • Jyek@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        7 minutes ago

        Large language models are not intelligent. They are predictive text applications with massive dictionaries of circumstantial sentence structures to choose from. Nothing more. They do not feel and do not think for themselves. The only time they do anything is when the API calls them to produce more text with an updated context string.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          52 minutes ago

          Sure, but is the full human brain the minimum set necessary?

          Sentience/sapience is probably an emergent property of a set of neurons needing to coordinate, plan, predict the future and oneself in relation to it.

          I suspect that AI is capable of sentience with sufficient complexity and training, but it’s not there yet. I also suspect we’ll be well past the point where it is there before we realize it is, but not until we make some kind of fundamental change in how we do it - we know human level intelligence is possible in the volume and power consumption of, well, a brain so we’re orders of magnitude off of efficiency limits.

          • Washedupcynic@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 minutes ago

            It’s estimated that mice have 70 million to 100 million neurons in their brains. They are capable of feeling pain and have social hierarchy. They also experience emotions like fear, pleasure, and anxiety. (We use them in pharmacology models of many mental illnesses.)

            Have you ever heard the phrase, “the neurons that fire together, wire together” ? Our neurons are in a constant feedback loop with the environment we experience. Our experiences shape how our neurons make interconnected networks, which then impacts how we behave upon the environment.

            If those neurons connected to the computer chip only ever experience playing the game “DOOM,” how would they know about anything else? How could they know about pain without having limbs to innervate and experience the pain with? How could they have a social hierarchy without others to interact with? We may as well be god to those neurons on the PC chip, because we are controlling the entire world they have access to.

            What I find sad is that our society is ok with hooking living cells up to a computer to make smarter computers, but has a problem with ethically harvesting stem cells to be used to treat diseases.

  • matlag@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    Am I the only one who wonders why, in a world where there are already concerns about machines rebellion, when we train rats, robots and a bench of neurons to play a game, it HAS to be Doom, we can’t think about another, non-violent, or let’s be bold: non-destructive game??

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 hours ago

      They trained a tiny patch of neurons to respond to low-voltage electric impulses. The cells don’t know they’re playing Doom. They don’t have any kind of social context or even video feedback.

      Imagine if I stuck you in a sensory deprivation chamber, handed you an NES controller, and asked you to hit the buttons. Then, periodically, I said “Yes” or “No” based on the buttons you pressed. And when I pulled you out of the tube at the end of an hour, I told you “the yes and no messages were intended to encourage you to correctly navigate Mario through the first level of the original game.” What if, instead of Mario, I’d been telling you how to play Street Fighter?

      It doesn’t matter if its Doom. They likely picked Doom because the I/O is so rudimentary that you can install the game on practically anything. The cellular matter has no idea what it’s doing beyond the “Yes/No” signaling.

      • matlag@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 minutes ago

        I know there is no real association between the game and real life. It’s more a question on the mindset of the researchers. I’m sure there are other games that would fit their needs.

    • bampop@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 hours ago

      OK but hear me out here, I think I have the beginnings of a business plan:

      1. Create the Torment Nexus

      2. ?

      3. Profit

      Some components of the plan are still under development, but let’s not lose momentum. We can advance with the initial phase while brainstorming to refine the plan in real time as we progress. It’s an exciting opportunity and we mustn’t forfeit our first-to-market advantage.

    • ouRKaoS@lemmy.today
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Scientists: “No, this isn’t The Torment Nexus, this is ‘The Nexus of Torment’! It’s totally different!”

    • AEsheron@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      5 hours ago

      IIRC, it doesn’t actually pay the game itself. We prod the cells, they fire in a certain way and that response is read to convert it to an output for the game. The cells aren’t a rudimentary Doom bot, they’re the controller.

  • MonkderVierte@lemmy.zip
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    4 hours ago

    we grew a human brain

    200’000 brian “brain cells” (so about 1/3 of it neurons) is the equivalent to a really simple microcontroller.

    Edit: left the typo for funny

    • Sturgist@lemmy.ca
      link
      fedilink
      arrow-up
      15
      ·
      8 hours ago

      Honestly? Sounds preferable to being stuck in the universe of I Have No Mouth And I Must Scream… I’ll take a challenging power fantasy with some massively overpowered weapons over millennia of endless physical and psychological torture by an insane AI… might just be me though…

    • TheLeadenSea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I think they should have already been worried, basically every animals endures immense suffering in its life. Also you don’t need to believe in literal reincarnation to believe in this, you just need empathy

  • JojoWakaki@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    7 hours ago

    “Scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should”

    Micheal on his fictional book about sex changing lizards in an unconventional zoo.