• wonderingwanderer@sopuli.xyz
    link
    fedilink
    arrow-up
    13
    arrow-down
    4
    ·
    edit-2
    9 hours ago

    We don’t know enough about human consciousness to know that for sure. Plenty of animals have fewer braincells than humans, but we don’t know enough about their consciousness to say whether they have an internal experience.

    That’s what I mean. It’s hubris to assume we can culture human braincells in a petri dish just because there’s a lack of evidence one way or the other whether it’s aware.

    • TechLich@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      14 hours ago

      There’s a lack of evidence for anything not being conscious.

      Neurons work by generating electrical signals in response to stimulus (either electrical inputs from other neurons or physical/sensory inputs activated by light or touch etc.) and they do this in a physical way.

      If they’re conscious, then there’s a pretty good chance that power plants are conscious, computers are conscious and pretty much everything else in the world is conscious.

      I’m not sure there’s any requirement for consciousness to include “human-like reasoning” or “understanding” for it to have some kind of experience and perspective or awareness. Humans make a lot of assumptions about the world to make it fit the patterns we’re used to.

      A cluster of neurons trained to play doom might have consciousness but it’s not likely to think like a human, just like a rock or a plant or an ant or an iPhone might have consciousness.

      Whether it’s ethical to squash an ant or turn off an iPhone or stimulate a lab-grown neuron depends on your ethical framework and your philosophical worldview.

      • wonderingwanderer@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        8 hours ago

        There’s a lack of evidence for anything not being conscious.

        So should we just assume that nothing is conscious? After all, I can’t prove that you’re conscious, nor you I. So should we relegate ourselves to an amoral solipsism?

        Neurons work by generating electrical signals in response to stimulus and they do this in a physical way.

        I know how neurons work. Nobody knows why they produce consciousness or what particular mechanism is responsible for human awareness.

        I’m not sure there’s any requirement for consciousness to include “human-like reasoning” or “understanding” for it to have some kind of experience and perspective or awareness.

        That’s… irrelevant. I never said they have “human-like reasoning” or “understanding.” I said we don’t understand enough, meaning humanity writ large, including the experts. There are too many unknowns about the nature of consciousness.

        A cluster of neurons trained to play doom might have consciousness but it’s not likely to think like a human

        Again, it doesn’t need to think like a human in order to be capable of experiencing suffering. Babies don’t “think like humans,” or at least we don’t have any solid evidence that they do, but they’re certainly capable of suffering.

        Your mentality is the same one people have used for generations to justify circumcising infants without anaesthetics. How far are you willing to extend it? Do pets “think like humans”? Do uncontacted tribes “think like humans,” in whatever vague way you define it in order to justify cultivating human braincells in a petri dish?

        Do you not see how problematic this is? What if the technology grows and in a decade they’re studying a clump of 2 billion neurons in a vat? Will it suddenly become human enough to deserve your consideration? What about when it becomes 20 billion?

        Whether it’s ethical to squash an ant or turn off an iPhone or stimulate a lab-grown neuron depends on your ethical framework and your philosophical worldview.

        Whether it’s ethical to murder an entire village of your enemies “depends on your ethical framework and philosophical worldview.” See what a slippery slope moral relativism is? Amoral people exist, moral cynicism exists, nihilism exists, solipsism exists, hell even social darwinism exists.

        Any of those frameworks and worldviews can be used to justify atrocities in the minds of those who hold them. And yes, an unethical or even anti-ethical persuasion is still an “ethical framework,” in the strictest sense of the term.

        Just because something can be seated in philosophical jargon doesn’t mean we should grant it license to do whatever it wants.

      • bss03@infosec.pub
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        13 hours ago

        I think there’s a lower limit of complexity for sentience, based on memory-persistence, self-firing, and self-recognition. I think there’s no need for moral concern for non-sentient things. (But, that’s just my ethical framework and philosophical worldview; the only “evidence” I’m at all aware of is thin and vague.)

        But, as far as having a subjective experience, I think that might go quite small and alien including fungi and plant or even certain sub-cellular structures. Probably anything that maintains a border and internal homeostasis including parts of the bodies of larger experiencers could be having an internal perspective – and any human words applied so those experiences would tell you more about human bias than the experience.

        • LH0ezVT@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          10 hours ago

          In my view, although I am neither a neurologist nor a philosopher, things should absolutely scale with neuron blob complexity, and it should do so in a non-linear way. I dislike harming an animal with a complex brain like mammals, cephalopods etc. much more than I dislike harming the equivalent nerve mass in insects, for instance.

          • bss03@infosec.pub
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            That’s also the way I feel, but I think that’s probably human bias and closely related to the evolutionary pressure behind my mirror neurons and how strongly they trigger correlates with outside sentient phenotype.

            I think if I knew what it felt like (if anything) to be an ant colony, I might have different views around the causal use of boric acid (and related) to keep them out of human spaces.