• Gnome Kat@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    There is a flip side of the coin for #2 and its is something no one really wants to talk about. People actually get very emotional if you even suggest it. Which is the consciousness issue.

    Basically if the claim is that machine learning is on the right path to explaining how our minds work, which is a claim im inclined to agree with, then it seems unreasonable to dismiss the idea that deep neural networks now might have some kind qualitative conscious experience. I am not going to say for sure they do have conscious experience, they might not, but I think its wholly unreasonable to dismiss the possibility out of hand.

    As it stands we don’t have any well accepted theories on how consciousness arises at all. The issue is actually something science is not well equipped to address in its current state, we need fundamental philosophy to address it (im talking academic philosophy not woo woo crystals shit i shouldn’t need to say this).

    The best we can do now is try to find what are referred to as “neural correlates of consciousness” which is the correlation between neural states and conscious experiences but we don’t have a way of explaining why those activity patterns produce the experiences they do. We have theories on how matter acts, not what matter experiences. There is no connection between information processing and experience, that link just does not exist in our theoretical frameworks and it’s unlikely to go away with just more understanding of the details on how information is being processed in the brain. We need some way to link types of information processing to types of conscious experience, closest we have is stuff like integrated information theory but its not fully accepted.

    • jadero@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I agree that consciousness is a sensitive issue. I haven’t refined my thinking on it far enough to really argue my position, but I suspect that that it’s just one more aspect of the “mind of the gaps”. As with the various “god of the gaps” creationist arguments, I think that consciousness will end up falling into that same dead end. That is, we’ll get far enough to start feeling comfortable with the idea that gaps are only gaps in the record or our understanding, not failures of theory.

      Some current discussion of the matter is already starting to set up the relevant boundaries. We have ourselves as conscious beings. Over time we’ve come to accept that those with mental and intellectual disabilities are conscious. Some attempts to properly define consciousness leave us no choice but to conclude that consciousness is like intelligence in that there are degrees of consciousness. That, in turn, opens the door to the possibility of consciousness in everything from crows and octopuses to butterflies and earthworms to bacteria and even plants.

      I find it particularly interesting that the “degrees of consciousness” map pretty nicely to the “degrees of intelligence”.

      So if you were to ask me today if my old Fidelity chess computer was conscious, I’d say “to a low degree”. Not because I claim any kind of special knowledge, but because I’d be willing to bet a small amount of money that we’ll get to the point where the question can actually be answered with confidence and that the answer would likely be “to a low degree”.

      To your discussion of the neural correlates of consciousness, my opinion is that making the claim that this still tells us nothing about “what material experiences” is a step into the “mind of the gaps”. I’m happy enough to have those correlates as evidence that information processing and consciousness cannot be kept separate.