Three possibilities come to mind:

Is there an evolutionary purpose?

Does it arise as a consequence of our mental activities, a sort of side effect of our thinking?

Is it given a priori (something we have to think in order to think at all)?

EDIT: Thanks for all the responses! Just one thing I saw come up a few times I’d like to address: a lot of people are asking ‘Why assume this?’ The answer is: it’s purely rhetorical! That said, I’m happy with a well thought-out ‘I dispute the premiss’ answer.

  • HowManyNimons@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    24 days ago

    Haha “Don’t think about it too hard,” I suggested! But there’s some real value in what you’ve said: I do find your idea of an “audience” very helpful and will have to cogitate on it.

    The first thing it makes me think of is a story about Doctor Who that I once made up while trying to sleep: are you familiar with Doctor Who? It’s a sci-fi show like a home-made British Star Trek with more tinfoil and more time travel.

    In my story, if you’ll indulge me, one of the characters briefly enters a super-perceptual state where she sees some kind of invisible entity steal the consciousness out of her friend’s head, squirts someone else’s consciousness in there, and then makes off with the one originally belonging to her friend.

    It’s like the audience from your analogy walked out of one brain and became the audience for another brain. Her friend wakes up, and doesn’t seem to be any different: the brain his conscientiousness inhabits still has all his memories and nobody else’s, so why would she expect him to be any different? But the “audience” observing their friendship hitherto has been spirited away to who-knows-where and this grieves her.

    The middle of the story consists of her trying to track down the entity, and also her trying, with only ambiguous success, to determine any difference in his behaviours or thoughts. She becomes increasingly desperate as she imagines the invisible entity getting further and further away.

    In the end, she tracks down a way to re-enter the state in which she perceived the entity originally. This time, she is able to look over a large group of people, and the punch line, of course, is that there are countless such entities swapping our “audiences” from head to head constantly. How many times in the last minute has she herself had her conscientiousness swapped for somebody else’s?

    It makes me wonder if the ownership between brain and audience is a thing. Is that connection a necessary thing that is missing from our materialistic definition of a consciousness? Maybe that’s the nagging thing that you call a communication channel.

    I’ve got another thought experiment to share which challenges this idea, but I’ve given you far too much to wade through already so I’ll save it.

    • WatDabney@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      24 days ago

      That’s a fascinating concept.

      And yes - though a yank, I know Doctor Who. ;)

      (And this is the point at which I accidentally tapped “Reply” last time through, which is why there’s a deleted post before this one)

      Anyway…

      My first reaction was that it didn’t make sense that a consciousness could find itself attached to (hosted by?) a different mind and just blithely continue on.

      But the more I think about it, the more I think that’s at least reasonable, and possibly even likely.

      A consciousness might be comparable to a highly sophisticated and self-aware frontend. Any range of data or software can be stored and run through it, and when new data or even a new piece of software is introduced, the frontend/consciousness can and will (if it’s working correctly) integrate it with the system, and it can review the data and software it’s overseeing and find flaws and (unless the ego subsystem intervenes) amend or replace it, and so on.

      And viewed that way, and taking into account the likely mechanics of the whole thing, it really is possible and arguably even likely that it would be essentially content-neutral. It would make sense that while the experience of “I the audience” is itself a distinct thing, the specific details - the beliefs and values and memories and such that make it up - are just data pulled from memory, and it could just as easily pull any other data from any other memory (if it had access to it).

      Fascinating…