> I suspect that there isn't anything to the experience of being like a worm
Even a worm has goals to attain - food and reproduction. In order to attain those goals, it needs to have perceptions and to rank those by the estimated return value. That is what it's like to be a worm.
Talking about a worm as if it had goals is a teleological analogy that makes discussing its behavior easier, but I think it is a mistake to take this usage literally. If you do, you must also conclude that a thermostat has a goal, and therefore (by your argument) that there is a something that is what it is like to be a thermostat. I can only see this line of argument as a distraction that will only confuse any attempt to understand consciousness.
The rest of this discussion is hard to follow because it has apparently been edited, but you can't draw a line between applying your argument to animate things, but not inanimate ones, simply by choosing one word over another.
Yes, you can draw a very clean line. It's the line of self-replication. Self-replicators have an internal goal - that is to maintain their existence by growing and replicating. The more advanced ones develop complex responses to changes in their environments. Humans are just the apex of this process. A toaster or a thermostat don't self reproduce, so they have no goals. Having no goals, means no consciousness, because goals are shaping the development of consciousness.
You are taking the teleological analogy literally again. It was Darwin's biggest insight that this is precisely not how evolution works.
Adding the word 'internal' doesn't help either. You could just as well say that a thermostat has an internal goal of staying warm.
This is moot anyway. Like Carroll's Humpty-Dumpty, you have crafted a definition of consciousness that suits you perfectly, but which doesn't match what everyone else means.
Fire self-replicates. It maintains its existence by consuming fuel and spreading itself. It's not as skilled at this as a human, but neither's a worm. What about an individual bacterium? Does it have a goal? Are they conscious in a way a humble virus is not?
Such goal attainment does not require consciousness. Even in our species, functions like breathing and blood circulation are autonomous, and they continue even in our sleep when consciousness is absent.
The fact that you have no memory of time while you are asleep does not prove that consciousness is absent.
The fact that your brain's executive network are unaware of the workings of "autonomous" activities does not indicate that those activities lack consciousness. You are also unaware of my consciousness but I assure you that it exists.
> The fact that your brain's executive network are unaware of the workings of "autonomous" activities does not indicate that those activities lack consciousness.
I'm lost. To me it means those activities lack consciousness. It's also the opinion stated in Jaynes' book, which was mentioned by user breckinloggins.
It's not the same as me being unaware of your consciousness. Your consciousness is foreign to me. It's always impossible for me to perceive it. My own consciousness is internal and I can perceive it. You may argue there are some activities of my own consciousness that I cannot perceive -- it's debatable, but we can discuss it -- but arguing about your consciousness from my perspective does not apply.
You are making a special distinction between stuff that is "part" of you, and stuff that is not "part" of you. In actuality there is no such clear distinction, there is only correlation in state, which is not binary but rather occurs on a spectrum.
Understand that I'm defining consciousness as the property of having awareness, not as the human mind. The human mind is what it is like to be a cerebral cortex with our given structure. Think of consciousness as being like a computer screen, and the mind as a picture displayed on that screen. That screen could display a very different image, but that wouldn't change what the screen itself is.
> You are making a special distinction between stuff that is "part" of you, and stuff that is not "part" of you
But this a crucial distinction when talking about consciousness, it's not arbitrary. Consciousness requires a subject to be conscious.
Splitting awareness from the human mind seems weird to me. It's not very useful to consider, say, my liver to be conscious. Even worse, there's no way to prove or disprove the assertion that it is conscious; it's even less falsifiable than talking about other people's consciousness!
The whole point of Jaynes' book is that consciousness is not required for most activities. He argues that not only are your body organs not conscious, but also that most activities you engage in every day aren't conscious either.
Decoupling awareness and its contents is a necessary prerequisite to a much more parsimonious philosophical viewpoint. Without taking that step, you're stuck with the idea that human awareness is somehow different in kind from the physical awareness that drives the evolution of everything else in the universe.
I don't understand what you mean by "the evolution of everything else in the universe", either. Consciousness isn't required for the evolution of living beings. Or do you mean something else?
When I say evolution here I mean a change in state, not Darwinian evolution.
I'm drawing a distinction between humans, who seem to have the ability to make choices, with the prevailing scientific world view, that everything is the result of mindless causal forces.
I suppose you could hold the epiphenominalist view that consciousness is a side effect with no causal agency, but then you have to explain why things that hinder survival/reproduction feel bad, and things that aid it feel good. If consciousness was truly an epiphenomenon there is no reason why there should be any concordance between survival value of events and their phenomenal character.
I think you have it backwards: if there is no reason for consciousness to have any concordance, then it can go either way. And there is evidence of lots of evolved traits that have indeed gone "either way". Or it could be that consciousness is not a blind side effect but actually provides survival value. Or maybe it is a side effect but it's harmless. There are lots of possibilities.
In any case, it seems this is a diversion from the main argument: if you aren't aware of a process, then it's not a conscious process on your part by definition.
> That's like saying a dead worm and a live worm are not so different.
What? No.
> If there is a difference in how they evolve, there is a something to be like it because it has to have some kind of perception and action selection mechanism. What it's like to be a worm is the perception stream evaluated by the value function of the worm.
Assuming that by "evolve" you mean "evolve as a dynamical system", then yes.
Even a worm has goals to attain - food and reproduction. In order to attain those goals, it needs to have perceptions and to rank those by the estimated return value. That is what it's like to be a worm.