So should we just assume that nothing is conscious?
Not at all! In fact, I believe that we should assume almost everything is conscious. I think it’s a bit of human arrogance to think that we brain creatures have a monopoly on perspective.
Nobody knows why they produce consciousness or what particular mechanism is responsible for human awareness.
Exactly my point.
That’s… irrelevant
I don’t think it is. If the argument is that it’s unethical to poke a neuron because it might have consciousness, would the same argument not apply to anything else? I think you might be getting a bit hung up on the “think like a human” thing. My point is not that it’s okay to torture something if it doesn’t “think like a human.” It’s that there are potentially a lot of things in the world that are conscious that don’t often get the same consideration.
capable of experiencing suffering
This is an interesting one. It shifts the question from “does it have a consciousness?” to “does it have a consciousness that is suffering or able to suffer?”.
The idea of suffering is a very human concept that we have a whole section of our brains devoted to. There’s a lot of ethics devoted to alleviating suffering (eg. Humanitarianism) and we sorta use it as a means of directing our goals - we avoid things that make us suffer and seek things that bring us happiness. What makes us happy or makes us suffer varies a bit from person to person due to experience and learning/training but a lot of it is biologically evolved. Physical and emotional pain makes us suffer for evolutionary reasons.
So in one sense, you could define suffering as a stimulus that some conscious system avoids? In which case, training neurons essentially teaches them what suffering is. They’re trained to activate or not activate based on what avoids irregular stimulus (suffering) and results in regular stimulus (happiness).
If that’s how you define it though, there could be many other systems that work the same way. Obviously animals and plants and fungi etc. But also Computers and lots of mechanical systems do that too. If making decisions to avoid or seek electrical stimulus is suffering then a computer is basically a pleasure/torture box.
Personally I think that suffering is more than that. I think it’s a larger system we brain creatures have developed that doesn’t necessarily apply very well outside the context in which we use it. Would a vat of 20 billion neurons be able to suffer? I think that depends on how they’re arranged and whether they have that concept.
Whether it’s ethical to murder an entire village of your enemies “depends on your ethical framework and philosophical worldview.” See what a slippery slope moral relativism is?
Just because different ethical frameworks and worldviews exist, doesn’t mean they should all be treated equally. Sure, if someone is super utilitarian they might be fine with torturing people for medical research when they feel that the ends justify the means. If someone has a strict deontological code of ethics that tells them homosexuality is a sin punishable by death, they might campaign for that. I think those people suck and their beliefs are evil because of my own ethics and worldview.
When it comes to a question like “is an ant capable of suffering?” Or “is it okay to swat a fly or set a mouse trap?” Or “how many human neurons does it take to suffer while changing a light bulb?” You’ll get varying answers from people based on who they are. Personally, I think the right answer to those questions is dependent on the brain of the person answering them.
Moral universalists have the same slippery slopes you mentioned. If right and wrong are fixed and objective and not dependent on people, then groups claiming to know the one true morality will use it to persecute those labelled as evil or morally bankrupt (see the homophobic asshole example above).
Moral relativism doesn’t mean that morality doesn’t matter or that it’s wrong to fight against what you think is evil. I believe you should fight for what is right and I’m hopeful that the things that I think are good will win out against the things that I think are evil. Absolutism is maybe a bit easier for that because it simplifies moral choices a lot, but I think it’s hubris to think that evil is the same everywhere to everyone and not an artifact of the human mind.
Not at all! In fact, I believe that we should assume almost everything is conscious. I think it’s a bit of human arrogance to think that we brain creatures have a monopoly on perspective.
Exactly my point.
I don’t think it is. If the argument is that it’s unethical to poke a neuron because it might have consciousness, would the same argument not apply to anything else? I think you might be getting a bit hung up on the “think like a human” thing. My point is not that it’s okay to torture something if it doesn’t “think like a human.” It’s that there are potentially a lot of things in the world that are conscious that don’t often get the same consideration.
This is an interesting one. It shifts the question from “does it have a consciousness?” to “does it have a consciousness that is suffering or able to suffer?”. The idea of suffering is a very human concept that we have a whole section of our brains devoted to. There’s a lot of ethics devoted to alleviating suffering (eg. Humanitarianism) and we sorta use it as a means of directing our goals - we avoid things that make us suffer and seek things that bring us happiness. What makes us happy or makes us suffer varies a bit from person to person due to experience and learning/training but a lot of it is biologically evolved. Physical and emotional pain makes us suffer for evolutionary reasons.
So in one sense, you could define suffering as a stimulus that some conscious system avoids? In which case, training neurons essentially teaches them what suffering is. They’re trained to activate or not activate based on what avoids irregular stimulus (suffering) and results in regular stimulus (happiness).
If that’s how you define it though, there could be many other systems that work the same way. Obviously animals and plants and fungi etc. But also Computers and lots of mechanical systems do that too. If making decisions to avoid or seek electrical stimulus is suffering then a computer is basically a pleasure/torture box.
Personally I think that suffering is more than that. I think it’s a larger system we brain creatures have developed that doesn’t necessarily apply very well outside the context in which we use it. Would a vat of 20 billion neurons be able to suffer? I think that depends on how they’re arranged and whether they have that concept.
Just because different ethical frameworks and worldviews exist, doesn’t mean they should all be treated equally. Sure, if someone is super utilitarian they might be fine with torturing people for medical research when they feel that the ends justify the means. If someone has a strict deontological code of ethics that tells them homosexuality is a sin punishable by death, they might campaign for that. I think those people suck and their beliefs are evil because of my own ethics and worldview.
When it comes to a question like “is an ant capable of suffering?” Or “is it okay to swat a fly or set a mouse trap?” Or “how many human neurons does it take to suffer while changing a light bulb?” You’ll get varying answers from people based on who they are. Personally, I think the right answer to those questions is dependent on the brain of the person answering them.
Moral universalists have the same slippery slopes you mentioned. If right and wrong are fixed and objective and not dependent on people, then groups claiming to know the one true morality will use it to persecute those labelled as evil or morally bankrupt (see the homophobic asshole example above).
Moral relativism doesn’t mean that morality doesn’t matter or that it’s wrong to fight against what you think is evil. I believe you should fight for what is right and I’m hopeful that the things that I think are good will win out against the things that I think are evil. Absolutism is maybe a bit easier for that because it simplifies moral choices a lot, but I think it’s hubris to think that evil is the same everywhere to everyone and not an artifact of the human mind.