Most moral questions that take place in our universe have easy, intuitive answers because human nature is fundamentally rational and responds to negative/positive emotions.
In this case, the situation is very hard to create an intuition for, because so much of it is at odds with how we view human nature. Moral intuitions operate in a realistic universe - and in our universe, if someone experiences sufficient levels of pain - or derives zero joy or happiness from their activities, then they would stop doing those things. But instead this person, as described by FC, carries on performing those activities. To me, this describes someone who has no free will.
So we should really alter our understanding of the situation to be more realistic, but still comparable to the hypothetical. What kind of entity would operate under the standards described? And how should we treat that entity?
In this case I think we should view it as a machine, kind of like AM (Allied Mastercomputer) from "I have no mouth, and I must scream" by Harlan Ellison. And the machine is programmed to inflict pain on others. The machine however, is also capable of experiencing joy. Is it worth it, in that scenario, to take its happiness away?
In that context, I think the answer is no. It is not responsible for causing suffering, it has no choice in the matter. Clearly no matter how much pain is inflicted on the machine, even infinite levels of pain would not be enough to change its behaviour. So it is by definition acting illogically, as the actions it is undertaking are pointless acts of cruelty and therefore can't be "worth" the pain in any substantial way. It must have been programmed.
To take away its happiness therefore is truly pointless and only creates more suffering in the world. The machine is basically a victim of bad programming. It should not be made to suffer for the evil instructions of its programmer. And if you really want to claim that the human in this severely twisted hypothetical is consciously choosing to cause harm regardless how much harm is done to him in response, I would say he suffers some form of brain damage and, as a disabled person, isn't really culpable for their actions.
In this case, the situation is very hard to create an intuition for, because so much of it is at odds with how we view human nature. Moral intuitions operate in a realistic universe - and in our universe, if someone experiences sufficient levels of pain - or derives zero joy or happiness from their activities, then they would stop doing those things. But instead this person, as described by FC, carries on performing those activities. To me, this describes someone who has no free will.
So we should really alter our understanding of the situation to be more realistic, but still comparable to the hypothetical. What kind of entity would operate under the standards described? And how should we treat that entity?
In this case I think we should view it as a machine, kind of like AM (Allied Mastercomputer) from "I have no mouth, and I must scream" by Harlan Ellison. And the machine is programmed to inflict pain on others. The machine however, is also capable of experiencing joy. Is it worth it, in that scenario, to take its happiness away?
In that context, I think the answer is no. It is not responsible for causing suffering, it has no choice in the matter. Clearly no matter how much pain is inflicted on the machine, even infinite levels of pain would not be enough to change its behaviour. So it is by definition acting illogically, as the actions it is undertaking are pointless acts of cruelty and therefore can't be "worth" the pain in any substantial way. It must have been programmed.
To take away its happiness therefore is truly pointless and only creates more suffering in the world. The machine is basically a victim of bad programming. It should not be made to suffer for the evil instructions of its programmer. And if you really want to claim that the human in this severely twisted hypothetical is consciously choosing to cause harm regardless how much harm is done to him in response, I would say he suffers some form of brain damage and, as a disabled person, isn't really culpable for their actions.