In the last essay it was suggested that although a hologram is not a person, it could be seen as a simplistic virtual person. This raises the question of whether this would be enough to provide a foundation for a moral argument against celebrity holograms. To make this case, I will employ Kant’s classic arguments about the moral status of animals.
In his ethical theory Kant is clear that animals are means rather than ends. They are mere objects. Rational beings, in contrast, are ends. For Kant, this distinction exists because rational beings can (as he sees it) chose to follow the moral law. Because they lack reason, animals cannot do this. Since animals are means and not ends, Kant claims we have no direct duties to animals. They belong with the other “objects of our inclinations” that derive value from the value we give them.
While this might suggest that Kant would have no moral concerns about how we treat animals, he argues we should be kind to them—despite their having no moral status of their own.
While Kant is not willing to accept that we have any direct duties to animals, he “smuggles” in duties to animals in a clever way: our duties towards animals are indirect duties towards humans. To make his case for this, he employs an argument from analogy: if a human doing X would obligate us to that human, then an animal doing X would also create an analogous moral obligation. For example, a human who has long and faithfully served another person should not simply be abandoned or put to death when he has grown old. Likewise, a dog who has served faithfully and well should not be cast aside in his old age.
While this would seem to create an obligation to the dog, Kant uses a little philosophical sleight of hand here. The dog cannot judge (that is, the dog is not rational) so, as Kant sees it, the dog cannot be wronged. So, then, why would it be wrong to shoot the dog?
Kant’s answer appears consequentialist in character: he argues that if a person acts in inhumane ways towards animals (shooting the dog, for example) then his humanity will likely be damaged. Since, as Kant sees it, humans do have a duty to show humanity to other humans, shooting the dog would be wrong. This would not be because the dog was wronged but because humanity would be wronged by the shooter damaging his humanity through such a cruel act. To support his view, Kant discusses how people develop cruelty: they often begin with animals and then work up to harming human beings.
Kant goes beyond merely enjoining us to not be cruel to animals and encourages us to be kind to them. Of course, he encourages this because those who are kind to animals will develop more humane feelings towards humans. So, roughly put, animals are practice for us: how we treat them is training for how we will treat human beings.
In the case of dead celebrity holograms, they clearly and obviously lack any meaningful moral status of their own. They do not think or even feel. They even have no independent existence—they are mere projections of light. As such, they lack all the qualities that might give them a moral status of their own.
Embed from Getty Images
While this might seem odd, these holograms seem to be on par with animals—at least in the context of Kant’s moral theory. For him, animals are mere objects and have no moral status of their own. The same is clearly true of holograms.
Of course, the same is also true of sticks and stones. Yet Kant would never argue that we should treat sticks well. Perhaps this would also apply to virtual beings such as a holographic Amy Winehouse. That is, perhaps it makes no sense to talk about good or bad relative to such virtual beings. Thus, the issue is whether virtual being are more like animals or more like rocks.
I think a case can be made for treating virtual beings well. If Kant’s argument has merit, then the key concern about how non-rational beings are treated is how such treatment affects the behavior of the person engaging in this behavior. So, for example, if being cruel to a real dog could damage a person’s humanity, then he should (as Kant sees it) not be cruel to the dog. This should also extend to virtual beings. For example, if creating and exploiting a hologram of a dead celebrity to make money would damage a person’s humanity, then they should not act in that way. If not doing this would make a person more inclined to be kind to other rational beings, then the person should not do this.
If Kant is right, then holograms of dead celebrities can have a virtual moral status that would make creating and exploiting them wrong. This view can be countered by two obvious lines of reasoning. The first is to argue that ownership rights override whatever indirect duties we might have to holograms of the dead. In this case, while it might be wrong to create and exploit such holograms, the owner of the likeness would have the moral right to do so. This is similar to how ownership rights can sometimes allow a person to have the right to do wrong to others, as paradoxical as this might seem. For example, slave owners believed they had the right to own and exploit their slaves. As another example, business owners often believe they have the right to exploit their employees by overworking and underpaying them.
The second like of reasoning is to argue that holograms are just light and this is not a solid enough foundation on which to build even an indirect obligation. On this view, there is no moral harm in exploiting such holograms because doing so cannot possibly cause a person to behave worse towards other people. This view does have considerable appeal, although the fact that many people feel that creating such holograms is creepy and disrespectful does provide a strong counter.
“If Kant is right, then holograms of dead celebrities can have a virtual moral status that would make creating and exploiting them wrong.”
Disagree. Images of dead public figures should enter the public domain and we should be free to do anything we want with their images.
No limits at all? Also, who counts as a celebrity for this?
Sure, why not. Holograms want to be free.