In contemporary ethics, paternalism is seen as really bad. On the other hand, in contemporary technology practice, paternalism is extremely widely practiced, especially in the name of security: all sorts of things are made very difficult to unlock, with the main official justification being that if if users unlock the things, they open themselves to malware. As someone who always wants to tweak technology to work better for him, I keep on running up against this: I spend a lot of time fighting against software that wants to protect me from my own stupidity. (The latest was Microsoft’s lockdown on direct access to HID data from mice and keyboards when I wanted to remap how my laptop’s touchpad works. Before this, because Chromecasts do not make root access available, to get my TV’s remote control fully working with my Chromecast, I had to make a hardware dongle sitting between the TV and the Chromecast, instead of simply reading the CEC system device on the Chromecast and injecting appropriate keystrokes.)
One might draw one of two conclusions:
Paternalism is not bad.
Contemporary technology practice is ethically really bad in respect of locking things down.
I think both conclusions would be exaggerated. I suspect the truth is that paternalism is not quite as difficult to justify as contemporary ethics makes it out, and that contemporary technology practice is not really bad, but just a little bad in the respect in question, even if that “a little bad” is very annoying to hacker types like me.
Here is another thought. While the official line on a lot of the locking down of hardware and software is that it is for the good of the user, in the name of security, it is likely that often another reason is that walled gardens are seen as profitable in a variety of ways. We think of a profit motive as crass. But at least it’s not paternalistic. Is crass better than paternalistic? On first, thought, surely not: paternalism seeks the good of the customer, while profit-seeking does not. On second thought, it shows more respect for the customer to have a wall around the garden in order to be able to charge admission rather than in order to control the details of the customer’s aesthetic experience for the customer’s own good (you will have a better experience if you start by these oak trees, so we put the gate there and erect a wall preventing you from starting anywhere else). One does have a right to seek reasonable compensation for one’s labor.
The considerations of the last paragraph suggest that the special harm of paternalistic behavior is a dignitary harm. There is no greater non-dignitary harm to me when I am prevented from rooting my device for paternalistic reasons than when I am prevented from doing so for profit reasons, but the dignitary harm is greater in the paternalistic case.
There is, however, an interesting species of dignitary harm that sometimes occurs in profit-motivated technological lockdowns. Some of these lockdowns are motivated by protecting content-creator profits from user piracy. This, too, is annoying. (For instance, when having trouble with one of our TV’s HDMI ports, I tried to solve the difficulty by using an EDID buffer device, but then I could no longer use our Blu-Ray player with that port because of digital-rights management issues.) And here there is a dignitary harm, too. For while paternalistic lockdowns are based on the presumption that lots of users are stupid, copyright lockdowns are based on the presumption that lots of users are immoral.
Objectively, it is worse to be treated as immoral than as stupid: the objective dignitary harm is greater. (But oddly I tend to find myself more annoyed when I am thought stupid than when I am thought immoral. I suppose that is a vice in me.) This suggests that in terms of difficulty of justification of technological lockdowns with respect to dignitary harms, the ordering of motives would be:
Copyright-protection (hardest to justify, with biggest dignitary harm to the user).
Paternalism (somewhat smaller dignitary harm to the user).
Other profit motives (easiest to justify, with no dignitary harm to the user).
No comments:
Post a Comment