Even though nobody thinks Strong AI has been achieved, we attribute beliefs to computer systems and software:
Microsoft Word thinks that I mistyped that word.
Google knows where I’ve been shopping.
The attribution is communicatively useful and natural, but is not literal.
It seems to me, however, that the difference in kind between the beliefs of computers and the beliefs of persons is no greater than the difference in kind between the beliefs of groups and the beliefs of persons.
Given this, the attribution of beliefs to groups should also not be taken to be literal.
2 comments:
What do you make of Philip Petit and Christian List's arguments in this respect? In particularly, they note that groups may endorse positions on issues which no particular members of that group hold and which may systematically behave illogically with respect the beliefs they hold (so that a group may accept "P" and "Q" but will reject "P & Q"; I think they call this the discursive paradox).
Endorsement isn't belief, so I am not sure I see how this affects my position.
Post a Comment