Neil Jordan: Selling Bullets and Beers – A Matter of Responsibility

The company American Rounds is supplying vending machines from which gun owners can buy bullets, with machines currently available in food shops in the states of Alabama, Oklahoma and Texas. There are plans to expand this provision to states where hunting is popular, such as Louisiana and Colorado. Customers simply select the ammunition that they would like to buy using a touchscreen, scan their identification and collect their bullets below, the machine having used ‘built-in AI technology, card scanning capability and facial recognition software’ to match the buyer’s face to his or her ID and to ensure that he or she is over 18 years old.

The states in which such machines are available at present place no minimum age limit on the purchase of ammunition, do not require the vendor to keep a record of the purchaser, impose no licensing regime for the sale or purchase of ammunition and do not prohibit those disqualified from purchasing or owning firearms from buying ammunition (though federal laws might impose such a restriction, without necessarily obliging vendors to check whether a customer is in fact disqualified). It would therefore seem that in checking the ID of a purchaser and maintaining a record of the transaction, the machines provided by American Rounds arguably do more than state law requires. This might be for the purposes of ensuring that the machines are unquestionably within the law, or, by ensuring sales are made to adults only, it might be an exercise in reputation management – perhaps both – but it does mean that the machines are likely to be legally compliant when installed in other states where tighter restrictions may apply.

 

Artificial Intelligence, Risk and Trust

Without entering into the wider issue of gun ownership and its regulation, there are nonetheless moral questions regarding the provision of something so potentially dangerous by way of a vending machine. Can we be certain that the technology will always perform as it is supposed to? We might ask whether such machines capable of discerning a forged ID from a genuine one. Moreover, will they identify buyers correctly? After all, numerous cases (at least seven in the US last year) have been documented of wrongful arrest as a result of facial recognition technology and it would appear that some technologies of this kind are prone to reflecting and perpetuating biases in the data with which they are trained. Whether the technology in American Rounds’ vending machines will accurately match the purchaser’s face to a photograph on an identity document is therefore a legitimate question. These concerns raise the much broader question of responsibility.

 

Decisions, Decisions…

Where there exists a right to own firearms and ammunition, there is no prima facie reason to disallow sales of ammunition provided by technological means, provided that the technology is reliable and ensures that sales are only ever made to the right people. What, then, is the role of people in such transactions? In a jurisdiction in which would-be buyers of ammunition were checked against a register of individuals disqualified from buying or owning guns, one would expect purchases to be carefully monitored – not least because the shop-owner’s livelihood is likely to be at risk for breaches of regulations. Such verification would doubtless be conducted by means of access to a database, such that the checks, while instigated and concluded by a human-being who makes a decision ‘in store’, would nonetheless be dependent on technology. Ultimately, therefore, while relying on the information provided, the individual vendor would be responsible for the sale. The question, then, is whether this decision, based on the same information, might safely be deferred to a machine that uses facial recognition software and searches databases itself.

The risks involved are different, but a similar question can be asked about the sale of alcohol. Practices vary but in some countries, alcoholic drinks can be bought from vending machines, with the identification of the buyer being verified either by biometric data gained by scanning the customer’s fingerprint, or by simply supplying the purchaser with a wristband to show that his or her ID has been checked by a member of staff. In other countries, alcohol can only be bought at certain times from state approved vendors.

 

Decisions and Responsibility

Whether the sale is of alcohol or ammunition, are those businesses and states who continue to require and rely upon a human decision at some stage in the transaction doing so based on an unjustified our outdated mistrust of technology, or because they acknowledge that responsibility can ultimately only be attributed to free human beings, who recognise the potential consequences of error? The question, therefore, becomes one not only of trust, but also of responsibility in relation to technology. Where certain decisions handed over to technology – which, of course, can be done more easily and more safely in some areas than in others – we are left with the matter of where responsibility lies, particularly when the technology ‘gets it wrong’. Other things being equal, the owner of a hunting supplies store will be liable if he or she sells a firearm to someone who is underage or disqualified from purchasing guns. Where does responsibility lie if a vending machine sells alcoholic drinks to children in error? Does this rest with the corporate owners or suppliers of the machine? If the machine is on licensed premises, is the landlord responsible? Perhaps there is a case for holding the suppliers of the technology used by the machine liable. This might not be a straightforward matter, as fatalities involving self-driving cars demonstrate: in one case, the back-up driver of the vehicle was convicted while the operating company was judged not to be criminally liable. When an algorithm becomes involved in decisions relating to sentencing for criminal misdemeanours or the provision of social security, where does responsibility for those decisions lie?

Regardless of the scenario, responsibility, as a moral category, must always reside with a person or (human) organisation, never a machine. Machines, however ‘intelligent’, are neither conscious nor free and as such, they are not moral agents. Where decisions are devolved to technology – and that technology ‘decides’ incorrectly – the challenge is for us to identify the responsible subject.


 

Neil Jordan is Senior Editor at the Centre for Enterprise, Markets and Ethics. For more information about Neil please click here.