A PhD student studying applied matematics at a university in Britain reflects on the links between academia and the arms trade.
When I started my PhD in applied mathematics around one third of my proposed funding plan was to come from BAE systems. Knowing very little about BAE and the arms trade in general I did some research, and was horrified at what I found. It soon became clear that there was no way I wanted their money or any involvement with them, which fortunately did not involve me dropping out of my PhD, just taking a funding cut. This move was a knee-jerk reaction; one which I have been trying to justify ever since. I do not struggle to justify standing against weapons research, but mathematical research by nature can have many unpredictable outcomes; a new technology that might seem like a genuine asset to humankind is only a couple of modifications from the latest killing machine, so even non-military research cannot be truly considered safe.
A few months in, I noticed many other students and professors were collaborating with weapons companies. I had always felt that academic science is driven purely by a thirst for knowledge, not wealth, considering that most academics (PhD or above) could typically earn a far higher salary from 'industrial' employment. By contrast, in areas where the weapons industry appears to flourish, money seems to be the driving factor. So I couldn't understand why these people at my university, who seemed like decent, honest people, were happy to receive funding from such an unethical source. The arms industry does not have the stigma among academics that I would expect. Three years have passed and I have been to conferences all over Europe, met a range of industrial and academic professionals, and often noticed the strong yet silent presence of the weapons industry.
Mathematicians enjoy working on idealised versions of real world problems problems. Herein lies the danger; this simplification is so significant that it entirely separates the problem from its application, removing any potential sense of guilt. This state of blissful ignorance seems to be common, and is very convenient for the arms industry. In the same way that the armed forces glamourise the military through advertisement and Hollywood movies, the arms companies benefit from this inherent disjointedness between the problem they ask researchers to solve, and the underlying application in mind. After all, if it is the researcher's intention to move humankind forward, and their work is developed into weapons, how are they responsible? But then, who is responsible? If the government authorises the the soldier to use the weapon sold at the arms fair designed by the engineer applying the research of the mathematician, at which point in the chain does immorality begin? Of students I have known to leave academia after graduation, around half are now employed by arms companies. Like many defense engineers they are not bad people - this is where the industrial jobs are. But engineers are not soldiers, and so naturally place accountability closer to the front line. With DSEI 2015 boasting an entire unmanned section, the buck of 'pulling the trigger' is one that cannot be passed for much longer.
Mathematical research has a huge range of applications, scattered across the ethical spectrum. There are mathematicians such as myself, working on simple problems that may be related to a range of real world problems, and there are mathematicians who are solving problems that have no real world applications (at least not yet). It can be difficult to find a moral alignment when working for either of these. My research can be applied to radiotherapy treatment of cancer patients, non-destructive oil exploration, designing concert halls, radar on warships (hence the BAE interest), or any problem involving acoustic or electro-magnetic waves. In terms of the ultimate progress of humankind, it would be nice to know that my research will result in more steps forward than steps back... But scientific research is part of the solution to - and a contributor to - a large number of humankind's problems. Planes and boats became fighter jets and warships a long time ago, but (for example) facial recognition software on your new iPhone can be used for much more than collating selfies. Soon, weapons will know who to shoot at before soldiers do.
So what is the right thing to do? Should mathematicians (and scientists in general) not involve themselves in research which may some day play a part in a military regime? Because, as history has shown, even the genii of a given generation cannot predict the applications of their own research. In 1940 Thomas Hardy wrote his famous essay "A Mathematicians Apology", in which he apologised for the uselessness of (then) present day mathematics, in particular making reference to many areas of mathematics he believed would never have practical applications. This included the study of prime numbers, which have since provided the basis for securing and encrypting all of the sensitive information we send online, and much of the encryption breaking too. One has to wonder what potential horrors can arise from the result of any research, and therefore if just refusing funding from weapons based companies is even enough.
In retrospect, it has become clear that refusing BAE's money on moral grounds whilst continuing the PhD was somewhat hypocritical. Because I always arrive at the same depressing conclusion: the only way to not contribute to weapons research is to not do research.