About Helsing

Dedication To Mission

We founded Helsing to help protect our democratic values and open societies. The freedom to say what we want to say, and to be who we want to be, is a privilege and it cannot be taken for granted. We believe it is our civic duty to preserve this freedom for us and future generations.

Defence has become a software problem. It takes a native software company to develop the advanced architectures and AI algorithms to keep our societies from harm.

We work with governments and industry partners to transform the capabilities of existing hardware assets. Despite our young age, we are already being trusted by governments to deliver a number of large contracts.
Our Team

We are a diverse team of software engineers, deep learning specialists and customer-facing programme managers. Meet some of our team:

Ethics

Our responsibility

Protecting open, democratic societies is our civic duty and collective responsibility. Increasingly, this requires the development of advanced technologies like artificial intelligence to deter and defend. As democracies, we believe we have a special responsibility to be thoughtful about the development and deployment of these technologies. We take this responsibility seriously.

Ethics at the core

Helsing was founded to put ethics at the core of defence technology development. We aim to think about ethical questions and trade-offs ahead of time, and we believe our work uniquely positions us to contribute to the public debate and international guidelines in this field.

Navigating the ethical landscape

We put particular emphasis on determining the democracies we work with. To this end, we have established quantitative and qualitative guidelines to help us think about potential customer countries. Often, the decisions are straightforward. Sometimes, they may require further context and information. In each case, our internal ethics processes must be transparent and standardised, providing accountability both at the time and historically. The development of the technology itself is the second large area of focus. While, for example, the topic of a human-in-loop is well established, we have found that the effectiveness of a human in scrutinising AI is strongly dependent on a number of factors, incl. cognitive load, perceived reliability of the AI, fatigue and UX design. For us, these are not theoretical considerations; we are required to address these issues and build solutions on a daily basis.

Our opportunity

If we - as open societies - care about ethics, we cannot leave the development of advanced defence technologies to third parties. Instead, we need to give sovereign democracies control over the ethical decisions and trade-offs involved, and we need to make the underlying technology understandable, transparent and auditable.

A joint effort

We are proud that nearly every candidate interview raises the question of ethics, and we encourage critical thinking and questions from all prospective and existing employees. Regular ethics workshops help us strengthen our ethics muscle and gain experience in assessing particular cases as well as calibrating the trade-offs that may be involved. In all this, we try to be on the front foot, thinking about potential complex cases before they come to pass.