FOR A SAFER TOMORROW.

our mission

We created Hope to help protect our democratic values, freedom, and open society.
The freedom to say what we want to say, and to be who we want to be, is a privilege and cannot be taken for granted. We believe it is our social duty to preserve this freedom for us and future generations.

Freedom

The freedom to say what we want to say, and be who we want to be, is a privilege and should not be taken for granted. At Hope, we believe it is our civic duty to preserve this freedom for ourselves and future generations.

Integrity

To know what to change, we first need to know where we stand. Only with a sincere look at reality can we get closer to the truth. One of the truths is that the West is starting to lag behind other countries when it comes to developing and manufacturing advanced autonomous systems for defense.

Hope

Motivation is related to perspective. Without perspective, there is no reason to take action. We believe that there is hope, and that hope is needed. We can defend our freedoms, we can build what is really necessary, we can succeed.

verantwoordelijkheid

BUILDING AUTONOMOUS SYSTEMS COMES WITH RESPONSIBILITY.

Our responsibility

Protecting open, democratic societies is our duty and collective responsibility. This increasingly requires the development of advanced technologies such as artificial intelligence to deter and defend. As democracies, we believe that we have a special responsibility to carefully consider the development and deployment of these technologies. At HOPE, we take this responsibility seriously.

Ethics at its core

HOPE was founded to put ethics at the heart of the development of defense technology. We strive to consider ethical issues and considerations beforehand, and we believe that our work puts us in a unique position to contribute to the public debate and international guidelines in this area.

Navigating the Ethical Landscape

We place particular emphasis on defining the democracies with which we work together. To this end, we have created quantitative and qualitative guidelines to help us think about potential customer countries. Often, decisions are simple. Sometimes, they may need more context and information. In any case, our internal ethical processes must be transparent, standardized and accountable, both at that time and in the past. The development of the technology itself is the second major area of focus. For example, while the topic of “human-in-loop” is well established, we found that a human's effectiveness in investigating AI depends heavily on a number of factors, including: cognitive load, perceived AI reliability, fatigue, and UX design. For us, these are not theoretical considerations; we need to address these problems on a daily basis and come up with solutions.

Our opportunity

If we — as open societies — care about ethics, we cannot leave the development of advanced defense technologies to third parties. Instead, we need to give sovereign democracies control over the ethical decisions and considerations that come with them, and we need to make the underlying technology understandable, transparent, and auditable.

A collaborative effort

We are proud that almost every job interview with candidates raises the ethical issue, and we encourage critical thinking and questioning among all prospective and existing employees. Regular ethics workshops help us strengthen our ethical strength and gain experience in assessing specific cases and calibrating the considerations that may be involved. In all of this, we try to lead the way and think about potentially complex issues before they occur.

Strengthen our team from Delft, and join us in making the Netherlands stronger.

Apply here >
Apply here