Software surrounds us in all aspects of life. Our cars are full of software, our communication is based on software, and almost every device in our daily life contains software.

In the recent past, there have been some incidents that showed us the problems and dangers latent in the use of software. Incredibly large amounts of customer data were stolen from eBay and Sony, among others. Surely many readers of this article were affected. Further, the disclosures of Edward Snowden show the large impact of software on our privacy today.

During the last few years, I realized one fact: We software developers make these scandals possible. Our software influences the life of most people on this planet. Our systems get hacked and exploited. Our employers and customers use our software to change the world for good or ill. And, in the end, we are the people who build this software.

During the past decades, we have learned a lot about building good and reliable software. The software crisis showed us our technical limits and as good engineers, we invented processes and methods to master that complexity. Unfortunately, we forgot to look beyond our technical challenges and to recognize our growing responsibility.

The software industry is very young. The first modern programming languages were invented fifty years ago. Thirty years ago, personal computers were far from being in every home. At the same time, the impact of the software industry over those fifty years is unprecedented.

The invention of the letterpress changed our world in a lasting way, but this change took almost 300 years. The invention of the automobile also changed our world. That mass distribution took about 150 years. Today, almost every country has its own institution to assure the safety of cars and to approve new models for public use.

Professions like medicine or civil engineering have a history of several millennia. These disciplines have had enough time to develop a solid set of ethics, rules like the Hippocratic Oath. But these also needed to grow into their responsibilities. Architects in ancient Greece had to live under a new bridge for a week with their families. If the bridge was poorly constructed, they suffered the consequences.

We software developers have accumulated a massive amount of influence within a comparatively short time. But we have had only that same short time to face the responsibility evolving from this influence. During the last several years, I encountered the same topics again and again and I had many interesting discussions about responsibility with many bright software developers. From these discussions I created the “Manifesto of Responsible Software Development” together with others to encourage thoughts and discussions about our responsibilities. In the following paragraphs, I describe the rationale behind each aspect of the manifesto.

Manifesto for Responsible Software Development

In order to foster a free and fair society I affirm that I will practice my profession with responsibly and with dignity. I will abide by the following principles:

I am ethically responsible for my decisions and I will act according to my conscience.

The impact of software is growing continuously in all areas of our lives. I acknowledge the consequences to humanity and the environment that evolve from our work.

I will not develop software that is intended to violate human rights and civil liberties.

It is increasingly possible to violate personal and human rights with the use of software as the boundaries between real and digital world become blurred.

I know that I can't control software once it is released so I have a responsibility to consider the potential for my software to violate people’s right before I start to implement it. I will reject projects which facilitate this abuse.

I will be worthy of the faith in me as an expert of my profession.

The possible negative consequences of the improper use of complex software are inconceivable to most users. Therefore it is our responsibility as software developers to communicate the boundaries of proper use clearly.

When I realize that a software that I released is not applicable anymore, my minimum responsibility is to let potential and existing users know.

I will collect only the data that is essential for my task. I will store it only as long as needed.

My applications are likely to collect personal information. I will safeguard this, and use it only as the provider of the information intended. I will treat their data as if it were my own.

I will do my very best to prevent the waste of energy and resources.

The increasing number of devices that contain software has a strong impact on the global use of resources and energy.

I make these promises solemnly, freely and upon my honor.

I am ethically responsible for my decisions and will act according to my conscience.

The impact of software is growing continuously in all areas of our lives. I acknowledge the consequences to humanity and the environment that evolve from our work.

When I searched for my first job as a professional software developer, I gave myself one simple rule. I would never write software for a weapons manufacturer. I had some very heated discussions with other people over the years and I know there are good arguments for and against my decision.

Weapons are a very controversial topic. Everybody has an opinion and can easily argue their side. Nonetheless you should always look for possible unintended negative consequences of your work and make a decision based on your values. I know some people who write software for weapons systems. They justify this with patriotism and national defense, but what happens when an antiaircraft defense missile is stolen by rebels who use it to destroy a civilian aircraft with 298 innocent passengers?

When I exclude weapons, nuclear power, aircraft, or computer games, I reduce the number of possible job opportunities open to me in the first place. But as soon as I start a job or a contract, I have to own responsibility for the consequences resulting from my work.

Weapons are a very catchy topic. Everybody has an opinion and you can easily argue about it. Nonetheless you should always look for possible unwanted consequences of your work and make a decision based on your values.

I will not develop software intended to violate human rights or civil liberties.

As the boundaries between the real and digital worlds become blurred, it is increasingly possible to violate personal and human rights with the use of software.

I know I can't control software once it is released, so I have a responsibility to consider the potential for my software to violate people’s rights before I start to implement it. I will reject projects which facilitate this abuse.

The disclosures of Edward Snowden showed us the degree of surveillance possible today. It is almost beyond belief how many people's privacy is affected.

The argument about the necessity of surveillance is as controversial as the one about weapons. But the fact is we have laws to ensure the privacy of innocent people. Today this privacy is weakened by software so much that we are not far from George Orwell's 1984.

We software developers should realize much more often that software cannot be withdrawn once it is released. Sometimes our programs show up at places we never imagined when we created them. Perhaps we should think about possible abuse before we try to build the next face recognition algorithm?

I will be worthy of the faith the public has in me as an expert in my profession.

The possible negative consequences of the improper use of complex software are inconceivable to most users. Therefore it is our responsibility as software developers to communicate the boundaries of proper use clearly.

When I realize software I released is obsolete, my minimum responsibility is to let users know.

As software developers, it is hard to grasp the lack of understanding most users have for technical limits. There are real problems that affect a large number of users and developers. Only few developers thought about the security of OpenSSL until the Heartbleed bug was announced. Almost nobody knew only one employee was responsible for OpenSSL at that time. Nobody realized the limits of OpenSSL.

I cannot check all external libraries for bugs and security issues. I rely on the documentation and the statements of other developers. That means other developers rely on my statements, too. When I am the expert for some specific field of activity and I release or sell software, I should communicate the technical limits very clearly. Users of my software usually have less experience with its performance, so it is my responsibility to be comprehensible.

I will collect only the data essential for my task. I will store it only as long as needed.

My applications are likely to collect personal information. I will safeguard this, and use it only as the provider of the information intended. I will treat their data as if it were my own.

There are several arguments from fans of the “store everything, forever” mentality. They say “We store only meta-data, not real data”. They say “We need all that data to provide good service to our customers.” Both are insincere arguments. Today it is a common business model to collect and sell data. If you want to do so, be honest and don't hide behind fake arguments.

Even if you don’t intend to sell the data, there are some dangers you need to acknowledge and control for. The power that comes from large amounts of data will always attract unwanted attention from third parties. When you store tons of data, someone will try to steal them. This happened to companies like Sony and eBay; why wouldn’t it happen to you? And even if you can protect yourself from hackers, there are still governments who will request access to your servers.

When someone steals your data, at best you lose your reputation. With some bad luck there are even legal ramifications. You might think about data privacy because you are an idealist and you believe in a better world. But even if that is not true, there are some very good, egoistic reasons to think twice about the data you gather and store.

I will do my very best to prevent the waste of energy and resources.

The increasing number of devices containing software has a strong impact on the global use of resources and energy.

There are reasons why Google and other companies invest hundreds of millions in regenerative energies. The operation of data centers consumes large amounts of energy. Optimizing a software for better energy efficiency can have large impact when this optimization is scaled across thousands of servers.

At the same time, we experience the consequences of inefficient software daily when a badly written app empties our smart-phone within hours. As software developers we have a significant impact on the energy consumption of our systems.

Next to the obvious ecological aspects, there are economic reasons to optimize resources. Efficient software in our data center reduces the operating costs for our servers. An app with bad energy consumption will score poorly in appstores. Both scenarios can cost or save real money.

Caring for the energy consumption of our software is an important task, no matter if you are trying to save the world or just your workplace.

Conclusion

None of the outlined topics are new. Still, most software developers don't realize how much we impact societies and individuals. This influence grows with every day and there is no end in sight. We also don’t always think about the societal and ecological repercussions that evolve from seemingly small decisions. As a very young industry, we should face our responsibility and deal with the consequences of our products and projects.

I want to encourage all of you to think about your workplace and your latest projects. Do you have a red line for work you would reject? Do you have rules about how your company stores data? Discuss these topics and the Manifesto with your colleagues. You will be surprised, how many of them have opinions you didn't know about!

About the Author

Nils Löwe is co-founder and CEO of Vedaserve, a tech start-up which specializes in knowledge-based decision making with tools like the proofler. He helps potential founders to shape and test their business models and to create first prototypes and MVPs. Nils takes a holistic approach and likes to fill several roles such as developer, architect, teacher and manager. Find more information at nils-loewe.de