The technical standards that govern the future of the web hit the spotlight at the World Congress of International Telecommunications, held in Dubai. Groups of people sitting for days on end in rooms with no windows will be deciding what technology will look like in the future. These groups of people travel from country to country meeting their counterparts from other companies as part of a multitude of different Standards Developing Organisations or SDOs.

Standards themselves are created in different ways. A friend of mine refers to the "American model", where everyone brings their solutions to the table and agrees what the standard is, as opposed to the "European model", where a standard is invented from scratch. Both models work and both models have issues. Arguably, the most successful European standard built from scratch is the GSM standard for mobile phones from the European Telecommunications Standards Institute (ETSI).

One mobile phone company told me that they were members of more than 140 standards bodies. Managing this must be incredibly difficult, never mind costly.

Intellectual property

For many companies, the objective is not only working together to enable new amazing products that can operate together, but to also get their own inventions adopted within the standard. For standards bodies, a lot hinges on the IPR (Intellectual Property Rights) policy that is in operation.

Most bodies in the mobile phone industry have a Frand (Fair, Reasonable and Non-Discriminatory) policy where companies license any patents they create on fair terms, however this has led to patent wars between big players in the mobile industry and what initially helped the industry grow is fast becoming a way of stifling innovation by large incumbents.

Another policy that can be adopted is the royalty free policy (RF). This is the patent policy under which the Worldwide Web Consortium (W3C) operates. It helps innovation because in theory anyone can adopt the standard without having to pay any licensing fees to companies with patents. This in turn helps innovation by allowing even the smallest developer to create the next world beating product. Even the poorest countries can adopt technology that will help them to develop.

However, again this is not without its issues. Companies will avoid work items where they feel they have existing intellectual property and are paranoid about so-called "submarine patents", lurking beneath the surface.

In some cases, the situation is a little less clear. The WHATWG was a breakaway group of companies from the W3C who developed some of the specifications for what is now known as HTML5. Astonishingly, they didn't have a patent policy until April 2012, despite working on the project since 2004.

Dominance and participation

Voting can be a big part of standards setting, but it is where the political game really starts. Mostly, the work is driven by an honest joint vision to get things done for mutual and societal gain. However, in some cases, big corporations are adept at playing the standards game and there have been accusations of some companies trying to 'buy' votes in the run-up to important decisions.

Other companies will seemingly deliberately manipulate mechanisms intended to prevent voting abuse. The situation looks worse in organisations like the International Telecoms Union (ITU), an agency of the UN. In the ITU, voting rights only go to countries. This would be all well and good if the products were actually being developed by those countries, but they're not – it is industry that invents the products and has all the engineers who are working on the designs.

Companies can participate, but at the end of the day it is the countries who decide on what is in or out of a standard or if it gets published. The end result is a complete lack of participation and standards that often don't get adopted. This is the worst possible outcome for an engineer who wants to see his work adopted into products and by the world.

Quality in standards

The standards body's office itself plays a key role in ensuring that the SDO is as competent as its own membership. Managing everything from difficult individuals, IPR disputes, through to making sure that conflict resolution and useful co-operation is taking place with other standards bodies means that the output of the SDO is ultimately of good quality.

If a standard is to be adopted, it has to be of good quality above everything else as it could cost millions if it isn't, or the standards just don't get adopted and the body dies a death with only a few tech journalists noticing its final demise as the web domain expires.

Organisations like ETSI and W3C have proven that through the combination of active industry membership, outreach to other bodies and an excellent office team, international standards can achieve great things.

Open vs closed

Open standards such as those from the W3C can be fantastic for the world as anybody can email the mailing list and help the standard along. This kind of peer review is really necessary if you want to get something that works, although dealing with all the comments from interested parties can literally take months.

In other bodies, they have a closed/open method – which is that the standard is developed in a closed group of paying members, before being made public once the standard is complete. Sometimes these standards will be sent out in pre-release form for a public review. The argument for this is that it encourages participation in that particular body (and membership fees to keep the thing going), but also enables the group not to be distracted by any old input. In theory, all the experts are around the table.

This doesn't work so well in some closed bodies. The ITU suffers from not having enough experts together, which means that the documents may not necessarily get the required scrutiny, greatly affecting quality. Sometimes there are no comments at all on inputs in the working groups and no discussions on mailing lists.

For any organisation, this is going to result in disastrously flawed outputs. The scariest thing here is that documents are produced with little real regard for the work of existing industry organisations, or any discussions with them before proposals are put together. It all comes across as a big land-grab to the other organisations in the same space, even if it isn't intended to be.

Standardising the scary stuff

In Dubai at the moment, the ITU World Conference in Telecommunications (WCIT) is discussing standards-related things that matter to the general public. It is being categorised by some as the fight for the future of the internet. Only last week, there were reports about standards being set about Deep Packet Inspection (DPI). You may not understand what DPI is, but it is enough to say that it is probably the most privacy invading technology there is as it can see every single bit of traffic going across a network like the internet.

There are really important good uses of DPI for companies to make sure your service works well, but it is one of those things can be used for bad too. How would you like it if your private email to a family member was censored, or worse, you were arrested for something you said in it? Even the most well-meaning standards, if set by only a small closed group of people from similar backgrounds, can jump to conclusions which you or I would question.

The ability of technology to do many things entices people to make silly decisions – standards engineers have a burden of responsibility, so should always consider that "just because we can, doesn't mean we have to". For example, using DPI to 'solve' the problem of malware may on the face of it seem like a logical decision. But it isn't. Our society isn't that black and white. How do we define what is and isn't malware and then who ultimately decides and controls that – and is it someone we're going to trust?

If the price of stopping ten thousand people getting infected by a virus is the privacy of one billion, is that proportionate? It is akin to smashing a very small nut with an industrial press. And after all of that, who is to say that the malware will actually get caught?

It is absolutely critical that we all work together to ensure that technologies, if standardised, respect everyone's needs and our innate human right to privacy. The only way to do this is by allowing everyone to have a view.

We all should have a voice in how we want our web to be

There are good and bad in all standards organisations, but it seems that the best solution is to have what is called the "multi-stakeholder model" – one where everyone has a voice. The W3C greatly benefited from having the participation of organisations like the Centre for Democracy and Technology while developing specifications for interfaces from the web to sensitive features like the phonebook or camera on a mobile phone.

In addition, the W3C was able to talk to the UK Child Exploitation & Online Protection Centre (CEOP) about the same subjects. Their input would not have been thought of by the engineers around the table who were interested in the nuts and bolts and it wouldn't really have been possible if the work had not been publically available. This is really the crux of the argument.

The web and the internet affects us all, it's not just about technical people or governments anymore. We all should have a voice in how we want our web to be.

David Rogers is a mobile phone security expert who runs Copper Horse Solutions Ltd, a software and security company based in the UK. He tweets @drogersuk and regularly blogs at blog.mobilephonesecurity.org

To get more articles like this sent direct to your inbox, sign up for free membership of the Guardian Media Network. This content is brought to you by Guardian Professional.