Preface

Roads are something we all take for granted. But in fact, our roads are the arteries supplying the heart of our global economy. To keep the economy running, road infrastructure has to be well maintained and any damage has to be fixed.

Time is an important factor in road maintenance: A small crack grows easily into a serious pothole within a winter season. And who hasn’t heard the argument of businesses relocating because of „better infrastructure“?

Roads carry an importance we usually miss to see.

Damaged roads add cost to the economy, not only in terms of repairs, but also delays and their consequences. Accidents due to bad infrastructure are usually followed by secondary burdens like further delays, possibly higher cost due to more serious damage and sometime tertiary cost like healthcare, in case of injuries.

When it comes to road infrastructure, waiting with repairs thus automatically results in increased risk and additional cost down the road (pun intended).

Long story short: the sooner any damages to road infrastructure are tended to, the lower the economic burden on society.

The problem

Processes for maintaining public road infrastructure are as archaic as you could possibly imagine.

As of today, maybe a few bored seniors but mostly police officers or observant public servants report road damages – if they find the right contact to submit a report. In addition, some municipalities actually employ people driving around all day, looking for damages.

When a report does find its way, usually an inspector is dispatched to check the reported damages, not rarely arriving at the reported location just to find that the damage isn’t severe — or not finding any damage at all, because the location description was not accurate enough.

Reporting damages is a completely unstructured process, riddled with inefficiencies, delays and loss of data.

In return, municipalities responsible for maintaining roads don’t have information about their current condition and have no efficient means of gathering information about the status quo.

The following idea could change that.

The pitch

In essence, this idea revolves around a global road monitoring system.

The core of this idea is to equip the worldwide road infrastructure with a tight sensor array that automatically monitors road conditions in order to detect any damages and thus enable municipalities and public services to drastically improve their processes and allocation of resources — and ultimately lower the monetary burden on tax payers.

You probably think I am nuts.

It’s actually pretty easy and hardly costs anything:

Most people use means of transportation like cars, bikes, motorcycles, buses, bicycles or trains on a daily basis. And the vast majority of them carries a smartphone in their pocket that includes an array of roughly a dozen fairly precise sensors.

In fact, data is being gathered by numerous smartphone apps already while we carry our phones around, as recently demonstrated in a detailed article by the New York Times. Unfortunately their owners are usually neither aware of their every movements being tracked, nor are they rewarded for their unknown “contributions”.

We are going to tap into this already existing, widely available, global sensor array in order to gather data about road conditions on a worldwide level— but by including users on their own will, on their terms and in an anonymous (or at least pseudonymous, see paragraph further down) fashion, while rewarding them for their contributed information.

All we need is a smartphone app being able to register vibrations while we drive, ride, cycle or use trains.

Note that this is only a mockup, illustrating of the idea and not supposed to be a design

The app taps into the smartphones’ sensor data (accelerometer, gyroscope, shock sensors, positioning systems, etc.) and registers vibrations if the user drives over a bump, a crack or a pothole.

Whenever the app registers such events, it writes the gathered data in a standardised form (eCl@ass?) to the IOTA Tangle.

Contributed data is rewarded in IOTA micro-transactions.

As we are talking about rewards on the level of micro-transactions, we can let the app take care of generating an IOTA seed and addresses derived from it automatically in the background. In my opinion, there is no need to include full wallet functions into our road condition monitoring app.

This app is intended to enable the average Joe. User experience would be crucial to attract non-crypto enthusiasts.

Therefore, functions giving users the ability to transfer earned IOTA rewards to another address should be sufficient. Most probably, Trinity modules could be repurposed in order to integrate these functions into our app.

Aggregating and enriching data

The current version of the IOTA Data Marketplace, to my knowledge, doesn’t include features to manage data sources nor does it offer functions to aggregate data.

I believe that a service aggregating the collected road condition information for potential customers would substantially lower the entrance barrier for them to utilise that information. Municipalities aren’t high tech companies that easily set up their own servers or software. I therefore propose to provide the necessary infrastructure:

A service is established that continuously listens to new road condition information being issued on the IOTA Tangle while remembering who contributed individual data items in order to be able to reward contributors later.

The service aggregates all individual data points and maps their locations onto a digital street map in order to enable customers like municipalities and public services to “see” current road conditions through a web interface.

Data sales

Heat maps created by our aggregation service showing road condition information are made accessible through a website to anyone interested in any area on the planet (if sufficient data has been aggregated).

Areas of interest can be checked for available data in a preview-mode that obfuscates any detailed information while indicating the severity of the average road condition through a colour key.

The preview thus enables potential buyers to determine whether it would make sense for them to buy information on a detailed level in order to determine where exactly road maintenance would be necessary.

The obfuscated preview could look somehow like this:

Example of a preview mode obfuscating the exact location of road damage.

Detailed information is sold ad hoc for MIOTA, ideally without any form of registration.

Also, detailed information should only be sold in bulk, spanning over a fixed area (for example in a sizes of 2km x 2km) in order to be able to reward continuous contributions of users submitting data and to prevent buyers from selectively picking only the most valuable data (i.e. specific spots with most severe damages).

After a payment has been made, the selected area can be inspected in detail by the buyer to determine where exactly and how severe the damages are that have been recorded.

Example of detailed information that has been paid and “uncovered”: The buyer had to buy data for a fixed area spanning several tiles, which enables us to reward more contributors — not only the ones who ran into the most severe road damages.

Rewarding contributors

This idea is first and foremost one that enables anyone to contribute to society by effortlessly improving an inefficient process that ultimately lowers tax expenditures to a certain degree. In the end, everyone benefits: Better roads, lowered costs.

Nevertheless, there should be a fair renumeration for data contributions, especially as we are living in a world where data is usually gobbled up for free: Proceeds from sales generated through the aggregation service/website should be split among everyone who contributed data for areas that are being bought.

This is no get-rich-fast idea. There is no guaranteed reward for contributing data per se, but only if data is bought. And in any case, no-one is going to become rich from contributing data about potholes. This is an idea aimed to enable anyone to make a small contribution for the betterment of society.

Payouts

Data contributions and payouts could either be handled through MAM channels or through individual transactions.

Using MAM channels would probably be a simple process but require an open channel to anyone who contributed data. Rewarding data points saved to the Tangle in individual transactions would most probably create a lot of overhead, but of course be possible. I suppose reusable addresses, as recently described by IOTA Foundation member Hans Moog could also be of great benefit to this use case.

In any case, which option would be more suitable can be decided if this idea is being followed up upon.

The same applies to the sales price to uncover detailed information of an area. It could e.g. depend on area size or the amount of available data — or both.

Also, further details in regard to the split of the proceeds among contributors (three year old data might for example not be as valuable as ‘fresh’ data) would probably have to be worked out.

Fake data

Any system including a monetary incentive attracts malicious actors trying to exploit it. I can easily imagine ‘dishonest’ actors trying to submit fake data for random geo-positions in hope of their contributions being included in data sets that are bought by someone.

There are probably several options to avoid this. Out of the top of my head, the easiest seems to be to include a threshold on aggregation level:

Over the course of time, people will detect the same damages at the same geo-locations, thus creating duplicate data points. By applying a threshold to every possible geo-location, any data contributions for locations with less than X reported data points could be easily excluded from sales.

As dishonest actors can not know which geo-locations receive enough data points to pass the threshold, they would have to generate incredible amounts of fake data points for random geo-locations in order to hit an area by chance that is included in data sets being offered to customers.

Data from “honest” users, on the other hand, will be included once enough people reported enough readings for the same location for it to pass the threshold “organically”. By applying a threshold, therefore only dishonest actors would waste their contributions. For honest users it is only a matter of time until their data contributions are included.

Of course, dishonest actors could also repeatedly generate data for specific geo-locations to pass the threshold.

This vector could probably be excluded by embedding a secret algorithm into the app, issuing ‘empty checkpoint’ data in predefined intervals. Think of it like 2FA where two parties, the app and the aggregation service, have the same key used to verify data. Or the IOTA Coordinator, whose milestones are recognised by nodes running the IRI.

This way, the aggregation service can determine whether data contributions are coming from someone that knows the key (only the official app does) and therefore classify contributions as “honest”.

This isn’t a new problem. I am confident that people more knowledgeable in this area of expertise could come up with a viable solution in case my approach is amateurish.

What if no-one contributes data?

In order to collect enough data to create heat maps of road damages, at least a few people will have to drive, ride or cycle into/over them.

While it might sound as if hundreds of people would be needed, i suspect only handful of people would be sufficient to generate accurate measurements for any given damage:

For example five people living in the same area, using the same routes to get to work every day would generate 100 duplicate measurements for every single pothole or bump over the course of a month. These 100 measurements are by far more than necessary to determine an accurate average.

I am therefore certain that this concept could come to fruition as a grassroots movement, enabled by the IOTA community alone.

But, maybe it doesn’t have to.

Opportunity for synergies with the European +CityxChange initiative

IOTA is an integral part of the +CityxChange initiative, starting in January 2019. Core participants of the initiative are seven European cities, committed to developing feasible and realistic smart city demonstration projects. The initiative is funded by the European Union as well as participating companies and institutions.

As cities are already committed to the initiative and budgets have already been allocated, using a truly tiny fraction of it to pay for data and thus reward the participating population for this concept doesn’t seem to be completely outlandish.

Especially, while it seems certain that the public will be included in several aspects, projects and trials within the +CityxChange initiative anyways.

In addition, a proposal like this one, including an easy way to directly involve the general population under the umbrella of the +CityxChange has a high chance of getting promoted by the cities itself–as well as gaining attention through the local, national or even international media, covering the initiative throughout the next years.

What if no-one buys data?

Participating cities of the +CityxChange are most likely interested in smart data anyways. Going beyond the seven cities most likely takes a little bit of an effort.

In principle, any city maintaining public infrastructure, as well as railway corporations should be interested in improving processes of early damage detection in order to save money.

Besides highly likely conferences and media coverage, talking to the right people could help to spread the word and accelerate awareness.

IOTA already has a huge community — and everyone can do their part in trying to at least make their local municipal aware of new opportunities helping them to cut cost.

In case there isn’t any feedback, unpack your trollface:

Let one of your local politicians from the opposition party who is trying to build a career in public services know about the data available for your city.

Explain that the insights would dramatically improve maintenance processes and help the municipal to save tax-payers’ money.

Tell them that the municipal/ruling party wasn’t interested in data that has been generated by their own constituants and that they seem to prefer “to do it the old way” while wasting tax payer money.

After that, get some popcorn, lean back and watch the political show that will unfold.

Privacy

IOTA isn’t anonymous but pseudonymous by design. Even if encrypted (instead of saving clear text information in individual IOTA transactions), contributed information could theoretically be used to derive meta-information about individuals in case the key used to encrypt road condition information is ever broken, stolen or accidentally revealed.

This makes it theoretically possible to derive meta-information from contributed data, like for example „contributor exceeded the speed limit at location x“ or „contributor always leaves at a certain location at a certain time“.

While linking addresses to individuals isn’t possible as long as contributors keep their seed (and thus addresses derived from it) private, sending earned rewards to another account address they own could for theoretically provide evidentiary links to their identity: or instance by sending rewards to an address used to pay a merchant, while providing that merchant with a name and/or delivery address.

The use of MAM channels instead of standard IOTA transactions to contribute road condition information would keep any contributed information “private”. But I suspect that MAM channels might bring architectural challenges when it comes to a large amount of users contributing data.

Regardless of whether standard transactions or MAM would be used — and while risks seem low even for standard IOTA transactions, privacy might be given a bit more thought in case this proposal is selected for further investigation.

Potential extensions and synergies with existing projects

Predictive maintenance: A potential use case for Qubic

Having access to information about current road conditions would be big step forward for municipalities. Being able to predict where maintenance will be necessary in the future would probably be a quantum leap for them.

While incoming data might indicate a moderate damage, for example from a small pothole, continuous recordings will reflect the increase of the damage while that pothole grows over time. The growth rate could probably be extrapolated into the future.

In essence, the heat map showing current road conditions could therefore also be used to project future road conditions, basically telling municipalities where maintenance would be necessary, and most importantly when it will be due.

Having this knowledge in advance would give them time to plan ahead, allocate resources accordingly and more effectively–which again saves money.

The computations necessary for such predictions could of course be outsourced to a Qubic. Going beyond that, I imagine an AI based prediction model could even take local modifiers (harsher winters in the northern hemisphere, highways being used more frequently by heavy trucks, etc.) into account to ‘learn’ from global data and derive more precise results for local areas.

High Mobility vehicle blueprints

Given that many modern vehicles already have similar (and in fact a larger array of) sensors at their disposal than smartphones, the smartphone app-based model could probably be extended to modern vehicles through High Mobility blueprints:

Instead of recording events with a smartphone app, vehicles could record them and save them to the IOTA Tangle.

In fact, this idea could be the first real life application through which vehicles generate revenue for their owners (even if only nano-revenue).

Public services vehicles

Cities and municipalities being the ones with the biggest interest in the generated data also maintain large fleets of vehicles like buses, police vehicles, ambulances, fire trucks and all sorts of public services vehicles. Those could also be used to add to collecting data.

A small device detecting vibrations on its own while being hooked to the CAN bus and/or OBD port (to gather movement and/or position data) of older vehicles could take the role of a smartphone and thus contribute data as well.

This would probably be a rather small challenge for Nordic Semiconductor and the NTNU. I am certain they could help with the design of low cost hardware.

Feasibility and timeframe

I am certain that this idea would “work”. A few years back I was tasked with developing a concept for an app making use of similar data input for a well known german vehicle manufacturer. The app was realised within 6 weeks by a single iOS developer. I suspect that a MVP of this idea could be developed in half the time.

For the aggregation service I suspect that a simple setup would take a frontend/backend team less than 4 weeks to set up a stable, first iteration.