Can information systems be designed in such a way that all data is used by those who collected it and still serve high-level decision makers?

Registration desk outside the Grenfell rest centre

I was the lead information manager for the rest centre as part of the Grenfell fire response. A whole other article can be written on the experience and learnings in the days following the event, but for now I want to focus on a particular aspect that resulted from it. The British Red Cross that summer responded to multiple large incidents including the Manchester bombing, the London bombing and the Grenfell fire being the largest. It stretched the capacity of the organisation and highlighted some gaps.

After a review it was decided that the organisation should build an information management system and processes to help with such responses and guarantee that data and information is acted upon and decisions made in a systematic manner.

Initially, a traditional approach was taken with a long research lead and after 6 months there had been mapping of the information flows needed, decisions that should be made by management and the data needed to make such decisions. A very much top down approach without consideration for those delivering the response (Since then, after careful discussion, a more low level agile approach has been taken).

Reboot wrote a wonderful relevant article highlighting the difference between upstream and downstream data and how systems need to consider downstream data more. It defined them as:

Upstream Data, or reporting data, is mostly for high-level decision makers and oversight bodies. It is useful for institutional accountability, strategic planning, and stakeholder coordination

Downstream Data is useful for adaptive management and can help programs generate impact.

Many humanitarian data systems prioritise upstream data collection with smaller amounts used operationally. What if humanitarian information systems were designed with downstream data as the primary purpose and upstream data as a beneficial byproduct derived from downstream data?

All data collected at the response is used for ground level decision making

When I look back to Grenfell I do believe a system could be created that primarily would serve the purpose of those responding at the rest centre, but still give an overview of the situation to senior decision makers.

Many responders who collect data suffer from data fatigue and often question the value in collecting. By prioritising downstream data, those collecting become the custodians using the data to empower their own response. Hopefully as a result it means higher data quality and better data driven responses.

Classic examples of upstream data systems are those of the cluster system and of donor reporting. What would these look like if they prioritised downstream data? Could the shelter cluster provide assistance and tooling for organisations to manage their shelter response with federated, standardised 3W data as a byproduct? How would large funders have to engage the sector to be more flexible with indicators? Is prioritising downstream data not viable for every use case?