Many developers envisage Ethereum's potential to harness distributed computational processing rate to potentially/hypothetically overtake our worlds most non-distributed clusters.

http://www.distributedcomputing.info/projects.html

https://www.top500.org/lists/2016/11/

^ According to this site's November 2016 ranking;

Sunway TaihuLight clocks@ 125,435.9 TFlop/s

and

Tianhe-2 (MilkyWay-2) clocks@ 54,902.4 TFlop/s

Assuming the Ethereum platform can successfully synchronise peers, do we have any mathematicians here that can model how many distributed Ethereum nodes it would take to compete?

Perhaps as part of the future Ethereum protocol we could ask contributors to partition only 5-10% of their CPU's power (not affecting an average users other/daily processing requirements)?

Personally for me, this is why I think Ethereum has great importance! This would be disruptive to the currently commercially available software suites, promoting the development of "everyday" "Ethereum Powered" DApps from Media editing to realtime GIS applications, initially. However I think with active collaboration with existing non-distributed cluster projects could greatly advance our current scientific modelling capabilities and increase our overall efficiency. Just a thought for the EF to approach, many of these projects are open source/non-proprietary licensed too?

We could dynamically adjust gas/eth price to dramatically offer these applications for a fraction of the price offered by leading commercial software providers - and make it even cheaper for registered educational/research institutional hubs to use.

I think project Golem have similar scope but it's not at EVM protocol level and already separately tokenised?