In a keynote on day four of EuroPython 2015 in Bilbao, Spain, Carrie Anne Philbin looked at using Python in computer education for children. She covered some of the background on a new curriculum that has been established in England and on why Python fits well into that curriculum. She also talked about using the Raspberry Pi as a platform for that teaching and at some of the challenges in using Python in the classroom.

Philbin considers herself an educator foremost and was a teacher at the time she started out working with the Raspberry Pi and Python. She now works for the Raspberry Pi Foundation, which is a registered educational charity. All of the profits from selling Raspberry Pi hardware go to the foundation, which was set up to advance the education of adults and children in computers and computer science.

Her journey

Her journey started in the classroom where she was teaching an "information and communications technology" (ICT) class. In 2010 or 2011, Google's Eric Schmidt had made a speech in the UK (her home) where he said that UK computer science education was really bad and that it was being taught the wrong way. That was picked up by the media and was turned into a statement that all of the ICT teachers in the UK were terrible.

That was depressing to her, so she started thinking about ways to bring some of the new thinking about computer education into her classroom. She heard about the Raspberry Pi and tried to get one on the first day it was out, which was unsuccessful for her as it was for many. She did eventually get one two months later, plugged it in, and saw that it was simply a Linux computer, which was not anything new or exciting to her.

But she still saw it as a tool for teaching about computers, so to try to find ways to use it in her classroom, she decided to attend a nearby Raspberry Jam that was being held at the Mozilla Space in London. It was attended by fifty men and three women; all of the women were teachers. There were lots of "geeky projects" that were presented at the meeting, but she didn't see anything that she could bring into her classroom.

She eventually had the opportunity to address the meeting, so she explained that she was a teacher and that she wanted to bring the Pi into the classroom. The projects she saw might not inspire students, she said, and in particular might not inspire girls. That elicited a heckling response from one attendee who suggested that maybe she would need to program the device to go shopping or organize sleepovers in order to attract girls.

That was a "terrible experience" (also recounted here), but once she returned to her seat "something really amazing happened". Ten different people came up to her and told her to ignore the heckler, that they had ideas for projects that would inspire students both male and female, and that they wanted to help. From there, everything started to snowball. She "kind of just took over" the education track at the next PyCon UK and she now serves on the board of the Python Software Foundation.

In September 2014, a new computing curriculum for England was announced. From the age of five, children will learn to program and from eleven, they will learn at least two programming languages, at least one of them text-based. Unfortunately, the government only put an additional £3.5 million toward the effort, which works out to about £175 per school. That is not really a useful amount of money for training and the like.

England is something of a pioneer in requiring computer education. Israel has long had computers as part of its curriculum. Estonia, Australia, and New Zealand have recently added the subject. Some Scandinavian countries look like they may doing so in the near future. That's it, she said.

Why teach computing?

Philbin asked, why teach computing? There are a number of reasons to do so. To start with, children are creative and imaginative; they are also not afraid of failure. Educators actually train that out of them along the way. Children will also tinker with things, which is how we all learn.

Another reason is for social mobility. Most people with jobs in computer science have come from affluent backgrounds. Programming is also empowering. It makes people feel good about themselves and what they can accomplish. Bringing more diversity into the technical fields is also a reason for teaching about computers. Only 16% of IT workers in the UK (and 20% in the US) are women. Those who are creating technology are not representative of those who are using it.

The final and most important reason is shown in a video called "Humans Need Not Apply". Many things will become automated over time, including tasks for people like supermarket clerks or baristas. In addition, technology such as self-driving vehicles has the potential to put a lot of people out of a job, including taxi and lorry (truck) drivers, when it becomes legal. So the goal is to train today's children to be able to program or fix these machines.

But the focus is shifting from "why?" to "how?". The visual programming language Scratch is an easy win for young children, but eleven-year-olds need to learn a text-based language. So people started having a look at Python, she said.

Why Python?

There are good reasons to look at Python for education. It is used all over the world and is powerful enough to be used for real development. It is actually used in the real world by organizations like NASA, for example. It has a simple syntax; by comparison, writing "hello world" in Java takes six lines of code.

Probably the most important reason for educators, though, is the community. Python has a strong and helpful community. Raspberry Pi has a great community, too, she said, but Python's tops it. It is not just national and international communities that are the big draw; it is the local communities that really make Python special.

Python conferences have also taken up the cause of using the language in education. PyCon UK has had an educational track for the last eight years. This year, there were 40 teachers at PyCon UK. If you build it, they will start to come along, she said.

PyCon 2015 in Montréal had an educational track. There is an educational summit at this year's EuroPython; she was headed to PyCon Australia soon, which will also have an educational track. She suggested that conferences should try to help teachers by starting these tracks. They may not come the first year, which would be sad, but they will see and start coming.

Barriers

There are some significant barriers to students trying to learn Python, however. To start with, it is a struggle to move from Scratch to Python. The Raspberry Pi Foundation is trying to overcome some of that difficulty with a game called Pyland. Python 2 versus Python 3 is also a problem, but it really shouldn't be. There is no reason not to just teach Python 3. All of the foundation's resources are written in Python 3 and most major libraries are available for that version.

There are a number of naming issues that make Python difficult for children, she said. People write libraries for themselves and are not always consistent with the names they choose. She mentioned the general-purpose I/O (GPIO) library for Raspberry Pi as an example. It was written by a UK brewer to regulate the temperature in his brewery. That library is now used in education all over the world. Like other Python libraries, it has inconsistencies that make it hard for students.

Another problem stems from children saving their program with the same name as the library they are using. It may seem obvious to everyone in the room that naming your program that uses the pycamera library pycamera.py will not lead to happiness, but she has done it herself. When she did, there were five Python developers trying to help her figure out the problem and it actually took some time to do so.

Another example is the module for Minecraft. It is a real hook for children; it "kind of blows their mind" that you can build things in Minecraft by writing a program. But even the first line of a program causes problems:

from mcpi.minecraft import Minecraft

setPos()

mc_teleport

The capital letter often trips them up and children are impatient. In addition, thefunction's name is not all that helpful. In Ruby, for example, the function is called, which is much more obvious.

Another snippet of Minecraft code shows additional problems:

glass = block.GLASS.id mc.setBlock(40, 50, 60, glass)

mc_set_block :glass, 40, 50, 60

That is long-winded, and has the capital letters to deal with. By contrast, in Ruby it is simpler:"Please think about children when naming things", Philbin said.

Pygame is a great library for children, but it is difficult to use in the classroom. There are a lot of concepts that need to be explained before you can start doing anything with it. That makes it hard to teach in an hour-long class. Daniel Pope came up with Pygame Zero to help bridge that gap. It simplifies Pygame by providing a runtime so that students don't need to understand game loops, event queues, and other concepts right away. That means a simple program that actually does something can be taught in a bite-sized lesson. It does make some decisions for you, she said, but you shouldn't be afraid to do that for children.

Adding extra libraries is another hurdle. It is easy to do it on one computer, but not for 30 or 50—or for Raspberry Pi devices. For one thing, school computers tend to be behind firewalls or filtering software that can make it difficult to simply access the extra libraries. She and Pope have been discussing an "education bundle" that would gather up useful libraries (e.g. Pygame Zero, NumPy) into a single package to help with that problem.

IDEs

Finally, there is the problem of a Python integrated development environment (IDE) for children. Finding programming interfaces for children is difficult. She asked how many in the audience used the Python-included IDLE as their IDE and got two or three hands out of more than 500. Being bundled with Python is a big advantage for IDLE, since it will always be available, whether there is internet connectivity or not.

She noted that her school was not able to upgrade to the newest version of Scratch because you needed to sign up online to get it. At her school, the internet was slow and firewalled. There are lots of online resources and IDEs for Python, but those won't work for children and schools that don't have access (or have restricted access) to the internet.

One offline option is the PyCharm Educational Edition, which is free and open, but it is not obvious how to use it. It is great for children sixteen or older, but won't work for those who are younger. There are too many buttons and too much setup required. There needs to be an simpler open solution.

IDLE is one option. It does come with simple syntax highlighting and some auto-indentation. But it runs in two separate windows, which is confusing. Adding a Minecraft window just makes that worse. In addition, the error reporting in IDLE is "atrocious", she said.

She showed a picture of the interface for an educational program developed by Dr. Sam Aaron called Sonic Pi. It is a synthesizer that can be live-coded to create various kinds of sounds and music. She apologized because it is written in Ruby, but it is a good example of a nice interface for students.

In Sonic Pi there is a button for "Run" and one for "Stop". There is a way to make the text smaller and larger (which is needed in IDLE). It has line numbers and its windows are all together as panes in a larger window. There is an inbuilt tutorial and it has a button to align the Ruby code. She wondered why there can't be something like that for Python.

Helping out

There are plenty of ways for those interested to help, Philbin said. Meeting with educators, talking with them, and listening to their problems is one way. Adding education tracks to conferences is another. Having a special education session at a local user group would be helpful, as would mentoring a teacher so they have someone to turn to with questions. In addition, creating and contributing "awesome libraries" (which have a consistent API) would be quite helpful.

There is a new Python education workgroup that is getting started. The intent is to "make some things happen", like making IDLE better, for example. The group needs any help people can bring to its mailing list. She would like to see the group set some specific goals before she tries to get recognition from the PSF board.

Ever the teacher, Philbin gave out some homework at the end of her talk. First was to join the workgroup mailing list. Second was to read and contribute to Al Sweigart's IDLE Reimagined project. Last was to read Nicholas Tollervey's Python in Education book, which is available as a free ebook. All of the homework is due by EuroPython 2016 (which will also be held in Bilbao). "I will be checking", she said with a laugh.

There is a danger that writing code will become an educational fad and that visual programming (with languages like Scratch) will dominate. "Please help make sure that doesn't happen". She firmly believes that code should be part of the curriculum and that it should be done with Python. If that doesn't happen, JavaScript could take over instead.

Computer education could change society in a positive way, she said. When the Raspberry Pi generation grows up in 20 years, they will not all be developers, certainly. But they could all benefit from the problem-solving skills that learning programming will provide.

Comments (12 posted)

MyPaint is a free-software painting program that provides a minimalist user interface optimized for use with pressure-sensitive graphics tablets. The project recently released the first beta builds of its upcoming 1.2 series. This update makes a number of noteworthy changes, including support for vector-graphics layers, new tools, and support for color palettes that emulate the traditional mixing of paints rather than the RGB model used in most computer graphics formats.

The beta release was made on July 22. Source is available through GitHub and there is a personal package archive (PPA) for Ubuntu. Both of these constitute recent changes in the project's infrastructure—it has only recently migrated its repositories to GitHub, and previous releases did not include an Ubuntu PPA. The transition from the old infrastructure to GitHub is not entirely complete, however: one is likely to find many confusing links to earlier MyPaint sites (such as mypaint.org) that are redirected back to GitHub—and not always to the correct destination. This makes looking up documentation particularly difficult, as the GitHub wiki does not yet appear to include everything that was available at the old site. On the plus side, the team has triaged quite a few bugs and successfully planned and released several milestones to lead up to the final 1.2 release, so the infrastructure change appears to be a positive one.

In addition to the application itself, the MyPaint GitHub "organization" now also maintains libmypaint in a repository of its own. Libmypaint is usually referred to colloquially as the "brush engine;" it is the library that simulates natural-media painting tools. The engine is well-liked for how easy it makes the process of designing and adjusting brush tools. Users, in particular, seem to like the MyPaint engine's ability to create randomized variations within the strokes as one paints.

In recent years, MyPaint's brush engine has developed enough of a following that users have called for its adoption in several other tools—most notably in Krita, which is the other major natural-media simulation program in free software. But Krita's attempts to add support for the MyPaint engine have been less than successful due to the speed of changes within MyPaint. Perhaps maintaining the engine as a standalone library will increase its stability and, therefore, its usefulness to other projects.

The most substantial new feature in the 1.2 update is improved support for layers within files. MyPaint's native file format is OpenRaster (ORA), which was designed to facilitate file interchange between graphics applications and which borrows heavily from the design of the OpenDocument format. However, in previous releases there have been ORA file features that MyPaint did not support—such as layers containing vector graphics like SVG. The use cases are fairly straightforward: users would like to assemble an image that contains SVG elements as well as the naturalistic ink-or-paint drawing styles at which MyPaint excels. Comics are often drawn this way, although simply inserting a vector logo into a painting file would require the same feature.

MyPaint can now open ORA files containing vector layers, though it cannot edit the content on those layers. For most users, this will certainly be sufficient, since Inkscape is designed for vector editing and supports ORA. The new vector-support work also enables MyPaint to read layers in many other content types. A necessary evil in the OpenRaster world is that layers tailored for use in a particular program might not render the same when displayed in every program, but this is something that the user needs to be aware of anyway. MyPaint may not render all of Inkscape's SVG filter effects exactly as Inkscape does, for example, but it should open the layer containing them correctly, which is a big improvement.

A related feature is that MyPaint can now use more image types as its background layer. The background layer in MyPaint is not simply the bottom-most image layer; it is a special feature designed to provide a canvas-like (or paper-like, depending on the user's preference) background to simulate the natural-media painting surface. Since MyPaint lets the user extend the canvas ad infinitum in any direction, these background image must be tileable. While there has always been a good selection available, supporting more image types is a welcome change.

There has also been some work improving the painting tools. There is a "fill" tool for the first time, for example, and there are now panels that list the history of the brushes and colors used during the active session. Access to brush and color history may sound like a small thing, but in daily usage it is (arguably) more useful than a simple "undo" history. Given how many brushes come with the default MyPaint and the fact that users can add their own, it can be all too easy to forget which one was just used. Rolling back edits with undo does not help. And it is certainly important to be able to re-select a previously used color, too.

Speaking of colors, one interesting addition in the 1.2 series is the ability to switch between the red-green-blue (RGB) primary color model used for almost all digital graphics and the red-yellow-blue (RYB) primaries that traditionally correspond to physical paint in the real world. For a program that advertises "natural media" simulation, it cannot get much more "natural" than RYB. Obviously, any particular color can be transformed from one model to the other and back; what this feature does in MyPaint is present a different color-selection wheel to the user. Presumably it will seem more natural to at least some users for some tasks. The application also offers a third option: the complementary-pairs wheel from the opponent-process model, with red across from green and blue across from yellow.

The final new-tool feature in the 1.2 series is a new brush pack designed by MyPaint user "kaerhon." It consists of 20 brushes that emulate traditional ink and paint tools. While MyPaint's engine supports a wide variety of creative effects—and the default complement of brush packs shows off those effects—straightforward tools are still what most users need.

As one might expect, there are other changes to be found under the hood in the 1.2 beta release. Perhaps the most significant is the MyPaint has been ported from GTK+2 to GTK+3. Users might only notice the change through MyPaint's ability to use GTK+3 "dark" theme variants. If the user's GTK+ theme supports a dark variant, the user can tell MyPaint to use it through the application preferences. In the long run, though, staying current with the GUI toolkit of choice is simply something that the project needs to do. It has been more than two and a half years since the release of MyPaint 1.1, but GTK+3 support is a bit overdue nonetheless.

Ultimately, the upcoming MyPaint 1.2 release seems poised to be a solid, incremental update rather than something revolutionary. In the years since the 1.1 release, though, the project has gotten its house in order—migrating infrastructure to GitHub, separating the brush engine into its own library, and porting to GTK+3 are not glamorous jobs, but they will presumably pay dividends as time goes by. In the meantime, MyPaint users will gain an important feature in the ability to use ORA files that contain vector images and the ability to experiment with non-RGB color selection. Natural-media simulation is a bit of a niche, and may always remain so, but there is still no shortage of development work for free-software projects tackling the problem.

Comments (none posted)

Holger Krekel is a longtime Python developer, starting back in 2001. He is one of the co-founders of the PyPy project, created the pytest Python testing tool, and has worked on several other Python-based projects. But his keynote on the third day of EuroPython 2015 was not particularly Python-centric—it was, instead, a look at the history of centralization in communication technology and some thoughts on what might lie in the future.

Krekel began by noting that he has given lots of talks at various conferences, but that he still gets nervous and uneasy when standing up in front of an audience. Part of the problem is that we are wired to feel uneasy when there are lots of people staring at you; in archaic times that probably meant that they wanted to kill you. Since it is a natural reaction, overcoming it is difficult, but recognizing the underlying cause helps. Those giving their first talk will likely feel that even more strongly, he said.

Over the last few years, Krekel has been meeting with other communities, including Node.js, Erlang, and Haskell groups, but also other non-language-specific groups that focus on higher-level concepts. His talk was meant to relay some of what he has learned. But first, he wanted to talk about the past in order to talk about the future.

The past

"Real rocket science" took place almost 50 years ago, with the Apollo moon landing. The Apollo missions set the speed record for humans at roughly 40,000 km/hour. But after that, the rocket science advances started to slow down. From 1685 on, the number of scientific papers published doubled every fifteen years—he likened it to Moore's Law—but that leveled off in the 1970s.

Who was doing this rocket science, he asked; who was programming these rockets and spacecraft to land on the moon? He put up a slide of Margaret Hamilton with a stack of printouts as tall as she that was the source code for the Apollo program. She led the programming effort for the project.

In the 1960s, more women than men were programmers. That changed because more money flowed into the computer industry, which attracted more men. Research has shown that as fields attract more money, men tend to dominate. In the early days, programming was seen as a "lowly" task that involved typing so it didn't seem particularly important. Hamilton was one of the leading rocket scientists.

He then showed a picture of an old rotary-dialed phone. In 1939, those types of phones started using "pulse dialing", where each digit of the phone number actually controlled relays across the country to switch the wire to connect to the phone at the other end. That was all run by one company (e.g. AT&T in the US), which controlled all of the hardware (phones, relays, network) to make it run reliably.

In 1974, another "rocket science invention" came about using modems that allowed creating an overlay network on top of the voice network. Many researchers believed it was the wrong approach, though, because it could not be any more effective than the underlying network. So they came up with the idea of a packet-switched network where each packet gets a "higher-level telephone number" (the IP address) for its destination.

That idea had a big advantage that was not obvious at the time: there are no setup costs, unlike with phone calls. You can just put a packet on the wire and the router will make a decision about how to forward it toward its destination. It was envisioned as a distributed network and one that was resilient in the face of failures—packets can be rerouted around them. It turned out that decentralization "was a bit of a hippie dream", he said.

The present

What actually happened is that certain endpoints started collecting the lion's share of the traffic. The IP network is still kind of using the idea of the original telephone network, where there are endpoints that we connect to. Instead of an evenly distributed network, we have a collection of star networks, where many people connect to a single telephone number.

Why did this happen? Companies recognized that if they are the endpoint that everyone uses, they have to be able to deal with all that traffic, but they get an excellent overview of what all that traffic is doing. Scaling up to handle the traffic is more than offset by the gains made by having more traffic information.

It comes from the economies of scale. Going to 100 users costs more than going from 100 to 10,000. That makes the "complexity tax" regressive. Companies can pay less and less to get more and more users. There is a tipping point where that process "becomes very profitable" from advertising and things like that, he said.

Krekel quoted former Facebook researcher Jeff Hammerbacher, who said: "The best minds of my generation are thinking about how to make people click ads. That Sucks." Instead of spending time on "getting us into space, flying cars, or whatever", the best minds in IT are focused on how to get people to click more ads.

So, we have ended up with million (or billion) to one architectures on the web. Lots of startup companies are trying to become one of the mediators of the traffic, but the impetus behind the traffic is that people want to connect with other people. They want to view videos or communicate with text and pictures, but they do that through YouTube, Twitter, and the like. On a social level, people are "peer to peer", but today there are intermediaries that monetize those interactions and profit from it.

The future

Returning to the subject of space, Krekel noted that Elon Musk wants to get humanity to Mars by 2026. Do we think that 41-year-old technology like TCP/IP or the 21-year-old HTTP will work on Mars? Can you call Gmail as a web app on Mars? Someone in the audience suggested that it would just take "patience" which elicited widespread laughter. Krekel said that the protocols we have will not work on Mars.

But we already have Mars on Earth in places where internet connectivity is not all that good. In 1981, there were 300 computers connected to the internet, but now there are now billions of devices in the world that are still using this phone-based model. It turns out that's not actually true, he said, some are following other models. There are communication and synchronization mechanisms that some of these devices are using to transfer data directly between them without using the internet.

For example, you can synchronize your mobile phone and laptop directly without using some remote server. Sometimes it is more practical to use a remote server "in California somewhere" to transfer files between two local devices, but there are ways to avoid having to do that. These mechanisms don't use standard protocols, but instead use some proprietary protocol. It is much more efficient to transfer files locally, especially given that upload bandwidth is often much smaller than that for downloads.

There is an organization based in Berlin called Offline First that has recognized that our endpoints have become much more powerful, with lots of local connectivity, so that it doesn't make sense to make connections across the world to talk to something local. People want local applications that work, even if they are not connected to the internet. At some point, the device will be able to get a connection to the net; when it does the application can simply synchronize. Like its name implies, the group is focused on an offline-first strategy.

If you look at successful projects over the last ten years, many are using synchronization and replication techniques that don't work according to the client-server paradigm. Git is a good example, he said, since it stores the whole history locally and allows local changes that eventually get synced, which is offline-first thinking.

Another example of distributed networks is BitTorrent, which came out of the realization that you shouldn't have to make a phone call back to California to get a video. Others already have the data, but you just aren't talking to them. Instead, with BitTorrent, people can register hashes of the data they have and others can get it locally, which is much more efficient. At one point, BitTorrent traffic was half of all internet traffic.

There are other projects that use hashes to identify data, including ZFS, Bitcoin, and Tahoe-LAFS. They are all based on Merkle trees, which are trees of hashes. We have "reasonably safe" cryptographic hashes, Krekel said, which can be used to hash data blocks; those hashes can be hashed to identify files, directories can be identified by the hash of their file hashes, and so on. He wryly pointed out that this Merkle is not the same as the Chancellor of Germany (Angela Merkel); "I totally disagree with her politics", he said to applause.

Immutability

Merkle trees are an immutable (unchangeable) data structure. If you change one of the data blocks, all of the hashes in the path to the root of the tree must change, including the root. But that root hash uniquely identifies the whole tree and any corruption of data during transfer can be easily detected by simply verifying the hashes.

Immutability of data structures is also a property of some programming languages. In nearly every language that has been created or become popular over the last ten years, immutability is a key feature of the language. It helps with scalability by allowing parallel operations. In addition, programming with immutable data structures is safer. There is a project called Pyrsistent that provides immutable dictionaries, sets, and such to Python, which allows experimenting with immutability.

Krekel then turned to the last entry in "The Zen of Python": "Namespaces are one honking great idea -- let's do more of those!". He noted that he loved the introspection features of Python and that it was "namespaces all the way down". Classes are just dictionaries, as are objects and modules, and all of that can be inspected programmatically. Creating his own implementation of that was part of his motivation for co-founding PyPy.

In thinking of "more of those", he has come up with a nascent idea about "immutable namespaces" for Python. Having a reference to the namespace would mean that nothing it referred to could ever be changed—it would be like a Git commit of the contained namespaces. It is worthwhile to see how this might be beneficial and could be a step toward removing the global interpreter lock (GIL) from Python. It is a "vague idea", but even if it doesn't work out, thinking about immutable data, perhaps combined with namespaces, is something that will make programs easier to reason about.

IPFS

A new peer-to-peer protocol, the InterPlanetary File System or IPFS, was next up. Instead of location-based addressing using names (like http://lwn.net/...), IPFS uses content-based addressing (ipfs://<hash>/...). So instead of asking to connect to a phone number, users ask for a particular piece of content wherever it is stored. They don't need to verify the sender of the data since they can validate the hash of the content returned.

But hash values are even harder to remember than domain names (or phone numbers), so there needs to be another layer that maps names to hashes. The current scheme uses mutable hashes stored as TXT records in the DNS that map to the actual immutable hash of the content. Mutable hashes are used so that the content can change without requiring an update to the DNS entry for a given domain. That scheme is called IPNS (which doesn't seem to have a web page) and is based on the naming used by the Self-certifying File System (SFS).

IPFS is a work in progress, but it can be used today. It currently uses IP and DNS, but can operate using other protocols when they become available. For example, Namecoin might be used instead of DNS someday. Data in IPFS is exchanged using a BitTorrent-like mechanism and routing is handled using the distributed hash table (DHT) or with Multicast DNS (mDNS) for purely local transfers.

There is still an issue about how to bootstrap a list of DHT nodes. If you think about the offline-first scenario, where devices are not connected for days or weeks, there will be changes in which IP addresses are participating in the DHT. Peer-to-peer networks solve the problem by having stable nodes that are always available, but that is not a decentralized solution.

He pointed to a blog post by Adam Ierymenko that talks about the problem. In it, Ierymenko suggested the idea of a "blind idiot god" for the internet. It would be a minimal centralized resource that could be used to solve the bootstrapping issue, but the key is that it would need to do so without being able to see much of the information it was handling—provably. It is a tall order and there is an open debate on how to do it, Krekel said.

Back to rocket science

Instead of blaming Google and Facebook, which have provided great services, released open-source software, and given good jobs to many in the industry, he said, we should just replace them with something decentralized. He quoted Buckminster Fuller ("You never change things by fighting the existing reality. To change something, build a new model that makes the existing mode obsolete.") and agreed with that sentiment. We should just build something better, he said.

There is still a lot of innovation going on, but the pace seems to have slowed since the 1970s. That is borne out by the number of scientific publications that he mentioned early in the talk, he said. A book by David Graeber called The Utopia of Rules has lots of research that shows that the rate of innovation has "kind of leveled off". If you look at the changes from 1910 to 1960 or 1970 and compare that with then until now, many of things that were expected have not arrived. We set the speed record for humans and haven't surpassed it (or done much in the way of a real space program) since.

This and other examples contradict the idea that we are innovating exponentially and making huge technical advances—doing rocket science, essentially. Progress is being made in specific areas, we have more and more ways to scale up to million-to-one architectures, for example, but it tends to be focused on monetization, rather than on basic research for things like the space program.

There may be things going on today that do hearken back to the 1970s, though. Back then, a few people created the internet protocol and changed the world in a fundamental way. That might be happening again with things like IPFS. While IPFS may not be successful, or even the right solution, he thinks that right solution will be something decentralized and look similar to where IPFS is headed.

Comments (30 posted)