Collapse Content Show Content

All code ultimately comes down to electric charges, so should you learn physics in order to program?

Tl;dr: Learn the high-level stuff first, if they're complete.

Previously, I discussed the languages a beginner needs to learn to create a web application with Rails (such as HTML and Ruby), but I skipped over more fundamental questions. Perhaps we should follow universities and require people to first study many basic topics before coming near something practical like web development? The Web Development curriculum would start with various branches of logic and math, move on to assembly languages, C programming and operating systems, and after a few years get up to Ruby on Rails. Alternatively, we could follow the practice of some tutorials and start building a complete application right away before even covering programming basics, Ruby or HTML. When every technology is built on levels beneath it, what is the balance of prerequisites one should learn before jumping into a higher subject?

The Complete Layer

I think what's important is having a complete abstraction layer to be able to work with. Learn the highest level that can serve your purposes without requiring you to do dig beneath it. For example, every general-purpose language (such as Java, Ruby or JavaScript) provides a complete abstraction layer to build with. While these high-level languages may be implemented with lower-level languages, you don't need to worry about these details when coding. You can build a wide range of applications in Ruby without ever touching the C code that it may be interpreted with. Ruby is not a leaky abstraction, Ruby is a complete package. In certain cases a specialist could use his knowledge of the lower level details to optimize code, but this is not something a beginner needs to worry about. Learn how to build applications and later you can learn advanced optimization techniques. After all, "Premature optimization is the root of all evil". 1

Ruby or Rails First?

If a level does not form a complete abstraction layer, you will need to learn more than one layer to create things. Yet you can often choose between learning the higher level or the lower level first. For example, one could start learning Ruby on Rails before even learning Ruby, the language it's written in. While this is possible, I don't think it makes sense for people new to programming. Instead you should get used to programming by creating simple Ruby programs before getting lost trying to tackle a complex framework in a language you don't understand. After you've coded with Ruby, you can then tackle Ruby on Rails one part at a time.

SQL or ActiveRecord?

SQL is used to communicate with databases, but Rails provides an alternative library, ActiveRecord, to generate the SQL for you. ActiveRecord does not form a complete abstraction layer since there are useful database actions it cannot do. A professional Rails developer will need to learn both ActiveRecord for concise and clear code, and SQL for advanced queries. What should a beginner learn first? As discussed before, I think beginners can start with ActiveRecord since it will be easier for them initially and it can perform all the requirements of a basic app without any hand-coded SQL. Programming students can learn the syntax of SQL later when they need it. (I recognize people could have different opinions on this area, since ActiveRecord cannot actually do everything SQL can.)

Abstracting Abstractions

This basic idea of abstraction can be further abstracted and applied to other areas. If a subject can be applied fully at a certain level (without going into lower-level details), it can be learned at that level. For example, there are many areas of math that can be done by computers. In some of these areas, the student could benefit from understanding the concepts the math is built on so as to apply it to new cases. However, in many areas the techniques used to solve the problem aren't relevant, and the computer forms a complete abstraction layer on top of the mathematical details. For example, there are many techniques for calculating integrals, but these aren't relevant to the fundamental study of calculus. Students can just learn how to use computers to solve such problems, they don't need to learn how to solve the problems manually. Solving problems with a paper and pencil when you can use a computer is like writing machine code when you can use Ruby or Python.

Humanity has come a long way since we scribbled equations on stones. This has only been possible by building abstraction on top of abstraction. A topic that was essential in the past could have been abstracted away since then. Let people learn what they need or what they're interested in, don't make them learn the past. People who remember the past too much are doomed to repeat it.

1. Donald Knuth, about optimizing your code too soon. Not saying he would necessary agree with my application.

Interested in learning the essential principles and practical aspects of web development? Check out my Kickstarter!