The (Old) Scientific Method

Hey guys.

Listened to an awesome podcast today by Economist Radio’s Charles Babbage, with the same title as this post, and the punchline is a rather jarring one.

Especially if you are some empirical traditionalist, or high school science teacher.

The advent of artificial intelligence just may upend the entire paradigm of knowledge gathering as we know it.

Kind of like how the printing press changed what it meant to write a book, or how the internet radically altered how we consume and share information.

That’s a Big Claim Buddy

Okay, granted.

But before we get there, what exactly is science?

Simply a set of rules?

Or some methodology we’ve adopted?

As explained in the podcast (btw, you should really give it a listen here) it as the process of accumulating a set of agreed upon facts, from whence we can produce a theory.

A process began by Galileo, and systemized by Francis Bacon (also known as the ‘father of empiricism’).

The always cool — Mr. Neil DeGrasse Tyson

“Science is a way of querying nature using methods and tools to establish something that is objectively true…” - Neil DeGrasse Tyson

But, in order to get to the meat of what we collectively know, we have to come up with a system of knowing.

This being; hypothesis, experiment, observation, and theory.

Alas, we have the oft revered scientific method.

But really and truly, the scientific method is just one tool in the grander box we have for discovering universal truths in a way that is reproducible.

Though, not the most efficient. Most religions simply say “Hey you. Here’s truth. Believe it. Have faith.”

Language can be better, as it distills and disperses the notions inherent in a population. Making it possible for us to create a sort of marketplace of ideas.

Though, the limitations are quite obvious.

For one, our minds are not computers. We make prejudgments about our world all the time.

Thus, even scientists enter their sanctified experiment rooms rife with preconceptions and biases.

It’s not wrong, per se.

Just human.

What I’m trying to say, is that It’s impossible, even for the most pragmatic and objective of us, to rid ones self of the need to understand what is not yet understood.

Which inherently leads to biases popping up, whether latent or explicit. These biases then can profoundly alter the direction of whatever experiments you decide to conduct.

This is the first ‘advantage’ of a supercomputer running the experiments for us, because computers see inputs and outputs as they are — unrelated.

That is, until they become so.