Automation and algorithms already play a big role in the media industry. Programmatic advertising, algorithm-operated front pages and machine translation are some of the ways automation has infiltrated the news desk.

So-called “robot journalists” are another blooming example of automation in the newsroom. The name is a bit disingenuous, since it’s not really a robot, nor really a journalist, but an algorithm. By the use of Natural Language Generation (NLG), the algorithm generates text from a pool of data, and writes the text based on a set of predefined rules and templates. Done right, it is a great tool for journalists and editors, and it can free up time for more valuable work.

Even though the field of Natural Language Generation has existed for over 40 years, the commercial application of this technology has only been mainstream for 6 or 7 years. As the connected world produces exponentially more and more data, the potential of NLG tools is growing proportionally. Companies like Automated Insights, Arria and Narrative Science have taken the lead in bringing NLG technology to market.

Within the field of journalism, NLG technology didn’t really have a breakthrough until 2014. That year the LA Times launched their QuakeBot, which extracted data from larger earthquakes and put them into pre-written templates. The main focus of the QuakeBot was speed, and its main objective was to get the report to the public as fast as possible.

This was the backdrop when we teamed up with NTB to create our own robot. The ambitious goal was to have the robot create summaries that would be free of blatant errors, and that didn’t have to go through an editor, but could be distributed straight to NTB’s customers. By the end of the project we had reached this milestone.