Using a mix of both Man and Machine, we are working on creating an amazing summary experience. The process starts with the curation of news & articles. This is done with a mix of our personal preferences and the analysis of our preferred websites with article activity on Twitter, Reddit, etc. to find articles that meet our criteria. This first part is not an exact science, our goal is merely to find correlations between the types of articles we pick and user feedback so that we can pick better articles in the future. After the article has been chosen, it goes through a number of processes to provide a quality summary using a number of services. We first analyze the article, then scrape it into a clean text-only format. This works most of the time, but every website has its own way of structuring articles and we have many different methods to grab it.