About this talk

Explore the history of .Net Core, and learn how to successfully develop and deploy on it with minimal manoeuvrings. In this talk you'll get two years of professional .NET Core experience distilled down into one juicy, brain-oxygenating session.

Transcript

- First of all, thank you to Jason and to Rick, who invited me to speak to you here today. It's a fantastic turnout. Actually it's awesome to see so many of you here, wanting to learn about .NET Core. I really believe that .NET Core is our chance to rejuvenate the .NET brand. For a long .NET has been seen as this legacy, enterprising platform, people working on monolithic line-of-business applications that were ported over from VB6 back in 2002. The startups, the cool kids, the market disruptors aren't generally known for their use of .NET, Justly notwithstanding. I think .NET Core is our chance to change that, and I think in three, four, five years time, when you're looking for a new role, the new, the greenfield projects, the big re-architecting work, the sort of companies advertising for those roles are not really gonna be looking for people who only know traditional .NET. They're gonna looking for people who know .NET Core. You might be working somewhere where moving to .NET Core is impractical for technical or commercial reasons, and I can't do anything about that. What I can do, and what I hopefully will do today is give you an overview of .NET Core, give you an overview of what we at Sun Branding have been doing with it, and some of the challenges and opportunities we've found, and hopefully spark a bit of interest, and give you enough of an interest to either propose it at work or start playing around with it in your own time. So, first of all, who am I and why am I here talking about .NET Core, other than the fact there is free beer and free pizzas, which is always a good reason to be anywhere. So, I'm the Development Manager at Sun Branding Solutions. Sun Branding Solutions, if you haven't heard of us, and, to be fair, probably all of you haven't heard of us, we're a bit like that legacy .NET platform that I just talked about. We're been around for ages. We were founded in 1893, I think it was. We have been through many, many name changes, and we have been through many, many different technologies over the years. Originally we were founded for photo engraving and plate making, which involved people physically carving images or reproductions of photos into sheets of wax that were then used to create brass plates that would actually print the images on paper. And, over the years, that's evolved, and these days things are all digital. People will work on Macs, MacBooks, iMacs, they use Adobe Illustrator, and the last physical printing press disappeared from the office back in 2007. And what that shows, if you needed to be told, is that change is a constant. The procedures and practises that we use back in 1893 aren't really of any value or interest now, unless you're a historian. In the same way my digital business unit produces software for our clients. This software has been around since the year 2000 in its earliest form, and it started out in Visual Basic 6, Classic ASP, and COM+. There is generally a perception in the industry I think that change equals risk and thus needs to be avoided. My counter to that is that that attitude meant we were still using VB6 and Classic ASP up to about the year 2013. At that point the entire project was deemed more or less incapable of being maintained and incapable of being upgraded. The risk of not changing, I feel, and we made the argument to the board, was greater than the risk of changing incrementally. We set out to rewrite the application. We did so over the course of about two years. We wrote using what was then the latest and greatest, MVC 5, Azure Platform as a Service, but we also set out with the non-functional requirement of staying as up-to-date as feasible in all the technologies that we were using. So, as soon as new packages would come out, we would upgrade the machine, and there were no critical breaking changes. If there were breaking changes, we would solve those, and upgrade anyway. So, we set out to devote a small portion of our time as developers constantly keeping things upgraded because not doing that would mean that in, say, 2023, we will be in the same position, again, and we will be looking at a full rewrite, or we will be struggling to find developers who still wanted to work on MVC 5. Where .NET Core comes in, so, back in 2015, all the buzz was around .NET Core. I personally liked playing around with the latest-and-greatest stuff. It looked interesting, so, I started having a look into it. This was back in the early Project K days. It wasn't production ready, arguably hasn't really been production ready until maybe late last year. We set out to do a black-ops style re-architecting of our project to take advantage of .NET Core, and that ran in the background in parallel with our regular day-to-day work, and it was a few of us working evenings and weekends, the odd bit of time in the office, when things were quiet. And taught us a lot of interesting things about our application, but it also meant that for the last two years, or so, we've been working pretty much day to day with .NET Core. By the time .NET Core Beta 8 came around, we started deploying some applications into production. I did not tell my CIO this, he would have had kittens. They weren't particularly important, but they worked, and they've been upgraded ever since. It's been painful, and the move from the so-called alphas to the so-called betas, which are really alphas. God knows what the actual alphas were. Every time something is broken, there have been things that have not worked. There have been things that have not worked and there's been no documentation about it, other than obscure tweets or blog posts. But doing that step by step has meant that by the time .NET Core 1.0 came around, we pretty much knew what we were doing, and we already have some applications in production using Preview 1 by then. So, by no means do we know everything about .NET Core, but we've been using it for a while. We are using it in production and have been for well over a year now. So, really this is a chance for me to share some of the lessons we've learned and some of the insights into it. So, we're gonna talk about a little bit of the history and the reasoning behind .NET Core, what it is and how it works. We're gonna look at a live demo of ASP.NET Core features. We're gonna do that after the break, and that's basically gonna be me with Visual Studio and an application running, and we'll merge that with a question-and-answer session because I'm sure you'll have questions as you're looking at the code. We'll look at some strategies for migrating to .NET Core, and questions, conclusions, and so on. So, it's been a bit of a turbulent history with .NET Core, as you might have gathered if you've used any of the betas, or previews, or read any of the, or followed David Edwards, Damian Fowler on Twitter. We start off with the Project K era, where it's ASP.NET Core through ASP.NET 5, and it's all about creating a new ASP.NET. .NET isn't really in the picture, yet. It's just about the web. It's really, in my opinion, a way for .NET to try and out node Node.js because everyone's using Node.js, at this point. You might still find blog posts and mentions of KRE, KVM. And then we move into the project.json, and the dnx, and the dnvm era, .NET Executive, .NET Version Manager, and the focus starts to shift away from ASP.NET, and it starts to shift more towards the Full .NET Framework. It's still really only websites and command-line applications, at this point, but the focus is shifting. Dotnet CLI comes out. Back in this era, the dnx, dnvm commands are actually just PowerShell scripts that get put in your path. By this point, Dotnet CLI is a full application that's installed in your path, but it's more of a first-class citizen. At this point, I believe the ASP.NET and the .NET teams actually merge, and they all come together under Scott Hunter, and they start looking at how to actually make what they call 1.NET. So, it's not longer ASP.NET on its own; it's now this is the way that .NET is going. That scuppers all their plans for going live, at this point. So, they were getting a beta together to go live here, and then suddenly there's what I think is a big set of discussions going on behind the scenes at Microsoft, and a little bit of a change of direction. As evidence by the whole saga of a CSPROJ, if anyone was following at that point, back over here, originally it was envisaged that ASP.NET Core would not use MSBuild, it would not use CSPROJ files. One of the great selling points was that unlike Full .NET, you wouldn't have to add every single item in your website to your CSPROJ, and have to deal with the inevitable merge conflicts when your UX developer puts something in source control and doesn't add it to the CSPROJ file. And I think a lot of people started complaining then because they had build, pipelines, and tooling that worked on MSBuild and CSPROJ files. And, so, it was a huge controversy. CSPROJ is back, but they've cleaned it up, so you don't have every individual file reference in there. You don't have hundreds and hundreds of arcane GUIDs in there anymore. It now looks sort of, almost nice. Not quite as nice as project.json, but I can see why they went that way. And then, finally, we get the big 1.0 release sometime last year. Currently they're gearing up for the big 2.0 release. I was hoping that would have come and gone by now, and it hasn't, so I'm mainly gonna talk about 1.0. The latest .NET Rocks episode that came out today, and I saw it when I was at the station getting a train over here, is all about .NET Core 2. So, if you're interested in that, that is a good thing to go and listen to. It's got, I think it's David Fowler on it, who is Mr. .NET Core. For those of you who have been around for a while, you might remember this back in 2007. Scott Guthrie announcing ASP.NET MVC, the very first beta of that. Back in these days, I was a web forms developer, I hated it. Still do hate web forms, although I hate them less than I used to. I'd originally come from a PHP background. I knew HTML, I knew CSS, I knew how HTTP worked. I did not like the fact that web forms took all that away from me and injected tables, and divs, and hidden script tags on the pages. I didn't like that I had to have all the associated tooling to set all that up. MVC was a breath of fresh air. Suddenly there was this nice, clean, new framework. You could have a nice, clean project structure. You had complete control over HTML, and there was a lot of controversy over that at the time. I remember a support developer in our company at the time going, "What, you can't find out "what page it is just by looking "at the path on the browser? "How can we look up what the view is?" Another architect that I knew later on said that, "No, MVC is never gonna take off, "and most shops will use web forms, "and MVC will be used by a small minority "who want to do complex stuff." If you look at the job postings that are around these days, and look at the number that are for web forms, and then of that number, look at the number that are for new projects, and of that, look at the number that are for exciting projects, you probably get to a very low number, indeed. And I think that's shown that MVC pretty much won that argument. At the time, it's controversial. People were saying, "You are forking .NET code base. "Are you gonna be able to support web forms and MVC?" And people were saying, "Why should I learn this stuff?" The answer is because it will probably win because, A, arguably it's better, and also developers like new and shiny stuff, so they will gravitate to stuff like this, and I certainly did. You can argue and, perhaps, think I would argue that MVC was a reaction to the popularity at the time of Ruby on Rails, which was the hot new thing back in 2007. In the same vein, I think .NET Core is really a reaction to the popularity of Node.js, and the fact that developers all want to use MacBooks these days, which restricts Microsoft's market. There are a lot of reasons behind it, Node being a big one. It is a better developer experience I think, or it certainly makes you feel more of a elite hacker if you can go with the command line, create new project, add packages to it, run it, see all the console map that's scrolling down your window. It feels more like being a proper developer than going into Visual Studio and hitting F5. It's lightweight, you can use Sublime Text to edit your text file so you don't have to wait for VS to open and load all its plugins. Docker, suddenly the emphasis is on small applications that can be portable, that can run with minimal dependencies, well, that ain't .NET. You've gotta have IIS installed, you've gotta configure IIS, you've gotta instal the Full .NET Frame, the right version of the .NET Framework, the correct Windows Security patches. And like I say, when ASP.NET was first written, it was written with the explicit goal that you could take a Classic ASP page, put an X on the end of the file name, and it would work as an ASP.NET page. All that interoperability, all that back compatibility that Microsoft is famous for is still in there, and it's still being evaluated in the runtime. Visual Studio Code, Microsoft's answer to lightweight text editors like Sublime. Again, doesn't have all the tooling that Visual Studio traditionally has. Azure and DevOps. We are not in a world anymore where we write the code and the IT admin deploys it on a special set of servers that only he or she knows how to admin. The year of special snowflakes servers that you don't touch because they work, and if you touch them, they stop working, and we tried building another one, it doesn't work. We are never building another one. You deploy on these servers, please and thank you. That's going away because why would you do that? Why would you rely on having an admin or a set of admins running all that when you can simply script it and automate it? Classic.NET is hard to automate. If you ever tried setting up a build server without installing Visual Studio on the build server machine, you will know it's not easy. You gotta have various SDKs around, and ultimately you end up having to instal Visual Studio, most of the time. They needed a solution that was lightweight and scriptable, and that was pretty much what they've gotten to with .NET Core. So, what is it? It's not just one thing, that's the first thing to say. When I say .NET Core, I pretty much mean all of this. On the infrastructure layer, you've got things like Roslyn and MSBuild. Roslyn, totally rewritten C Sharp compiler, written in .NET that compiles .NET. That bakes my noodle, I do not understand it. It is witchcraft, but it seems to work, and it looks very nice. MSBuild has been completely rewritten. It's now open source and arguably, should they have done that? I don't know, but it's open source. It's now the foundation for the project system in .NET Core and .NET. .NET Standard, we'll get on to that later in detail, but that's basically a way of standardising the .NET APIs across all platforms so that we can hopefully, finally, get to the goal of writing a library once and running it on any .NET. We've got the Core CLR. This is a cross-platform port of the .NET CLR, the one that's been around since 2002, the thing that compiles the code down into IL, and actually runs it. Works on Windows, Linux, OS X, and fully open source. They do accept pool requests. On top of that, we have the Core Framework, sometimes called the Foundation Class Layer. So, these are the things that you would normally do references at assembly. System.IO, System.Text, System.Reflection, all that stuff that you would generally consider the .NET Framework. Again, all open source. Unlike in the past, that is now delivered, or each of those rather, are now delivered as separate packages. So, with .NET Core, you don't just instal .NET Core. You instal .NET Core if you want it on your machine to develop on, but each of those you can bring down as a separate package. If you think back to, if you think about the Full Framework, you don't really think about where the assemblies live. If you really thought about it, you would say it's in C Windows Microsoft.NET Framework version number and they're probably all in there somewhere. With .NET Core you're bringing down an explicit version of, say, System.Data, or System.Data.SqlClient. The reasoning behind that is that they can iterate the faster over those packages. So, if there's a critical security flaw found in System.Data.SqlClient, for example, they can issue a patch, issue a new package version, and we as application developers can simply instal it and deploy the application. We don't have to go through all the paperwork of getting the administrator to instal the relevant security patch, go through the testing, get it approved by all the different levels of policy that it usually has to do. On top of all that, we have the application layer of things. ASP.NET Core, more or less a replacement for MVC and Web API. We have Entity Framework Core. A full talk on that would take way longer than we have today. It is very cool. If you're interested, a lady called Julie Lerman is very, very good on that. So, follow her on Twitter or read her blog. Kestrel Web Server. You do not need IIS to develop ASP.NET Core. You don't technically need IIS to run or host .NET Core either. Config, logging, abstractions. These are replacements for things like System.Diagnostics and System.Configuration. These are great; they're much better than their predecessors. I'll get onto to those in much more detail when we go into the live coding, but the beauty of those is that Microsoft is going towards providing standard abstractions that people can plug into. So, now if you're starting a project and you say what are we doing about logging, actually that becomes a decision you can defer 'til pretty much the end of the project because you use the common abstractions, and then at the end, you plug in where you actually want to write your logs to, which is a really nice model. So, quick screenshot showing some of these Framework Class Libraries on NuGet, System.IO, and, again, same as any other NuGet package, they're all up there, and the repositories are also available on GitHub, as we see here. Screenshot was taken a couple of months ago, when I first did this talk. Back then a chap called FransBouma, maintains a ORM LLBLGen Pro. He had just submitted and had accepted his first PR into System.Data.SqlClient. He's not affiliated with Microsoft, has no connection with them other than being a key .NET developer, but he proposed a fix. It was a one line fix, propose it, and it was accepted. Anyone can do that if you're brave enough. It's actually quite interesting looking at the code behind some of this stuff. You can tell the age of it by looking at the coding staff. The more like C++ it looks, the older it is. So, that leaves us with a few flavours of .NET to choose from these days. So, we've got the choice. We've got .NET Core, cross-platform Windows, Linux, OS X. We have the Full .NET Framework, Windows only, or Xamarin, iOS mobile devices running on the Mono Framework. Across all of that, you have the .NET Standard, giving you a common standard for APIs, and by APIs I mean things like how do I define regex? How does cryptography work? How does file IO work and things like that. And then Common Infrastructure, MSBuild, Roslyn, et cetera, setting the foundation of all of that. It's changed a lot from the first days of .NET. Back then it was many languages, one platform. Nowadays it's one language, really. F Sharp is kind of edging in there, but it's mainly C Sharp, and many platforms. C Sharp is more or less won the language war, although F Sharp is interesting. But actually the challenge now is not making COBOL work with .NET; it's making .NET work with Linux or SUSE Linux. So, .NET Standard. I said I'd talk about this in more detail. I'm not gonna go through each one of these table cells, don't worry. So, .NET Standard is, like I said, it's an attempt to define a stable API that is available on a given sect of .NET platforms. The idea is that it is forward compatible and will not have breaking changes. So, for example, if you target let's say .NET Standard 1.0, that is supported by .NET Core 1.0, .NET Framework 4.5, Mono 4.6, Xamarin for iOS 10, Xamarin for Android 7, and so on. If you're targeting, say, 1.5, .NET Core 1, .NET Framework 4.6.2, and all the way down, and if you're targeting Silverlight, then I feel very sorry for you. Find another job. Interestingly, actually Silverlight lives on in .NET Core. If you think of what Silverlight was, which was an attempt to let you write .NET that would run as a browser plugin, that was basically cross-platform .NET. When the team started writing the .NET Core Class Libraries and CLR, they took Silverlight as their base. So, although Silverlight itself is pretty much dead, its legacy lives on, if that makes you feel any better. And so on, so, the idea is that as new .NET standards are introduced, those standards will be backwards compatible, and they're not gonna break the APIs as they move forward. There is some talk, I'm not sure how true it is, that they might break some in .NET Core 2. However, I think the ones they're going to break are ones where they broke the APIs for .NET Core 1. So, to give you an example, in .NET Core 1, as it stands now, if you want to get information about a type, you no longer do .GetType. You have to do GetType.GetTypeInfo, and bring in the System.Reflection package. That is going away in .NET Core 2 because understandably people got very annoyed by it. You end up having to do switches depending on which framework you were targeting. But, as far as I know, those are the only breaking changes they're considering putting into .NET Core 2. Everything else is gonna be standard. .NET Standard is pretty much Portable Class Libraries done right. I haven't had much experience with Portable Class Libraries. The experience I have had was unpleasant. The idea of Portable Class Libraries, PCLs, was that it was an intersection of features so that you could say I want to target .NET for Full .NET Framework, or the .NET for fridges, or .NET Windows Phone, or whatever it might. And actually that framework has this and this, but not that, and this framework has that and that, but not this. .NET Standard does away with all that, and says this version of .NET Standard has X, Y, and Z. This will compile for all these different runtimes. There are some things that will compile the certain runtimes but through a runtime exception when they are run. For example, some of the cryptography libraries are available in .NET Core, but are tied very heavily in with the Windows Cryptography API X509 Certificates. Other things such as the registry, for example. You can use it in .NET Core, but very obviously you're not gonna be able to actually use that on OS X, and it would be a particularly sadistic Microsoft developer to actually try to emulate that on OS X and succeeded. But what that actually gives you is this. A .NET Full Framework 4.Whatever DLL, you can run that in a .NET Full Framework application obviously. You can run that on a .NET Core application if you're targeting the .NET Full Runtime. So, you can take use of all the nice, new application layer features, but still run on the Full framework, but you can't run a .NET Core application on the Core CLR, you can't run it cross-platform. The .NET Standard DLL, if you compile to .NET Standard, whatever version, it will run on Core CLR, it will run on the Full Framework using .NET Core App, or it will run quite happily on a old-fashioned .NET 4.Whatever application. This gives you quite a nice migration strategy, and this only really happened around the Preview 1 to 1.0 time. We'll go into more detail about that in a sec. And the way that works is something called multi-targeting. I mentioned before that CSPROJ files have been cleaned up quite substantially. This is a class library CSPROJ file as generated by the .NET CLI, and it is this. As I add files to it, this won't change. No GUIDs, no cryptic references, it's lovely. They do get more complex than this, but not too much. The changes I've made here, which I've highlighted in yellowy-orange, target frameworks. So, I can say that this class library targets .NET Standard 1.6 and .NET 452. I can then use the wonderful MSBuild conditional syntax. This was much nicer in project.json, but I can use that syntax to say if I'm building for .NET 452, in this instance, I'm going to include this NuGet package, Microsoft.Azure.DocumentDB. That does not support running on the .NET Core CLR, which is not cross-platform. However, if I'm building against .NET Standard, I'm gonna bring Microsoft.Azure.DocumentDB.Core, which is a straight port of that, and which is compatible with .NET Core and cross-platform. And because they're equivalent in API, it will work. If they're not equivalent in API, you do have the option of using compiler directives, and if NET 452, or if NET Standard 1.6 to get around that. But that basically means you can write a class library, or modify existing class library, and it can still have .NET Full Framework dependencies, but you can actually gradually move it across to .NET Core. So, actually .NET Core doesn't have to be a big bang upgrade which might be fantastic news to your managers, or to you if you are a manager or an architect. When is .NET Core not .NET Core? The answer is when you're running it on Full .NET Framework. As I said, the .NET Standard Libraries will run across either a ASP.NET Core application, targeting .NET 4.6. So, this is all the goodness of the new application layer, but I'm running it on presumably a Windows machine, running the Full .NET Framework, .NET 4.62, or .NET 4.7, or whatever it is. But I can also target those on an application that runs on the Core CLR and the cross-platform version. And, across both of those, I get the application layer goodness, I get ASP.NET Core MVC, I get the .NET CLI, and I can even, if I'm doing this transitional approach, I can even bring in my existing .NET 4.6 Libraries. In fact, I don't even need to change them. So, if you've got a large Visual Studio project, with a load of existing class libraries, you can create a new .NET Core web or console application. You can target the .NET 4.6 Framework, and you can bring in the existing class libraries, and it will work, but you can also use some of the new stuff, as well. And actually if you then gradually migrate your application across bit by bit, that new application becomes closer and closer to being cross-platform and fully new, if you like. So, the important thing to remember is not to panic. It sounds really complex when I'm talking to you about this. It is kind of complex when you're doing it, but you get the hang of it pretty quickly, and it does start to make sense, hopefully. So, there's pluses and minuses, as you might expect. We gain a lot of flexibility, we gain a lot of portability. We gain a fast release cycle. Bundling the runtime. That's something we haven't mentioned actually. With classic .NET, you deploy the application on a server and .NET is installed on that server therefore it runs. If the version you want, sorry, if the version of .NET that you want is not on that server, you can't run the application. The same is more or less true of .NET Core, in that you can instal machine-wide, and therefore it's there for all applications, but you can also bundle it. You can also say to .NET CLI, I want to bundle this application along with version X of the Core CLR runtime. That will then give me an executable that has the runtime, all the dependencies, all the .NET Framework Class Libraries, all of my class libraries, and that is completely stand-alone. I can run that from command line on any machine, and it will work, which means your IT administrator doesn't have to get grumpy, doesn't have to fill in a load of forms, and you can probably use whatever framework version you like. We haven't actually done that one yet because we manager our own servers, but it is one of the big selling points of .NET Core. Also really useful if you're building tools that you want to redistribute. If you're building something like TeamCity or Octopus, and you want to run a nice self-contained web server with your application on clients' machines, you can do that. You bundle the whole thing up, deploy it. They don't have to IIS installed, they don't have to go through any config. You can just run it from command line as a Windows Service, or whatever. There's some interesting possibilities when you get in that. And, of course, the shiny. It's new and shiny; we like that. They've improved a lot of things. Some people disagree, but broadly I think it's a nicer framework to work with. The downside is complexity. We've always had that complexity, but because the majority of .NET apps I would venture to say are written on Windows machines, and deployed on Windows machines, we haven't really seen it. If you really think about the amount of code that went into creating a Hello World webpage on web forms, and you think of all the designer tooling that Visual Studio generated for you, there's actually a hell of a lot of that in there already, but it was hidden, hidden behind Visual Studio, hidden behind tooling. With .NET Core, a lot of that complexity comes to the fore, in that you have to think about it more. It's easier to manager. You can, as I showed with the CSPROJ example, you can manage this in a text editor quite easily, but you do have to think about it, and the tooling won't necessarily do that for you, although Visual Studio 2017 is doing a better job of .NET Core than it used to. And, like I say, if you're a package author, if you're James Newton-King and you're writing Json.NET, you have always had to deal with that complexity, and he will tweet for England about the fact, about how difficult it is to get one class library that will run across anything and anywhere anyone wants to run it. .NET Core actually makes that a lot easier, but if you're an application author, there's maybe even more complexity there to think about than you've previously come across. So, what is the new, what's the new shiny? DotnetCLI, as I've mentioned. It's a command-line interface. I'll demo that in a bit. You can create .NET projects, you can add NuGet packages, you can add project references, you can run things, test things, debug things all from the command line. If you don't like the command line, fine, you don't have to use it. You can use Visual Studio. For very quick hacking, I do find it a hell of a lot easier and quicker to use. CSProj and MSBuild in general, massively tidied up, much, much cleaner. Everyone has had merge conflicts in CSProj files, especially with web projects, where there are dozens and dozens of images, CSS files, JS files, whatever files. If they get in the wrong order, if two people commit different things at different times, there are merge conflicts, and they are tricky to debug. That goes away with CSProj. The CSProj changes completely. .NET Standard is new. We've now got a supported and efficient way of doing multi-targeting for our class libraries. We have MVC Core. This combines MVC with Web API. They are no longer two separate libraries, and they are no longer two separate pipelines. So, you no longer have to write two separate dependency injection containers to target both; you can just do it all through one. You no longer have to have two separate sets of controllers to do both if you're doing a mixed project. One of the wonderful controversies and difficulties of our current application is that it's still MVC and Web API, and every now and then someone will always say, "Is this a Web API controller or an MVC controller "'cause there's mainly views, "but there are a few API-ish methods. "So, should I do it in MVC, or API, or both?" That goes away with MVC. It's all just one pipeline. You can create a controller that does views, it can return you JSON, it can return you plain text, it can do posts, and gets, and puts, and deletes, and all that stuff, and it's just one file. Same dependency injection container, same pipeline. Entity Framework Core, again, far too big a topic to talk about today. Very quick summary is that it is a totally rewritten Entity Framework with no dependency whatsoever on SQL Server. The original Entity Framework, according to the EF Team, did have a quite a few dependencies hidden there and on System.Data, System.Data.SqlClient. EF Core is now wholly targeted at the ORM aspect. If you want to use SQL Server with Entity Framework, you pull in Entity Framework Core and you pull in EntityFrameworkCore.SqlServer, and then it works, and you just say Entity Framework, use SQL Server, done. Or you say use MySQL, or use an in-memory database, or whatever it is, or you can write your own. Whether it will work or not depends on how well that provider works with link statements, and how complex your link statements are, but that's just the nature of the game. Caveat there, there are a few things that they haven't gotten working yet in there. Many-to-many relationships, for example, are still tricky. They are working on it. They've made slower progress than they wanted. It is coming along. Kestrel, so, if anyone remembers the OWIN set of libraries, the OWIN Open Web Initiative, I think it was, and Katana, which is where Project K I think originally came from, that was an attempt to define a common set of interfaces that a web server could run .NET should have. In practise what that ended up meaning was that you would have MVC stuff in global asax, and you would have Web API stuff in startup.cs. Kestrel is a full-fledge web server. It does the absolute minimum in can. It's based on the library called Libuv, which is the same one I think used by Node.js, and it serves webpages. You can run it from the command line. You don't need anything installed on your machine to run it and if you want static file hosting or MVC, you plug those into the pipeline as middleware, and then you gradually build up the pipeline that you want, in much the same way that you do with Node.js. Middleware, I mentioned there. HTTP modules go away, as do IIS handlers, replaced by something called Middleware. Middleware is just something you can plug into the request pipeline that will do some stuff, and then pass onto the next Middleware in the stack chain, and will cancel at that point. So, things like blocking certain browsers you can do in custom Middleware. MVC is Middleware in and of itself, in that it intercepts the request and chooses what to do with the response. Authentication, as well, also runs as Middleware. Razor, some massive improvements in Razor, especially around templating. A lot of the @HTML.Stuff goes away, and is replaced by something called tag helpers that let you annotate normal HTML with additional attributes that tell Razor to get involved from that particular tag. It makes the HTML look a lot cleaner, and a lot more like HTML rather than a sea of @ statements. So, Kestrel, again. Thought it was worth talking a bit more about this. If you were to host a .NET Core application today, you would typically debug it using Kestrel. You can use IIS Express, if you want. IIS Express is still only supported through Visual Studio. Kestrel you can use through Visual Studio or command line. Typically when you deploy that, if you deploy it to Azure or to your own servers, you would put that behind a proxy, such as IIS or nginx. Microsoft do not currently support running Kestrel as an edge web server, as in a web server totally exposed to the public internet. As of .NET Core 2, they will, although they still don't recommend it. Basically what that means is that IIS is probably going to become less and less important, and Kestrel will become more and more important as time goes on. But conversely if you're running .NET Core applications on Linux or OS X, you obviously don't have IIS, and the supported or the recommended way of doing that there is to use nginx as a front-end proxy that will basically take requests and forward them on to Kestrel running in the background. Again, you'll see much more of that when we get onto the live coding demo. The New Project System is great, frankly. So, that is a web project. If you use the .NET CLI to create a new project, that is the CSPROJ you get, and that's it. In fact, if you tell the CLI to generate a web project, you won't even get that, you won't even get ASP.NET. You have to say I actually want ASP.NET. But it's basically a very tiny CSPROJ that says targeting .NET Core app, the .NET CLR cross-platform, and these are my NuGet packages. Packages.config goes away. All the packages are defined in this file, as are any project references, as are in DLR references. So, everything goes into this one file, rather than having the awkwardness of removing something from packages.config and then having to find the particular place in the CSPROJ file where it's actually referenced. So, NuGet and the .NET tooling really unify at this point. And no GUIDs and no file references. So, as you add files, they will just appear in Visual Studio. And a lot of problems just go away. Assembly redirects go away completely. So, if you ever had the version of assembly X does not match the referenced version, and you've had to then hunt down the obscure assembly redirect in one of the config files, that just goes, .NET takes care of it. Transitive dependencies. If you've ever pulled down a package from NuGet, if it has five or six dependencies, and suddenly your packages.config file now has those five or six dependencies, and then you try and clean it up and think, well, is that used? If I take it out, oh, no it is used, that thing's using it. That goes away, as well. So, if you want package X, you will see package X. All of these have their own dependencies, but you don't see them because that's not the package you're bringing in. These are the packages you care about. Transitive dependencies, the dependencies that that package has, are handled for you by the .NET CLI and the .NET tool, which is great when it comes to project-to-project references because you can just say I depend on this library, that then in turn depends on my core library, and my utility library, and so on, but actually I just want to reference this one library. Sharing code between MVC and Web API. As I've said, they are now one pipeline, so that problem goes away entirely. Depending on what you do, that may or may not be a problem, but it goes away. Plugging in dependency injection. DI has never really been a first-class citizen in .NET. There are some fantastic libraries for doing it, Windsor, and Inject, Unity, and so on, but you have to plug them in. You have to create your own config file, or create your own code file, inject them into the pipeline, and basically tell your controllers how to locate dependencies. It's also quite difficult to then pass those dependencies back through the chain of your object model, your class libraries, et cetera. .NET Core, or in .NET Core rather, dependency injection is very much a first-class citizen. You are expected to use DI. If you don't use DI, it looks at you funny, and says is there something wrong with you? If you take a .NET Core project, you can add dependencies straight away using .NET Core. There's no additional stuff to plug in. All the DI providers such as Windsor and Inject have .NET Core equivalents, and you can use them, but if you just wanna play around with it, the built-in .NET dependency injection abstractions work absolutely fine. We have not found any problem with them so far. Local developer configuration, always a pain. We have historically with full .NET not checked our web config files under source control. We have checked templated, and then developers have to then copy that template, rename it, change the variables to suit their machine, and invariably someone checks it anyway. In .NET Core there is a story for doing that. You can use something called User Secrets, where you can actually inject your own config values that exist only on your machine. Or, if you don't want to do that, you can inject them via command line or environment variables. Configuration is flexible and pluggable. You can get your config from wherever you like, which means that you can have a standard way of each developer checking out the code and configuring it for their machine. Consistent error handling between ASP.NET and IIS. If you've ever had to battle why you're not seeing a custom exception, or why you're seeing an IIS exception instead, or why you're seeing an IIS exception that says there was an exception with your custom exception, ASP.NET Core does away with that all together. There is now no battle between IIS and .NET. Kestrel is just running .NET. So, the full error stack is in your hands. You can choose exactly what to do with errors. There is error-handling middleware provided for you, or you are free to write your own either way. There are some things that just go away that you might be a bit sad about. System.Web goes away completely. So, if you have code that references System.Web, which is basically web forms, that's not gonna work. You have to port that across, or you have to target the Full .NET Framework. WebForms are not going to be supported on .NET Core. The word from Microsoft is that they will never be. WinForms, similarly, they have no interest in porting to .NET Core. WinsForms is basically a wrapper around the Win32 API, and it's not cross-platform. They are working on WPF. They have announced a XAML Standard. I don't know exactly when that will land, or what the result of that will be, but apparently are working on it. System.Configuration, if you used that, that goes away. So, anything in app settings, or if you created custom config elements, they go. You can bridge that by your writing your own conflict provider, but if you're referencing that assembly, that's not gonna work with .NET Core. And, very obviously, if you're using DirectoryServices, that is Windows only and isn't gonna work cross-platform. Quick note about .NET Core 2. It's currently in preview. They have managed to get all the version numbers consistent. This is their great achievement. So, you are now on Core CLR 2, .NET Standard 2, MVC Core 2, and Entity Framework Core 2. I'm not gonna talk much more about .NET Core 2. It's meant to be a fairly safe upgrade. ASP.NET Core 2 does have some breaking changes. .NET Rocks, as I said, is out today, interview with David Fowler about it. Listen to that, it will tell you a lot more than I can. Migrating applications to .NET Core. So, there are some strategies we can use here to do this. So, we start off, we've got our application. We've got class libraries written in Full .NET Framework. We have an application, the Full .NET Framework, and we have external dependencies, probably NuGet, which target the Full .NET Framework, and we wanna get that across to .NET Core. One approach, move our class libraries across the .NET Standard. So, we take the pain of doing that, we hive off any dependencies that are Full .NET Framework dependencies into their own libraries, or packages, and we make all of our class libraries cross-platform. We can then split our application. We can have some of our application running Full Framework, and using the class libraries, and we can have a cross-platform .NET Core application also using the same class libraries. That can pull in .NET Core external dependencies, and that can pull in Full .NET Framework dependencies. If you're working in an environment where you have multiple sites or services running off the same class library base, that might be a viable alternative, in that you do the core libraries first, and you take applications one by one and decide whether they are good candidates for converting. The other approach, or another approach, I'm sure there are many others, leave the class libraries where they are. If you can, try and modularize them, so that will make it easier to move things over to .NET Core in the future. So, where you've got a Full .NET Framework dependency, put that behind a service interface, put it into its own library, and then you can look at snipping that off and replacing it with a .NET Core replacement later. We can convert our application to running using the .NET Core Framework Class Library and tooling and still run it on the Full .NET Framework. So, it's the same application. We may need to make some tweaks to make it work in the tooling, but not a huge amount of pain. We haven't had to touch our class library, so it's using the same ones of those. We can still use our external Full Framework dependencies, but we can start bringing in some external dependencies that target the Core Framework, and the idea is that as time goes on, we gradually snip more and more of these off, we gradually convert that to .NET Core, and then the whole thing becomes green. The pain points we found so far. There are still packages that don't have .NET Core equivalents. Their package authors haven't ported them across, maybe it's a more or less abandoned package that was done five years ago, and no one has any interest in maintaining it, or they've just been too busy to do it. If that happens, the options are basically find an equivalent, plead with the package author to try and port it, or port it yourself, and fork it. If you're using things like the DependencyResolver in MVC, System.Web.MVC, that's gonna be a pain point. We have that a lot. We have a lot of class libraries that use that as a magic global to get the actual dependencies that they need. That's been our biggest sticking point so far is converting them and to actually accept the dependencies and then construct its own methods, rather than assuming things are global. In general, if you do have code that assumes that things are global, that will probably be a problem with .NET Core. It tends to nudge on the path of making things injectable and pluggable by default. Same with HttpContext.Current. If you're using that, aside from the fact that System.Web goes away, it's more difficult to get to that in .NET Core. Using Nuget client-side libraries sort of works. If you're using NuGet for jQuery, for example, it will sort of work, but it won't really work with the folder structure that .NET Core expects you to have. You are better off moving those over to Bower or NPM, or hiving them off to a CDN, or whatever you want to do, and you'll get a better developer experience out of that anyway. I think starting with the next Visual Studio, or it might be the current patch release, I think Microsoft are gonna start flagging the client-side dependencies in NuGet to show you that you're not meant to be using 'em, that you're meant to be using NPM or Bower, instead. So, they're pushing you away from using those. Image processing, there's not a huge amount of support for this in .NET Core. The System.Drawing.Image packages are all built on top of GDI, which is all very heavily tied in with the Win32 API, and they have not been ported. There is a guy who has ported the same API across to .NET Core. There is another chap that's building, I think it's ImageSharp, which is a fully managed image library in .NET written in C Sharp, and there is also ImageMagick, and then maybe many others, as well, but they aren't one-for-one equivalent with System.Drawing. They do, however, form much better than System.Drawing because System.Drawing does not have very good performance because it's designed to support WinForms on a single-use system. But if you're using it heavily, it will be a pain point to convert. And if you've got class libraries that depend on specific versions of MVC or System.Web, they are gonna be a problem. If you've got a .Web library that refers to controllers or action results, as we have, these are sometimes gonna be issues to move over. What we're gonna do now is just look at some coding examples, have a look at .NET Core, what it actually is, how it works, laugh at my bad code. I'm a manager, not a coder, so you are allowed to laugh at my code, I encourage it. And I thought we'd start with the .NET CLI 'cause it's a very different way of working to the traditional load Visual Studio, file a new project, or new solution, a way that we're probably used to. So, .NET CLI, .NET Command Line Interface is fantastic. I wrote this in the coffee just before I came along here. This is a little Windows batch file that automates the creation of a Full .NET solution. So, what we're doing here, we're taking a parameter %1. You could do this PowerShell, and you probably should do this in PowerShell, but I did a batch file 'cause it was quick and dirty. We've taken our solution name, we're going to add a class library, a data access layer, and a command line application. We're going to add those to our solution. We're going to add the references. So, we're gonna say that the CLI references both data and core. And we're going to add Json.NET and Entity Framework Core into that. And that line's in the wrong place. That should be up there, but it doesn't matter, and then we're gonna restore the packages and we're gonna run it, and this is going to work 'cause it did in the coffee shop earlier. Boom! So, that's just demonstrating how easy it is to automate this stuff. If you've got a common template for solutions, if you make a lot of micro services or small services, if you're a consultant who you create solutions every day for clients, you can script a lot of that, and suddenly you've got an easier alternative to T4 templates for generating standard project templates. So, CLI, to actually show you it in action, and, at the end of that, we've got hello world, which is what our console application actually outputs. If I run the S code on that folder, we've got our projects and our solution. We've got our solution file in all its ancient, ugly glory, and I'm told that this the next thing they want to clean up. And we have our lovely, little clean CSPROJ files that have been generated for us. So, note that in this one, we've got a project reference, and that's it, that's all the project references. No hint path, nothing like that. In Core we've got a package reference to Json.NET. And the data access layer, they don't have the reference to Json.NET because it's transitively given to us by the fact that we're referencing our core library, and then we get our actual programme, which does nothing exciting, it outputs hello world. That's the basic scaffold. Once you instal .NET, you get access to the CLI. You can run it, it will give you lots of help, tell you how to use it. So, you can do dotnet new help, and it will tell you all the things you can create from scratch. So, I can create a console application class library, an MSTest application, xUnit, Web MVC, Web API, a new solution file, and I'm told they will add to this as time goes on. I think they're currently working on adding different spar frameworks into this. So, you can react, or knockout spar applications. You get access to run your code. You can just run web projects or command-line applications. So, if I go into my CLI project, I can just do dotnet run. It will take a little while, as it always does, and you get hello world out of it. Incidentally, I'm told that in ASP.NET Core 2, sorry, .NET Core 2, they have massively improved the compile and startup time of this, and to something like back where it was in the early days of .NET Core betas. Currently there's a little bit of a delay between starting stuff and actually running, which I'm told they've gotten down quite a lot. Test, as well, if you've got unit test, you can simply run dotnet test. There aren't any here, so I'll just draw an error. You can add NuGet packages so you don't have to use the package console in Visual Studio or find where your NuGet EXE is. There we go. So, it's good stuff. So, I thought I'd just show you that, and show you how easy it is to script it. What we're actually really gonna be looking at for most of this session is this thing of beauty here, which is, like I said, my horrible, horrible code, and I do not apologise. I'm not a full-time developer. I always struggle to think of a good subject for sample applications for a very simple blog engine, and I'm gonna run it now. Just to get in to demonstrate these, or that you can script this. To run this I'm gonna use nothing but command line, and I've created a very simple command-line script to do that, and all that does, it sets an environment variable that tells it that it's running in development mode. If I've passed a parameter through, it will then use that as the environment that it's running in. So, I can run it and say you're in production now. I'm gonna go into the website folder, and I'm just gonna go dotnet watch. Watch is a package that you can add to your .NET Core programmes, and it's basically a file watcher. As soon as you change anything that .NET deems compilable, like a .CS file, it will recompile and restart the application. So, you can set this to run, and you can develop away. As soon as you make changes, the changes will be recompiled, and your application will continue running. You can also do that with unit tests. So, you can have your unit tests continually running in the background on a separate command prompt window. I know you can do that now with Visual Studio live unit testing. This is another way to do it if you don't want to run Visual Studio, or indeed if you're on Linux or OS X. This starts up. There we go, so it's telling me I'm in the development environment. It's saying that it's using this content root path, and that it's listening on this port. So, there is no IIS on this server. It's not using IIS; it's just running Kestrel. What it's actually running is the .NET EXE, which is then targeting my application's DLL. Slightly interesting point, if you compile a .NET Core web application for the Full Framework, what you'll get in the bin folder, or your publish output, is an EXE, and it's simply a command-line application that you can just run. If you compile it for the Full Framework, you will end up with a series of DLLs, and then you can put those behind IIS or nginx and use that as a proxy. Let's just make sure this is working. That's intentional, I meant to do it. The first thing to note is there is no database for this. So, this is using Entity Framework Core, and I have neglected to deploy the database. So, like I said, I'm not gonna go into the intricacies of EF Core 'cause it's a massive topic, but basically I've got my models folder here. I've created a model for a blog post, which is very simple, and I've created a model for a blog database context, which inherits from DbContext, which is Entity Framework, and it's broadly the same sort of thing as you'll have seen from Classic EF. I'm just defining how my model maps into a database. The other neat thing about this, two neat things, in fact. First of all, I am getting all my telemetry and all my exceptions out to the console. So, I'd have to get the output window in VS and maximise it. I can see pretty much everything through here. All this is using the standard logging abstractions that .NET Core provides to you. So, you can plug into this, and you can plug this output to wherever you want, whether that be log4net, or Serilog, or the console, or TraceListener, or whatever. So, I'm just gonna stop that from running. So, I've created another script, and all that is gonna do is build the database. The .NET CLI has commands available for automating Entity Framework. So, this command is saying, Entity Framework, update the database. If it doesn't exist, it will build it. I'm gonna run this, cross my fingers, and, in the meantime, I'm gonna load up my SQL Management Studio, and by the time Management Studio loads, it should have built the database, that's the theory. So, again, it's all focused on automation and scriptability. We've now got our database with its one table, fantastic. And, again, the things that this script, you can also run from code. So, you can also call that database update from C Sharp itself. So, you could have an application that essentially self-installs. By the way, if you do have questions, shout 'em out. This is more of an interactive question-answer session. Restart this, load up the page. That, by the way, is the standard error page that you get, and I mentioned before, there is now no conflict between IIS errors and .NET Core errors. That's the error handling framework in developer mode. You can plug in whatever you like in release mode, as well. I'll show you how that work in a sec. Do not laugh at my CSS. It looks hideous, I didn't have a lot of time to do this. We have our fantastic blog post application, which has a home page, has an about page, and it lets me write a post. So, I'm gonna say... And I'm gonna use markdown for the actual body content, and I'm gonna put some tags on. So, demo, dotnetnorth, publish. There we go, the post is up. We get the tags displayed and we get the text displayed. If I go back home now, that now appears on the home page, and jobs are good. If we look at the console output, we can actually see because we're running in development mode, it's producing verbose logs. It's showing us pretty much everything. It's saying each request it's getting, it's echoing back to us. Each SQL command it's running, it's echoing back. We can pretty see exactly what the application's doing here, and exceptions, and stack traces will also show up here, when and if they occur. You can control exactly what log level you want to see here. You can control that either globally, or you can make it dependent on the type of environment it's running in. You're not restricted to just development and production. You can have as many as you like, and I can have as many toggles as I like. If I now write another post... Thank you very much. Very nicely demonstrates that it is a command-line application, and behaves exactly like command-line applications do, thank you. I'm gonna try and create a second post. I'm gonna give it the same URL slug as the first one. And that should, if I got that right, give me a validation error, which has gone on to check the database, and it's found that that already exists. So, that is pretty much the sum total of the application. So, let's actually have a look at what it is and how it's working. So, it looks broadly like an MVC or Web API application that you might have seen before. You've got controllers, models, views. There's a few differences. You've got this wwwroot folder. That comes with standard, and it's designed to actually separate your static files, web-viewable files from your application code. In ASP, especially in web forms, there's no separation. If you've got a class at the root level of your project, it just sits alongside robots.txt. This is meant to do away with that. So, your code lives one level above that, and in there we've just got the standard stuff. We've got a style sheet, robots.txt, and a favicon. I've only created one controller 'cause I was lazy. Go on to middleware in a second. Models are fairly standard. We've got a settings class. I think someone mentioned earlier they were interested in the configuration model of .NET Core. So, we've created a strongly typed class for all the settings of our application. So, we can configure the type with the blog. We can say how many posts to show on the home page. The HTML tag to use for our tags markups, so when we go back to here and see the tags, that's using the HTML mark tag. Looking at that nice, sexy yellow colour. And we're saying that this our minimum page load time that we find acceptable. All of those are defined in our appsettings.json. So, app settings goes away as a concept, or rather it goes away as a section of your web config file. But, by default, when you create a new MVC project in .NET Core, you will get this JSON file. JSON is their default way of doing configuration. There are providers for XML, there are providers for XAML, and providers for INI files, if you particularly want to go oldschool or, again, you can write your own. So, you can plug it in to a database, if you particularly want to, or a web service. And, on here, this again is standard, but this defines how the application should output logs. And, by default, i.e. production, we don't want to output anything lower than warning level. ConnectionString to the database, and then we've got our AppSettings, which are simply defined as a very basic JSON object. If we want to, there is no fixed structure, so we can do nested elements as many times as we like. There's no structure at all that's required. There are some conventions such as logging, and the reconventions such as ConnectionStrings, but, again, you don't have to adhere to them. You can get each setting out of this config file individually. What does that look like? So, the entry point for our application, just to begin there, is Program.cs. Again, this is a command-line application, and in here, again, this is just code. You can do whatever you like in here. As standard, creates a web host builder, tells it to use Kestrel, gives it the content root, which is the wwwroot, and integrates with IIS. Again, you don't have to do that. Tells it what the startup class is, and, by default, it will try and make UseApplicationInsights. Again, you don't have to do that; all these can be taken out. And then it runs it. Because this is just code, you can do anything you like. If you've passed in a certain command-line argument, you can tell it to not use IIS and run on nginx, or you can do any config. Again, entirely up to you. What we do with this, there is a bit of, there's a model that you can plug in that will take the command-line arguments and turn them into configs values. So, you can pass your entire config in in the command arguments section. If you're writing an application that you want to be deployed to end users and maybe run as a Windows Service, that might be useful for you. So, you can deal with your config in your service configuration. Into the startup file. And here we are telling the application to use appsettings.json. We are also saying, in addition to that, that if there is an app settings file that has a suffix, which is the same as our environment, i.e. development, production, staging, QA, UAT, whatever you want to call it, we will take those settings, and we will merge those with the first settings. And, in this model, the last one wins. So, the way they want you to use this is to say these are my base settings. In development mode, I want to override the following settings. So, if we look at appsettings.Development, we're overriding the logging, and we're saying actually, in development, log everything, output everything. You could also change the connection string, you can change the app settings, you can do whatever you want. If you've used Azure Web Hosting, or Azure Websites, rather, you might have noticed that you can add application settings in the Azure Portal, and that they will override the ones in web config. It's the same model here. You define your base settings and you override them. You can plug in other providers here, as well. If you wanted to use this JSON file as a base, and then have a per-user config file, or config provider, you could have that over the top, and those settings will take precedence. If the settings aren't defined in this file, for example, then the ones in this file win, so it's last one wins. You've got your configuration object, and that's basically a dictionary. So, I can do that, and I can get a particular key from it. If I want to get a nested object, I just use colons to map the path down, the same way you do with XML or XSLT with slashes. And that will just get me a string, and that's the sort of very basic abstraction that configuration gives you, and everything else is built on top of that. There are some things to make that easier. We've got our AppSettings object here, and we're telling it to configure AppSettings based on a section of configuration called AppSettings, and I've used name of 'cause I don't want to literal string that. So, that will go off, and it will look into AppSettings, it will find that key. It will take each of these properties, and it will try and find an equivalent public property on this class, and it will then put that class into our dependency injection framework. From then on, I can inject that into anything, and I will always be able to get the current settings of the application. What does that look like? If we go to, for example, let's see where I've used it, not there, here we go. Our custom middleware. In our constructor, there is a tiny abstraction you have to use, annoyingly, called IOptions of type. You give it a parameter, IOptions of type AppSettings. Dependency injection will go off and get that for you, and you can now use that settings o