If you’re like me, you might not have SCOM or SCCM or some other 3rd party solution that you can leverage to track all of your systems in your environment that you can pull up at a glance to know what software is on it or what local account might be created on it. Fortunately, with the combined forces of PowerShell and SQL, we can build up our own solution that can provide us this information. This is where I came up with a project to roll my own server inventory system.

This will be at least a 3 part series in which I will go over each aspect of my build which will cover the following topics:

This article will focus on the SQL database and table build using a few techniques. I consider this almost a living script as I have added and even removed a few things since its original inception. I opted to create tables for each section that will be inventoried. Currently, the sections are (in no particular order):

AdminShares – Holds the admin shares of a server

Drives – Holds the different drives on a server such as CD-ROMs and hard drives

General – Holds information that doesn’t really apply to the other tables

Groups – Holds local group information on the server

Memory – Holds the memory information of a server

Network – Holds networking configuration information of a server

OperatingSystem – Holds operating system information

Processor – Information about the processors on a server

ScheduledTasks – Lists all scheduled tasks on a server

ServerRoles – Lists all server roles, if applicable, on a server

Services – Lists all services on a server

Software – Lists all installed software on a server

Updates – Lists all installed updates on a server

Users – Lists all local users on a server

UserShares – Lists all user created shares on a server

Of course, this could change over time depending on various requirements, but I am hoping that for the most part, things stay the same and the only things that would need to be updated would be the content in each table such as adding or removing various properties. Adding new tables is as easy as copying the code from a different table creation region and making changes to add the new names and properties to include.

One last requirement that I had was that this data was only meant to last until the next run of the scheduled task which collects the data and sends it to SQL. This means that whatever that I had prior to the run will be replaced by the new data. Probably not the most efficient way to handle the data, but there is always room for improvement with this project.

The script that I wrote to help build not only the database, but the tables where I will push all of my data to is found at the following link: https://github.com/proxb/ServerInventoryReport/blob/master/Invoke-ServerInventorySQLBuild.ps1

I have defaulted the SQL server name to vSQL as that is my server in my lab environment so feel free to update it as needed or just call the script with the parameter –SQLServer and it will do the rest of the work for you! You can also see that I have my own version of Invoke-SQLCmd here. There are much better versions of this out in the wild so you can feel free to rip this one out and replace with a different one if you wish.

As for the column names and types, I tried to make it something that would ‘just work’ and also after some trial and error with figuring out a size for some of the data in the columns where I ran into errors when sending the data to SQL as something like a display name was longer than what was allowed in SQL. This might still be an issue for some of you depending on what you have running, but for that you can just the code and re-build the database.

Running the code is simple, just make sure that you point it to a valid SQL server and let it run!

.\Invoke-ServerInventorySQLBuild.ps1

It shouldn’t take too long to finish and after that, you can use SQL Server Management Studio (or whatever else you want to use) to validate that the database and tables have been created.

Part 2 will take us to performing the queries against remote server and sending the data collected up to the SQL server using a script that I wrote which will leverage my PoshRSJobs module for added multithreading.