In principle great. But: every program should have one single purpose! This program does two things in one program: 1.) Scan directories for meta files and write extracted meta data into a DB, 2.) serve as DNLA server It would be much better to separate the two tasks into two programs. Why: now if you run it on a PC / Table with lots of computational power the scanning process is done in a matter of minutes or up to an hour if you have a large library. But on a slow device with low memory like a WLAN router this can really take a while. In this situation it would be great if you could use a fast PC to perform the scan an then save the result in to the SQL lite DB (maybe even with some GUI or at least some logging). Once the scan is done the DB would simply be copied to a the device and used there. Now one could say but you can do this on the Linux PC and then copy the DB to the device - but that's just a workaround for a poor design. Another design issue is to write the media information for each file individually (meatadata.c). That is SLOW. Every file causes an I/O for reading and another I/O for writing the SQL DB. It would be much faster to cue the SQL INSERT / UPDATE statements and then run like 100 at a time at once in a single transaction / batch. Most router vendors using minidlna hence use temp memory for the DB which makes it fast, but limits the number of files in the list as they quickly run out of memory. But still thew software does a pretty good job. I had a quick look at the source code - my god - terrible programming style. The DB connection is a global variable... The code if overloaded with comments and no there is absolutely no spaghetti code. :-) Even in C one can split up a function into several sub-functions.