In some sense, it is a UI convention with history that goes back all the way to 1984. Since Windows and X11 both post date the original Mac GUI, one might say that Windows does it the Windows way "just to be different" rather than suggesting that the Mac is the oddball.

Back in the earliest days of the Macintosh, you could only run one application at a time. It was perfectly reasonable for an application to open with no windows because the application always had a visible menu bar at the top of the screen. When you closed all the windows of an application, it made sense to keep the application open because you could always use the menu bar to create a new document, or open an existing one. Exiting the process just because a window was closed didn't make any sense at the time, because there would have been no other process to yield focus to.

A few years on, the Macintosh of the late 80's advanced to the point where there was enough memory to have multiple applications open at once. Since the tools for doing this had to retain backwards compatibility with existing applications, they naturally weren't going to change the basic UI conventions and go killing applications without any windows open. The result was a clean distinction in the UI between a visual GUI element (a window), and an abstract running process (the application).

Meanwhile, Microsoft had been developing Windows. By the early 90's, Microsoft had Windows 3.X working well, and Motif on X11 had been heavily inspired by Microsoft's work. While the Macintosh was built around presenting a UI of Applications, Windows (as the name would suggest) was built around the philosophy that the Window itself should be the fundamental unit of the UI, with the only concept of an application being in the form of MDI style container windows. X11 also considered an application largely unimportant from a UI standpoint. A single process could even open up windows on multiple displays connected to several machines across a (very new-fangled) local area network.

The trouble with the Windows style approach was that you couldn't do some forms of user interaction, such as opening with just a menu bar, and the user had no real guarantee that a process had actually exited when the windows were gone. A Macintosh user could easily switch to an application that was running without windows to quit it, or to use it, but Windows provided absolutely no way for the user to interact with such a process. (Except to notice it in the task manager, and kill it.) Also, a user couldn't choose to leave a process running so that they could get back to it without relaunching it, except to keep some visible UI from the process cluttering up the screen, and consuming (at the time, very limited) resources. While the Macintosh had an "Applications" menu for switching, Windows popularised a "task bar," which displayed all top level windows without any regard for the process that had opened them. For heavy multitaskers, the "task bar soup" proved unweildy. For more basic users, the upredictability about what exactly qualified as a "top level window" was sometimes confusing as there was no learnable rule about exactly which windows would actually show up on the bar.

By the late 90's, Microsoft's GUI was the most commonly used. Most users has a Windows PC rather than a Macintosh or a UNIX X11 workstation. Consequently, as Linux grew in popularity over time, many developers were coming from a background of using Windows UI conventions rather than UNIX UI conventions. That combined with the history of early work on things like Motif drawing from Windows UI conventions, to result in modern Linux desktop environments behaving much more like Windows than classic X11 things like twm or the Macintosh.

At this point, "classic" Mac OS had run its course with Mac OS 9, and the Macintosh became a Unix powered machine with very different guts in the form of Mac OS X. Thus, it inherited the NeXT UI concept of a Dock. On the original NeXT machines, X11 was used, but with a fairly unique set of widgets and UI conventions. Probably the most distinctive of them was the Dock, which was a sort of combination program launcher and task switcher. (The "multicolumn" open file dialog box that is known in OS-X also came from NeXT, as well as some other visible things. The most significant changes in the OS-X transition were all the invisible ones, though.) The Dock worked well with the Macintosh's concept of "Application as the fundamental UI element." So, a user could see that an application is open by a mark on the dock icon, and switch to it or launch it by clicking on it. Since modern OS-X now supported multitasking so much better than the classic Mac OS had, it suddenly made sense that a user might want to have all sorts of things running in the background, such as some video conversion software that cranks away in the background, a screen recorder, VOIP software, Internet Radio, a web server, something that speaks in response to a spoken command, etc. None of that stuff necessarily requires a visible window to be open to still have a sensible user experience, and the menu bar was still separate from the windows at the top of the screen, and you could have a menu directly on the dock icon, so a user could always interact with a program that had no open UI. consequently, ditching the existing convention of keeping an application open, just to be more like Windows, would have been seen by most Mac users as a horrible step in the wrong direction. It makes several modes of interaction impossible, with no real benefit.

Obviously, some users prefer the Windows convention, and neither is "provably correct." But, migrating away from something useful like that, without any good reason would just make no sense. Hopefully, this tour through some of the history gives you a bit of context that you find useful.