Sites hosting open source projects provides an online viewer for browsing source code without actually checking out source code using SVN/Git clients. Checking out entire repository will take long time. Also with Git there is no straight forward way to checkout only a particular directory. Cloning Git repos takes long time as Git downloads entire repository to local machine. Even with sparse checkout, Git downloads entire repository. When bandwidth is a concern, one cannot checkout entire repository.



One can use GNU Wget to recursively download files from online code repositories. For windows this can be downloaded from http://users.ugent.be/~bpuype/wget/



Command to download a directory and its child directories and all files in it recursively excluding index.html is below. This will not download parent directories and files from external sites.



wget --cut-dirs=2 --level=15 --include-directories=src/main/java --recursive --no-parent --no-host-directories --reject=index.ht…