This project was developed for a really old release of Synology's Disk Station Manager. It is no longer maintained and probably won't work on modern systems. Take a look at GitHub to find modern alternatives.
This documentation is kept online for historical reasons.
This tool provides access to the Synology Download Station on your Synology NAS device from the command line. You can add, remove, pause, resume, and list the download jobs.
The main motivation for developing this tool was the lack of a function to add multiple URLs in the web interface and the client from Synology, and the wish to control Download Station via SSH from remote locations without port forwarding. Moreover, there are people who like text-based interfaces better, so this is nice for those people, too.
Please note: If you do not care about controlling Download Station via SSH or prefer a GUI over the CLI, give SynoDownloader a try. Downloadstation CLI is mainly aimed at people with some experience with the CLI.
If you are interested now, you can download the program here.
Please keep in mind that this program does not support all features of Synology Download Station, but only downloads via HTTP, FTP and BitTorrent. Furthermore, I could only test it on my own Synology device (DS207+). So if you encounter any difficulties or unexspected behaviour, feel free to report it using the email address below.
It is possible to run the tool on a host other than the disk station. However, this requires modifying the configuration of the database running on the disk station. If there are people who are interested in doing this, I will write a tutorial on that topic in the future.
To use Downloadstation CLI, you will have to activate SSH or telnet on your diskstation and install ipkg:
Enable SSH or telnet
Install ipkg
You need Python 2.4 and PgSQL installed to run the program. To do this, ssh (or telnet) into your diskstation as root and type:
After that, copy the file to your NAS. The location should be in the PATH (e.g. /opt/bin). Check that the file can be executed and then type
and hopefully, you will be greeted by interactive mode.
That is all you need to use most of the features of the script. The additional installation steps described below are optional and can be performed at a later time.
In newer firmware versions the download progress is only written to the database if a flag in a partition of Unix shared memory is set. Since the script does not use the web interface, this does not happen automatically.
The solution is a small C snippet based on the (GPL'd) source code of the Disk Station firmware. This program will set the required flag.
You need gcc installed to compile the program for your DS:
If gcc is installed properly and on the PATH, execute the following steps to compile the program and install:
Alternatively, you can download an installation script that will execute those commands for you.
After installation a new command dls is available. This is a script that first calls downloadstation_shm to update the shared memory and then the downloadstation script. Use dls instead of downloadstation and your database will always hold up-to-date values for running downloads.
You can also use the downloadstation_shm program to manually read/write to the shared memory section:
If you want to use the script to extract zip and rar files it is necessary to install unzip and unrar and have them on the PATH.
For more information read the Download Packages section.
There are two methods to use the program. You can either type "downloadstation <command> <options> <arguments>" to do a single operation or type "downloadstation <options> to enter interactive mode. In this documentation we will focus on the former as the commands in interactive mode work similar. In the following text, each command will be described by first providing one or more examples and then explaining what it does.
Prints a list of the current download jobs. The arguments contain the properties which you want to be displayed. When ommitted, the defaults are task, filename, status.
Valid arguments are:
Same as list, but shows torrents only. Default displayed columns are task, filename, status, rate, upload_rate, connected_peers.
Accepts the same arguments as list does. Displays the table with download jobs and refreshes it every second.
Navigation is done with either the cursor keys or the VI movement keys (h, j, k, l). To exit, press q, Escape or Ctrl-C.
Same as monitor, but for torrents only.
Adds download jobs for the resources given as arguments.
It is also possible to parse URLs from text read from stdin. To do this call add without arguments:
This will add all URLs occuring in someurls.txt.
But wait, there is more. Consider you want to download some files listed on a web page, but the webpage also contains links to files you do not want. Using
would add every linked resource. To solve this, there is the option -e or --expression which lets you specify the URLs you want by a wildcard pattern:
The first example would add all URLs ending with .jpg, so you get all JPEG images embedded in somesite.html while the second would add all rapidshare links.
As of version 1.0 duplicates are filtered automatically when parsing URLs.
You can also use the -t (or --target) option to specify a subdirectory for the downloaded files. Note that this will result in an error if your version of the DS firmware does not support subdirectories for downloads. Using subdirectories in combination with Download Packages is not supported at the moment.
Creates download jobs from the supplied torrent files.
Removes any download jobs which are completed. Does not remove download jobs which resulted in error so you can use this command without having to worry about deleting something containing useful information.
Pauses specified downloads. You can specify the ids of the jobs you want to pause in the arguments. The keyword "all" pauses all incomplete download jobs. Useful if you have a rather large batch of jobs but need the bandwidth for the moment.
Works the same as pause, but resumes paused jobs.
Works the same as pause, but restarts jobs that failed, i.e. that have a status of 101 (ERROR) or 107 (TIMEOUT).
Removes a download job. Same syntax as pause. Use with care!
short | long | Meaning |
---|---|---|
-u | --user | Restrict all actions to jobs belonging to USER |
-s | --host | Host of the downloadstation database |
-p | --port | Port of the downloadstation database |
-h | --help | Display help |
/ | --version | Show version information |
As of version 1.5 it is possible to group a list of download tasks. This comes with two main features:
There are a few steps required before you can use this feature.
At the moment, the script assumes that your download directory is /volume1/downloads. If this is not the case, package management will not work unless you search the script for /volume1/downloads and replace it with the proper location for your system.
For each package a file containing metadata will be created in a subfolder in your home directory (e.g. ~/.downloadstationcli/packages/mypackage). The file is removed when the package is cleaned.
Automatic extraction requires the appropriate binaries to be installed and on the PATH. Currently supported are rar and zip files. The required binaries are unrar and unzip. Install them with "ipkg install unrar unzip".
Packages are identified by their names. In the examples below, the package name "mypackage" is used.
When creating a package, files are subdivided into groups. Usually every file gets its own group. However, files of supported multi-part archives (currently *.partX.rar and *.rXX) are put into the same group. The grouping is necessary to avoid extracting a multi-part archive before all files are downloaded or calling unrar on each of the parts.
It is not intended that the package functionality (especially the process and clean operations) is used simultaneously by different processes. This will have unexpected results. Don't do it!
This a short description of the typical lifecycle of a package named "mypackage".
The first example returns a list of all packages.
The second example does the same as "downloadstation list" but excludes all tasks that are not part the given package.
Creates a new package with the specified name. URLs are read from stdin the same way as using "downloadstation add".
If the package contains encrypted files that are supported (currently rar/zip) the content of the last non-empty line is used as password for extracting. Note that heading and trailing whitespaces are stripped.
Sets the password used to extract the files of a package. This is useful if the password was forgotton or mistyped during creation.
Pauses all download tasks that belong to given package.
Resumes all download tasks that belong to given package.
Processes all unprocessed files of the package. This means that a subfolder named after the package will be created (if it does not exist) and finished downloads are moved to the subfolder.
If the files are zip or rar archives, they will be extracted (to the subdirectory) instead of moved.
If you call process on a package that is still missing some files, the already downloaded files will be processed. If you call it again later, only files that were not processed before will be affected. You do not have to worry about files getting processed multiple times.
After a package is processed, it may be cleaned using this command. Cleaning will have the following effects:
This command does the same as clean, except it will remove a package even if it has not been processed yet. This means that if you use this command on an unprocessed package, it will delete the download tasks and delete the downloaded files. Use with care!
You should only use this command if you extracted files manually or moved/copied them somewhere else (or if you do not need them anymore).
A convenience command that first calls process and then clean on the specified package.
If you have multiple packages you may use the all keyword to apply an action to each of them. For instance, downloadstation pkg list all would list the download tasks of all packages.
It is especially convenient for processing and cleaning as you might forget how you named a package the day before. Note that it is always safe to call downloadstation pkg process all or downloadstation pkg clean all regardless of the state of the download. Files that are not downloaded will not be processed and files that are not processed will not be cleaned.
It would even be possible to truly automize processing of downloaded files by creating a cronjob that calls downloadstation pkg pac all every hour or so. (This would interfere with hibernation of the hard disks though, so it is not recommended.)
Due to all being a keyword trying to create a package named "all" will result in an error message.