When downloading at 2022-08-26, it was found that the SSL certificate of the AkamaiCDN used by the video source had expired, so the AriaNG panel could not be downloaded.
After some research, I found that the NG panel could not be adjusted and the certificate could not be verified, which may be related to the RPC connection method. Then I added check-certificate=false to the Config of Aria2C and I found that it still did not work.
This article introduces several solutions (including Aria2c command line, Wget command line) that cannot be downloaded from the remote server.
Aria2c command line
It should be noted that if you use the Aria2.sh script to connect the Airang panel, then you must be running and using RPC. If you use the same default configuration file to start with the default port, it will report an error that the address is occupied.
The recommended workaround is to use the aria2.sh script to stop aria first. The default configuration file of aria2 installed by this script is in the /root/.aria2c directory
Specify the configuration file
Then I tried the command to add without checking the certificate, and found that it still had no effect.
aria2c --conf-path="/root/.aria2c/aria2.conf" --check-certificate=false --disable-ipv6aria2c --conf-path="/root/.aria2c/aria2.conf"
Download command example
aria2c "http://www.baidu.com/test.mp4" -d "/root/downloads" -o "baidu.mp4" --check-certificate=false -x 1024 -s 64 -c
It should be noted that this command runs in the foreground by default and cannot be turned off directly like wget.
- -d followed by the download directory folder
- -o followed by the filename
- -x followed by the maximum number of threads
- -s followed by the current task thread number
- –check-certificate=false does not check the certificate
- -c is to resume the transfer from a breakpoint
For other parameters, please refer to the official documentation https://aria2.github.io/manual/en/html/aria2c.html
Recently, there is a download link for video source network disk. I found that it cannot be downloaded normally in ariang, and wget and idm are normal.
The link format is https://ift.tt/Q4Syk0s start/xxx/x/xx/xxxx-720p.mp4 like this
Then I found that the IP at the beginning is the server ipv6, although it is an ASN with ipv4
At the same time, the airang panel is forced to disable ipv6. At least the one I use cannot enable ipv6. It may cause an error due to the inconsistency between the authorized IP and the download IP. In this case, you can consider disabling the server ipv6, or dns blocking ipv6 to solve it.
Then an error may be reported when downloading with aria2c (downloading directly with ariang has no progress)
[ERROR] CUID#7 – Download aborted. URI=https://ift.tt/Tn6DjuN
Exception: [AbstractCommand.cc:351] errorCode=22 URI=https://ift.tt/Tn6DjuN
-> [HttpSkipResponseCommand.cc:269] errorCode=22 The response status is not successful. status=403
However, I changed https to http several times, and sometimes I can download it again. Of course, it is best to solve the problem of ipv6.
WGET command line
By default, wget will continue to download in the background, so you can turn it off directly. Of course, the commonly used wget is single-threaded, while IDM and Aria are multi-threaded downloads. If the other party’s speed limit or network speed is poor, the single-threaded speed may be very impressive.
Refer to the download command
wget -O "/root/downloads/baidu.mp4" "http://www.baidu.com/test.mp4" --no-check-certificate -c -T 30 -t 5 -d --user-agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/22.214.171.124 Safari/537.36"
- -O followed by the file name in the path (usually the file name of the absolute path is used)
- –no-check-certificate do not check certificates
- -c resume the upload from a breakpoint
- -T followed by timeout wait time
- -t followed by the number of retries
- -d, –debug print debug output
- –user-agent followed by UA
For other command parameters, please refer to the following block
Wget parameter ==> expand/collapse
-h, –help print syntax help
-b, –background switch to background execution after startup
-e, –execute=COMMAND Execute commands in `.wgetrc’ format, see /etc/wgetrc or ~/.wgetrc for wgetrc format
* Record and input files
-o, –output-file=FILE write records to FILE
-a, –append-output=FILE append records to FILE
-d, –debug print debug output
-q, –quiet Quiet mode (no output)
-v, –verbose verbose mode (this is the default)
-nv, –non-verbose turn off verbose mode, but not silent mode
-i, –input-file=FILE download URLs that appear in FILE
-F, –force-html treat input files as HTML formatted files
-B, –base=URL Prefix URLs to relative links that appear in files specified by -F -i parameters
–sslcertfile=FILE optional client certificate
–sslcertkey=KEYFILE Optional client certificate KEYFILE
–egd-file=FILE specifies the file name of the EGD socket
–bind-address=ADDRESS Specify the local address (hostname or IP, used when there are multiple IPs or names locally)
-t, –tries=NUMBER set the maximum number of attempts to connect (0 means unlimited).
-O –output-document=FILE write the document to the FILE file
-nc, –no-clobber don’t overwrite existing files or use .# prefix
-c, –continue then download unfinished files
–progress=TYPE set progress bar marker
-N, –timestamping don’t redownload file unless newer than local file
-S, –server-response print server response
–spider don’t download anything
-T, –timeout=SECONDS set response timeout in seconds
-w, –wait=SECONDS SECONDS seconds between attempts
–waitretry=SECONDS wait 1…SECONDS seconds between relinks
–random-wait wait 0…2*WAIT seconds between downloads
-Y, –proxy=on/off Turn proxy on or off
-Q, –quota=NUMBER set download capacity limit
–limit-rate=RATE limit download rate
* Table of contents
-nd –no-directories do not create directories
-x, –force-directories force the creation of directories
-nH, –no-host-directories do not create host directories
-P, –directory-prefix=PREFIX save files to directory PREFIX/…
–cut-dirs=NUMBER Ignore NUMBER level remote directories
* HTTP options
–http-user=USER Set the HTTP user name to USER.
–http-passwd=PASS Set the http password to PASS.
-C, –cache=on/off allow/disallow server-side data caching (usually allowed).
-E, –html-extension save all text/html documents with .html extension
–ignore-length ignore the `Content-Length’ header
–header=STRING insert string STRING in headers
–proxy-user=USER set the proxy username to USER
–proxy-passwd=PASS Set the proxy password to PASS
–referer=URL include the `Referer: URL’ header in HTTP requests
-s, –save-headers save HTTP headers to file
-U, –user-agent=AGENT Set the agent’s name to AGENT instead of Wget/VERSION.
–no-http-keep-alive Turn off HTTP active links (permanent links).
–load-cookies=FILE load cookies from file FILE before starting session
–save-cookies=FILE save cookies to FILE after session ends
* FTP option
-nr, –dont-remove-listing do not remove `.listing’ files
-g, –glob=on/off Turn on or off the globbing mechanism for filenames
–passive-ftp Use passive transfer mode (default).
–active-ftp use active transfer mode
–retr-symlinks When recursing, point links to files (instead of directories)
* recursive download
-r, –recursive recursive download — use with caution!
-l, –level=NUMBER Maximum recursion depth (inf or 0 for infinity).
–delete-after delete files partially after the present
-k, –convert-links convert non-relative links to relative links
-K, –backup-converted back up file X as X.orig before converting it
-m, –mirror Equivalent to -r -N -l inf -nr.
-p, –page-requisites download all images showing HTML files
* Inclusion and exclusion in recursive download (accept/reject)
-A, –accept=LIST semicolon separated list of accepted extensions
-R, –reject=LIST Semicolon-separated list of unaccepted extensions
-D, –domains=LIST semicolon separated list of accepted domains
–exclude-domains=LIST Semicolon-separated list of unaccepted domains
–follow-ftp Follow FTP links in HTML documents
–follow-tags=LIST Semicolon separated list of HTML tags to follow
-G, –ignore-tags=LIST Semicolon separated list of HTML tags to ignore
-H, –span-hosts go to external hosts when recursing
-L, –relative only follow relative links
-I, –include-directories=LIST list of allowed directories
-X, –exclude-directories=LIST list of excluded directories
-np, –no-parent don’t trace back to parent directory
wget -S –spider url does not download but only shows the process
Available if username and password are available
- –http-user=USER set HTTP user
- –http-passwd=PASS set HTTP password
If you need an HTTP proxy, you can first create a .wgetrc file in the current user’s directory. The proxy server can be set in the file:
Represents the proxy server for http and the proxy server for ftp, respectively. If the proxy server requires a username and password then use:
- –proxy-user=USER set proxy user
- –proxy-passwd=PASS set proxy password
Last parameter side, use –proxy=on/off to use or not use proxy
This article is reprinted from https://www.blueskyxn.com/202209/6510.html
This site is for inclusion only, and the copyright belongs to the original author.