Hello,
Assuming I have access to a directory shared through http (by enabling directory listing), how could I count the directory size in depth?
Is there a tool or a script?
Thank you,
Mike
sum_bytes=0
for content_length in $(wget --spider -S -q http://some-web-page/ 2>&1 | grep Content-Length: | grep -o "[0-9]\+")
do
sum_bytes=$(($sum_bytes+$content_length))
done
echo "scale=2; $sum_bytes/1024/1024 " | bc
man wget wrote:--ignore-length
Unfortunately, some HTTP servers ( CGI programs, to be more precise) send out bogus "Content-Length" headers, which makes Wget go wild, as it thinks not all the document was retrieved. You can spot this syndrome if Wget retries getting the same document again and again, each time claiming that the (otherwise normal) connection has closed on the very same byte.
With this option, Wget will ignore the "Content-Length" header---as if it never existed.
[noneco@nyra2 ~]$ sum_bytes=0
[noneco@nyra2 ~]$ for content_length in $(wget --spider -S -q -r -l2 http://ftp.mandrivauser.de/magazin/ 2>&1 | grep Content-Length: | grep -o "[0-9]\+")
> do
> sum_bytes=$(($sum_bytes+$content_length))
> done
[noneco@nyra2 ~]$ echo "scale=2; $sum_bytes/1024/1024 " | bc
456.02
script.sh http://sample/
Users browsing this forum: No registered users and 1 guest