wget --spider http://www.naver.com
--14:23:14--  http://www.naver.com/
           => `index.html'
Resolving www.naver.com... 222.122.195.6, 202.131.29.70
Connecting to www.naver.com|222.122.195.6|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
200 OK

header정보만 받아서 보여주는 기능이 spider 임.

http://linux.die.net/man/1/wget
--spider
When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. For example, you can use Wget to check your bookmarks:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the functionality of real web spiders.

'unix and linux' 카테고리의 다른 글

리눅스 커널 3.0.0-rc1 commit  (0) 2011.05.31
netstat를 완전 믿지 말자.  (0) 2010.08.18
프로세스별 cpu ration 구하기  (0) 2010.07.13
NAS 대 NAS 복사하기 (mount 이용)  (0) 2010.06.10
파일 하나만 Rsync하기  (0) 2009.10.08
Posted by '김용환'
,