2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 47
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 34
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (7) is25
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 7 is dead, restarting it.
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 7 started.
2013-05-22 13:46:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 13:46:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:46:57 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 7 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is27
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.88/, null]
2013-05-22 13:47:02 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=ddff6c6cd441a66b:FF=0:TM=1369244822:LM=1369244822:S=okc5e3rdadxZewJr][domain: .google.co.in][path: /][expiry: Fri May 22 13:47:02 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.88"
2013-05-22 13:47:02 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=EKm2xCdOBmd-pBhTtPxL0dGWLSdd75x6oSpTx_BzWiKiQhE6FV6J28ICXtVw_jalVlPY6-Uj_dgG0OVbFsSiMMtYJXTr7awl5_juzI7Ocn1Q-NkbjdhMNurkPw5nujnl][domain: .google.co.in][path: /][expiry: Thu Nov 21 12:47:02 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.88"
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is28
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-22 13:47:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:02 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 13:47:03 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: MySQL_S][value: g2qsjbfr8ul4kk86mudefgak916mhp0v][domain: mysql.com][path: /][expiry: null]". Illegal domain attribute "mysql.com". Domain of origin: "137.254.60.11"
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 95
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is29
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-22 13:47:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:06 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is30
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-22 13:47:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:08 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 13:47:13 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 3 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-22 13:47:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is31
2013-05-22 13:47:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-22 13:47:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-22 13:47:13 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 13:47:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:14 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 093c69ce82475df5ad2b0b2fdc0d04c7][domain: .palominodb.com][path: /][expiry: Fri Jun 14 17:20:36 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is32
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-22 13:47:15 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 13:47:15 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is33
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-22 13:47:16 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 13:47:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 47
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 34
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (7) is34
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 7 is dead, restarting it.
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 7 started.
2013-05-22 13:47:19 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 13:47:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:47:24 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 7 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-22 13:51:48 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.init.Initializer]:93 - previous instance of crawler is forcebly stopped
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.init.Initializer]:94 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-22 13:51:48 DEBUG - [com.wsc.crawler.init.Initializer]:198 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-22 13:51:48 WARN  - [com.wsc.crawler.init.Initializer]:207 - prev_frontier.xml is not found in ./temp
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.init.Initializer]:209 - Trying get URLs from Default URL Source, Frontier Server
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 13:51:48 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 13:51:48 DEBUG - [com.wsc.crawler.init.Initializer]:176 - Frontier Server returned a Frontier of size 9
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :8
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :1
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 8
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 13:51:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 13:51:52 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=d3e3cd6aae9f2727:FF=0:TM=1369245112:LM=1369245112:S=MlHb8d8_e4IJBJdY][domain: .google.co.in][path: /][expiry: Fri May 22 13:51:52 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 13:51:52 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=Wr1TLPkOHccE9vKzzd_WpM5RFUHtHs9PqIvyFkY-Wh0Y8NAA6BlwiC-JIEYedd7EphQ0aEfLIKRafExB_rEh9k1ugi1yPqSYApSYy6FSsj7DvivJHV-dMuwhykJtMAy6][domain: .google.co.in][path: /][expiry: Thu Nov 21 12:51:52 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:51:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:51:52 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:52 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:51:53 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: MySQL_S][value: na8nesc3a0guq9h2uq56h2rqvde5dm9e][domain: mysql.com][path: /][expiry: null]". Illegal domain attribute "mysql.com". Domain of origin: "137.254.60.11"
2013-05-22 13:51:53 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 38f6cbb463ad200b8f8b47cba1b94062][domain: .palominodb.com][path: /][expiry: Fri Jun 14 17:25:15 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-22 13:51:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-22 13:51:54 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:54 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-22 13:51:55 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:55 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-22 13:51:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-22 13:51:55 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:55 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 47
2013-05-22 13:51:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 34
2013-05-22 13:51:56 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:56 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 13:51:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 95
2013-05-22 13:51:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:51:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 13:52:02 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is9
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 13:52:12 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=4b7a04a2e28d6b80:FF=0:TM=1369245132:LM=1369245132:S=YlLb83rMwluj48qa][domain: .google.co.in][path: /][expiry: Fri May 22 13:52:12 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 13:52:12 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=s8xGp8_hdLk3YRQLkUMZY9nJA2m_6sbqe7ZFsP9TWR9tZ-4JkPAyEi2lAsb_TDW1WO9z91hj222Ec6idqhjqH8_gWVp-vHqSCBPmqvJyd5uL-j9FBWKGYa0CcVl9X-Gp][domain: .google.co.in][path: /][expiry: Thu Nov 21 12:52:12 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 13:52:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 13:52:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is10
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-22 13:52:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 13:52:12 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 13:52:12 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: MySQL_S][value: rdh35hcng1pnbocj0qgkkb5d6jkfpv6g][domain: mysql.com][path: /][expiry: null]". Illegal domain attribute "mysql.com". Domain of origin: "137.254.60.11"
2013-05-22 14:01:29 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.init.Initializer]:93 - previous instance of crawler is forcebly stopped
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.init.Initializer]:94 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-22 14:01:29 DEBUG - [com.wsc.crawler.init.Initializer]:198 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-22 14:01:29 WARN  - [com.wsc.crawler.init.Initializer]:207 - prev_frontier.xml is not found in ./temp
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.init.Initializer]:209 - Trying get URLs from Default URL Source, Frontier Server
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 14:01:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:30 DEBUG - [com.wsc.crawler.init.Initializer]:176 - Frontier Server returned a Frontier of size 9
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :8
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :1
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 8
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 14:01:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 14:01:32 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=02c23e15b63135a1:FF=0:TM=1369245692:LM=1369245692:S=JNtfdQhZhZeReC3j][domain: .google.co.in][path: /][expiry: Fri May 22 14:01:32 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 14:01:32 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=ITvkd0Ap6vax4uAEKSh2ox8QYJclB3IHBt0Yc1lJ2ilFd7jLdAGx9OfI3IQOShICNGNzkezyKSstds84wv54YlSfCjYQexBStnx0G3I2E_nz-S9vAPXJpUeTV8A-ejet][domain: .google.co.in][path: /][expiry: Thu Nov 21 13:01:32 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 14:01:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 14:01:32 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:32 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:32 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:32 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:32 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:32 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:32 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:32 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: MySQL_S][value: alags57c2uqtvj3vjg76onq6iu2o2jt8][domain: mysql.com][path: /][expiry: null]". Illegal domain attribute "mysql.com". Domain of origin: "137.254.60.11"
2013-05-22 14:01:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 0a6dabdec51067606e17519bd73dd533][domain: .palominodb.com][path: /][expiry: Fri Jun 14 17:34:56 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-22 14:01:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-22 14:01:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 47
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 34
2013-05-22 14:01:34 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:34 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-22 14:01:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-22 14:01:34 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:34 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:34 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-22 14:01:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-22 14:01:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 95
2013-05-22 14:01:37 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:37 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:37 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:37 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:37 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:37 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:37 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:42 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is9
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.87/, null]
2013-05-22 14:01:52 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=0543e5768a1b332b:FF=0:TM=1369245712:LM=1369245712:S=bTKJXn0pCLM-_LIQ][domain: .google.co.in][path: /][expiry: Fri May 22 14:01:52 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 14:01:52 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=ZsOI2IvLXlslDGyCHsc6jEbWnb9H-W5lYFo-qKBdUcTOkGXO9H3WzD78Lja2UJONeO39sQ0wySg7njZg-HJ57TWT29wA_5ktYa3kdz3HS1xQC6zvy_GevMgh-yIDbg__][domain: .google.co.in][path: /][expiry: Thu Nov 21 13:01:52 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.87"
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-22 14:01:52 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:52 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:52 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:52 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:52 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:52 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:52 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is10
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-22 14:01:52 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:01:52 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html, http://137.254.60.11/doc/refman/5.5/en/replication-options-slave.html, null]
2013-05-22 14:01:53 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: MySQL_S][value: 36ohvm5c0d5gcvu5530366d3gqtaeouu][domain: mysql.com][path: /][expiry: null]". Illegal domain attribute "mysql.com". Domain of origin: "137.254.60.11"
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html).
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 0
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 325
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://dev.mysql.com/doc/refman/5.5/en/replication-options-slave.html) is 95
2013-05-22 14:01:56 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:56 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:56 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:56 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:56 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:56 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:56 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is11
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-22 14:01:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:01:56 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-22 14:01:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:01:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:01:57 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:01:57 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:01:57 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:01:57 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:01:57 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is12
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-22 14:01:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:01:57 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-22 14:01:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is13
2013-05-22 14:01:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-22 14:01:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-22 14:01:58 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-22 14:01:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:01:59 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: e869b94b1bd9179de6df15c8221a9381][domain: .palominodb.com][path: /][expiry: Fri Jun 14 17:35:22 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-22 14:02:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:02:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:02:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:02:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:02:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:02:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:02:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is14
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-22 14:02:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:02:01 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-22 14:02:02 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:02:02 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:02:02 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:02:02 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:02:02 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:02:02 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:02:02 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is15
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-22 14:02:02 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:02:02 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 47
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 47
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 34
2013-05-22 14:02:04 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-22 14:02:04 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-22 14:02:04 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-22 14:02:04 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-22 14:02:04 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-22 14:02:04 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-22 14:02:04 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (7) is16
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 7 is dead, restarting it.
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 7 started.
2013-05-22 14:02:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :8
2013-05-22 14:02:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-22 14:02:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-22 14:02:11 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-22 14:02:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-22 14:02:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-22 14:02:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-22 14:02:11 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-22 14:02:11 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-22 14:02:11 DEBUG - [com.wsc.Crawler.WSCrawler]:82 - Cleaning Crawler resources.
2013-05-22 14:02:11 DEBUG - [com.wsc.Crawler.WSCrawler]:85 - Deleting Lock file
2013-05-22 14:02:11 DEBUG - [com.wsc.Crawler.WSCrawler]:87 - Lock file Deleted
2013-05-24 13:33:20 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.init.Initializer]:85 - previous instance of crawler is cleanly stopped
2013-05-24 13:33:20 DEBUG - [com.wsc.crawler.init.Initializer]:50 - Creating lock file.
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.init.Initializer]:54 - Lockfile crawler.lock created successfully.
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.init.Initializer]:87 - Getting URLS from Source, which is described in crawler-core.xml file
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.init.Initializer]:110 - Getting Seed urls from Frontier Server
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:33:20 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-24 13:33:20 WARN  - [com.wsc.crawler.init.Initializer]:158 - Looks like Frontier server is not running
2013-05-24 13:33:20 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-24 13:33:20 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-24 13:33:50 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:33:50 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-24 13:33:50 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-24 13:34:20 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:34:20 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-24 13:34:20 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-24 13:34:50 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:34:50 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-24 13:34:50 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-24 13:35:20 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:35:20 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-24 13:35:20 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-24 13:35:50 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-24 13:35:50 ERROR - [com.wsc.crawler.init.Initializer]:170 - Unable to Connect Frontier Server because, whether it is not running or not reachable
2013-05-24 13:35:50 ERROR - [com.wsc.crawler.init.Initializer]:171 - Exiting Crawler
2013-05-24 14:04:00 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.init.Initializer]:93 - previous instance of crawler is forcebly stopped
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.init.Initializer]:94 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-24 14:04:00 DEBUG - [com.wsc.crawler.init.Initializer]:198 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-24 14:04:00 WARN  - [com.wsc.crawler.init.Initializer]:207 - prev_frontier.xml is not found in ./temp
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.init.Initializer]:209 - Trying get URLs from Default URL Source, Frontier Server
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:00 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:00 DEBUG - [com.wsc.crawler.init.Initializer]:176 - Frontier Server returned a Frontier of size 9
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :7
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :2
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 7
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:04:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:04:04 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=3f1bbed06ad93493:FF=0:TM=1369418644:LM=1369418644:S=8lFFGmLTFLrlbJK1][domain: .google.co.in][path: /][expiry: Sun May 24 14:04:04 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:04:04 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=qIfJKW1hiStiM-9exBdvjfq_7WDr3spfz2haorD0zV0MGG-X9azWcdbADZwqwzmYq-uI8MIPpwBWEKOjxpzEUmeB3aQnwzAhQ2oNRcF4qGTJJ66e1WfhV5JO3bMqBS0o][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:04:04 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:04 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:04 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:04 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:04 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:04 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:04 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:04 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:04 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:05 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: ce8584217702765dc905de2bdd7490b2][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:37:31 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:04:05 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:04:05 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:05 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:05 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:05 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:04:06 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:06 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:04:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:04:06 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:06 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:06 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:04:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:04:08 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:08 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:08 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:08 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:08 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:08 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:08 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:13 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is9
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:04:23 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=3ad86b3df7def09f:FF=0:TM=1369418663:LM=1369418663:S=y7YYvqgpdbhoIck4][domain: .google.co.in][path: /][expiry: Sun May 24 14:04:23 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:04:23 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=dO4ZRdSeXuoddpmWBJRfM1fYHT23vAN7np79p_-ET3RdYrYSVvKzstHrtnLyMLFQYWBwEUX77MlmCN8VNn9RPr3d1yVHt-0920rZomjnzfsj0GdO86dNm1uUph6Sd-ip][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:04:23 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:23 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:23 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:23 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:23 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:23 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:23 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:23 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is10
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:04:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:23 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:04:26 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:26 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:26 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:26 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:26 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:26 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:26 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is11
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:04:26 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:04:26 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is12
2013-05-24 14:04:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:04:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:04:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:27 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:04:29 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: ccfb3b43e391e8bce5a9045744b14007][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:37:55 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:04:31 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:31 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:31 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:31 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:31 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:31 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:31 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is13
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:04:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:31 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:04:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is14
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:04:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:04:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:04:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is15
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:04:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:35 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is17
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.152/, null]
2013-05-24 14:04:43 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=490fa0a77f0d699f:FF=0:TM=1369418683:LM=1369418683:S=0ABjewF9Pq4UU8bW][domain: .google.co.in][path: /][expiry: Sun May 24 14:04:43 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:04:43 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=v08qQtjlmk2FSqP1gYyl9hL306m5d-kMkK2e1VOdsworfy4npPFM2F6GWZ6wE_YovPSqdRnnEUFATOh7AW7QKh2KSEcKeKv1U_6QgztCv7rAAY-rgmO5DZ4DXDL-cais][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:04:43 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:04:43 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:43 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:43 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:43 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:43 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:43 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:43 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is18
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:04:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:43 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:04:45 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:45 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:45 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:45 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:45 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:45 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:45 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is19
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:04:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:45 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:04:50 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:04:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is20
2013-05-24 14:04:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:04:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:04:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:50 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:04:51 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: caa6e2c1636dcab2db64d01ca249b289][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:38:17 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:04:53 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:53 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:53 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:53 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:53 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:53 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:53 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is21
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:04:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:53 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:04:54 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:54 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:54 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:54 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:54 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:54 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:54 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is22
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:04:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:54 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:04:56 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:04:56 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:04:56 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:04:56 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:04:56 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:04:56 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:04:56 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is23
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:04:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:04:56 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:05:01 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is25
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.159/, null]
2013-05-24 14:05:06 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=63a73e91e4bc21d6:FF=0:TM=1369418706:LM=1369418706:S=eLJ11mxUVHjRc8p7][domain: .google.co.in][path: /][expiry: Sun May 24 14:05:06 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:05:06 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=IIL8Xek5TG2dFiX-GcAy92Px4hYbuJ-p6IIsQGX1-bybG6l9qOOuP36u_ytNG4LlbW6EoSotQLnj8ZkX69drDJlwan5fcqRNKlczwnFZvgd1oKfWYVw2D6d_uK4pshba][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:05:06 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:06 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:06 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:06 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:06 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:06 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:06 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:06 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is26
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:05:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:06 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:05:09 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:09 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:09 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:09 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:09 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:09 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:09 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is27
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:05:09 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:05:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:14 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:05:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is28
2013-05-24 14:05:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:05:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:05:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:14 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:05:15 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 4960a1de3e52f5cbc888a0381dd96d84][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:38:41 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:05:17 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:17 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:17 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:17 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:17 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:17 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:17 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is29
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:05:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:17 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:05:18 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:18 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:18 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:18 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:18 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:18 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:18 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is30
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:05:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:18 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:05:21 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:21 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:21 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:21 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:21 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:21 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:21 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is31
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:05:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:05:26 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is33
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:05:32 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=0e31aec2091a31ff:FF=0:TM=1369418732:LM=1369418732:S=GHrNhRYqyAjwdJXJ][domain: .google.co.in][path: /][expiry: Sun May 24 14:05:32 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:05:32 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=JeR4v4LyKVxUrbbNxgwlSFQ7aeHdgo_NrwdXBu_0d6R-FaUroPUzwrWDCi1RlLdo26Ii6Yef0GW1-Jz6u-pNY42fajrC0Ealk2VoNs7-UL928nJFST9_802Z6rawCxjJ][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:05:32 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:32 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:32 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:32 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:32 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:32 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:32 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:32 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is34
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:05:32 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:05:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:05:34 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:34 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:34 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:34 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:34 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:34 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:34 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is35
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:05:34 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:34 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:05:39 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:05:39 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is36
2013-05-24 14:05:39 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:05:39 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:05:39 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:39 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:05:40 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 38f376f8c75d823efc9a6cf88d0d1b8b][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:39:06 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:05:42 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:42 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:42 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:42 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:42 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:42 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:42 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is37
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:05:42 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:42 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:05:43 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:43 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:43 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:43 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:43 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:43 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:43 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is38
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:05:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:43 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:05:46 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:46 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:46 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:46 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:46 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:46 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:46 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is39
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:05:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:46 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:05:51 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is41
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.152/, null]
2013-05-24 14:05:56 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=c0a59035c1619d45:FF=0:TM=1369418756:LM=1369418756:S=to3z6uKQta__lH-p][domain: .google.co.in][path: /][expiry: Sun May 24 14:05:56 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:05:56 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=FUDJhicvYADAMBlKiymkhT3njqtF7VrOdct3WSOrgK2eG1DIf7r2Ayz7SofmJgRHYJzHl8UN7Rs2BVNgF5POMsrQql6-O9_37rqZbMuXJe1qjXCIYS5FJGKDVFsJZ_eU][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:05:56 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:05:56 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:56 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:56 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:56 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:56 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:56 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:56 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is42
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:05:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:56 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:05:58 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:05:58 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is43
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:05:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:05:58 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:06:03 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:06:03 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is44
2013-05-24 14:06:03 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:06:03 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:06:03 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:03 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:06:05 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: ce2cab2ba03ec3137378857a78f64f80][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:39:30 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:06:06 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:06 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:06 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:06 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:06 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:06 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:06 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is45
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:06:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:06 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:06:08 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:08 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:08 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:08 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:08 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:08 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:08 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is46
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:06:08 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:06:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:06:10 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:10 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:10 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:10 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:10 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:10 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:10 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is47
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:06:10 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:06:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:15 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is49
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.159/, null]
2013-05-24 14:06:20 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=4224633fdd236cc5:FF=0:TM=1369418780:LM=1369418780:S=WFIjnQlgo_-PnRK5][domain: .google.co.in][path: /][expiry: Sun May 24 14:06:20 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:06:20 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=Vcz-f2UYsu7KHz5g5HVdy5D3uPXCu4-M3XC3pHXfBNlCTkBLsbCW3QM6bGeiWnWcWLrV7auyVH7NTpNntWfeN_wGG_5_RwhH86mBRSyMZLQMnJfYHYH-8fqtIiIvKYTO][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:06:20 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:06:20 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:20 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:20 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:20 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:20 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:20 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:20 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is50
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:06:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:20 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:06:22 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:22 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:22 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:22 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:22 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:22 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:22 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is51
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:06:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:22 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:06:27 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:06:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is52
2013-05-24 14:06:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:06:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:06:27 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:06:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:28 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: ab13e2d5fe4596e956e6988295f67413][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:39:54 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:06:30 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:30 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:30 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:30 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:30 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:30 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:30 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is53
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:06:30 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:06:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:06:31 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:31 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:31 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:31 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:31 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:31 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:31 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is54
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:06:31 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:31 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:06:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is55
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:06:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:06:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:38 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is57
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:43 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:06:43 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=98a38facec9b3955:FF=0:TM=1369418803:LM=1369418803:S=L-dGt5eIb9smZRWJ][domain: .google.co.in][path: /][expiry: Sun May 24 14:06:43 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:06:43 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=M_SnOdWBMDTe4EWutM7nJ8EGb0dmgiRPoE3OWZhRAItcD7V9cZUFR5OzsjseU-Tr0ZoUiWToeRe7-VGOclusOOt2WggIJEsliQD32Dc5EC0U_dogQoyVx3TavISNc89q][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:06:43 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:06:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:06:43 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:44 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:44 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:44 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:44 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:44 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:44 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is58
2013-05-24 14:06:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:06:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:06:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:44 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:06:46 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:46 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:46 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:46 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:46 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:46 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:46 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is59
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:06:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:46 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:06:51 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:06:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is60
2013-05-24 14:06:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:06:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:06:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:51 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:06:53 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 6c91c46e72a4973d670afdb7ba9bbebf][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:40:19 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:06:55 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:55 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:55 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:55 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:55 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:55 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:55 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is61
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:06:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:55 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:06:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:57 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:57 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:57 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:57 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:57 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is62
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:06:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:57 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:06:59 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:06:59 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:06:59 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:06:59 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:06:59 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:06:59 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:06:59 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is63
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:06:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:06:59 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:07:04 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is65
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.152/, null]
2013-05-24 14:07:09 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=2e012dc5d100a595:FF=0:TM=1369418829:LM=1369418829:S=EI8np-bRrTbFpZkM][domain: .google.co.in][path: /][expiry: Sun May 24 14:07:09 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:07:09 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=Yj6BBtzYMdLfxTRFD-9PCSGn5cOMyoawmSEr2jExZqkWdA4xLkysgw2rmc1pD4F-pzbPO56ERUDH0jn1cJ-4m-TzftcvlCJjDd1FATOD2qk1LRjTtPMz_rS44KuI5LOo][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:07:09 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:09 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:09 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:09 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:09 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:09 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:09 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:09 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is66
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:07:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:09 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:07:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:12 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:12 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:12 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:12 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:12 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is67
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:07:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:12 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:07:17 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:07:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is68
2013-05-24 14:07:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:07:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:07:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:17 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:07:17 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: d36c0b58c271687e1fe0d0dcc8fcc495][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:40:44 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:07:19 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:19 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:19 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:19 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:19 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:19 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:19 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is69
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:07:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:19 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:07:21 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:21 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:21 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:21 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:21 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:21 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:21 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is70
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:07:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:07:22 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:22 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:22 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:22 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:22 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:22 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:22 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is71
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:07:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:22 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:07:27 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is73
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.159/, null]
2013-05-24 14:07:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=a3297efca775095b:FF=0:TM=1369418853:LM=1369418853:S=YTK8gO1Kx-lLYD1Q][domain: .google.co.in][path: /][expiry: Sun May 24 14:07:33 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:07:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=EbJNdLm-Xg_qCmW5ig8PqzbtO128gYHKANx_rC-LJfu4aXxQAMTDB-EJqgJ5h_g_6KnZK1movuggMYPzaAVo2_qk9QFPNMSZvw3kd7LNqR4RmpEX6s-tpGJxYJ09fWK5][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:07:33 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.159"
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is74
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:07:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:07:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is75
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:07:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:35 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:07:40 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:07:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is76
2013-05-24 14:07:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:07:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:07:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:40 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:07:41 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: b86d5e8121ced3e5a51b09780cc28a5c][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:41:07 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:07:43 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:43 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:43 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:43 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:43 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:43 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:43 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is77
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:07:43 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:07:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:07:44 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:44 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:44 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:44 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:44 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:44 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:44 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is78
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:07:44 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:44 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:07:46 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:46 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:46 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:46 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:46 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:46 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:46 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is79
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:07:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:46 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:07:51 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:07:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:07:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is81
2013-05-24 14:07:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:07:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:07:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:56 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:07:57 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=26e5737d4bccd5db:FF=0:TM=1369418876:LM=1369418876:S=h5nM7bh0r7yRaCqb][domain: .google.co.in][path: /][expiry: Sun May 24 14:07:56 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:07:57 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=IAKx3PQBH8pdWk8v1KyJhvqq_Wq_omeXd1PuN-K7bro7u0NiqIGHkiDW3iAczbOZnxtWfLan6k4SlIY7cYxhuxgNSvS4FPCwicmuWhEYdYxzJ9X9ixxrgR4APebUFPUc][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:07:56 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:07:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:57 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:57 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:57 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:57 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:57 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is82
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:07:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:57 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:07:59 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:07:59 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:07:59 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:07:59 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:07:59 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:07:59 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:07:59 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is83
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:07:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:07:59 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:08:04 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:08:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is84
2013-05-24 14:08:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:08:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:08:04 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:08:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:05 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 07d227bbfb8b55db4b6054a8132931cc][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:41:31 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:08:07 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:07 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:07 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:07 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:07 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:07 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:07 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is85
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:08:07 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:08:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:08:08 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:08 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:08 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:08 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:08 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:08 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:08 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is86
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:08:08 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:08:08 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:08:11 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:11 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:11 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:11 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:11 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:11 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:11 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is87
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:08:11 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:08:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:16 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is89
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.152/, null]
2013-05-24 14:08:21 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=b41d2dc6683728ec:FF=0:TM=1369418901:LM=1369418901:S=nCxjPj84SVq72jp9][domain: .google.co.in][path: /][expiry: Sun May 24 14:08:21 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:08:21 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=dhJ5cY0PqBg5LmOocBnET_t9-3POPF4HKS67Lotvfa2Ry2lqQqvdvyRUjs-AIT3HLX4fa5NAQ4f26ba-ikZUkxeohDl0dv6HPN08SejLVHFinoG0FEIcewnPHW8HqBOA][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:08:21 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.152"
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:08:21 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:21 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:21 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:21 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:21 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:21 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:21 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is90
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:08:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:08:24 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:24 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:24 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:24 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:24 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:24 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:24 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is91
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:08:24 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:24 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:08:29 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:08:29 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is92
2013-05-24 14:08:29 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:08:29 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:08:29 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:29 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:08:31 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: fa36f1bfaa3572c6ad98ef66030aeb7d][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:41:57 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:08:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is93
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:08:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:08:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is94
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:08:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:35 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:08:37 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:37 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:37 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:37 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:37 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:37 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:37 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is95
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:08:37 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:08:37 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:42 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:08:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:08:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is97
2013-05-24 14:08:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:08:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:08:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:47 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:08:47 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=4a009c86ea0483cf:FF=0:TM=1369418927:LM=1369418927:S=JOcH9EfY2t88azVY][domain: .google.co.in][path: /][expiry: Sun May 24 14:08:47 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:08:47 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=exGucRwqCGeXrGQWD_OfbX1LezQvokZDEatEhzVH6yEe27N1Vy9GNoCzZXXF1D6wXZHAObdRVbl-rm1D3eI8ZwVqTeIJXgP4M0zL0spmrc4_795B6Ylv6LBdKkerXPJk][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:08:47 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:08:48 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:48 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:48 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:48 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:48 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:48 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:48 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is98
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:08:48 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:48 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:08:50 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:50 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:50 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:50 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:50 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:50 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:50 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is99
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:08:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:50 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-24 14:08:55 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:08:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is100
2013-05-24 14:08:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-24 14:08:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-24 14:08:55 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-24 14:08:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:08:56 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: bd84b6a59f454ebcbe67f6a3f557e09f][domain: .palominodb.com][path: /][expiry: Sun Jun 16 17:42:22 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 132
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 132
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 94
2013-05-24 14:08:58 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:08:58 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:08:58 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:08:58 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:08:58 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:08:58 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:08:58 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is101
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-24 14:08:58 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-24 14:08:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-24 14:09:00 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:09:00 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:09:00 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:09:00 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:09:00 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:09:00 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:09:00 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is102
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-24 14:09:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:00 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-24 14:09:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:09:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:09:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:09:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:09:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:09:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:09:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is103
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-24 14:09:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:01 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-24 14:09:06 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is105
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.151/, null]
2013-05-24 14:09:19 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=31cebcddb570f1a0:FF=0:TM=1369418959:LM=1369418959:S=5hW4dAurK7yumonP][domain: .google.co.in][path: /][expiry: Sun May 24 14:09:19 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:09:19 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=qjWFlGJHVADKzKoozJKihnzwKrKcAeCEE9ACnf8TFFGlbqrjV5QacR1S-SXglaEGZP4QJkMQdmC3VtxnQa6uza-6roJHh6KoCUjrKvYsoO6T6ungQzR6pNkdBOUA3Z3K][domain: .google.co.in][path: /][expiry: Sat Nov 23 13:09:19 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.151"
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-24 14:09:19 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:09:19 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:09:19 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:09:19 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:09:19 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:09:19 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:09:19 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is106
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-24 14:09:19 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:19 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-24 14:09:22 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-24 14:09:22 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-24 14:09:22 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-24 14:09:22 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-24 14:09:22 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 404
2013-05-24 14:09:22 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <html><body><h1>File/home/madhu/workspace/WSCrawler/temp/?operation=urlsposted not found</h1></body></html>
2013-05-24 14:09:22 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is107
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-24 14:09:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-24 14:09:22 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-27 13:21:14 DEBUG - [com.wsc.Crawler.HeartBeatServer]:134 - Opeartion heart not yet Implemented
2013-05-27 13:21:38 WARN  - [com.wsc.Crawler.HeartBeatServer]:140 - Unsupported operation operatiosn found in request.
2013-05-27 14:07:50 INFO  - [com.wsc.Crawler.HeartBeatServer]:248 - Listening on port 8080
2013-05-27 14:08:01 WARN  - [com.wsc.Crawler.HeartBeatServer]:168 - Unsupported operation (operatiosn) found in request.
2013-05-27 14:08:02 WARN  - [com.wsc.Crawler.HeartBeatServer]:325 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:320)
2013-05-27 14:08:42 WARN  - [com.wsc.Crawler.HeartBeatServer]:325 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:320)
2013-05-27 14:10:28 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.init.Initializer]:93 - previous instance of crawler is forcebly stopped
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.init.Initializer]:94 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 14:10:28 DEBUG - [com.wsc.crawler.init.Initializer]:198 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 14:10:28 WARN  - [com.wsc.crawler.init.Initializer]:207 - prev_frontier.xml is not found in ./temp
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.init.Initializer]:209 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:10:28 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 14:10:28 WARN  - [com.wsc.crawler.init.Initializer]:158 - Looks like Frontier server is not running
2013-05-27 14:10:28 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-27 14:10:28 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-27 14:10:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:10:58 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-27 14:10:58 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-27 14:11:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:11:28 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-27 14:11:28 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-27 14:11:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:11:58 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-27 14:11:58 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-27 14:12:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:12:28 DEBUG - [com.wsc.crawler.init.Initializer]:160 - Trying to connect Frontier Server...
2013-05-27 14:12:28 INFO  - [com.wsc.crawler.init.Initializer]:243 - Sleeping crawling for 30sec.
2013-05-27 14:12:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 14:12:58 ERROR - [com.wsc.crawler.init.Initializer]:170 - Unable to Connect Frontier Server because, whether it is not running or not reachable
2013-05-27 14:12:58 ERROR - [com.wsc.crawler.init.Initializer]:171 - Exiting Crawler
2013-05-27 14:13:58 INFO  - [com.wsc.Crawler.HeartBeatServer]:247 - Listening on port 8080
2013-05-27 14:14:08 DEBUG - [com.wsc.crawler.init.Initializer]:42 - Intializing crawler resources...
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.init.Initializer]:93 - previous instance of crawler is forcebly stopped
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.init.Initializer]:94 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 14:14:08 DEBUG - [com.wsc.crawler.init.Initializer]:198 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 14:14:08 WARN  - [com.wsc.crawler.init.Initializer]:207 - prev_frontier.xml is not found in ./temp
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.init.Initializer]:209 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:08 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:08 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 14:14:08 DEBUG - [com.wsc.crawler.init.Initializer]:176 - Frontier Server returned a Frontier of size 9
2013-05-27 14:14:09 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :0
2013-05-27 14:14:09 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :9
2013-05-27 14:14:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 0
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:14 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-27 14:14:14 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 14:14:24 INFO  - [com.wsc.crawler.grabber.Grabber]:228 - Current Thread count is max count=:1
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 14:14:34 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-27 14:14:34 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 14:14:44 INFO  - [com.wsc.crawler.grabber.Grabber]:228 - Current Thread count is max count=:1
2013-05-27 15:15:40 DEBUG - [com.wsc.crawler.init.Initializer]:55 - Intializing crawler resources...
2013-05-27 15:15:40 WARN  - [com.wsc.crawler.init.Initializer]:91 - Its looks like an another instance of crawler is running orthe previous instace is stopped forcebly.
2013-05-27 15:15:40 WARN  - [com.wsc.crawler.init.Initializer]:93 - Unable to start new instance of the WSCrawler
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.init.Initializer]:94 - Delete crawler.lock file manually, If Previous instace is stopped forcebly.
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.init.Initializer]:121 - previous instance of crawler is forcebly stopped
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.init.Initializer]:122 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:15:40 DEBUG - [com.wsc.crawler.init.Initializer]:251 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:15:40 WARN  - [com.wsc.crawler.init.Initializer]:260 - prev_frontier.xml is not found in ./temp
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.init.Initializer]:262 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 15:15:40 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 15:15:40 DEBUG - [com.wsc.crawler.init.Initializer]:229 - Frontier Server returned a Frontier of size 9
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :0
2013-05-27 15:15:40 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :9
2013-05-27 15:15:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 0
2013-05-27 15:16:53 DEBUG - [com.wsc.crawler.init.Initializer]:55 - Intializing crawler resources...
2013-05-27 15:16:53 WARN  - [com.wsc.crawler.init.Initializer]:91 - Its looks like an another instance of crawler is running orthe previous instace is stopped forcebly.
2013-05-27 15:16:53 WARN  - [com.wsc.crawler.init.Initializer]:93 - Unable to start new instance of the WSCrawler
2013-05-27 15:16:53 INFO  - [com.wsc.crawler.init.Initializer]:94 - Delete crawler.lock file manually, If Previous instace is stopped forcebly.
2013-05-27 15:45:40 DEBUG - [com.wsc.crawler.init.Initializer]:74 - Intializing crawler resources...
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.init.Initializer]:149 - crawler.running Created Successfully.
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.init.Initializer]:201 - previous instance of crawler is forcebly stopped
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.init.Initializer]:202 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:45:40 DEBUG - [com.wsc.crawler.init.Initializer]:329 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:45:40 WARN  - [com.wsc.crawler.init.Initializer]:338 - prev_frontier.xml is not found in ./temp
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.init.Initializer]:340 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 15:45:40 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 15:45:40 DEBUG - [com.wsc.crawler.init.Initializer]:306 - Frontier Server returned a Frontier of size 9
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :0
2013-05-27 15:45:40 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :9
2013-05-27 15:45:40 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 0
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-27 15:45:45 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-27 15:45:45 WARN  - [com.wsc.Crawler.HeartBeatServer]:324 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:319)
2013-05-27 15:45:56 INFO  - [com.wsc.crawler.grabber.Grabber]:228 - Current Thread count is max count=:1
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:46:06 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:46:06 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:46:06 DEBUG - [com.wsc.Crawler.WSCrawler]:99 - Cleaning Crawler resources.
2013-05-27 15:46:06 DEBUG - [com.wsc.Crawler.WSCrawler]:107 - Deleting Lock file
2013-05-27 15:46:06 DEBUG - [com.wsc.Crawler.WSCrawler]:109 - Lock file Deleted
2013-05-27 15:46:25 DEBUG - [com.wsc.crawler.init.Initializer]:74 - Intializing crawler resources...
2013-05-27 15:46:25 DEBUG - [com.wsc.crawler.init.Initializer]:113 - Creating lock file.
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.init.Initializer]:118 - Lockfile crawler.lock created successfully.
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.init.Initializer]:189 - previous instance of crawler is cleanly stopped
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.init.Initializer]:191 - Getting URLS from Source, which is described in crawler-core.xml file
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.init.Initializer]:224 - Getting Seed urls from Frontier Server
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:46:25 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:46:25 WARN  - [com.wsc.crawler.init.Initializer]:288 - Looks like Frontier server is not running
2013-05-27 15:46:25 DEBUG - [com.wsc.crawler.init.Initializer]:290 - Trying to connect Frontier Server...
2013-05-27 15:46:25 INFO  - [com.wsc.crawler.init.Initializer]:374 - Sleeping crawling for 30sec.
2013-05-27 15:48:11 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.init.Initializer]:205 - previous instance of crawler is forcebly stopped
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.init.Initializer]:206 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:48:11 DEBUG - [com.wsc.crawler.init.Initializer]:333 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:48:11 WARN  - [com.wsc.crawler.init.Initializer]:342 - prev_frontier.xml is not found in ./temp
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.init.Initializer]:344 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:48:11 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:48:11 WARN  - [com.wsc.crawler.init.Initializer]:292 - Looks like Frontier server is not running
2013-05-27 15:48:11 DEBUG - [com.wsc.crawler.init.Initializer]:294 - Trying to connect Frontier Server...
2013-05-27 15:48:11 INFO  - [com.wsc.crawler.init.Initializer]:378 - Sleeping crawling for 30sec.
2013-05-27 15:49:30 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:49:30 DEBUG - [com.wsc.crawler.init.Initializer]:96 - An another instance os WSCrawler is running.
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.init.Initializer]:206 - previous instance of crawler is forcebly stopped
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.init.Initializer]:207 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:49:30 DEBUG - [com.wsc.crawler.init.Initializer]:334 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:49:30 WARN  - [com.wsc.crawler.init.Initializer]:343 - prev_frontier.xml is not found in ./temp
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.init.Initializer]:345 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:49:30 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:49:30 WARN  - [com.wsc.crawler.init.Initializer]:293 - Looks like Frontier server is not running
2013-05-27 15:49:30 DEBUG - [com.wsc.crawler.init.Initializer]:295 - Trying to connect Frontier Server...
2013-05-27 15:49:30 INFO  - [com.wsc.crawler.init.Initializer]:379 - Sleeping crawling for 30sec.
2013-05-27 15:50:06 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:50:06 DEBUG - [com.wsc.crawler.init.Initializer]:101 - An another instance os WSCrawler is running.
2013-05-27 15:50:06 DEBUG - [com.wsc.crawler.init.Initializer]:102 - Existing Crawler.
2013-05-27 15:52:29 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:52:29 DEBUG - [com.wsc.crawler.init.Initializer]:101 - An another instance os WSCrawler is running.
2013-05-27 15:52:29 WARN  - [com.wsc.crawler.init.Initializer]:102 - Unable to start new instance of the WSCrawler
2013-05-27 15:52:29 INFO  - [com.wsc.crawler.init.Initializer]:103 - Delete crawler.running file manually.
2013-05-27 15:52:47 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:52:47 DEBUG - [com.wsc.crawler.init.Initializer]:101 - An another instance os WSCrawler is running.
2013-05-27 15:52:47 WARN  - [com.wsc.crawler.init.Initializer]:102 - Unable to start new instance of the WSCrawler
2013-05-27 15:52:47 INFO  - [com.wsc.crawler.init.Initializer]:103 - Delete crawler.running file manually.
2013-05-27 15:54:51 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:54:51 DEBUG - [com.wsc.crawler.init.Initializer]:106 - An another instance os WSCrawler is running.
2013-05-27 15:54:51 WARN  - [com.wsc.crawler.init.Initializer]:107 - Unable to start new instance of the WSCrawler
2013-05-27 15:54:51 INFO  - [com.wsc.crawler.init.Initializer]:108 - Delete crawler.running file manually.
2013-05-27 15:55:56 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:55:56 DEBUG - [com.wsc.crawler.init.Initializer]:106 - An another instance os WSCrawler is running.
2013-05-27 15:55:56 WARN  - [com.wsc.crawler.init.Initializer]:107 - Unable to start new instance of the WSCrawler
2013-05-27 15:55:56 INFO  - [com.wsc.crawler.init.Initializer]:108 - Delete crawler.running file manually.
2013-05-27 15:56:23 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:56:23 INFO  - [com.wsc.crawler.init.Initializer]:155 - crawler.running Created Successfully.
2013-05-27 15:56:23 INFO  - [com.wsc.crawler.init.Initializer]:207 - previous instance of crawler is forcebly stopped
2013-05-27 15:56:23 INFO  - [com.wsc.crawler.init.Initializer]:208 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:56:24 DEBUG - [com.wsc.crawler.init.Initializer]:335 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:56:24 WARN  - [com.wsc.crawler.init.Initializer]:344 - prev_frontier.xml is not found in ./temp
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.init.Initializer]:346 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:56:24 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:56:24 WARN  - [com.wsc.crawler.init.Initializer]:294 - Looks like Frontier server is not running
2013-05-27 15:56:24 DEBUG - [com.wsc.crawler.init.Initializer]:296 - Trying to connect Frontier Server...
2013-05-27 15:56:24 INFO  - [com.wsc.crawler.init.Initializer]:380 - Sleeping crawling for 30sec.
2013-05-27 15:57:33 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:57:33 DEBUG - [com.wsc.crawler.init.Initializer]:180 - Deleting crawler.running file.
2013-05-27 15:57:33 DEBUG - [com.wsc.crawler.init.Initializer]:182 - crawler.running file is deleted.
2013-05-27 15:57:33 DEBUG - [com.wsc.crawler.init.Initializer]:108 - An another instance os WSCrawler is running.
2013-05-27 15:57:33 WARN  - [com.wsc.crawler.init.Initializer]:109 - Unable to start new instance of the WSCrawler
2013-05-27 15:57:33 INFO  - [com.wsc.crawler.init.Initializer]:110 - Delete crawler.running file manually.
2013-05-27 15:57:50 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:57:50 DEBUG - [com.wsc.crawler.init.Initializer]:184 - File crawler.running file does not exist in ./temp
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.init.Initializer]:157 - crawler.running Created Successfully.
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.init.Initializer]:209 - previous instance of crawler is forcebly stopped
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.init.Initializer]:210 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-27 15:57:50 DEBUG - [com.wsc.crawler.init.Initializer]:337 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-27 15:57:50 WARN  - [com.wsc.crawler.init.Initializer]:346 - prev_frontier.xml is not found in ./temp
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.init.Initializer]:348 - Trying get URLs from Default URL Source, Frontier Server
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:57:50 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-27 15:57:50 WARN  - [com.wsc.crawler.init.Initializer]:296 - Looks like Frontier server is not running
2013-05-27 15:57:50 DEBUG - [com.wsc.crawler.init.Initializer]:298 - Trying to connect Frontier Server...
2013-05-27 15:57:50 INFO  - [com.wsc.crawler.init.Initializer]:382 - Sleeping crawling for 30sec.
2013-05-27 15:58:05 DEBUG - [com.wsc.crawler.init.Initializer]:77 - Intializing crawler resources...
2013-05-27 15:58:05 DEBUG - [com.wsc.crawler.init.Initializer]:180 - Deleting crawler.running file.
2013-05-27 15:58:05 DEBUG - [com.wsc.crawler.init.Initializer]:182 - crawler.running file is deleted.
2013-05-27 15:58:05 DEBUG - [com.wsc.crawler.init.Initializer]:108 - An another instance os WSCrawler is running.
2013-05-27 15:58:05 WARN  - [com.wsc.crawler.init.Initializer]:109 - Unable to start new instance of the WSCrawler
2013-05-27 15:58:05 INFO  - [com.wsc.crawler.init.Initializer]:110 - Delete crawler.running file manually.
2013-05-27 15:58:20 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:58:20 DEBUG - [com.wsc.crawler.init.Initializer]:298 - Trying to connect Frontier Server...
2013-05-27 15:58:20 INFO  - [com.wsc.crawler.init.Initializer]:382 - Sleeping crawling for 30sec.
2013-05-27 15:58:50 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-27 15:58:50 DEBUG - [com.wsc.crawler.init.Initializer]:298 - Trying to connect Frontier Server...
2013-05-27 15:58:50 INFO  - [com.wsc.crawler.init.Initializer]:382 - Sleeping crawling for 30sec.
2013-05-28 07:24:52 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 07:24:52 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 07:24:52 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-28 07:24:52 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:8080
2013-05-28 07:24:52 WARN  - [com.wsc.crawler.init.Initializer]:298 - Looks like Frontier server is not running
2013-05-28 07:24:52 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 07:24:52 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 07:25:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:8080) MESG=Connection refused
2013-05-28 07:25:22 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 07:25:22 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 07:26:39 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:26:39 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 07:26:39 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 07:26:39 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 07:26:39 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 07:26:39 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 07:27:29 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 07:27:29 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 07:27:29 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 07:27:29 INFO  - [com.wsc.Crawler.HeartBeatServer]:246 - Listening on port 8080
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:29 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:29 DEBUG - [com.wsc.crawler.init.Initializer]:316 - Frontier Server returned a Frontier of size 9
2013-05-28 07:27:29 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :7
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :2
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://74.125.236.223/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 7
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.223/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:27:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:27:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=d3a4ff098d0e9eb3:FF=0:TM=1369740453:LM=1369740453:S=tlK6hV_ngldprdc7][domain: .google.co.in][path: /][expiry: Thu May 28 07:27:33 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 07:27:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=ci5yU_mmvz_5eWDV2BdCD918CIRW9VVR06_LZWmRe8nUBPCf4BN3r94z1N5tcufK9nPtBqmg9fOjhoWnye3nv2aBKQ0l3ORDFQfqAZ5Dt1s5U8k7XbO6lFhI5k7TYZnm][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:27:33 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:27:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:27:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:34 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: d111d840c487e1001e8fee38ce0ce51d][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:00:58 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 07:27:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:35 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 07:27:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 07:27:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:27:36 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 128
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 91
2013-05-28 07:27:36 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:36 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:27:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:27:36 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:36 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:36 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:38 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:41 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-28 07:27:41 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:41 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-28 07:27:41 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is12
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:51 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.223/, null]
2013-05-28 07:27:51 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=944fe60e5b0160ee:FF=0:TM=1369740471:LM=1369740471:S=ysXKLA2tv5B4abno][domain: .google.co.in][path: /][expiry: Thu May 28 07:27:51 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 07:27:51 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=YXtNkPZzEfyGDxmK1Rv_KF5cR9MAuQNWq5NsCwozBY_tv4PS_sO3M15k25u9G0wwGx2TdntAyv0PRKV0oih7x88qfkcmYlMufo-ktwFtO7E2uClAQICentk4EF-W7VLG][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:27:51 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:27:51 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:51 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:51 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is13
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-28 07:27:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:51 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:27:53 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:53 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:53 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is14
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-28 07:27:53 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:27:53 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is15
2013-05-28 07:27:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-28 07:27:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-28 07:27:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:54 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:27:56 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 3a86834b4e57fd1242eca75d749ae756][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:01:19 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:27:56 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:57 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 128
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 91
2013-05-28 07:27:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:57 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is16
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-28 07:27:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:57 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 07:27:58 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:27:58 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:27:58 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is17
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-28 07:27:58 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:27:58 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:28:00 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:00 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 07:28:00 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:00 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:00 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is18
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-28 07:28:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:00 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:28:05 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is27
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-28 07:28:07 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.215/, null]
2013-05-28 07:28:07 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=b03a5cb3e6a8a4f5:FF=0:TM=1369740487:LM=1369740487:S=xb9H7zr3BozXyvic][domain: .google.co.in][path: /][expiry: Thu May 28 07:28:07 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 07:28:07 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=vbv2GNW1XjqEn2V4xfVF_HEvctVO9CqfCz91LCCzXPJIA-JWhFK7rTEdH28Nx16fMLJ1vmdn1_yuHSXrJbXBcvc30SAbZrRY94ERD2VmE2E_yNsdP4GsS0PjTkJ_TRPC][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:28:07 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:28:07 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:07 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:07 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is29
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-28 07:28:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:07 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:28:09 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:09 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:09 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:09 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is31
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-28 07:28:09 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:28:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:14 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 2 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-28 07:28:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is32
2013-05-28 07:28:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-28 07:28:14 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:14 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:28:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-28 07:28:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:15 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 2135dd4b734ad9eddc901aa77a16a0d3][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:01:38 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 128
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 91
2013-05-28 07:28:16 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:16 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:16 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (4) is34
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 4 is dead, restarting it.
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 4 started.
2013-05-28 07:28:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:16 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:28:17 WARN  - [com.wsc.Crawler.HeartBeatServer]:323 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 07:28:18 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:18 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:18 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (5) is36
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 5 is dead, restarting it.
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 5 started.
2013-05-28 07:28:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:18 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 07:28:20 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:28:20 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.HeartBeatServer]:159 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:28:20 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (6) is38
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 6 is dead, restarting it.
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 6 started.
2013-05-28 07:28:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:28:20 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:28:23 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:28:25 WARN  - [com.wsc.crawler.grabber.Grabber]:344 - 6 - error: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
2013-05-28 07:28:25 WARN  - [com.wsc.Crawler.HeartBeatServer]:327 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:318)
2013-05-28 07:38:21 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:38:21 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 07:38:21 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 07:38:21 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 07:38:21 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 07:38:21 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 07:38:28 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 07:38:28 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 07:38:28 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:28 DEBUG - [com.wsc.crawler.init.Initializer]:316 - Frontier Server returned a Frontier of size 9
2013-05-28 07:38:28 INFO  - [com.wsc.Crawler.HeartBeatServer]:256 - Listening on port 8080
2013-05-28 07:38:28 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :7
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :2
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://74.125.236.119/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 7
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.119/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:38:33 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:38:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=543fdb6526c16546:FF=0:TM=1369741113:LM=1369741113:S=81Taq67T4R1NsP88][domain: .google.co.in][path: /][expiry: Thu May 28 07:38:33 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.119"
2013-05-28 07:38:33 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=FFbGupIE5JhnATZ7QeGFaV3N6qj2ElrxzMe3qTU_BQuNSQ2jOhnZV0U1CsVctDsX_kKEHRjHx91VPhmsky5tzphxi5q0wPG_GXZcSDZMPScyq84AWBXNFRUZPalXdGCR][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:38:33 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.119"
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:38:33 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:38:33 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:33 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:33 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:34 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 2b91e491031aad60897f9a5289d067d9][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:11:58 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:38:35 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 07:38:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 07:38:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:38:35 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:38:35 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:35 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:35 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:36 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:36 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:36 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 128
2013-05-28 07:38:36 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 91
2013-05-28 07:38:36 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:36 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:36 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.Grabber]:237 - Queue is Empty...!
2013-05-28 07:38:41 WARN  - [com.wsc.Crawler.HeartBeatServer]:337 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.Grabber]:238 - Refilling Queue...!
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:41 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-28 07:38:41 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:189 - Thread Count is less than numthreads In Thread :1
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.Grabber]:222 - Queue is empty in for loop
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.Grabber]:67 - Queue size in Grabber is 9
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (0) is12
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 0 is dead, restarting it.
2013-05-28 07:38:51 WARN  - [com.wsc.Crawler.HeartBeatServer]:333 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 0 started.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://74.125.236.119/, null]
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:38:51 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=1ce09f2fa7c2449c:FF=0:TM=1369741131:LM=1369741131:S=PxcAEHFathPtJ9dS][domain: .google.co.in][path: /][expiry: Thu May 28 07:38:51 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.119"
2013-05-28 07:38:51 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=PDqSa3064sZ_QiZ3tcOcYBd7oicRL8PrykzBOwe6xtzfwKKfZ5ush1NHofKCDMUUNcg8GYKX-MCjGRhkcfBJjxvoeZsG8K3cx9YTQrGy9Y-OvedBog2-Y57GVb2tVy5l][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:38:51 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.119"
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:38:51 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:51 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:51 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (1) is13
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 1 is dead, restarting it.
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 1 started.
2013-05-28 07:38:51 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:38:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:38:55 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:38:55 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.HeartBeatServer]:169 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:38:55 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (2) is14
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 2 is dead, restarting it.
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 2 started.
2013-05-28 07:38:55 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:196 - Checking thread at (3) is15
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:199 - Thread 3 is dead, restarting it.
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:214 - Thread 3 started.
2013-05-28 07:38:55 DEBUG - [com.wsc.crawler.grabber.Grabber]:215 - Thread list size is :7
2013-05-28 07:38:55 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:38:56 WARN  - [com.wsc.Crawler.HeartBeatServer]:337 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:328)
2013-05-28 07:38:56 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: ded89f474fca70bef020fa778be90672][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:12:20 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:43:16 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:43:16 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 07:43:16 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 07:43:16 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 07:43:16 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 07:43:16 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 07:43:20 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 07:43:20 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 07:43:20 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 07:43:20 INFO  - [com.wsc.Crawler.HeartBeatServer]:258 - Listening on port 8080
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 07:43:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 07:43:20 DEBUG - [com.wsc.crawler.init.Initializer]:316 - Frontier Server returned a Frontier of size 9
2013-05-28 07:43:20 WARN  - [com.wsc.Crawler.HeartBeatServer]:335 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:330)
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:119 - number of resolved hosts are :7
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:121 - number of Unresolved hosts are :2
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:156 - Crawling Bean is ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.Grabber]:165 - Threads length  is : 7
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 07:43:21 INFO  - [com.wsc.crawler.grabber.Grabber]:327 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 07:43:21 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=1901e918c085392a:FF=0:TM=1369741401:LM=1369741401:S=rRgVhsaexMqNgwvE][domain: .google.co.in][path: /][expiry: Thu May 28 07:43:21 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 07:43:21 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=jtmMfFVcpUGVZ0mFgEPg3XkBZfzyH_4B4IhpWCALQHk1msaMzlyjCCjwdwRnybOYfJMUn9a-ztq8_Zz8MJuQ_1hj0N4DKtw_pOLaLPLm31_90REq3xWV5HrPHWV5v1ix][domain: .google.co.in][path: /][expiry: Wed Nov 27 06:43:21 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:43:21 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 07:43:21 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:43:21 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:43:21 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:43:21 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:43:22 DEBUG - [com.wsc.Crawler.HeartBeatServer]:171 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:43:22 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:43:22 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:43:22 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:43:22 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: f010fa2af7f62103d673f2340c7788ae][domain: .palominodb.com][path: /][expiry: Thu Jun 20 11:16:46 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 07:43:23 WARN  - [com.wsc.Crawler.HeartBeatServer]:335 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:330)
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 07:43:23 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:43:23 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.HeartBeatServer]:171 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 07:43:23 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:43:23 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.HeartBeatServer]:171 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:43:23 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:43:23 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 07:43:24 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:43:24 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.HeartBeatServer]:171 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 07:43:24 WARN  - [com.wsc.Crawler.HeartBeatServer]:335 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:330)
2013-05-28 07:43:24 WARN  - [com.wsc.Crawler.HeartBeatServer]:335 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:330)
2013-05-28 07:43:24 WARN  - [com.wsc.Crawler.HeartBeatServer]:335 - Client closed connection
org.apache.http.ConnectionClosedException: Client closed connection
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:94)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:330)
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 128
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 128
2013-05-28 07:43:24 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 91
2013-05-28 07:43:24 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 07:43:24 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.HeartBeatServer]:171 - Opeartion value (urlsposted) not yet Implemented
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is 
2013-05-28 07:43:24 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 08:06:16 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 08:06:16 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 08:06:16 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 08:06:16 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 08:06:16 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 08:06:16 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 08:06:22 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 08:06:22 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 08:06:22 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:9991 is reachable.
2013-05-28 08:06:22 INFO  - [com.wsc.Crawler.HeartBeatServer]:274 - Listening on port 1234
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:06:22 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:9991
2013-05-28 08:06:22 WARN  - [com.wsc.crawler.init.Initializer]:298 - Looks like Frontier server is not running
2013-05-28 08:06:22 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:06:22 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:06:43 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:06:43 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:06:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:06:52 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:06:52 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:07:00 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:07:02 DEBUG - [com.wsc.Crawler.HeartBeatServer]:187 - Opeartion value (heartbeat4) not yet Implemented
2013-05-28 08:07:10 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:07:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:07:22 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:07:22 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:07:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:07:52 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:07:52 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:08:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:08:22 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:08:22 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:08:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:08:52 ERROR - [com.wsc.crawler.init.Initializer]:310 - Unable to Connect Frontier Server because, whether it is not running or not reachable
2013-05-28 08:08:52 ERROR - [com.wsc.crawler.init.Initializer]:311 - Exiting Crawler
2013-05-28 08:18:32 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 08:18:32 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 08:18:32 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 08:18:32 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 08:18:32 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 08:18:32 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 08:18:38 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 08:18:38 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 08:18:38 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:9991 is reachable.
2013-05-28 08:18:38 INFO  - [com.wsc.Crawler.HeartBeatServer]:274 - Listening on port 1234
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:18:38 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:9991
2013-05-28 08:18:38 WARN  - [com.wsc.crawler.init.Initializer]:298 - Looks like Frontier server is not running
2013-05-28 08:18:38 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:18:38 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:18:56 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:19:08 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:19:08 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:19:08 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:19:22 DEBUG - [com.wsc.Crawler.HeartBeatServer]:187 - Opeartion value (heartbea) not yet Implemented
2013-05-28 08:19:27 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 08:19:31 DEBUG - [com.wsc.Crawler.HeartBeatServer]:187 - Opeartion value (=heartbeat) not yet Implemented
2013-05-28 08:19:38 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 08:19:38 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 08:19:38 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 08:19:47 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:13:22 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 13:13:22 DEBUG - [com.wsc.crawler.init.Initializer]:107 - An another instance os WSCrawler is running.
2013-05-28 13:13:22 WARN  - [com.wsc.crawler.init.Initializer]:108 - Unable to start new instance of the WSCrawler
2013-05-28 13:13:22 INFO  - [com.wsc.crawler.init.Initializer]:109 - Delete crawler.running file manually.
2013-05-28 13:13:22 DEBUG - [com.wsc.crawler.init.Initializer]:182 - Deleting crawler.running file.
2013-05-28 13:13:22 DEBUG - [com.wsc.crawler.init.Initializer]:184 - crawler.running file is deleted.
2013-05-28 13:13:27 DEBUG - [com.wsc.crawler.init.Initializer]:78 - Intializing crawler resources...
2013-05-28 13:13:27 INFO  - [com.wsc.crawler.init.Initializer]:159 - crawler.running Created Successfully.
2013-05-28 13:13:27 INFO  - [com.wsc.crawler.init.Initializer]:211 - previous instance of crawler is forcebly stopped
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.init.Initializer]:212 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 13:13:28 DEBUG - [com.wsc.crawler.init.Initializer]:339 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 13:13:28 WARN  - [com.wsc.crawler.init.Initializer]:348 - prev_frontier.xml is not found in ./temp
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.init.Initializer]:350 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:9991 is reachable.
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:13:28 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:9991
2013-05-28 13:13:28 WARN  - [com.wsc.crawler.init.Initializer]:298 - Looks like Frontier server is not running
2013-05-28 13:13:28 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 13:13:28 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 13:13:28 INFO  - [com.wsc.Crawler.HeartBeatServer]:274 - Listening on port 1234
2013-05-28 13:13:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:13:58 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 13:13:58 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 13:14:18 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:14:24 DEBUG - [com.wsc.Crawler.HeartBeatServer]:187 - Opeartion value (=heartbeat) not yet Implemented
2013-05-28 13:14:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:14:28 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 13:14:28 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 13:14:30 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:14:37 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:14:53 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:14:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:14:58 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 13:14:58 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 13:15:05 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:15:19 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:15:28 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:15:28 DEBUG - [com.wsc.crawler.init.Initializer]:300 - Trying to connect Frontier Server...
2013-05-28 13:15:28 INFO  - [com.wsc.crawler.init.Initializer]:384 - Sleeping crawling for 30sec.
2013-05-28 13:15:32 WARN  - [com.wsc.Crawler.HeartBeatServer]:355 - I/O error 
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(Unknown Source)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:166)
	at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:90)
	at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:281)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:92)
	at org.apache.http.impl.io.DefaultHttpRequestParser.parseHead(DefaultHttpRequestParser.java:59)
	at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:254)
	at org.apache.http.impl.AbstractHttpServerConnection.receiveRequestHeader(AbstractHttpServerConnection.java:247)
	at org.apache.http.protocol.HttpService.handleRequest(HttpService.java:246)
	at com.wsc.Crawler.HeartBeatServer$WorkerThread.run(HeartBeatServer.java:346)
2013-05-28 13:15:58 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 13:15:58 ERROR - [com.wsc.crawler.init.Initializer]:310 - Unable to Connect Frontier Server because, whether it is not running or not reachable
2013-05-28 13:15:58 ERROR - [com.wsc.crawler.init.Initializer]:311 - Exiting Crawler
2013-05-28 13:49:34 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:34 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:40 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:40 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:40 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:40 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:41 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:41 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:41 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:41 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:42 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:49:42 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:49:45 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (hartbeat) not yet Implemented
2013-05-28 13:49:50 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (hartbeat) not yet Implemented
2013-05-28 13:49:55 WARN  - [clientserver.AsyncHTTPServer]:316 - Unsupported operation (operatiojn) found in request.
2013-05-28 13:50:00 WARN  - [clientserver.AsyncHTTPServer]:316 - Unsupported operation (operatiojn) found in request.
2013-05-28 13:50:02 WARN  - [clientserver.AsyncHTTPServer]:316 - Unsupported operation (operatiojn) found in request.
2013-05-28 13:52:01 WARN  - [clientserver.AsyncHTTPServer]:316 - Unsupported operation (operatiojn) found in request.
2013-05-28 13:52:08 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (hartbeat) not yet Implemented
2013-05-28 13:52:10 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (hartbeat) not yet Implemented
2013-05-28 13:53:23 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (hartbeat) not yet Implemented
2013-05-28 13:53:29 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:29 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:32 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:32 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:35 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (heartbleat) not yet Implemented
2013-05-28 13:53:39 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operatiokln) found in request.
2013-05-28 13:53:42 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (heartbleat) not yet Implemented
2013-05-28 13:53:45 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:45 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:51 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:52 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 13:53:53 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:32 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:01:32 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:01:32 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:01:32 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:01:32 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:01:32 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:01:51 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:01:51 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:01:51 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:01:52 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:01:52 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:9991 is reachable.
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:01:52 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:9991
2013-05-28 14:01:52 WARN  - [com.wsc.crawler.init.Initializer]:300 - Looks like Frontier server is not running
2013-05-28 14:01:52 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:01:52 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:01:55 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:55 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:57 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:57 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:57 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:57 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:58 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:01:59 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:00 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:01 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:02 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:03 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:03 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:03 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:03 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:05 DEBUG - [clientserver.AsyncHTTPServer]:309 - Opeartion value (heartbeatm) not yet Implemented
2013-05-28 14:02:09 DEBUG - [clientserver.AsyncHTTPServer]:290 - Request recieved for operation heartbeat
2013-05-28 14:02:09 DEBUG - [clientserver.AsyncHTTPServer]:301 - Request Served Successfully
2013-05-28 14:02:13 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:02:22 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:02:22 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:02:32 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:33 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:33 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:33 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:33 WARN  - [clientserver.AsyncHTTPServer]:326 - Unsupported operation (operationj) found in request.
2013-05-28 14:02:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:02:52 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:02:52 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:03:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:03:22 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:03:22 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:03:52 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:03:52 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:03:52 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:04:22 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:04:22 ERROR - [com.wsc.crawler.init.Initializer]:312 - Unable to Connect Frontier Server because, whether it is not running or not reachable
2013-05-28 14:04:22 ERROR - [com.wsc.crawler.init.Initializer]:313 - Exiting Crawler
2013-05-28 14:20:15 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:20:15 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:20:15 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:20:15 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:20:15 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:20:15 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:20:19 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:20:19 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:20:19 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:9991 is reachable.
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:20:19 WARN  - [com.wsc.crawler.grabber.LocalClient]:133 - Frontier port is not up :http://127.0.0.1:9991
2013-05-28 14:20:19 WARN  - [com.wsc.crawler.init.Initializer]:300 - Looks like Frontier server is not running
2013-05-28 14:20:19 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:20:19 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:20:49 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:20:49 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:20:49 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:21:19 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:21:19 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:21:19 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:21:49 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:21:49 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:21:49 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:22:08 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:22:08 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:22:08 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:22:08 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:22:08 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:22:08 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:22:19 INFO  - [com.wsc.crawler.utils.CheckHost]:87 - iOException is thrown while connecting (http://127.0.0.1:9991) MESG=Connection refused
2013-05-28 14:22:19 DEBUG - [com.wsc.crawler.init.Initializer]:302 - Trying to connect Frontier Server...
2013-05-28 14:22:19 INFO  - [com.wsc.crawler.init.Initializer]:386 - Sleeping crawling for 30sec.
2013-05-28 14:23:01 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:23:01 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:23:01 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:23:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:23:01 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:195 - Number of resolved hosts are :7
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:197 - Number of Unresolved hosts are :2
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.google.co.in/, http://74.125.236.216/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.Grabber]:249 - Number of Threads are : 7
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.google.co.in/, http://74.125.236.216/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:23:10 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:23:10 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=8b6bc5c018092080:FF=0:TM=1369765390:LM=1369765390:S=miBzLmfBKufxg88G][domain: .google.co.in][path: /][expiry: Thu May 28 14:23:10 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.216"
2013-05-28 14:23:10 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=hIv6mDRMT1XMMI1omN-FdNnj7syiHC0ES9LebmNsfcH8rEKeM-4qVmTJkYVkc0hpwo1_RbjGOlH0hPiFEDwamQRC7U3kdRqqy4f-r_fPA8IkKwXw7mkGSei1itFA6E5r][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:23:10 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.216"
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:23:10 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:23:10 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:23:10 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:23:10 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:23:10 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:23:11 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:23:11 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:23:11 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:23:11 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: b30abfa835d03966e2b6be5d973ca1a1][domain: .palominodb.com][path: /][expiry: Thu Jun 20 17:56:35 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:23:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:23:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:23:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:23:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:23:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:23:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:23:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:23:12 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:23:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:23:13 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:23:13 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:23:13 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:25:55 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:25:55 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:25:55 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:25:55 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:25:55 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:25:55 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:25:59 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:25:59 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:25:59 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:25:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:25:59 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:195 - Number of resolved hosts are :7
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:197 - Number of Unresolved hosts are :2
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.google.co.in/, http://74.125.236.215/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:249 - Number of Threads are : 7
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.google.co.in/, http://74.125.236.215/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:26:00 INFO  - [com.wsc.crawler.grabber.Grabber]:473 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:26:00 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=65f02cf565585db5:FF=0:TM=1369765560:LM=1369765560:S=C4VLSnYQKccDs4zQ][domain: .google.co.in][path: /][expiry: Thu May 28 14:26:00 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 14:26:00 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=b0QhU1r9uySRQ5-YXc07RXGvs8fUX0r284InW0-6boAR0pwxcP0MfP0igeYdiNNULXTqqXj9kjivZZMzADo_ZSYOzlvftQgXB78LLReEUaVzSz4JijgGx1jZmPQXnyH3][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:26:00 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:26:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:26:00 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:26:00 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:26:00 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:26:00 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:26:00 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:26:00 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:26:00 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:26:01 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: a3c6db5c3bc6457d9cea384941b821b1][domain: .palominodb.com][path: /][expiry: Thu Jun 20 17:59:25 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:26:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:26:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:26:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:26:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:26:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:26:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:26:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:26:02 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:26:02 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:26:02 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:26:02 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:26:02 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:26:02 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:26:02 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:26:03 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:26:03 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:26:03 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:26:03 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:30:39 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:30:39 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 14:30:39 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 14:30:39 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:30:39 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:30:39 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:30:39 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:30:39 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:30:43 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:30:43 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:30:43 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:30:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:30:43 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:30:47 INFO  - [com.wsc.crawler.grabber.Grabber]:195 - Number of resolved hosts are :7
2013-05-28 14:30:47 INFO  - [com.wsc.crawler.grabber.Grabber]:197 - Number of Unresolved hosts are :2
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:30:47 DEBUG - [com.wsc.crawler.grabber.Grabber]:249 - Number of Threads are : 7
2013-05-28 14:30:47 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:30:47 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:30:47 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:30:48 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:30:48 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:30:48 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:30:48 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:30:48 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=a421219164e7a128:FF=0:TM=1369765848:LM=1369765848:S=BS6gUEykNIeCk90w][domain: .google.co.in][path: /][expiry: Thu May 28 14:30:48 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 14:30:48 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=A59rud9UDwrRVY-VJlVu1wrlXIV0GgrO-w4jUyFtWHpVpVLi7A5qUervt3tVpecabraj3cKlTyxhH8HHdOR2T4FT5Ab-VpIrbXjU3EMvdiV6G8sHEMTF1jRfNrMU80_T][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:30:48 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:30:48 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:30:48 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:30:48 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:30:48 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:30:48 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:30:48 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:30:48 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:30:48 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:30:49 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 43b2cbc0e1ea1d12ddd977d8cda3186b][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:04:13 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:30:49 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:30:49 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:30:49 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:30:49 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:30:49 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:30:49 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:30:49 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:30:49 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:30:50 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:30:50 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:30:50 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:30:50 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:30:50 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:30:50 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:30:52 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:30:52 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:30:52 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:30:52 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.Grabber]:353 - Queue is Empty...!
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.Grabber]:354 - Refilling Queue...!
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:30:57 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is13
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is14
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is15
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is16
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is17
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is18
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is19
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:07 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:07 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is23
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is24
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is25
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is26
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is27
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is28
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is29
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:12 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:12 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is31
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is32
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is33
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is34
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is35
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is36
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is37
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:18 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:18 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:18 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is39
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is40
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is41
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is42
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is43
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is44
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is45
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:23 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:23 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:26 DEBUG - [clientserver.AsyncHTTPServer]:291 - Request recieved for operation heartbeat
2013-05-28 14:31:26 DEBUG - [clientserver.AsyncHTTPServer]:302 - Request Served Successfully
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is47
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is48
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is49
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is50
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is51
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is52
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is53
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:28 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:28 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is55
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is56
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is57
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is58
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is59
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is60
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is61
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:33 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:33 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is63
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is64
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is65
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is66
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is67
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is68
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is69
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:38 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:38 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:38 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is71
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is72
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is73
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is74
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is75
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is76
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is77
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:43 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:43 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:43 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is79
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is80
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is81
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is82
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is83
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is84
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is85
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:49 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:49 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:49 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:50 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:31:50 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is87
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is88
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is89
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is90
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is91
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is92
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is93
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:54 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:54 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:54 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is95
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is96
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is97
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is98
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is99
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is100
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is101
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:31:59 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:31:59 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:31:59 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is103
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is104
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is105
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is106
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is107
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is108
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is109
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:04 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:04 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:04 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is111
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is112
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is113
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is114
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is115
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is116
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is117
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:09 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:09 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:09 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is119
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is120
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is121
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is122
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is123
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is124
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is125
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:14 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:14 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:14 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is127
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is128
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is129
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is130
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is131
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is132
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is133
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:22 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:22 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:23 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:32:23 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is135
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is136
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is137
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is138
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is139
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is140
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is141
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:27 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:27 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:27 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is143
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is144
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is145
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is146
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is147
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is148
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is149
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:32:32 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:32:32 INFO  - [com.wsc.crawler.grabber.Grabber]:475 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:32:32 WARN  - [com.wsc.crawler.grabber.Grabber]:492 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:18 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:37:18 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 14:37:18 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 14:37:18 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:37:18 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:37:18 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:37:18 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:37:18 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:37:22 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:37:22 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:37:22 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:37:22 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:37:22 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:37:22 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:37:22 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:37:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:37:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:37:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:23 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:23 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:195 - Number of resolved hosts are :7
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:197 - Number of Unresolved hosts are :2
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:231 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:249 - Number of Threads are : 7
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:37:28 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:37:28 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=40eec2ca2c9eea46:FF=0:TM=1369766248:LM=1369766248:S=WnM-Sse37dHgbZni][domain: .google.co.in][path: /][expiry: Thu May 28 14:37:28 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:37:28 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=Hk9Mp6qewgfErMpSyvNpk-81jrnCLTEM3UllMbI95bjBg3LOhaBbViEJyg0_0QYYoG6-BxrO_tSA2TxzPjGCPgUmY3hYMqKLeybv5QvDOYtOcddMCNx4C41XXgW_Z30C][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:37:28 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:37:28 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:37:28 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:37:28 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:37:28 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:37:28 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:37:28 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:37:28 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:37:28 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:37:29 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:37:29 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:37:29 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: bd7a5e3d11724e199ed8290b28aaf33d][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:10:53 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:37:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:37:29 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:37:29 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:37:29 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:37:30 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:37:30 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:37:30 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:37:30 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:37:30 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:37:30 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:37:30 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:37:30 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:37:31 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:37:31 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:37:31 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:37:31 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.Grabber]:353 - Queue is Empty...!
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.Grabber]:354 - Refilling Queue...!
2013-05-28 14:37:36 DEBUG - [com.wsc.crawler.grabber.Grabber]:357 - IsStopped =false
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:36 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is13
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is14
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is15
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is16
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is17
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is18
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is19
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:37:46 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:46 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:37:46 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is23
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is24
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is25
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is26
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is27
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is28
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is29
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:37:51 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:51 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:37:51 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:54 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:37:54 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is31
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is32
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is33
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is34
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is35
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is36
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is37
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:37:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:37:56 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:37:56 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is39
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is40
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is41
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is42
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is43
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is44
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is45
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:38:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:01 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:38:01 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:02 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:38:02 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is47
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is48
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is49
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is50
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is51
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is52
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is53
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:38:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:06 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:38:06 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is55
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is56
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is57
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is58
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is59
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is60
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is61
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:38:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:11 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:38:11 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:289 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:333 - Queue is empty in for loop
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:108 - Queue is Set in Grabber is 9
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (0) is63
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 0 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 0 started.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (1) is64
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 1 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 1 started.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (2) is65
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 2 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 2 started.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (3) is66
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 3 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 3 started.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (4) is67
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 4 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 4 started.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (5) is68
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 5 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 5 started.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:296 - Checking thread at (6) is69
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:299 - Thread 6 is dead, restarting it.
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:320 - Thread 6 started.
2013-05-28 14:38:22 INFO  - [com.wsc.crawler.grabber.Grabber]:478 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:38:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:321 - Thread list size is :7
2013-05-28 14:38:22 WARN  - [com.wsc.crawler.grabber.Grabber]:495 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:08 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:40:08 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 14:40:08 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 14:40:08 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:40:08 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:40:08 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:40:08 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:40:08 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:40:12 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:40:12 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:40:12 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:12 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:12 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:201 - Number of resolved hosts are :7
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:203 - Number of Unresolved hosts are :2
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:237 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.Grabber]:255 - Number of Threads are : 7
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:40:13 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:40:13 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=266a2b02ae8583f0:FF=0:TM=1369766413:LM=1369766413:S=nu6N6DStb5vkee4O][domain: .google.co.in][path: /][expiry: Thu May 28 14:40:13 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:40:13 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=dldJZNTv-uHFAYtptAjIx_p13xFAMtPsgsk-LkLmbo3knU-9C951Nls8RnySHg0uOwhqlqVl3K1dZwAeWDF_XBctOFlKIaxOXImkNU7lFAmEIgLmt0YAu4QjGeygoDFk][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:40:13 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:40:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:40:13 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:40:13 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:40:13 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:40:13 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:40:13 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:40:13 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:40:13 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:40:14 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: c25333a02c17d51836bb124ee744f849][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:13:38 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:40:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:40:14 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:40:14 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:40:14 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:40:14 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:40:14 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:40:14 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:40:14 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:40:15 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:40:15 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:40:15 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:40:15 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:40:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:40:15 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:40:15 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:40:15 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.Grabber]:359 - Queue is Empty...!
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.Grabber]:360 - Refilling Queue...!
2013-05-28 14:40:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:363 - IsStopped =false
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:20 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:295 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:339 - Queue is empty in for loop
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (0) is13
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 0 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 0 started.
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (1) is14
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 1 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 1 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (2) is15
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 2 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 2 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (3) is16
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 3 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 3 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (4) is17
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 4 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 4 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (5) is18
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 5 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 5 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:302 - Checking thread at (6) is19
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:305 - Thread 6 is dead, restarting it.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:326 - Thread 6 started.
2013-05-28 14:40:30 DEBUG - [com.wsc.crawler.grabber.Grabber]:327 - Thread list size is :7
2013-05-28 14:40:30 INFO  - [com.wsc.crawler.grabber.Grabber]:480 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:40:30 WARN  - [com.wsc.crawler.grabber.Grabber]:497 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:40:35 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:40:35 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:40:35 DEBUG - [com.wsc.crawler.grabber.Grabber]:295 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:40:35 INFO  - [com.wsc.crawler.grabber.Grabber]:339 - Queue is empty in for loop
2013-05-28 14:42:16 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:42:17 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 14:42:17 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 14:42:17 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:42:17 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:42:17 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:42:17 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:42:17 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:42:21 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:42:21 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:42:21 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:21 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:21 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:203 - Number of resolved hosts are :7
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:205 - Number of Unresolved hosts are :2
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:42:25 DEBUG - [com.wsc.crawler.grabber.Grabber]:257 - Number of Threads are : 7
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:42:25 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:42:25 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=ef12a832f8ccff89:FF=0:TM=1369766545:LM=1369766545:S=2Qzf1I2l7uvLirgP][domain: .google.co.in][path: /][expiry: Thu May 28 14:42:25 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:42:25 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=tEV9PCPjLZq3Vfs9z4zOdoAfQMBOOFHiHsLyRGsoEKydHgNrWNEDeG3kjT2yytuwm5HgzuFjuQoVaF7q12emJ3SQXJbFnBQdIzj_2tIQ3e3wfSC9fcIAyRzxIDsacymA][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:42:25 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.152"
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:42:26 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:42:26 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:42:26 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:42:26 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:42:26 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:42:26 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:42:26 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:42:26 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:42:26 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 8cca974e157a3e6b2dc7bd636309e0de][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:15:50 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:42:27 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:42:27 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:42:27 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:42:27 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:42:27 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:42:27 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:42:27 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:42:27 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:42:29 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:42:29 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:42:29 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:42:29 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.Grabber]:361 - Queue is Empty...!
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.Grabber]:362 - Refilling Queue...!
2013-05-28 14:42:34 DEBUG - [com.wsc.crawler.grabber.Grabber]:365 - IsStopped =false
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:34 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is13
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is14
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is15
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is16
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is17
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is18
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is19
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:42:45 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:45 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:42:45 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is23
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is24
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is25
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is26
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is27
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is28
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is29
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:42:50 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:42:50 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:50 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is31
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is32
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is33
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is34
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is35
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is36
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is37
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:42:56 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:42:56 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:42:56 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is39
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is40
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is41
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is42
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is43
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is44
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is45
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:43:01 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:01 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:43:01 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:01 DEBUG - [clientserver.AsyncHTTPServer]:291 - Request recieved for operation heartbeat
2013-05-28 14:43:01 DEBUG - [clientserver.AsyncHTTPServer]:302 - Request Served Successfully
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is47
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is48
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is49
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is50
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is51
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is52
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is53
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:43:06 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:06 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:43:06 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is55
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is56
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is57
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is58
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is59
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is60
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is61
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:43:11 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:43:11 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:43:11 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:43:15 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:43:15 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:43:16 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:43:16 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:43:16 INFO  - [com.wsc.crawler.grabber.Grabber]:122 - Crawler Recieved A Stop Signal.
2013-05-28 14:43:16 INFO  - [com.wsc.crawler.grabber.Grabber]:123 - Stopping Crawler
2013-05-28 14:44:54 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:44:55 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 14:44:55 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 14:44:55 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 14:44:55 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 14:44:55 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 14:44:55 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 14:44:55 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 14:44:58 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.init.Initializer]:213 - previous instance of crawler is forcebly stopped
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.init.Initializer]:214 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 14:44:58 DEBUG - [com.wsc.crawler.init.Initializer]:341 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 14:44:58 WARN  - [com.wsc.crawler.init.Initializer]:350 - prev_frontier.xml is not found in ./temp
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.init.Initializer]:352 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:44:58 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:44:58 DEBUG - [com.wsc.crawler.init.Initializer]:318 - Frontier Server returned a Frontier of size 9
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:203 - Number of resolved hosts are :7
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:205 - Number of Unresolved hosts are :2
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.Grabber]:257 - Number of Threads are : 7
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:45:00 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:45:00 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=4b8430c0830e541d:FF=0:TM=1369766700:LM=1369766700:S=3bwolukQKAfq0DGB][domain: .google.co.in][path: /][expiry: Thu May 28 14:45:00 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 14:45:00 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=DebMTbd2geESJqL0jAFlTeZpCVBrXgru-liQUxN3PCusIkMmll92FW3oM4MP6NTUuRuliRTX1UrjZnmKq0OJRw9GKFcEIRmY9rjDq-I9kp5cCDqDgGfDyst3PhLnPwnz][domain: .google.co.in][path: /][expiry: Wed Nov 27 13:45:00 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "173.194.38.159"
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:45:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 14:45:00 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:45:00 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:45:00 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:45:00 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:45:00 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:45:00 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:45:00 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:45:01 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 311980756fe133fa511bcd4efcb9640a][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:18:25 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 14:45:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:45:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 14:45:01 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 14:45:01 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:45:01 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:45:01 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 14:45:02 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:45:02 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 14:45:02 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 14:45:02 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 14:45:02 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:108 - Cleaning Crawler resources.
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:116 - Deleting Lock file
2013-05-28 14:45:02 DEBUG - [com.wsc.Crawler.WSCrawler]:118 - Lock file Deleted
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.Grabber]:361 - Queue is Empty...!
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.Grabber]:362 - Refilling Queue...!
2013-05-28 14:45:07 DEBUG - [com.wsc.crawler.grabber.Grabber]:365 - IsStopped =false
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:07 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is13
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.159/, null]
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is14
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is15
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is16
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is17
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is18
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is19
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:45:17 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:17 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:45:17 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:19 DEBUG - [clientserver.AsyncHTTPServer]:317 - Opeartion value (dsa) not yet Implemented
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is23
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.151/, null]
2013-05-28 14:45:22 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is24
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:45:22 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is25
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:45:22 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is26
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:22 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:45:22 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is27
2013-05-28 14:45:22 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:23 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:45:23 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is28
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:23 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:45:23 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is29
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:45:23 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:45:23 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:23 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (0) is31
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 0 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 0 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://173.194.38.152/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 0 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (1) is32
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 1 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 1 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 1 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (2) is33
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 2 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 2 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 2 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (3) is34
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 3 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 3 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 3 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (4) is35
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 4 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 4 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 4 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (5) is36
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 5 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 5 started.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 5 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:304 - Checking thread at (6) is37
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:307 - Thread 6 is dead, restarting it.
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:328 - Thread 6 started.
2013-05-28 14:45:28 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 14:45:28 DEBUG - [com.wsc.crawler.grabber.Grabber]:329 - Thread list size is :7
2013-05-28 14:45:28 WARN  - [com.wsc.crawler.grabber.Grabber]:499 - 6 - error: java.lang.IllegalStateException: Connection pool shut down
2013-05-28 14:45:30 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 14:45:30 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 14:45:33 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 14:45:33 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 14:45:33 INFO  - [com.wsc.crawler.grabber.Grabber]:122 - Crawler Recieved A Stop Signal.
2013-05-28 14:45:33 INFO  - [com.wsc.crawler.grabber.Grabber]:123 - Stopping Crawler
2013-05-28 15:05:40 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 15:05:40 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 15:05:40 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 15:05:40 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 15:05:40 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 15:05:40 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 15:05:40 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 15:05:40 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 15:05:52 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.init.Initializer]:212 - previous instance of crawler is forcebly stopped
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.init.Initializer]:213 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 15:05:52 DEBUG - [com.wsc.crawler.init.Initializer]:340 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 15:05:52 WARN  - [com.wsc.crawler.init.Initializer]:349 - prev_frontier.xml is not found in ./temp
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.init.Initializer]:351 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 15:05:52 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 15:05:52 DEBUG - [com.wsc.crawler.init.Initializer]:317 - Frontier Server returned a Frontier of size 9
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:203 - Number of resolved hosts are :7
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:205 - Number of Unresolved hosts are :2
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.google.co.in/, http://74.125.236.223/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:239 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.Grabber]:257 - Number of Threads are : 7
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.google.co.in/, http://74.125.236.223/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 15:05:57 INFO  - [com.wsc.crawler.grabber.Grabber]:482 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 15:05:57 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=08cda204e266b554:FF=0:TM=1369767957:LM=1369767957:S=LSW6JiFZOSQSXg7E][domain: .google.co.in][path: /][expiry: Thu May 28 15:05:57 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 15:05:57 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=uFj3Tv8zuw-xExN2Z2ksnZAkrK6zG6rsnZWx0W2unkJ5V5TED3jg2VBnXEXFrMa3LidlcOm0DmuBJTE-S9pUlr4s0Egg82AIP39rjZ38RXVrFgqwPyqySAs-80nfEyt7][domain: .google.co.in][path: /][expiry: Wed Nov 27 14:05:57 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.223"
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 15:05:57 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 15:05:57 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:05:57 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:05:57 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:05:57 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:05:57 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:05:57 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:05:57 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 15:05:58 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 15:05:58 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:05:58 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:05:58 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:05:59 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: 16f740b28e6ff99b6889a28b10ebce82][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:39:22 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 15:05:59 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:05:59 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 15:05:59 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 15:05:59 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:05:59 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:05:59 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 15:06:00 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 15:06:00 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:06:00 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:109 - Cleaning Crawler resources.
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:117 - Deleting Lock file
2013-05-28 15:06:00 DEBUG - [com.wsc.Crawler.WSCrawler]:119 - Lock file Deleted
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.Grabber]:361 - Queue is Empty...!
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.Grabber]:362 - Refilling Queue...!
2013-05-28 15:06:05 DEBUG - [com.wsc.crawler.grabber.Grabber]:365 - IsStopped =false
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 15:06:05 INFO  - [com.wsc.crawler.grabber.Grabber]:110 - Queue is Set in Grabber is 9
2013-05-28 15:06:08 DEBUG - [clientserver.AsyncHTTPServer]:291 - Request recieved for operation heartbeat
2013-05-28 15:06:08 DEBUG - [clientserver.AsyncHTTPServer]:302 - Request Served Successfully
2013-05-28 15:06:15 INFO  - [clientserver.AsyncHTTPServer]:306 - Server Recieved a Stop Request...
2013-05-28 15:06:15 INFO  - [clientserver.AsyncHTTPServer]:307 - Trying to stop crawler...
2013-05-28 15:06:15 DEBUG - [com.wsc.crawler.grabber.Grabber]:297 - Thread Count is less than numthreads In Thread :20
2013-05-28 15:06:15 INFO  - [com.wsc.crawler.grabber.Grabber]:341 - Queue is empty in for loop
2013-05-28 15:06:15 INFO  - [com.wsc.crawler.grabber.Grabber]:122 - Crawler Recieved A Stop Signal.
2013-05-28 15:06:15 INFO  - [com.wsc.crawler.grabber.Grabber]:123 - Stopping Crawler
2013-05-28 15:07:50 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 15:07:50 DEBUG - [com.wsc.crawler.init.Initializer]:131 - Creating lock file.
2013-05-28 15:07:50 INFO  - [com.wsc.crawler.init.Initializer]:136 - Lockfile crawler.lock created successfully.
2013-05-28 15:07:50 DEBUG - [com.wsc.crawler.init.Initializer]:109 - An another instance os WSCrawler is running.
2013-05-28 15:07:50 WARN  - [com.wsc.crawler.init.Initializer]:110 - Unable to start new instance of the WSCrawler
2013-05-28 15:07:50 INFO  - [com.wsc.crawler.init.Initializer]:111 - Delete crawler.running file manually.
2013-05-28 15:07:50 DEBUG - [com.wsc.crawler.init.Initializer]:184 - Deleting crawler.running file.
2013-05-28 15:07:50 DEBUG - [com.wsc.crawler.init.Initializer]:186 - crawler.running file is deleted.
2013-05-28 15:08:10 DEBUG - [com.wsc.crawler.init.Initializer]:80 - Intializing crawler resources...
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.init.Initializer]:161 - crawler.running Created Successfully.
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.init.Initializer]:212 - previous instance of crawler is forcebly stopped
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.init.Initializer]:213 - Getting URLS from prev_frontier.xml from temp directory.
2013-05-28 15:08:11 DEBUG - [com.wsc.crawler.init.Initializer]:340 - Checking (prev_frontier.xml) existence in (./temp) directory
2013-05-28 15:08:11 WARN  - [com.wsc.crawler.init.Initializer]:349 - prev_frontier.xml is not found in ./temp
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.init.Initializer]:351 - Trying get URLs from Default URL Source, Frontier Server
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 15:08:11 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 15:08:11 DEBUG - [com.wsc.crawler.init.Initializer]:317 - Frontier Server returned a Frontier of size 9
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:208 - Number of resolved hosts are :7
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:210 - Number of Unresolved hosts are :2
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://www.google.co.in/, http://74.125.236.215/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:244 - Crawling Bean  ::[http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.Grabber]:262 - Number of Threads are : 7
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://www.google.co.in/, http://74.125.236.215/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://www.pythian.com/, http://50.56.240.252/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://www.bluegecko.net/gfgggg+tyhrt+ht/, http://199.195.218.39/gfgggg+tyhrt+ht/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://palominodb.com/, http://173.236.53.234/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://effectivemysql.com/, http://174.37.245.168/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://www.continuent.com/, http://50.16.211.10/, null]
2013-05-28 15:08:12 INFO  - [com.wsc.crawler.grabber.Grabber]:490 -  - about to get something from [http://www.ngs.noaa.gov/PUBS_LIB/FundSPCSys.pdf, http://205.156.4.133/PUBS_LIB/FundSPCSys.pdf, null]
2013-05-28 15:08:12 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: PREF][value: ID=bfe83c9fa4bb3a8b:FF=0:TM=1369768092:LM=1369768092:S=eHrxZeSBG0X5Gycg][domain: .google.co.in][path: /][expiry: Thu May 28 15:08:12 EDT 2015]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 15:08:12 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: NID][value: 67=ffb0mwv6hJw7aG1yIk_nVd5B1DZOISrhs5m9G2kVyV5mVmw3InDuziXa7tY5gd260cQcSAl5lrFiflybrfdjDQhY-g3nvk-V-QZcNuu-5gPRkWVSW2jSgT1pzzEKlbPm][domain: .google.co.in][path: /][expiry: Wed Nov 27 14:08:12 EST 2013]". Illegal domain attribute "google.co.in". Domain of origin: "74.125.236.215"
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.google.co.in/).
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.google.co.in/).
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.google.co.in/) is 28
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.google.co.in/).
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.google.co.in/).
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.google.co.in/) is 0
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 15:08:12 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.google.co.in/) is 28
2013-05-28 15:08:12 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:08:12 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:08:12 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:08:12 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:08:12 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:08:12 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:08:12 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:08:13 WARN  - [org.apache.http.client.protocol.ResponseProcessCookies]:127 - Cookie rejected: "[version: 0][name: SESS217349ac89be5468506443781e6d5f6c][value: e42bfe7010528ce48410a0e0b65a522b][domain: .palominodb.com][path: /][expiry: Thu Jun 20 18:41:37 EDT 2013]". Illegal domain attribute "palominodb.com". Domain of origin: "173.236.53.234"
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://effectivemysql.com/).
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://effectivemysql.com/).
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://effectivemysql.com/) is 102
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://effectivemysql.com/).
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://effectivemysql.com/).
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://effectivemysql.com/) is 0
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 102
2013-05-28 15:08:13 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://effectivemysql.com/) is 87
2013-05-28 15:08:13 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:08:13 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:08:13 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:08:13 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:08:13 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:08:13 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:08:13 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.pythian.com/) is 131
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.pythian.com/) is 1
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.pythian.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.pythian.com/) is 0
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 132
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.pythian.com/) is 90
2013-05-28 15:08:14 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:08:14 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://www.continuent.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://www.continuent.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://www.continuent.com/) is 45
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://www.continuent.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://www.continuent.com/).
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://www.continuent.com/) is 0
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 45
2013-05-28 15:08:14 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://www.continuent.com/) is 33
2013-05-28 15:08:14 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:08:14 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:08:14 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:101 - All hyperLinks are adding to one list from (http://palominodb.com/).
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:43 - Parsing HyperLinks in (http://palominodb.com/).
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:56 - Parsed hyperLinks size in (http://palominodb.com/) is 129
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:61 - Parsing IframeLinks in (http://palominodb.com/).
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:74 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:81 - Parsing frame Source Links in (http://palominodb.com/).
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:95 - Parsed IframeLinks size in (http://palominodb.com/) is 0
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:108 - Before duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 129
2013-05-28 15:08:15 DEBUG - [com.wsc.crawler.grabber.HyperLinkParser]:114 - After duplicate Elimination size of hyperLinks found in (http://palominodb.com/) is 92
2013-05-28 15:08:15 INFO  - [com.wsc.postcrawling.ConstructXML]:35 - Constructing Xml file to post to frontier
2013-05-28 15:08:15 DEBUG - [com.wsc.postcrawling.ConstructXML]:79 - XML file Construction is completed.
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:53 - postXML() is called.
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:60 - URI to post URLs parsed from html to frontier Server is :http://127.0.0.1:8080/?operation=urlsposted
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:88 - Response status code from frontier after POST URLs as XML is: 200
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:89 - Response body from frontier after POST URLs as XML is <!--
  ~ JBoss, Home of Professional Open Source.
  ~ Copyright (c) 2011, Red Hat, Inc., and individual contributors
  ~ as indicated by the @author tags. See the copyright.txt file in the
  ~ distribution for a full listing of individual contributors.
  ~
  ~ This is free software; you can redistribute it and/or modify it
  ~ under the terms of the GNU Lesser General Public License as
  ~ published by the Free Software Foundation; either version 2.1 of
  ~ the License, or (at your option) any later version.
  ~
  ~ This software is distributed in the hope that it will be useful,
  ~ but WITHOUT ANY WARRANTY; without even the implied warranty of
  ~ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
  ~ Lesser General Public License for more details.
  ~
  ~ You should have received a copy of the GNU Lesser General Public
  ~ License along with this software; if not, write to the Free
  ~ Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
  ~ 02110-1301 USA, or see the FSF site: http://www.fsf.org.
  -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

<html>
<head>
  <title>Welcome to JBoss Application Server 7</title>
  <link rel="shortcut icon" href="favicon.ico" type="image/x-icon">
  <link rel="StyleSheet" href="as7_style.css" type="text/css">
</head>

<body>
  <div class="wrapper">
    <div class="as7">
      <img src="as7_logo.png" alt="JBoss Application Server 7... it's here." width="195" height="228" border="0">
    </div>

    <div class="content">
      <h1>Welcome to AS 7</h1>

      <h3>Your JBoss Application Server 7 is running.</h3>

      <p><a href="documentation.html">Documentation</a> | <a href="https://docs.jboss.org/author/display/AS71/Quickstarts">Quickstarts</a> | <a href="/console">Administration Console</a> <br/>

      <a href="http://www.jboss.org/jbossas"><br>
      JBoss AS Project</a> | <a href="http://community.jboss.org/en/jbossas/as7_users?view=all">User
      Forum</a> | <a href=
      "https://issues.jboss.org/browse/AS7">Report an issue</a></p>

      <p class="logos"><a href="http://jboss.org"><img src="jboss_community.png" alt="JBoss and JBoss Community" width=
      "254" height="31" border="0"></a></p>

      <p class="note">To replace this page set "enable-welcome-root" to false in your server configuration and deploy
      your own war with / as its context path.</p>
    </div>
  </div>
</body>
</html>

2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:94 - Releasing Post connetion after posting URLs
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:109 - Cleaning Crawler resources.
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:117 - Deleting Lock file
2013-05-28 15:08:15 DEBUG - [com.wsc.Crawler.WSCrawler]:119 - Lock file Deleted
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.Grabber]:366 - Queue is Empty...!
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.Grabber]:367 - Refilling Queue...!
2013-05-28 15:08:20 DEBUG - [com.wsc.crawler.grabber.Grabber]:370 - IsStopped =false
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:33 - pinging the host 127.0.0.1.
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:35 - pinging the host 127.0.0.1 is successfull.
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:37 - Checking whether http://127.0.0.1:8080 is reachable.
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.LocalClient]:39 - the host http://127.0.0.1:8080 is reachable.
2013-05-28 15:08:20 INFO  - [com.wsc.crawler.grabber.Grabber]:112 - Queue is Set in Grabber is 9
