Jump to content

Web Sorrow Tool/scanner


Recommended Posts

I started a project on google code. id love your feedback! Contribute if you wish.

http://code.google.com/p/web-sorrow/

A perl based tool used for checking a Web server for misconfiguration, version detection, enumeration, and server information. I will build more Functionality in the future. what is's NOT: Vulnerably scanner, inspection proxy, DDoS tool, exploitation framework. It's entirely focused on Enumeration and collecting Info on the target server

CURRENT functionality:

-S - stands for standard. a set of Standard tests and includes: indexing of directories testing, banner grabbing, language detection (should be obvious), robots.txt, 200 response testing (some servers send a 200 ok for every req), and thumbs.db scanning

-auth - looks for login pages with a list of some of the most common login files and dirs and admin consoles. don't need to be very big list of URLs because what else are going to name it? notAlogin.php???

-Cp - scan with a huge list of plugins dirs. the list is a bit old (Drupal and wordpress plugins databases are now current but sorry joomla's still a bit old)

-I - searches the responses for interesting strings

-Ws - looks for web services such as hosting provider, blogging services, favicon fingerprinting, and cms version info

-Fd - look for generally things people don't want you to see. The list is generated form a TON of robot.txt so whatever it finds should be interesting.

-ninja - A light weight and undetectable scan that uses bits and peaces from other scans

-R - use http range headers to make scans faster

-Shadow - Use Google cache instead of requesting from the target host

-Sd - Bruteforce Sub Domains

-Db - Bruteforce Directories with the big dirbuster Database

-ua - use a custom UserAgent. PUT UA IN QUOTES if theres spaces

-proxy - send all http reqs via a proxy. example: 255.255.255.254:8080

-e - run all the scans in the tool

web-sorrow also has false positives checking on most of it's requests (it pretty accurate but not perfect)

Edited by flyingpoptartcat
Link to comment
Share on other sites

Just wondering if you would be able to upload a video demonstrating how it works.

Link to comment
Share on other sites

Posted · Hidden by flyingpoptartcat, June 8, 2012 - irrelivent content
Hidden by flyingpoptartcat, June 8, 2012 - irrelivent content

Just wondering if you would be able to upload a video demonstrating how it works.

i'd love to. i'm constantly adding features and fixing bug so stay up to date! v1.2 coming soon

Link to comment
Posted · Hidden by flyingpoptartcat, June 8, 2012 - irrelivent content
Hidden by flyingpoptartcat, June 8, 2012 - irrelivent content

when ever i try to edit the footage of the tool it come out looking like cr4p and it like 200mb do i need a special encoder or a setting missing??

Link to comment
  • 2 months later...

The major thing that gives away a web scan is how they don't really throttle the scan. I drop most web scans just by putting a SYN rate limit in my IPTABLES. It works with port scans and SYN floods too. I would look into HTTP 1.1 Keep Alive as it lets you keep the same connection to check multiple pages. This would stop you from getting picked up by the same rules that keep other easily avoidable attacks left out in the cold.

Link to comment
Share on other sites

The major thing that gives away a web scan is how they don't really throttle the scan. I drop most web scans just by putting a SYN rate limit in my IPTABLES. It works with port scans and SYN floods too. I would look into HTTP 1.1 Keep Alive as it lets you keep the same connection to check multiple pages. This would stop you from getting picked up by the same rules that keep other easily avoidable attacks left out in the cold.

If you want to use stealth try -ninja

Also Web-Sorrow uses Connection caching aka all on one socket. Don't forget to update I posted this thread awhile ago

Link to comment
Share on other sites

If you want to use stealth try -ninja

Also Web-Sorrow uses Connection caching aka all on one socket. Don't forget to update I posted this thread awhile ago

I hadn't tried it out, I was just suggesting based on how I detect scans, the fact that yours does that puts it above most. I see scans from w00tw00t all the time, to the point where I started blocking IPs in iptables that scan my server for vulnerabilities, and made taunting 404 messages. :P

Link to comment
Share on other sites

I hadn't tried it out, I was just suggesting based on how I detect scans, the fact that yours does that puts it above most. I see scans from w00tw00t all the time, to the point where I started blocking IPs in iptables that scan my server for vulnerabilities, and made taunting 404 messages. :P

Well thank you. btw custom 404 pages are always fun to see

Link to comment
Share on other sites

Well thank you. btw custom 404 pages are always fun to see

So I got to rethinking how I block web scanners, and had an idea for putting some code in my custom 404 page. Here is the run down of my idea.

http://pastebin.com/Nf2YyAGe

It blocks any IP that visits those pages(if they don't exist that is). I'm going to try your tool against that idea, and let you know the results. I'm not sure if PHP code will run on a HEAD command, but if it doesn't, that could be one way to bypass my idea.

Edit: The custom 404/403 method does work against your tool. This includes ninja mode.

Edited by bwall
Link to comment
Share on other sites

So I got to rethinking how I block web scanners, and had an idea for putting some code in my custom 404 page. Here is the run down of my idea.

http://pastebin.com/Nf2YyAGe

It blocks any IP that visits those pages(if they don't exist that is). I'm going to try your tool against that idea, and let you know the results. I'm not sure if PHP code will run on a HEAD command, but if it doesn't, that could be one way to bypass my idea.

Edit: The custom 404/403 method does work against your tool. This includes ninja mode.

neet idea! If you want a good list of things to block goto Web-Sorrow_v(version number)/DB/small-tests.db and open in text editor. In Web-Sorrow -ninja does NOT make other scans stealthy It Itself is a scan that uses very few requests. BTW I've just updated web-sorrow to v1.3.7

Link to comment
Share on other sites

neet idea! If you want a good list of things to block goto Web-Sorrow_v(version number)/DB/small-tests.db and open in text editor. In Web-Sorrow -ninja does NOT make other scans stealthy It Itself is a scan that uses very few requests. BTW I've just updated web-sorrow to v1.3.7

Hey,

I tried using -R, and its still only sending GETs, not HEADs. I'm not sure if the command line argument is catching or not.

Link to comment
Share on other sites

Ok, I figured -R would send HEAD requests, as that's a good way to check if a file exists. I see what you are doing with the Content-Range though, that is pretty neat. That way only bytes 0 through 1 get returned. Although, it does process the whole php page, meaning the 404/403 script will block it. Would it be difficult to add a flag to send HEADs instead of GETs?

http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html 9.4

Link to comment
Share on other sites

Ok, I figured -R would send HEAD requests, as that's a good way to check if a file exists. I see what you are doing with the Content-Range though, that is pretty neat. That way only bytes 0 through 1 get returned. Although, it does process the whole php page, meaning the 404/403 script will block it. Would it be difficult to add a flag to send HEADs instead of GETs?

http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html 9.4

well I would have to make a subroutine to make requests and check whether the flag is set every time a request is made. so yes it would be hard. and you might want to use regex when checking the url to avoid evasions like ././././././passwords.txt?tgvfjhgvjhg=rgtrf444

Link to comment
Share on other sites

well I would have to make a subroutine to make requests and check whether the flag is set every time a request is made. so yes it would be hard. and you might want to use regex when checking the url to avoid evasions like ././././././passwords.txt?tgvfjhgvjhg=rgtrf444

Good idea, I was planning on adding to the methods I use to sanitize urls for checking. Right now I have it removing an extra / at the beginning, but I think I'm going to loop that, and change how the values are stored. I'll post the next version.

I also want to change everything over to the startsWith check, so excess stuff like the GET args will be ignored.

Link to comment
Share on other sites

well I would have to make a subroutine to make requests and check whether the flag is set every time a request is made. so yes it would be hard. and you might want to use regex when checking the url to avoid evasions like ././././././passwords.txt?tgvfjhgvjhg=rgtrf444

Double posting, sorry.

So I did some work with netcat and Apache.

Apache seems to automatically remove the "./" but I'll keep that functionality in for other webservers. Also, HEADs do run PHP code, so using a HEAD will not avoid detection.

Here is my current version.

http://pastebin.com/h7SPzftp

I'm going to add more locations to it later.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...