Git Product home page Git Product logo

blackwidow's Introduction

alt tag

ABOUT:

BlackWidow is a python based web application spider to gather subdomains, URL's, dynamic parameters, email addresses and phone numbers from a target website. This project also includes Inject-X fuzzer to scan dynamic URL's for common OWASP vulnerabilities.

DEMO VIDEO:

BlackWidow Demo

FEATURES:

  • Automatically collect all URL's from a target website
  • Automatically collect all dynamic URL's and parameters from a target website
  • Automatically collect all subdomains from a target website
  • Automatically collect all phone numbers from a target website
  • Automatically collect all email addresses from a target website
  • Automatically collect all form URL's from a target website
  • Automatically scan/fuzz for common OWASP TOP vulnerabilities
  • Automatically saves all data into sorted text files

LINUX INSTALL:

sudo bash install.sh

USAGE:

blackwidow -u https://target.com - crawl target.com with 3 levels of depth.
blackwidow -d target.com -l 5 -v y - crawl the domain: target.com with 5 levels of depth with verbose logging enabled.
blackwidow -d target.com -l 5 -c 'test=test' - crawl the domain: target.com with 5 levels of depth using the cookie 'test=test'
blackwidow -d target.com -l 5 -s y -v y - crawl the domain: target.com with 5 levels of depth and fuzz all unique parameters for OWASP vulnerabilities with verbose logging on.
injectx.py -u https://test.com/uers.php?user=1&admin=true -v y - Fuzz all GET parameters for common OWASP vulnerabilities with verbose logging enabled.

SAMPLE REPORT:

alt tag

DOCKER:

git clone https://github.com/1N3/BlackWidow.git
cd BlackWidow
docker build -t blackwidow .
docker run -it blackwidow # Defaults to --help

LICENSE:

You may modify and re-distribute this software as long as the project name "BlackWidow", credit to the author "xer0dayz" and website URL "https://sn1persecurity.com" are NOT mofified. Doing so will break the license agreement and a takedown notice will be issued.

DISCLAIMER:

This program is used for educational and ethical purposes only. I take no responsibility for any damages caused from using this program. By downloading and using this software, you agree that you take full responsibility for any damages and liability.

LINKS:

blackwidow's People

Contributors

1n3 avatar delirious-lettuce avatar ifly53e avatar khast3x avatar rdxr10 avatar tdebatty avatar xer0dayz avatar zishanadthandar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

blackwidow's Issues

add port?

Does it make sense to add the port for non standard http? For example:

parser.add_option('-p', '--port',
action="store", dest="port",
help="Port for the URL", default="80")

port = str(options.port)

if (len(str(domain)) > 4):
target = "http://" + domain + ":" + port

etc...

README example, how to get to result files?

I ran the example on Debian 11 (Bullseye):

git clone https://github.com/1N3/BlackWidow.git
cd BlackWidow
docker build -t blackwidow .
docker run -it blackwidow -h htttps://egbert.net/

Seems to build and run fine. Got a bunch of analyzed reports made.

And got the following ending:

__________________________________________________________________________________________________

[+] Loot Saved To: 
/usr/share/blackwidow/egbert.net_80/
__________________________________________________________________________________________________

Then back in the main shell:

~/work/github# cd /usr/share/blackwidow/egbert.net_80
-bash: cd: /usr/share/blackwidow/egbert.net_80: No such file or directory

But no way to get access to the reports. Did I misread the README?

Python error on -d option

When running blackwidow with the -d option, a python error very similar to #10 occurs. While that issue used the -u option, this issue concerns the -d option.

All generated loot files are empty.

blackwidow -d hackerone.com


                _.._
              .'    '.
             /   __   \ 
          ,  |   ><   |  ,
         . \  \      /  / .
          \_'--`(  )'--'_/
            .--'/()'--.
     1N3   /  /` '' `\  \ 
             |        |
              \      /


 + -- --=[https://crowdshield.com
 + -- --=[blackwidow v1.0

Failed to parse: hackerone.com:80:80
local variable 'domain' referenced before assignment


                _.._
              .'    '.
             /   __   \ 
          ,  |   ><   |  ,
         . \  \      /  / .
          \_'--`(  )'--'_/
            .--'/()'--.
     1N3   /  /` '' `\  \ 
             |        |
              \      /


 + -- --=[https://crowdshield.com
 + -- --=[blackwidow v1.0

[+] URL's Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-urls-sorted.txt
__________________________________________________________________________________________________

[+] Dynamic URL's Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-dynamic-sorted.txt
__________________________________________________________________________________________________

[+] Form URL's Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-forms-sorted.txt
__________________________________________________________________________________________________

[+] Unique Dynamic Parameters Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-dynamic-unique.txt
__________________________________________________________________________________________________

[+] Sub-domains Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-subdomains-sorted.txt
__________________________________________________________________________________________________

[+] Emails Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-emails-sorted.txt
__________________________________________________________________________________________________

[+] Phones Discovered: 
/usr/share/blackwidow/hackerone.com_80/hackerone.com_80-phones-sorted.txt
__________________________________________________________________________________________________

[+] Loot Saved To: 
/usr/share/blackwidow/hackerone.com_80/
__________________________________________________________________________________________________

 HACK THE PLANET!!!!!
**************************************************************************************************
If you haven't already, please donate to this project using the addresses below.
This will help fascilitate improved features and ongoing support.

[+] BTC 1Fav36btfmdrYpCAR65XjKHhxuJJwFyKum
[+] ETH 0x20bB09273702eaBDFbEE9809473Fd04b969a794d
[+] LTC LQ6mPewec3xeLBYMdRP4yzeta6b9urqs2f
[+] XMR 4JUdGzvrMFDWrUUwY3toJATSeNwjn54LkCnKBPRzDuhzi5vSepHfUckJNxRL2gjkNrSqtCoRUrEDAgRwsQvVCjZbS3EN24xprAQ1Z5Sy5s
[+] ZCASH t1fsizsk2cqqJAjRoUmXJSyoVa9utYucXt7

1N3@CrowdShield
https://crowdshield.com
**************************************************************************************************

"blackwidow -d target.com -l 5 -s y - crawl" Command not working

Dear ,

I just start using your tool but while searching content and fuzzing it with OWASP (can be done by your tool with the command i mentioned in Issue title).


blackwidow -d target.com -l 5 -s y - crawl


Response was containing nothing in .txt directories as made by blackwidow.. Can you please check ..is it working fine on ur end?

errors when running in vm

2018-05-08 14_33_37-kali-linux-2018 1-vbox-amd64 snapshot 1 running - oracle vm virtualbox

Linux kali 4.15.0-kali3-amd64 #1 SMP Debian 4.15.17-1kali1 (2018-04-25) x86_64 GNU/Linux

cp blackwidow /usr/bin/blackwidow
cp injectx.py /usr/bin/injectx.py
pip install -r requirements.txt

Done all 3 of the above without error, any ideas?

I did something wrong i know...

I just used git clone and pip install -r request.txt and everything and when i scanned a website (blackwidow website.com -d) something like that everything was good until i was going to open the /usr/share/blackwidow/80 folder. There was no results and i got upset and i just sudo rm -r -f blackwidow from the /use/share/ and now everytime i remove and clone this tool, it never make a folder. I thought it would download a new... What is the problem?!??

python blackwidow --help does not work

VMbox - Kali linux

when i type python3 blackwidow this pops up

File "blackwidow", line 45
print ""
^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print("")?

how do i fix?

also when i do ./install.sh and/or sudo ./install.sh i get this error

./install.sh: line 33: pip: command not found

also what is the fix for that?

RequestsDependencyWarning: urllib3 (1.22) or chardet (2.3.0)

I installed all requirements but I am facing this error.

/home/darklord/.local/lib/python2.7/site-packages/requests/init.py:80: RequestsDependencyWarning: urllib3 (1.22) or chardet (2.3.0) doesn't match a supported version!
RequestsDependencyWarning)
black

onion url

Does this tool work with onion urls and if so, how would the command look like?

local variable 'domain' referenced before assignment

blackwidow -u https://hackerone.com

            _.._
          .'    '.
         /   __   \ 
      ,  |   ><   |  ,
     . \  \      /  / .
      \_'--`(  )'--'_/
        .--'/()'--.
 1N3   /  /` '' `\  \ 
         |        |
          \      /
  • -- --=[https://crowdshield.com
  • -- --=[blackwidow v1.0

[Errno 0] Error
local variable 'domain' referenced before assignment

            _.._
          .'    '.
         /   __   \ 
      ,  |   ><   |  ,
     . \  \      /  / .
      \_'--`(  )'--'_/
        .--'/()'--.
 1N3   /  /` '' `\  \ 
         |        |
          \      /
  • -- --=[https://crowdshield.com
  • -- --=[blackwidow v1.0

[+] URL's Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-urls-sorted.txt


[+] Dynamic URL's Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-dynamic-sorted.txt


[+] Form URL's Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-forms-sorted.txt


[+] Unique Dynamic Parameters Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-dynamic-unique.txt


[+] Sub-domains Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-subdomains-sorted.txt


[+] Emails Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-emails-sorted.txt


[+] Phones Discovered:
/usr/share/blackwidow/hackerone.com/hackerone.com-phones-sorted.txt


[+] Loot Saved To:
/usr/share/blackwidow/hackerone.com/


HACK THE PLANET!!!!!


If you haven't already, please donate to this project using the addresses below.
This will help fascilitate improved features and ongoing support.

[+] BTC 1Fav36btfmdrYpCAR65XjKHhxuJJwFyKum
[+] ETH 0x20bB09273702eaBDFbEE9809473Fd04b969a794d
[+] LTC LQ6mPewec3xeLBYMdRP4yzeta6b9urqs2f
[+] XMR 4JUdGzvrMFDWrUUwY3toJATSeNwjn54LkCnKBPRzDuhzi5vSepHfUckJNxRL2gjkNrSqtCoRUrEDAgRwsQvVCjZbS3EN24xprAQ1Z5Sy5s
[+] ZCASH t1fsizsk2cqqJAjRoUmXJSyoVa9utYucXt7

1N3@CrowdShield
https://crowdshield.com


Docker build name must be to lowercase over OS X

MacBook-Pro-de-akita:BlackWidow ak1t4$ docker build -t BlackWidow .
invalid argument "BlackWidow" for "-t, --tag" flag: invalid reference format: repository name must be lowercase

must be:

MacBook-Pro-de-akita:BlackWidow ak1t4$ docker build -t blackwidow .
Sending build context to Docker daemon  606.7kB

error "BeatifulSoup"

Traceback (most recent call last):
File "/usr/bin/blackwidow", line 6, in
from bs4 import BeautifulSoup
ImportError: No module named bs4

im need help with that

I want to understand

What does it mean when it says "Reflected value detected"? I don't understand...
Screenshot_20210815-113135

Error opening in kali 2020

Hi, I have an error opening the program with kali 2020.

Code to install it.
pip3 install -r requirements.txt
sudo ./install.sh

Ways to open it 1:
blackwidow
Error:
Traceback (most recent call last):
File "/usr/bin/blackwidow", line 6, in
from bs4 import BeautifulSoup
ImportError: No module named bs4

Ways to open it 2:
sudo python3 blackwidow
Error:
File "blackwidow", line 45
print ""
^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print("")

Video:
https://www.dropbox.com/s/gom4oyx8yojs17v/0b6a74OYy3.mp4?dl=0

Save the Injectx output file

Hello, how are you doing?

Could you please change the code to save the output of Injectx?

Something like this:
os.system('for a in cat ' + save_dir + domain + "_" + port + '-dynamic-unique.txt; do python3 /usr/bin/injectx.py -u $a | tee save_dir + domain + "_" + port + "-injectx.txt"; done;')

Thanks.

Add support for list of domains

Hi @1N3

Can you please add support for list of domains as "subdomain module" is not working as expected. So it will be a good feature to have option for feeding list of domains.

Thanks.

error in docker build

$ docker build -t blackwidow .
Sending build context to Docker daemon  528.4kB
Step 1/7 : FROM alpine:edge
 ---> 7eacb6761fa1
Step 2/7 : RUN apk --update add --no-cache python2 py2-requests py2-pip py2-lxml py2-requests openssl ca-certificates
 ---> Running in ea5b005b3661
fetch http://dl-cdn.alpinelinux.org/alpine/edge/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/edge/community/x86_64/APKINDEX.tar.gz
ERROR: unsatisfiable constraints:
  py2-lxml (missing):
    required by: world[py2-lxml]
The command '/bin/sh -c apk --update add --no-cache python2 py2-requests py2-pip py2-lxml py2-requests openssl ca-certificates' returned a non-zero code: 1

Directory nonexistent

Ive installed all the requirements. It looks like theres a problem with the path to the directory.

Traceback (most recent call last):
  File "/usr/bin/blackwidow", line 308, in <module>
    urls_saved = open(urls_saved_file,"w+")
IOError: [Errno 2] No such file or directory: '/usr/share/blackwidow//-urls.txt'
sh: 1: cannot create /usr/share/blackwidow//-urls-sorted.txt: Directory nonexistent
sh: 1: cannot create /usr/share/blackwidow//-forms-sorted.txt: Directory nonexistent
sh: 1: cannot create /usr/share/blackwidow//-dynamic-sorted.txt: Directory nonexistent
touch: cannot touch '/usr/share/blackwidow//-dynamic-unique.txt': No such file or directory
cat: /usr/share/blackwidow//-dynamic-sorted.txt: No such file or directory
sh: 1: cannot create /usr/share/blackwidow//-subdomains-sorted.txt: Directory nonexistent
sh: 1: cannot create /usr/share/blackwidow//-emails-sorted.txt: Directory nonexistent
sh: 1: cannot create /usr/share/blackwidow//-phones-sorted.txt: Directory nonexistent

Choose of the output file path

Can you add an option to choose the output directory of the script? It could be useful for integration with other tools :-)

Unreachable code

if link.get('href')[:4] == "http":

To get inside of this if statement, link.get('href') must start with http but then it cannot ever start with # or tel: or :mailto.

Since link.get('href') is being used so frequently in this function, a better option would be to assign it to a variable like current_link = link.get('href') and also use Python's str.startswith instead of slices

>>> link = {'href': 'http://www.google.com'}
>>> current_link = link.get('href')
>>> current_link[:4] == 'http'
True
>>> current_link.startswith('http')
True
>>> current_link.startswith(('#', 'tel:', 'mailto:'))
False

I commented out these two seemingly unreachable sections to highlight them.

if link.get('href')[:4] == "http":
  # SAME ORIGIN
  if domain in link.get('href'):
    # IF URL IS DYNAMIC
    if "?" in link.get('href'):
      print OKRED + "[+] Dynamic URL found! " + link.get('href') + " " + RESET
      urls.write(link.get('href') + "\n")
      urls_saved.write(link.get('href') + "\n")
      dynamic_saved.write(link.get('href') + "\n")
    # # DOM BASED LINK
    # elif link.get('href')[:1] == "#":
    #   print OKBLUE + "[i] DOM based link found! " + link.get('href') + " " + RESET
    # # TELEPHONE
    # elif link.get('href')[:4] == "tel:":
    #   s = link.get('href')
    #   phonenum = s.split(':')[1]
    #   print OKORANGE + "[i] Telephone # found! " + phonenum + " " + RESET
    #   phones_saved.write(phonenum + "\n")
    # # EMAIL
    # elif link.get('href')[:7] == "mailto:":
    #   s = link.get('href')
    #   email = s.split(':')[1]
    #   print OKORANGE + "[i] Email found! " + email + " " + RESET
    #   emails_saved.write(email + "\n")

    # FULL URI OF SAME ORIGIN
    else:
      print link.get('href')
      urls.write(link.get('href') + "\n")
      urls_saved.write(link.get('href') + "\n")
  # EXTERNAL LINK FOUND
  else:
    # IF URL IS DYNAMIC
    if "?" in link.get('href'):
      print COLOR2 + "[+] External Dynamic URL found! " + link.get('href') + " " + RESET
    # # DOM BASED LINK
    # elif link.get('href')[:1] == "#":
    #   print COLOR2 + "[i] External DOM based link found! " + link.get('href') + " " + RESET
    # # TELEPHONE
    # elif link.get('href')[:4] == "tel:":
    #   s = link.get('href')
    #   phonenum = s.split(':')[1]
    #   print OKORANGE + "[i] External Telephone # found! " + phonenum + " " + RESET
    # # EMAIL
    # elif link.get('href')[:7] == "mailto:":
    #   s = link.get('href')
    #   email = s.split(':')[1]
    #   print OKORANGE + "[i] External Email found! " + email + " " + RESET

    # FULL URI OF EXTERNAL ORIGIN
    else:
      print COLOR2 + "[i] External link found! " + link.get('href') + " " + RESET

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.