Git Product home page Git Product logo

datasploit's Introduction

ToolsWatch Best Tools

Arsenal-2017-EU

Arsenal-2017-US

Arsenal-2017-ASIA

Arsenal-2016-EU

Arsenal-2016-US

DEFCON 25 ReconVillage DEFCON 24 Demolabs

Join Datasploit Slack

Follow Datasploit on Twitter

Overview of the tool:

  • Performs OSINT on a domain / email / username / phone and find out information from different sources.
  • Correlate and collaborate the results, show them in a consolidated manner.
  • Tries to find out credentials, api-keys, tokens, subdomains, domain history, legacy portals, etc. related to the target.
  • Use specific script / launch automated OSINT for consolidated data.
  • Performs Active Scans on collected data.
  • Generates HTML, JSON reports along with text files.

Basic Usage:


	  ____/ /____ _ / /_ ____ _ _____ ____   / /____  (_)/ /_
	  / __  // __ `// __// __ `// ___// __ \ / // __ \ / // __/
	 / /_/ // /_/ // /_ / /_/ /(__  )/ /_/ // // /_/ // // /_  
	 \__,_/ \__,_/ \__/ \__,_//____// .___//_/ \____//_/ \__/  
	                               /_/                        
						
         	   Open Source Assistant for #OSINT            
                 website: www.datasploit.info               
	
Usage: domainOsint.py [options]

Options:
  -h,	    	--help			    show this help message and exit
  -d DOMAIN,	--domain=DOMAIN		Domain name against which automated Osint 
                                    is to be performed.

Required Setup:

  • Python 2.7 (because bunch of dependencies do not support Python 3.0)
  • Bunch of python libraries (use requirements.txt)
  • In Kali Linux, please install the requirements using the command pip install --upgrade --force-reinstall -r requirements.txt

Detailed Tool Documentation:

https://datasploit.github.io/datasploit/

Lead Developers

Social Media

datasploit's People

Contributors

ameygat avatar anandtiwarics avatar anantshri avatar benichmt1 avatar chan9390 avatar denrat avatar dvopsway avatar ishamfazal avatar khasmek avatar kunalaggarwal avatar meme-lord avatar nkpanda avatar spmedia avatar sudhanshuc avatar upgoingstar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datasploit's Issues

Check Shodan/Censys/Zoomeye for nslookup

perform a quick nslookup on IPs accumulated, so we can verify if the dynamic services like aws, etc. have not changed the mapping.

Also many times accessing IP directly is disabled, so finding a hostname related to that will be a quick help.

Performing passive SSL Scan

Performing passive SSL Scan

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 311, in do_everything
print "OverAll Rating: %s" % results['GRADE']
KeyError: 'GRADE'

Whois Error

Hello!

Not sure what's happening but I'm getting an error during the whois proces:

Traceback (most recent call last):
  File "domainOsint.py", line 432, in <module>
    main()
  File "domainOsint.py", line 428, in main
    do_everything(domain)
  File "domainOsint.py", line 122, in do_everything
    whoisdata = whoisnew(domain)
  File "/home/sdavenport/Tools/datasploit/domain_whois.py", line 15, in whoisnew
    w = whois.whois(domain)
  File "/usr/local/lib/python2.7/dist-packages/whois/__init__.py", line 38, in whois
    return WhoisEntry.load(domain, text)
  File "/usr/local/lib/python2.7/dist-packages/whois/parser.py", line 170, in load
    return WhoisCom(domain, text)
  File "/usr/local/lib/python2.7/dist-packages/whois/parser.py", line 244, in __init__
    raise PywhoisError(text)
whois.parser.PywhoisError: 
Whois Server Version 2.0

I verified that I have the most up to date pywhois version. Am I missing something? Thanks!

ImportError: No module named whois

Hi, I using Ubuntu 15 and python 3. So a have an issue when trying
python domainOsint.py -d domain.org
an answer is

Traceback (most recent call last):
  File "domainOsint.py", line 30, in <module>
    import whois
ImportError: No module named whois

Can u help me? What I doing wrong?

BSON Overflow

Running the program against a test domain I ran into this error:

Traceback (most recent call last):
File "domainOsint.py", line 432, in
main()
File "domainOsint.py", line 428, in main
do_everything(domain)
File "domainOsint.py", line 391, in do_everything
result = db.domaindata.insert(dict_to_apend, check_keys=False)
File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 2212, in insert
check_keys, manipulate, write_concern)
File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 535, in _insert
check_keys, manipulate, write_concern, op_id, bypass_doc_val)
File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 516, in _insert_one
check_keys=check_keys)
File "/usr/local/lib/python2.7/dist-packages/pymongo/pool.py", line 244, in command
self._raise_connection_failure(error)
File "/usr/local/lib/python2.7/dist-packages/pymongo/pool.py", line 372, in _raise_connection_failure
raise error
OverflowError: BSON can only handle up to 8-byte ints

This happened right after the script searched through Shodan, I will try and see if its reproducible.

Duplicates python scripts

The python scripts are both in the root directory and in core/osint.

It would be easier if both could be stored in a single place.

Add builtwith.com

I feel http://builtwith.com/ should be added as it has the following features:

  1. Free sign-up and 50 credits (credits could be increased for with some $$$)
  2. API access
  3. XML and JSON output formats
  4. Advanced LookUp features along with the list of widgets used in website (helps to find any recent vulns. related to the widgets)

where to find the GUI

Hey guys,

i desperately searched for the GUI in the tool : /, but I couldnt find it.. Is it already out there? At least it says its available in GUI and console.

If not, where can I access or export the results of the tool?

would be glad for some advice, thanks!

Error : Finding User Information

---> Finding User Information

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 222, in do_everything
print_emailosint(x)
File "/home/seamus/datasploit/emailOsint.py", line 82, in print_emailosint
data = fullcontact(email)
File "/home/seamus/datasploit/email_fullcontact.py", line 14, in fullcontact
req = requests.get("https://api.fullcontact.com/v2/person.json?email=%s&apiKey=%s" % (email, cfg.fullcontact_api))
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 71, in get
return request('get', url, params=params, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 57, in request
return session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 453, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine("''",))

Check osintframework.com

The site http://osintframework.com/ has lots of good resources for OSINT. I think most of them could be included to datasploit.

[ The project is also open source, you could find the source code of the website at https://github.com/lockfale/OSINT-Framework ]

Problem with installation of requirements on Ubuntu 14.04

Hey, while trying to install, I'm getting the following error message:

$ sudo pip install -r requirements.txt
The directory '/home/wklm/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
You are using pip version 7.1.0, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
The directory '/home/wklm/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Collecting amqp==1.4.9 (from -r requirements.txt (line 1))
Downloading amqp-1.4.9-py2.py3-none-any.whl (51kB)
100% |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 53kB 717kB/s
Collecting anyjson==0.3.3 (from -r requirements.txt (line 2))
Downloading anyjson-0.3.3.tar.gz
Collecting BeautifulSoup==3.2.1 (from -r requirements.txt (line 3))
Downloading BeautifulSoup-3.2.1.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 20, in
File "/tmp/pip-build-t9x02kuf/BeautifulSoup/setup.py", line 22
print "Unit tests have failed!"
^
SyntaxError: Missing parentheses in call to 'print'

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-t9x02kuf/BeautifulSoup

How can I deal with that?
cheers, wklm

Installation error (inotify is not available on macosx-10.11-intel)

Collecting pyinotify==0.9.6 (from -r requirements.txt (line 19))
  Using cached pyinotify-0.9.6.tar.gz
    Complete output from command python setup.py egg_info:
    inotify is not available on macosx-10.11-intel
    ---------------------
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/dj/m0m8d52n03v4wfd7zk35fwnm0000gn/T/pip-build-DfhEUn/pyinotify/

'cookies' is not defined

Hi, it seems that you forgot to change how you were doing the things with cookies before at line 154 at domain_subdomains.py.

req2 = requests.get(url, cookies = cookies)

should be now

req2 = requests.get(url)

Right?

Error while Performing passive SSL Scan

---> Performing passive SSL Scan

Picking up One IP from bunch of IPs returned: xxx.xxx.xxx.xxx
Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 299, in do_everything
results_new = check_ssl_htbsecurity(results['MULTIPLE_IPS'][0])
File "/root/datasploit-master/domain_sslinfo.py", line 15, in check_ssl_htbsecurity
req = requests.post('https://www.htbridge.com/ssl/chssl/1451425590.html', headers=headers , data=data)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 111, in post
return request('post', url, data=data, json=json, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 57, in request
return session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 403, in send
timeout=timeout
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen
chunked=chunked)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.7/httplib.py", line 1001, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1035, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 997, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 854, in _send_output
self.send(message_body)
File "/usr/lib/python2.7/httplib.py", line 826, in send
self.sock.sendall(data)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 253, in sendall
sent = self._send_until_done(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE])
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 242, in _send_until_done
return self.connection.send(data)
File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 947, in send
raise TypeError("data must be a byte string")
TypeError: data must be a byte string

Getting random "Please check if Email_id exist or not?"

Sometimes running for example python emailOsint.py [email protected] I'm getting
[-] Error Occured - Encountered Status Code: 404. Please check if Email_id exist or not? or
[-] Error Occured - Encountered Status Code: 202. Please check if Email_id exist or not?

Why does this happen? Sometimes it runs, sometimes it doesn't.

Netcraft : KeyError: "name='csrftoken', domain=None, path=None"

This is has turning up of late, anyone else see this?

---> Searching Domain history in Netcraft

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 249, in do_everything
subdomains(domain)
File "/home/seamus/datasploit/domain_subdomains.py", line 30, in subdomains
cookies['csrftoken'] = r.cookies['csrftoken']
File "/usr/local/lib/python2.7/dist-packages/requests/cookies.py", line 293, in getitem
return self._find_no_duplicates(name)
File "/usr/local/lib/python2.7/dist-packages/requests/cookies.py", line 351, in _find_no_duplicates
raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))
KeyError: "name='csrftoken', domain=None, path=None"

SSL Scan issue

As per domain_sslinfo.py, the passive SSL scan is conducted with the help of the site www.htbridge.com
But the website currently has some certificate error when viewed in a browser. Chrome displays the following:

This server could not prove that it is www.htbridge.com; its security certificate is from htbridge.ch.

Anyways there is a way to ignore verifying in Python. When we modify the Line 25 in domain_sslinfo.py as:

req = requests.post('https://www.htbridge.com/ssl/chssl/1451425590.html', headers=headers , data=data, verify=False)

we get the response as:

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /chssl/1451425590.html was not found on this server.</p>
</body></html>

which is html not JSON. So issue of No JSON object could be decoded arises ( Issue #67 )

Get rid of OS-exclusive dependencies

pyinotify
Currently pynotify available on Linux only.

pip install pyinotify==0.9.6
Collecting pyinotify==0.9.6
  Using cached pyinotify-0.9.6.tar.gz
    Complete output from command python setup.py egg_info:
    inotify is not available on win32

OS: Windows 10 14393.82 x64
Python: 3.5.2

passive SSL : ValueError: No JSON object could be decoded

---> Performing passive SSL Scan

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 291, in do_everything
results = check_ssl_htbsecurity(domain)
File "/home/seamus/datasploit/domain_sslinfo.py", line 16, in check_ssl_htbsecurity
results = json.loads(req.content)
File "/usr/lib/python2.7/json/init.py", line 338, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

JigSaw API Key

I have a data.com username / password, with available tokens ... How do I retrieve my API key?

Error in domain_censys.py

Parsed and collected results from page 1
Traceback (most recent call last):
File "domain_censys.py", line 70, in
main()
File "domain_censys.py", line 64, in main
censys_search(domain)
File "domain_censys.py", line 15, in censys_search
res = requests.post("https://www.censys.io/api/v1/search/ipv4", json = params, auth = (cfg.censysio_id, cfg.censysio_secret))
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 88, in post
return request('post', url, data=data, *_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, *_kwargs)
TypeError: request() got an unexpected keyword argument 'json'

ImportError: No module named celery

Anyone know of a fix for this?

Debian 8 Python 2.7.9

root@OTX:~/datasploit# C_FORCE_ROOT=root celery -A core worker -l info --concurrency 20
Traceback (most recent call last):
File "/usr/local/bin/celery", line 9, in
load_entry_point('celery==3.1.23', 'console_scripts', 'celery')()
File "/usr/local/lib/python2.7/dist-packages/celery/main.py", line 30, in main
main()
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 793, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 309, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 469, in setup_app_from_commandline
self.app = self.find_app(app)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 489, in find_app
return find_app(app, symbol_by_name=self.symbol_by_name)
File "/usr/local/lib/python2.7/dist-packages/celery/app/utils.py", line 254, in find_app
symbol_by_name=symbol_by_name, imp=imp,
File "/usr/local/lib/python2.7/dist-packages/celery/app/utils.py", line 238, in find_app
sym = imp(app)
File "/usr/local/lib/python2.7/dist-packages/celery/utils/imports.py", line 101, in import_from_cwd
return imp(module, package=package)
File "/usr/lib/python2.7/importlib/init.py", line 37, in import_module
import(name)
ImportError: No module named celery

Handling commandline arguments

There is no check if the required arguments are given by the user and is not handled even in other modular scripts.

For example,

datasploit@datasploit $ python domainOsint.py
Traceback (most recent call last):
  File "domainOsint.py", line 72, in <module>
    from emailOsint import print_emailosint
  File "/home/datasploit/emailOsint.py", line 17, in <module>
    email = sys.argv[1]
IndexError: list index out of range

celery related issue (no such table: celery_taskmeta)

[2016-10-27 10:47:25,109: INFO/Worker-15] Starting new HTTPS connection (1): search.wikileaks.org
[2016-10-27 10:47:25,128: CRITICAL/MainProcess] Task osint.domain_shodan.shodandomainsearch[d945d4d1-f0b8-4f7a-978a-e10193ee2d87] INTERNAL ERROR: OperationalError('no such table: celery_taskmeta',)
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 253, in trace_task
I, R, state, retval = on_error(task_request, exc, uuid)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 201, in on_error
R = I.handle_error_state(task, eager=eager)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 85, in handle_error_state
}[self.state](task, store_errors=store_errors)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 118, in handle_failure
req.id, exc, einfo.traceback, request=req,
File "/usr/local/lib/python2.7/dist-packages/celery/backends/base.py", line 133, in mark_as_failure
traceback=traceback, request=request)
File "/usr/local/lib/python2.7/dist-packages/celery/backends/base.py", line 271, in store_result
request=request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/djcelery/backends/database.py", line 29, in _store_result
traceback=traceback, children=self.current_task_children(request),
File "/usr/local/lib/python2.7/dist-packages/djcelery/managers.py", line 42, in _inner
return fun(_args, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/djcelery/managers.py", line 181, in store_result
'meta': {'children': children}})
File "/usr/local/lib/python2.7/dist-packages/djcelery/managers.py", line 87, in update_or_create
return get_queryset(self).update_or_create(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/djcelery/managers.py", line 70, in update_or_create
obj, created = self.get_or_create(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/django/db/models/query.py", line 465, in get_or_create
return self.get(**lookup), False
File "/usr/local/lib/python2.7/dist-packages/django/db/models/query.py", line 381, in get
num = len(clone)
File "/usr/local/lib/python2.7/dist-packages/django/db/models/query.py", line 240, in len
self._fetch_all()
File "/usr/local/lib/python2.7/dist-packages/django/db/models/query.py", line 1074, in _fetch_all
self._result_cache = list(self.iterator())
File "/usr/local/lib/python2.7/dist-packages/django/db/models/query.py", line 52, in iter
results = compiler.execute_sql()
File "/usr/local/lib/python2.7/dist-packages/django/db/models/sql/compiler.py", line 848, in execute_sql
cursor.execute(sql, params)
File "/usr/local/lib/python2.7/dist-packages/django/db/backends/utils.py", line 79, in execute
return super(CursorDebugWrapper, self).execute(sql, params)
File "/usr/local/lib/python2.7/dist-packages/django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
File "/usr/local/lib/python2.7/dist-packages/django/db/utils.py", line 95, in exit
six.reraise(dj_exc_type, dj_exc_value, traceback)
File "/usr/local/lib/python2.7/dist-packages/django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
File "/usr/local/lib/python2.7/dist-packages/django/db/backends/sqlite3/base.py", line 323, in execute
return Database.Cursor.execute(self, query, params)
OperationalError: no such table: celery_taskmeta

Initialization error with email0sint.py

Running Mint 18 and getting strange issues on initilization:

python domainOsint.py
Traceback (most recent call last):
File "domainOsint.py", line 72, in
from emailOsint import print_emailosint
File "/home/foo/datasploit/emailOsint.py", line 17, in
email = sys.argv[1]
IndexError: list index out of range

running python 2.7.12

Thanks.

punkspider not working

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 148, in do_everything
res = checkpunkspider(reversed_domain)
File "/home/cyborg/datasploit/domain_checkpunkspider.py", line 16, in checkpunkspider
req= requests.post("http://www.punkspider.org/service/search/detail/" + reversed_domain, verify=False)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 111, in post
return request('post', url, data=data, json=json, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 57, in request
return session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 465, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPConnectionPool(host='127.0.0.1', port=8080): Max retries exceeded with url: http://www.punkspider.org/service/search/detail/com.domain.www (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f6cc15ba490>: Failed to establish a new connection: [Errno 111] Connection refused',)))

Error in domain_github.py

There is No check to see if split is giving proper data:

Traceback (most recent call last):
File "domainOsint.py", line 252, in
main()
File "domainOsint.py", line 124, in main
print github_search(domain, 'Code')
File "/home/a/datasploit/domain_github.py", line 13, in github_search
return "%s Results found in github Codes. \nExplore results manually: %s" % (str(mydivs[0]).split(">")[1].split("<")[0], endpoint_git)
IndexError: list index out of range

Finding Pagelinks: List index out of range

Hello,

When running datasploit on a domain, I'm getting the following error in the Finding Pagelinks module/section:

---> Finding Pagelinks:
<removed list of links to save space> 

Traceback
 (most recent call last):
  File "domainOsint.py", line 435, in <module>
    main()
  File "domainOsint.py", line 431, in main
    do_everything(domain)
  File "domainOsint.py", line 265, in do_everything
    subdomains_from_netcraft(domain)
  File "/datasploit/domain_subdomains.py", line 150, in subdomains_from_netcraft
    if num_subdomains[0] != str(0):
IndexError: list index out of range

Any thoughts?

https redirection on punkspider triggers exception

In the domain_checkpunkspider.py file, data is fetched on the punkspider website using the http protocol :
req= requests.post("http://www.punkspider.org/service/search/detail/" + reversed_domain)

unfortunately, the site can only be reached on https:
$ curl -I http://www.punkspider.org/service/search/detail/
HTTP/1.1 301 Moved Permanently
Date: Mon, 20 Jun 2016 14:50:43 GMT
Server: Apache/2.4.7 (Ubuntu)
Location: https://www.punkspider.org/service/search/detail/
Content-Type: text/html; charset=iso-8859-1

An error is triggered and in order to use this module you have to add the ", the verify=False" to the post request.
Why ? The punkspider.org website uses a certificate which chain of trust is incomplete which prevents the python module from querying it.

ValueError: No JSON object could be decoded

Hi, I'm having a JSON problem when I try to run this tool on my own domain. I have included the complete log here:

chriskennard@mint ~/Desktop/tools/datasploit $ python domainOsint.py -d chriskennard.com

  ____/ /____ _ / /_ ____ _ _____ ____   / /____  (_)/ /_
  / __  // __ `// __// __ `// ___// __ \ / // __ \ / // __/
 / /_/ // /_/ // /_ / /_/ /(__  )/ /_/ // // /_/ // // /_  
 \__,_/ \__,_/ \__/ \__,_//____// .___//_/ \____//_/ \__/  
                               /_/                        

           Open Source Assistant for #OSINT            
                 website: www.datasploit.info               

No earlier scans found for chriskennard.com, Launching fresh scan in 3, 2, 1..

---> Finding Whois Information.
{
"updated_date": [
"2016-07-28 00:00:00",
"2016-07-28 07:00:00"
],
"status": [
"clientTransferProhibited https://icann.org/epp#clientTransferProhibited",
"clientTransferProhibited https://www.icann.org/epp#clientTransferProhibited"
],
"name": "Contact Privacy Inc. Customer 12435752",
"dnssec": "unsigned",
"city": "Toronto",
"expiration_date": [
"2017-01-26 00:00:00",
"2017-01-26 08:00:00"
],
"zipcode": "M4K 3K1",
"domain_name": [
"CHRISKENNARD.COM",
"chriskennard.com"
],
"country": "CA",
"whois_server": "whois.google.com",
"state": "ON",
"registrar": "Google, Inc.",
"referral_url": "http://domains.google.com",
"address": "96 Mowat Ave",
"name_servers": [
"NS-CLOUD-E1.GOOGLEDOMAINS.COM",
"NS-CLOUD-E2.GOOGLEDOMAINS.COM",
"NS-CLOUD-E3.GOOGLEDOMAINS.COM",
"NS-CLOUD-E4.GOOGLEDOMAINS.COM"
],
"org": "Contact Privacy Inc. Customer 12435752",
"creation_date": [
"2015-01-26 00:00:00",
"2015-01-26 08:00:00"
],
"emails": [
"[email protected]",
"[email protected]"
]
}
---> Finding Whois Information.

A Records
216.239.32.21
216.239.34.21
216.239.36.21
216.239.38.21
MX Records
5 gmr-smtp-in.l.google.com.
10 alt1.gmr-smtp-in.l.google.com.
20 alt2.gmr-smtp-in.l.google.com.
30 alt3.gmr-smtp-in.l.google.com.
40 alt4.gmr-smtp-in.l.google.com.
SOA Records
ns-cloud-e1.googledomains.com. dns-admin.google.com. 14 21600 3600 1209600 300
Name Server Records
ns-cloud-e1.googledomains.com.
ns-cloud-e2.googledomains.com.
ns-cloud-e3.googledomains.com.
ns-cloud-e4.googledomains.com.
TXT Records
No Records Found
CNAME Records
No Records Found
AAAA Records
2001:4860:4802:32::15
2001:4860:4802:34::15
2001:4860:4802:36::15
2001:4860:4802:38::15

---> Trying luck with PunkSpider

[-] No Vulnerabilities found on PunkSpider

---> Wappalyzing web pages

->Trying Wapalyzer on HTTP:
[-] HTTP connection was unavailable
->Trying Wapalyzer on HTTPS:
[-] HTTP connection was unavailable

---> Searching Github for domain results

Sad! Nothing found on github

---> Harvesting Email Addresses:.

[u'status', u'domain', u'pattern', u'results', u'webmail', u'offset', u'emails']

Do you want to launch osint check for these emails? [(Y)es/(N)o/(S)pecificEmail]: y

---> Searching Domain history in Netcraft

216.239.36.21: Google Inc

---> Finding Pagelinks:

no links found
zero subdomains found here
---> Finding subdomains:

---> Searching through WikiLeaks

Total
0 results

For all results, visit: https://search.wikileaks.org/?query=&exact_phrase=chriskennard.com&include_external_sources=True&order_by=newest_document_date

---> Gathering links from Forums:

---> Performing passive SSL Scan

Traceback (most recent call last):
File "domainOsint.py", line 432, in
main()
File "domainOsint.py", line 428, in main
do_everything(domain)
File "domainOsint.py", line 291, in do_everything
results = check_ssl_htbsecurity(domain)
File "/home/cnkennar/Desktop/tools/datasploit/domain_sslinfo.py", line 16, in check_ssl_htbsecurity
results = json.loads(req.content)
File "/usr/lib/python2.7/json/init.py", line 339, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

Extensive use of tab space \t

Most of the scripts has \t\t\t prefixed to the display messages. When executing it on Ubuntu (bash), the output messages are quite odd because of the tabs.

Below is a screenshot of domain_emailhunter.py module on Ubuntu terminal:
upload

I would suggest using a space instead of the tabs

Error Finding hosts from ZoomEye

I have followed the instructions correctly, tried with 2 different email accounts

keep getting "please enter correct verify code" on the telent 404 site and then this when I run domainOsint.py

---> Finding hosts from ZoomEye

Traceback (most recent call last):
File "domainOsint.py", line 435, in
main()
File "domainOsint.py", line 431, in main
do_everything(domain)
File "domainOsint.py", line 337, in do_everything
zoomeye_results = search_zoomeye(domain)
File "/root/datasploit-master/domain_zoomeye.py", line 25, in search_zoomeye
zoomeye_token = get_accesstoken_zoomeye(domain)
File "/root/datasploit-master/domain_zoomeye.py", line 18, in get_accesstoken_zoomeye
access_token1 = responsedata['access_token']
KeyError: 'access_token'

Feature: Use the data from WhatsMyName

I maintain the WhatsMyName repo (https://github.com/WebBreacher/WhatsMyName) of user enumeration data. With the JSON file in my repo, you can find out which of over 150 sites have a specific username as an account.

I noticed at least one (prolly more) module(s) in datasploit that could possibly be converted to use my JSON file. I've done this successfully in the Recon-NG framework in the Profiler script (https://bitbucket.org/LaNMaSteR53/recon-ng/src/4e3cc65adfdd79cfa366b64dcce68e5ab9b2dd8c/modules/recon/profiles-profiles/profiler.py?at=master&fileviewer=file-view-default).

At runtime, a call is made to the Git repo and it pulls the most recent content, iterates through it when it finds an entry marked "valid=True" and then uses the data in my project to find valid user names. Check out https://webbreacher.com/2014/12/11/recon-ng-profiler-module/ for a run-through.

Just a suggestion to augment your awesome project. While my WhatsMyName JSON project will find valid usernames, it does not scrape content from the pages of the destination sites. This is a gap-area that your project (if you choose) could fill.

MongoDB Error

Traceback (most recent call last):
  File "domainOsint.py", line 432, in <module>
    main()
  File "domainOsint.py", line 428, in main
    do_everything(domain)
  File "domainOsint.py", line 391, in do_everything
    result = db.domaindata.insert(dict_to_apend, check_keys=False)
  File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 2212, in insert
    check_keys, manipulate, write_concern)
  File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 535, in _insert
    check_keys, manipulate, write_concern, op_id, bypass_doc_val)
  File "/usr/local/lib/python2.7/dist-packages/pymongo/collection.py", line 516, in _insert_one
    check_keys=check_keys)
  File "/usr/local/lib/python2.7/dist-packages/pymongo/pool.py", line 244, in command
    self._raise_connection_failure(error)
  File "/usr/local/lib/python2.7/dist-packages/pymongo/pool.py", line 372, in _raise_connection_failure
    raise error
OverflowError: MongoDB can only handle up to 8-byte ints

Wappalyzer issue - no module named Wappalyzer

Ian@Box:/Downloads/datasploit-master$ cat requirements.txt | grep Wapp
python-Wappalyzer==0.2.2
Ian@Box:
/Downloads/datasploit-master$ pip list | grep Wapp
python-Wappalyzer (0.2.2)
Ian@Box:/Downloads/datasploit-master$
Ian@Box:
/Downloads/datasploit-master$ python domainOsint.py google.com
Traceback (most recent call last):
File "domainOsint.py", line 35, in
from Wappalyzer import Wappalyzer, WebPage
ImportError: No module named Wappalyzer
Ian@Box:~/Downloads/datasploit-master$

Tinkering a bit I can get the wappalyzer-python-0.1.2 package down and begin renaming your froms in domainOsint.py to call that version but run into other errors. Am I missing something preventing this from working with the properly referenced python-Wappalyzer version?

error in domain_emailhunter.py

When running the tool against the domain, I'm getting this error
AttributeError: 'module' object has no attribute 'emailhunter'

Looks like it has the error when trying to fetch information from Github.com

Errors while using web UI

1st. Get following errors about shodan and emailhunter:

[2016-09-04 12:03:22,068: ERROR/MainProcess] Task osint.domain_emailhunter.emailhunter[80edd1d7-c720-4094-8f82-bfa7a5cc0ae7] raised unexpected: AttributeError("'module' object has no attribute 'emailhunter'",)
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 240, in trace_task
R = retval = fun(_args, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 438, in protected_call
return self.run(_args, *_kwargs)

File "/root/datasploit/core/osint/domain_emailhunter.py", line 12, in emailhunter
if config.emailhunter:
AttributeError: 'module' object has no attribute 'emailhunter'
[2016-09-04 12:03:22,097: ERROR/MainProcess] Task osint.domain_shodan.shodandomainsearch[0b00e584-feb3-4c43-9ecb-48ff88bfa632] raised unexpected: AttributeError("'module' object has no attribute 'shodan_api'",)
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 240, in trace_task
R = retval = fun(_args, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 438, in protected_call
return self.run(_args, *_kwargs)
File "/root/datasploit/core/osint/domain_shodan.py", line 15, in shodandomainsearch
if config.shodan_api:
AttributeError: 'module' object has no attribute 'shodan_api'

2nd. Lies about number of subdomain and just freeze:)
[2016-09-04 12:03:30,069: WARNING/Worker-6] zero subdomains found here
[2016-09-04 12:03:30,085: INFO/MainProcess] Task osint.domain_subdomains.run[d0093775-9432-4d23-bb35-0590dc727bb9] succeeded in 8.03900062s: [u'gp.somedomain', u'gp-ss.somadomain', u'vpn.somedomain', u'familyhistory.somedomain', u'somedomain',...

After installing all the missing requirements WhoIS error

i get the following error

---> Finding Whois Information.
Traceback (most recent call last):
File "domainOsint.py", line 432, in
main()
File "domainOsint.py", line 428, in main
do_everything(domain)
File "domainOsint.py", line 122, in do_everything
whoisdata = whoisnew(domain)
File "/home/XXXXXXXX/Apps/datasploit/domain_whois.py", line 15, in whoisnew
w = whois.whois(domain)
AttributeError: 'module' object has no attribute 'whois'

Error installing requirements

I saw the other issue, but since this one is a different error, I thought I should post it in its own. This could easily boil down to me not setting something up correctly.

Pretty fresh install of Ubuntu Ubuntu 16.04.1 LTS
Python Version Python 2.7.12
Also, saw the error asking if libxml2 is installed...it is:
libxml2 is already the newest version (2.9.3+dfsg1-1ubuntu0.1).

Error:
running build_ext
building 'lxml.etree' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/lxml
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Isrc/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
In file included from src/lxml/lxml.etree.c:320:0:
src/lxml/includes/etree_defs.h:14:31: fatal error: libxml/xmlversion.h: No such file or directory
compilation terminated.
Compile failed: command 'x86_64-linux-gnu-gcc' failed with exit status 1
cc -I/usr/include/libxml2 -c /tmp/xmlXPathInitE7ECqO.c -o tmp/xmlXPathInitE7ECqO.o
/tmp/xmlXPathInitE7ECqO.c:1:26: fatal error: libxml/xpath.h: No such file or directory
compilation terminated.
*********************************************************************************
Could not find function xmlCheckVersion in library libxml2. Is libxml2 installed?
*********************************************************************************
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1

----------------------------------------

Command "/usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-build-BMJ17D/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(file).read().replace('\r\n', '\n'), file, 'exec'))" install --record /tmp/pip-B3t7DG-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-BMJ17D/lxml/

Overall bugs in datasploit

Did a source code analysis and found the following bugs in Datasploit :

active_default_file_check.py

  • host is blindly suffixed with http:// without checking if host already contains http://
  • No check for Redirection
  • Line 35 is not necessary as there is no check for 403 Forbidden or any other specific error codes

Logical Errors:

  • If status_code for a non-existent page is 200, then all the existing pages will be rejected
  • The returned status_code could be 301 Redirect, but it will be added as base_statuscode i.e. code returned for non-existent page

domain_censys.py

  • As far as the output is concerned, results are given out as a list to the user.

domain_checkpunkspider.py

  • Python connection warnings are ignored, requests explicitly has verify=False

Suggestions:

  • Give an output if the module didn't work.

domain_dnsrecords.py


domain_emailhunter.py


domain_forumsearch.py


domain_github.py

Not developed fully


domain_GooglePDF.py

Not working - not developed fully


domain_history.py


domain_pagelinks.py


domain_shodan.py


domain_sslinfo.py

  • HTML mishandled as JSON when the output is Not Found ( Issue #68 )
  • URL used in Line 15 returns 404 - Not Found which makes the script as Not Working ( Issue #54 )

domain_subdomains.py

  • find_subdomains_from_wolfram(domain) in Line 50 is not working as of now. Needs development

Others:

  • Interesting issue #66
  • Line 172 states Zero subdomains found here. It would be better if you check if there are subdomains other than the main domain before displaying. Example if no subdomains are found, the output will be

---> Finding subdomains, will be back soon with list.
zero subdomains found here
List of subdomains found
REDACTED_DOMAIN.com


domain_wappalyzer.py


domain_whois.py


domain_wikileaks.py


domain_zoomeye.py


domainOsint.py


email_fullcontact.py


emailOsint.py


facebook_user_details.py

  • Handle JSON and display the specific user information Development Required
  • Check if the username is present and handle error if not present

generate_passwords.py

  • No bugs, development required
  • NOT RECOMMENDED

git_searcher.py

  • Check if the user is valid
  • Output relevant information, not yet fully developed (Development Required)

instaUsernameOsint.py

  • Line 12 uses cfg.instagram_token whereas instagram_token is not present in config_sample.py. Instead instagram_api is present in config_sample.py

ip_shodan.py

  • Handle JSON response
  • Not used by any script

ip_to_neighboursites.py

  • No bugs
  • Not used by any script

usernameOsint.py

  • Line 35 explicitly uses verify=False which means the SSL connection is ignored even if there is an error

Others:

  • Not completely developed. Explicitly used GitHub actions against a user (@anantshri) as an example in comments

Overall bugs

  • No checking if sys.argv[1] is present
  • Colored output is not used in all scripts
  • No checking for API keys which are mandatory for some modules to work.

All these bugs will be removed / reduced in the upcoming pull requests

[ NOTE : Scripts in core directory were not checked ]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.