Git Product home page Git Product logo

cloud_enum's People

Contributors

codingo avatar davidmcduffie avatar gpxlnx avatar initstring avatar jamesconlan96 avatar n7wera avatar nalauder avatar octaviovg avatar orduan avatar sg3-141-592 avatar shiftlefter avatar six2dez avatar t1826749 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cloud_enum's Issues

Improved wordlists

The wordlists for brute-forcing DNS and container names are quite short, and were created by me manually to get a working tool going.

Longer wordlists mean more results, of course. But I want to be selective and not just import something random from another tool. Long wordlists also mean longer runtime, higher chance of detection/ban/etc.

Need to investigate this.

Required permissions and address

Hi
I would like to use the too but I have some questions which I could not resolve from the git description:

  1. What permissions is required to run the too against each platform (AWS, Azure, GCP)?
  2. Against which address I will use it in each of the platforms?
    ASAP
    Thank's ahead.

Speed up DNS brute forcing

Need to improve DNS lookups, and also allow the user to specify a DNS server.

Possibly, could use subprocess to queue OS commands to handle this in batches. Need to test.

"Skip" key

Look into viability of allowing the user to interactively skip running tests by pressing 'S' or something.

Logging

Need to implement logging. Currently, using something like tee to manually create logs is workable, but all the sys.stdout.write / flush / unicode color escapes make re-reading those logs a bit messy.

Probably will log only found items.

Move to MIT License

I don't think a GPL license makes sense for this project, MIT seems a better fit.

I would like to change it, but want to make sure everyone who has contributed so far is ok with that change, as it was GPL when you committed code.

I'll leave this issue open for 30 days, if no one disagrees I will make the change in the next release.

If I tagged you and you are ok with this, it would be great if you could ๐Ÿ‘ this.

Thanks!

@sg3-141-592 @codingo @gpxlnx @OrDuan @nalauder @octaviovg @N7WEra @ShiftLefter

Capital letters borks GCP enumeration

For some reason, GCP does not like bucket requests with capital letters.
I should either:

  • sub and clean all keywords at argparse time (do other services care about case?)
  • look into which individual checks need to be cleaned

Unnecessary returning of variable or May be just forgot to do some logic ?

In the enum_tools dir , check the python file azure_checks.py. In this file,
utils.fast_dns_lookup(candidates, nameserver, callback=print_website_response, threads=threads)
These lines do nothing after getting a variable valid_names from fast_dns_lookup.
is it needed for some processing or just an accidental return ?

Disabled Storage Account

I have just started to use this tool. It seems to be the nice one but I wonder why this tool reports Disabled Storage Account. I don't know if we can do anything with this. Please let me know the attack vector so that i can learn more to exploit this.

Disabled Storage Account: http://abc.blob.core.windows.net/

[Feature request] Improve output customisation

Hello!

Description
I am trying to access the output of this tool programmatically, and it is being slightly more complicated than it could be due to the all the logging and colouring in the output. I've dropped here a possible solution I've managed to come up with; let me know your thoughts!

Possible solution

  1. Add a --colourless flag that makes the tool output everything without colours.
  2. Add a --silent flag that makes the tool not output anything to stdout other than the results. This would imply the output being colourless too.
  3. Consider --logfile - or -l - to mean "output to stdout". This is common behaviour defined in the POSIX utility syntax guidelines (for utilities that use operands to represent files to be opened for either reading or writing, the '-' operand should be used only to mean standard input (or standard output when it is clear from context that an output file is being specified)). If such a value was specified, then the output should be silent too.
  4. Add a --format flag which allows selection of the output format desired (e.g.: --format=jsonlines, --format=csv, etc...). In the former, the tool could return a format such as {"platform": "aws", "status":"protected", "url":"xxxx.xxx.xxx"} for every result. Formatted results would then be output to the specified logfile. Note that I've used those formats as examples on purpose, as they would allow outputting results "on the go".

Thank you!

Can not start Brute-forcing on azure checks

Hi,

Whatever I'm trying to search, amazon checks are working but once it starts the azure cheks, I get the following message:

[+] Checking for S3 buckets

Elapsed time: 00:02:15

++++++++++++++++++++++++++
azure checks
++++++++++++++++++++++++++

[+] Checking for Azure Storage Accounts
[*] Brute-forcing a list of 445 possible DNS names
Traceback (most recent call last):
File "./cloud_enum.py", line 218, in
main()
File "./cloud_enum.py", line 205, in main
azure_checks.run_all(names, args)
File "/cloud_enum/enum_tools/azure_checks.py", line 286, in run_all
valid_accounts = check_storage_accounts(names, args.threads,
File "/cloud_enum/enum_tools/azure_checks.py", line 77, in check_storage_accounts
valid_names = utils.fast_dns_lookup(candidates, nameserver)
File "/cloud_enum/enum_tools/utils.py", line 129, in fast_dns_lookup
batch_pending[name] = subprocess.Popen(cmd,
File "/usr/lib/python3.8/subprocess.py", line 854, in init
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.8/subprocess.py", line 1702, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'host'

Please advise.

Exception Handling

Add common exception handling. Priority on recovering from failed HTTP connections.
Initial release is living life on the wild side, which will become problematic when using with larger wordlists and sketchy Internet connections.

Test writeable S3 buckets

Received a request in a security-focused Slack channel to add bucket write checks for S3.

I'll take a look at this for both S3 and GCS (and eventually Azure too). I don't plan to ever add functionality that requires keys or credentials, so it will be dependent on whether or not those actions are possible in a totally pre-auth manner.

requirements.txt invocation to be change in readme.md

Hi,

Currently in the setup part of readme.md, the method to install from requirements.txt is given as follows:

pip3 install -r ./requirements.txt

This should be changed to

pip3 install -r requirements.txt

Thanks for the awesome work,
Kiran

Called from another script

When I execute cloud_enum from inside another script it sometimes won't exit out of the script. I just see the "All done, happy hacking!" message and I have to ctrl-c out of the script.

JSON output

Is it possible to develop/implement an output feature?
Thanks

Error

Getting error while running,
python cloud_enum.py -k xxx
[!] Cannot access mutations file: /enum_tools/fuzz.txt

[Azure] The system cannot find the file specified

When running the Azure enumerations, getting an FileNotFoundError: [WinError 2] The system cannot find the file specified error. However - I have verified the AWS and GCP enumerations are able to be ran just fine.

I have attempted to fix the file paths defaults within cloud_enum.py/parse_arguements() to fit Windows conventions, however that did not change anything.

   # Use included mutations file by default, or let the user provide one
    parser.add_argument('-m', '--mutations', type=str, action='store',
                        default=script_path + '\\enum_tools\\fuzz.txt',
                        help='Mutations. Default: enum_tools/fuzz.txt')
    # Use include container brute-force or let the user provide one
    parser.add_argument('-b', '--brute', type=str, action='store',
                        default=script_path + '\\enum_tools\\fuzz.txt',
                        help='List to brute-force Azure container names.'
                        '  Default: enum_tools/fuzz.txt')

Running on Windows 10

Example Run:
cloud_enum_az

error typo on open function

The following is what it output before and after the crash...

root@csi-analyst:/home/csi/cloud_enum# python cloud_enum.py -k binance

##########################
cloud_enum
github.com/initstring
##########################

Keywords: binance
Mutations: /home/csi/cloud_enum/enum_tools/fuzz.txt
Brute-list: /home/csi/cloud_enum/enum_tools/fuzz.txt

Traceback (most recent call last):
File "cloud_enum.py", line 234, in
main()
File "cloud_enum.py", line 213, in main
mutations = read_mutations(args.mutations)
File "cloud_enum.py", line 152, in read_mutations
with open(mutations_file, encoding="utf8", errors="ignore") as infile:
TypeError: 'errors' is an invalid keyword argument for this function
root@csi-analyst:/home/csi/cloud_enum# python cloud_enum.py -k binance -ns binance.com

##########################

Google Cloud functions fail when possible subdomain length too long.

I have a domain I'm trying to cloud_enum. Let's say this is "preprod-second-hand-elastic-standalone-abcdefghi-abcdefgh.REDCTcloud.com"

This is an acceptable length for a subdomain, and it does resolve. But, adding the fuzz to it makes it too long, and thus fails.

Perhaps a length check on subdomain + fuzz strings before attempting the check? If any component is too long, then skip as there's no way it'd be a positive result?

[+] Checking for project/zones with Google Cloud Functions.
[*] Testing across 1 regions defined in the config file
Traceback (most recent call last):
  File "/home/dnx/3rdparty/cloud_enum/cloud_enum.py", line 255, in <module>
    main()
  File "/home/dnx/3rdparty/cloud_enum/cloud_enum.py", line 244, in main
    gcp_checks.run_all(names, args)
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/gcp_checks.py", line 390, in run_all
    check_functions(names, args.brute, args.quickscan, args.threads)
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/gcp_checks.py", line 338, in check_functions
    utils.get_url_batch(candidates, use_ssl=False,
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/utils.py", line 88, in get_url_batch
    batch_results[url] = batch_pending[url].result(timeout=30)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/usr/lib64/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connectionpool.py", line 496, in _make_request
    conn.request(
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 395, in request
    self.endheaders()
  File "/usr/lib64/python3.11/http/client.py", line 1281, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib64/python3.11/http/client.py", line 1041, in _send_output
    self.send(msg)
  File "/usr/lib64/python3.11/http/client.py", line 979, in send
    self.connect()
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 243, in connect
    self.sock = self._new_conn()
                ^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/util/connection.py", line 58, in create_connection
    raise LocationParseError(f"'{host}', label empty or too long") from None
urllib3.exceptions.LocationParseError: Failed to parse: 'us-central1-preprod-second-hand-elastic-standalone-abcdefghi-abcdefgh.REDCTcloud.com.cloudfunctions.net', label empty or too long

Breaks when too many fuzz list is given

First of all very nice tool
i have used commonspeak subdomain wordlist to check it
but it kinda breaks when tested
Mutations: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt
Brute-list: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt

[+] Mutations list imported: 484943 items
[+] Mutated results: 2909659 items

++++++++++++++++++++++++++
amazon checks
++++++++++++++++++++++++++

[+] Checking for S3 buckets
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.8/encodings/idna.py", line 165, in encode
raise UnicodeError("label empty or too long")
UnicodeError: label empty or too long

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "cloud_enum.py", line 243, in
main()
File "cloud_enum.py", line 228, in main
aws_checks.run_all(names, args)
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 130, in run_all
check_s3_buckets(names, args.threads)
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 84, in check_s3_buckets
utils.get_url_batch(candidates, use_ssl=False,
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/utils.py", line 81, in get_url_batch
batch_results[url] = batch_pending[url].result(timeout=30)
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 432, in result
return self.__get_result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 665, in urlopen
httplib_response = self._make_request(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 387, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1010, in _send_output
self.send(msg)
File "/usr/lib/python3.8/http/client.py", line 950, in send
self.connect()
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 184, in connect
conn = self._new_conn()
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 156, in _new_conn
conn = connection.create_connection(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/util/connection.py", line 61, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/lib/python3.8/socket.py", line 918, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.