Git Product home page Git Product logo

aws-auto-inventory's People

Contributors

dependabot[bot] avatar elgamala avatar github-actions[bot] avatar sleonov avatar stephenb87 avatar valter-silva-au avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-auto-inventory's Issues

CloudQuery Source Plugin Migration?

Hi Team, hopefully this is right place to ask, if not, I'd appreciate if you can direct me.

I'm the founder of cloudquery.io, a high performance open source ELT framework.

We are maintaining an AWS plugin, which is widely used and users use it to sync their APIs to a database/datalake and create asset inventory, "operational data lake" and other use-cases.

As we have limited capacity to maintain all plugins, we are usually looking for the official vendor to help maintain it (similar to terraform provider).

I was curious if this would be an interesting collaboration, where we can help with the initial version (already implemented) and you will help maintain it?

This will give your users the ability to sync/ELT AWS APIs to any of their datalakes/data-warehouses/databases easily using any of the growing list of CQ destination plugins.

Best,
Yevgeny

running on Ubuntu 18.04, glibc requirement and version issue

Hi,

Upon trying to run the linux binary on Ubuntu 18.04, aws-auto-inventory appears to have a requirement on glibc:

$ aws-auto-inventory 
[24646] Error loading Python lib '/tmp/_MEIXX3OpN/libpython3.9.so.1.0': dlopen: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /tmp/_MEIXX3OpN/libpython3.9.so.1.0)

While I did install glibc-source, the package version available in Ubuntu repositories is 2.27:

Package: glibc-source
Status: install ok installed
Priority: optional
Section: devel
Installed-Size: 23845
Maintainer: Ubuntu Developers <[email protected]>
Architecture: all
Multi-Arch: foreign
Source: glibc
Version: 2.27-3ubuntu1.4
Replaces: eglibc-source
Recommends: xz-utils
Conflicts: eglibc-source
Description: GNU C Library: sources
 This package contains the sources and patches which are needed to
 build glibc.
Homepage: https://www.gnu.org/software/libc/libc.html
Original-Maintainer: GNU Libc Maintainers <[email protected]>
Original-Vcs-Browser: https://salsa.debian.org/glibc-team/glibc
Original-Vcs-Git: https://salsa.debian.org/glibc-team/glibc.git

Any chance of packaging required dependencies into the binary file itself, like aws-cli v2?

Document `config.yaml` arguments

Congrats on the tool! It seems to be very powerful, however some parameters I need are not documented. Would it be possible to document all possible arguments within the configuration YAML?

service output json file should contain both service/function names

code in main function storing service results should use both service name and function name to create unique file names,
since there could be different function names.

  • Version: Latest
  • Local OS Version: MacOS 13.6.2
  • Local chip architecture: Apple M1
  • Reproduces in: NA

Steps to Reproduce:

Run scan with config file containing multiple functions of the same service

Allow Outputs of Other Sheets/Comands

I would like to better document my inventory, not all boto3 client actions return details of my resources. I would like it to be possible to use outputs from other spreadsheets to generate spreadsheets with more details of AWS resources.

Possible config.yaml

---
Sheets: &sheets
  - name: EKSClusterNames
    service: eks
    function: list_clusters

  - name: EKSClusterDetails
    service: eks
    function: describe_cluster
    output_parameter: EKSClusterNames # loop the client action based in EKS Cluster Name list

inventories:
  - name: aws-test
    aws:
      profile: aws-test
      region:
        - us-east-1
        - sa-east-1
    excel:
      transpose: false
    sheets: *sheets

Scanning GovCloud for all Services Ends in JSON Serialization Error, Truncated Output

  • Version: 1.1.1
  • Local OS Version: Ubuntu 20.04.5 LTS
  • Local chip architecture: x86_64
  • Reproduces in: 432896750513

Steps to Reproduce:

  1. python3 scan.py -s scan/sample/all_services.json -r us-gov-west-1

output:

INFO:__main__:Finished processing for region: us-gov-west-1
ERROR:__main__:'us-gov-west-1' generated an exception: Object of type bytes is not JSON serializable
ERROR:__main__:Traceback (most recent call last):
  File "scan.py", line 334, in main
    json.dump(service_result["result"], f, cls=DateTimeEncoder)
  File "/usr/lib/python3.8/json/__init__.py", line 179, in dump
    for chunk in iterable:
  File "/usr/lib/python3.8/json/encoder.py", line 431, in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)
  File "/usr/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/usr/lib/python3.8/json/encoder.py", line 325, in _iterencode_list
    yield from chunks
  File "/usr/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/usr/lib/python3.8/json/encoder.py", line 438, in _iterencode
    o = _default(o)
  File "scan.py", line 44, in default
    return super().default(o)
  File "/usr/lib/python3.8/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type bytes is not JSON serializable

Total elapsed time for scanning: 0h:31m:1s

Output dir error: The directory name is invalid

  • Version: Latest
  • Local OS Version: Windows 10 22H2
  • Local chip architecture:
  • Reproduces in: <environment, AWS Account ID>

Steps to Reproduce:
I have utilised the
1..\aws-auto-inventory-windows.exe -s .\services\running_ec2.json --output_dir c:\Apps\aws-auto-inventory\output\
2.I'm getting the following output:

ERROR:__main__:'us-east-1' generated an exception: [WinError 267] The directory name is invalid: 'c:\\Apps\\aws-auto-inventory\\output\\2023-08-25T11:08'
ERROR:__main__:Traceback (most recent call last):
  File "scan.py", line 317, in main
  File "os.py", line 215, in makedirs
  File "os.py", line 225, in makedirs
NotADirectoryError: [WinError 267] The directory name is invalid: 'c:\\Apps\\aws-auto-inventory\\output\\2023-08-25T11:08'

Total elapsed time for scanning: 0h:0m:2s

make init build can't find cli.py

 ~/d/aws-auto-inventory-0.2.0 $ make init build 
python3 -m venv .venv
Requirement already satisfied: pip in ./.venv/lib/python3.9/site-packages (22.0.4)
Collecting pip
  Using cached pip-22.3.1-py3-none-any.whl (2.1 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 22.0.4
    Uninstalling pip-22.0.4:
      Successfully uninstalled pip-22.0.4
Successfully installed pip-22.3.1
Collecting pyinstaller
  Using cached pyinstaller-5.7.0-py3-none-macosx_10_13_universal2.whl (923 kB)
Requirement already satisfied: setuptools>=42.0.0 in ./.venv/lib/python3.9/site-packages (from pyinstaller) (58.1.0)
Collecting altgraph
  Using cached altgraph-0.17.3-py2.py3-none-any.whl (21 kB)
Collecting macholib>=1.8
  Using cached macholib-1.16.2-py2.py3-none-any.whl (38 kB)
Collecting pyinstaller-hooks-contrib>=2021.4
  Using cached pyinstaller_hooks_contrib-2022.14-py2.py3-none-any.whl (252 kB)
Installing collected packages: altgraph, pyinstaller-hooks-contrib, macholib, pyinstaller
Successfully installed altgraph-0.17.3 macholib-1.16.2 pyinstaller-5.7.0 pyinstaller-hooks-contrib-2022.14
Collecting altgraph==0.17.2
  Using cached altgraph-0.17.2-py2.py3-none-any.whl (21 kB)
Collecting astroid==2.11.7
  Using cached astroid-2.11.7-py3-none-any.whl (251 kB)
Collecting boto3==1.24.50
  Using cached boto3-1.24.50-py3-none-any.whl (132 kB)
Collecting botocore==1.27.50
  Using cached botocore-1.27.50-py3-none-any.whl (9.0 MB)
Collecting cfgv==3.3.1
  Using cached cfgv-3.3.1-py2.py3-none-any.whl (7.3 kB)
Collecting confuse==2.0.0
  Using cached confuse-2.0.0-py3-none-any.whl (24 kB)
Collecting dill==0.3.5.1
  Using cached dill-0.3.5.1-py2.py3-none-any.whl (95 kB)
Collecting distlib==0.3.5
  Using cached distlib-0.3.5-py2.py3-none-any.whl (466 kB)
Collecting et-xmlfile==1.1.0
  Using cached et_xmlfile-1.1.0-py3-none-any.whl (4.7 kB)
Collecting filelock==3.8.0
  Using cached filelock-3.8.0-py3-none-any.whl (10 kB)
Collecting identify==2.5.3
  Using cached identify-2.5.3-py2.py3-none-any.whl (98 kB)
Collecting isort==5.10.1
  Using cached isort-5.10.1-py3-none-any.whl (103 kB)
Collecting jmespath==1.0.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting lazy-object-proxy==1.7.1
  Using cached lazy_object_proxy-1.7.1-cp39-cp39-macosx_12_0_arm64.whl
Collecting mccabe==0.7.0
  Using cached mccabe-0.7.0-py2.py3-none-any.whl (7.3 kB)
Collecting nodeenv==1.7.0
  Using cached nodeenv-1.7.0-py2.py3-none-any.whl (21 kB)
Collecting numpy==1.23.1
  Using cached numpy-1.23.1-cp39-cp39-macosx_11_0_arm64.whl (13.3 MB)
Collecting openpyxl==3.0.10
  Using cached openpyxl-3.0.10-py2.py3-none-any.whl (242 kB)
Collecting pandas==1.4.3
  Using cached pandas-1.4.3-cp39-cp39-macosx_11_0_arm64.whl (10.5 MB)
Collecting platformdirs==2.5.2
  Using cached platformdirs-2.5.2-py3-none-any.whl (14 kB)
Collecting pre-commit==2.20.0
  Using cached pre_commit-2.20.0-py2.py3-none-any.whl (199 kB)
Collecting pyinstaller==5.3
  Using cached pyinstaller-5.3-py3-none-macosx_10_13_universal2.whl (829 kB)
Collecting pyinstaller-hooks-contrib==2022.8
  Using cached pyinstaller_hooks_contrib-2022.8-py2.py3-none-any.whl (239 kB)
Collecting pylint==2.14.5
  Using cached pylint-2.14.5-py3-none-any.whl (488 kB)
Collecting python-dateutil==2.8.2
  Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting pytz==2022.2
  Using cached pytz-2022.2-py2.py3-none-any.whl (504 kB)
Collecting PyYAML==6.0
  Using cached PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl (173 kB)
Collecting s3transfer==0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting six==1.16.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting toml==0.10.2
  Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting tomli==2.0.1
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting tomlkit==0.11.3
  Using cached tomlkit-0.11.3-py3-none-any.whl (35 kB)
Collecting urllib3==1.26.11
  Using cached urllib3-1.26.11-py2.py3-none-any.whl (139 kB)
Collecting virtualenv==20.16.3
  Using cached virtualenv-20.16.3-py2.py3-none-any.whl (8.8 MB)
Collecting wrapt==1.14.1
  Using cached wrapt-1.14.1-cp39-cp39-macosx_11_0_arm64.whl (35 kB)
Collecting XlsxWriter==3.0.3
  Using cached XlsxWriter-3.0.3-py3-none-any.whl (149 kB)
Collecting typing-extensions>=3.10
  Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Requirement already satisfied: setuptools>=20.0 in ./.venv/lib/python3.9/site-packages (from astroid==2.11.7->-r requirements.txt (line 2)) (58.1.0)
Requirement already satisfied: macholib>=1.8 in ./.venv/lib/python3.9/site-packages (from pyinstaller==5.3->-r requirements.txt (line 22)) (1.16.2)
Installing collected packages: pytz, distlib, altgraph, XlsxWriter, wrapt, urllib3, typing-extensions, tomlkit, tomli, toml, six, PyYAML, pyinstaller-hooks-contrib, platformdirs, numpy, nodeenv, mccabe, lazy-object-proxy, jmespath, isort, identify, filelock, et-xmlfile, dill, cfgv, virtualenv, python-dateutil, pyinstaller, openpyxl, confuse, astroid, pylint, pre-commit, pandas, botocore, s3transfer, boto3
  Attempting uninstall: altgraph
    Found existing installation: altgraph 0.17.3
    Uninstalling altgraph-0.17.3:
      Successfully uninstalled altgraph-0.17.3
  Attempting uninstall: pyinstaller-hooks-contrib
    Found existing installation: pyinstaller-hooks-contrib 2022.14
    Uninstalling pyinstaller-hooks-contrib-2022.14:
      Successfully uninstalled pyinstaller-hooks-contrib-2022.14
  Attempting uninstall: pyinstaller
    Found existing installation: pyinstaller 5.7.0
    Uninstalling pyinstaller-5.7.0:
      Successfully uninstalled pyinstaller-5.7.0
Successfully installed PyYAML-6.0 XlsxWriter-3.0.3 altgraph-0.17.2 astroid-2.11.7 boto3-1.24.50 botocore-1.27.50 cfgv-3.3.1 confuse-2.0.0 dill-0.3.5.1 distlib-0.3.5 et-xmlfile-1.1.0 filelock-3.8.0 identify-2.5.3 isort-5.10.1 jmespath-1.0.1 lazy-object-proxy-1.7.1 mccabe-0.7.0 nodeenv-1.7.0 numpy-1.23.1 openpyxl-3.0.10 pandas-1.4.3 platformdirs-2.5.2 pre-commit-2.20.0 pyinstaller-5.3 pyinstaller-hooks-contrib-2022.8 pylint-2.14.5 python-dateutil-2.8.2 pytz-2022.2 s3transfer-0.6.0 six-1.16.0 toml-0.10.2 tomli-2.0.1 tomlkit-0.11.3 typing-extensions-4.4.0 urllib3-1.26.11 virtualenv-20.16.3 wrapt-1.14.1
make: *** [build] Error 1
 ~/d/aws-auto-inventory-0.2.0 $ cat build.txt                       
26 INFO: PyInstaller: 5.3
26 INFO: Python: 3.9.16
43 INFO: Platform: macOS-13.1-arm64-arm-64bit
44 INFO: wrote /Volumes/Development/aws-auto-inventory-0.2.0/aws-auto-inventory-darwin-arm64.spec
44 DEBUG: Testing for UPX ...
47 INFO: UPX is not available.
47 INFO: Removing temporary files and cleaning cache in /Users/stephenb87/Library/Application Support/pyinstaller
script '/Volumes/Development/aws-auto-inventory-0.2.0/cli.py' not found

Add option to output information as a CSV file or JSON instead of in a spreadsheet

The company I work for is looking at using this tool to create an inventory of our AWS resources. We would like to have the option to export the information gathered by this tool in a CSV file format or JSON.

We have considered converting the file after the fact but we believe that others would benefit from this feature too.

Given that the tool uses pandas to write the output file it should be simple to add other file formats.

I would be happy to take on this task.

Result key should support jq based fields

Currently, the scan JSON file contains the result_key which is based on a single key that calls the AWS service function and returns "ResponseMetadata", supporting JQ-based fields in the result_key option to get more granular and shortened results.

Continue to support key based filtering from response

  {
    "function": "list_clusters",
    "service": "emr",
    "parameters": {
      "ClusterStates": [
            "WAITING"
          ]
        },
    "result_key": "Clusters"
}
]

Also, support JQ based filter query

  {
    "function": "list_clusters",
    "service": "emr",
    "parameters": {
      "ClusterStates": [
            "WAITING"
          ]
        },
    "result_key": ".Clusters[]|.Id, .Name"
}
]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.