pwn0sec / pwnxss Goto Github PK
View Code? Open in Web Editor NEWPwnXSS: Vulnerability (XSS) scanner exploit
License: MIT License
PwnXSS: Vulnerability (XSS) scanner exploit
License: MIT License
Hey i realy like ur project but please we need auto save valid XSS outputs or a function to save output for long requests
and thanks
Hello i decided to use your tool to help me in my hunt for xss vulnerabilities, so long story short i installed everything as needed and attempted to do a simple scan on my target. Well it works but stops with only INFO and then goes back to root (kali@kali:~PwnXSS) so what could be the problem for it not scanning my target successfully is there something else i did not install to make this tool work ?
How can I save the result into text file over verbose were there
python3 pwnxss.py -u http://site.com 1
Traceback (most recent call last):
File "/home/kali/PwnXSS/pwnxss.py", line 7, in
from lib.helper.helper import *
ModuleNotFoundError: No module named 'lib.helper'
version- Python 3.9.9
i used below commands but tool isn't working with proxy
Let me know , what am doing wrong
Sometimes fails with an exception.
Traceback (most recent call last):
File "pwnxss.py", line 73, in <module>
start()
File "pwnxss.py", line 52, in start
core.main(getopt.u,getopt.proxy,getopt.user_agent,check(getopt),getopt.cookie,getopt.method)
File "[hidden]/PwnXSS/lib/core.py", line 165, in main
self.get_method()
File "[hidden]/PwnXSS/lib/core.py", line 135, in get_method
_respon=self.session.get(test)
File "/usr/local/lib/python3.6/dist-packages/requests/sessions.py", line 546, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.6/dist-packages/requests/sessions.py", line 668, in send
history = [resp for resp in gen] if allow_redirects else []
File "/usr/local/lib/python3.6/dist-packages/requests/sessions.py", line 668, in <listcomp>
history = [resp for resp in gen] if allow_redirects else []
File "/usr/local/lib/python3.6/dist-packages/requests/sessions.py", line 165, in resolve_redirects
raise TooManyRedirects('Exceeded %s redirects.' % self.max_redirects, response=resp)
requests.exceptions.TooManyRedirects: Exceeded 30 redirects.
thanks for contributing
sites containing a tel:// tag breaks the process
[20:53:08] [INFO] Checking connection to: tel://00000000
[20:53:08] [CRITICAL] Internal error: No connection adapters were found for 'tel://00000000'
Traceback (most recent call last):
File "pwnxss.py", line 73, in
start()
File "pwnxss.py", line 54, in start
crawler.crawl(getopt.u,int(getopt.depth),getopt.proxy,getopt.user_agent,check(getopt),getopt.method,getopt.cookie)
File "/home//temp/PwnXSS/lib/crawler/crawler.py", line 52, in crawl
self.crawl(url,depth-1,base,proxy,level,method,cookie)
File "/home//temp/PwnXSS/lib/crawler/crawler.py", line 44, in crawl
urls=self.getLinks(base,proxy,headers,cookie)
File "/home/******/temp/PwnXSS/lib/crawler/crawler.py", line 19, in getLinks
text=conn.get(base).text
File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 537, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 524, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 631, in send
adapter = self.get_adapter(url=request.url)
File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 722, in get_adapter
raise InvalidSchema("No connection adapters were found for '%s'" % url)
requests.exceptions.InvalidSchema: No connection adapters were found for 'tel://00000000'
how can I get around this? type -from -do not check certificate
I don't know how to using --cookie in this tool
I try with --cookie="", --cookie=''.. but didn't work
Hello,
I have a mistake when the crawler check link tag with a phone number inside : phone contact
[21:15:28] [CRITICAL] Internal error: No connection adapters were found for 'tel:0800006272' Traceback (most recent call last): File "PwnXSS-master/pwnxss.py", line 73, in <module> start() File "PwnXSS-master/pwnxss.py", line 54, in start crawler.crawl(getopt.u,int(getopt.depth),getopt.proxy,getopt.user_agent,check(getopt),getopt.method,getopt.cookie) File "PwnXSS-master/lib/crawler/crawler.py", line 50, in crawl self.crawl(url,depth-1,base,proxy,level,method,cookie) File "PwnXSS-master/lib/crawler/crawler.py", line 50, in crawl self.crawl(url,depth-1,base,proxy,level,method,cookie) File "PwnXSS-master/lib/crawler/crawler.py", line 42, in crawl urls=self.getLinks(base,proxy,headers,cookie) File "PwnXSS-master/lib/crawler/crawler.py", line 19, in getLinks text=conn.get(base).text File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get return self.request('GET', url, **kwargs) File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 649, in send adapter = self.get_adapter(url=request.url) File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 742, in get_adapter raise InvalidSchema("No connection adapters were found for {!r}".format(url)) requests.exceptions.InvalidSchema: No connection adapters were found for 'tel:0800006272'
Traceback (most recent call last):
File "C:\Users\DELL\3D Objects\PwnXSS-master\pwnxss.py", line 73, in
start()
File "C:\Users\DELL\3D Objects\PwnXSS-master\pwnxss.py", line 52, in start
core.main(getopt.u,getopt.proxy,getopt.user_agent,check(getopt),getopt.cookie,getopt.method)
File "C:\Users\DELL\3D Objects\PwnXSS-master\lib\core.py", line 179, in main
self.post_method()
File "C:\Users\DELL\3D Objects\PwnXSS-master\lib\core.py", line 68, in post_method
req=self.session.post(urljoin(self.url,action),data=keys)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\DELL\anaconda3\Lib\site-packages\requests\sessions.py", line 635, in post
return self.request("POST", url, data=data, json=json, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\DELL\anaconda3\Lib\site-packages\requests\sessions.py", line 587, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\DELL\anaconda3\Lib\site-packages\requests\sessions.py", line 723, in send
history = [resp for resp in gen]
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\DELL\anaconda3\Lib\site-packages\requests\sessions.py", line 723, in
history = [resp for resp in gen]
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\DELL\anaconda3\Lib\site-packages\requests\sessions.py", line 191, in resolve_redirects
raise TooManyRedirects(
requests.exceptions.TooManyRedirects: Exceeded 30 redirects.
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
When I run the script --> python3 pwnxss.py -u https://mysite.com
I see this error:
Traceback (most recent call last):
File "/opt/PwnXSS/pwnxss.py", line 7, in <module>
from lib.helper.helper import *
ModuleNotFoundError: No module named 'lib.helper'
Desktop (please complete the following information):
A greeting and thanks
Awesome accurate script! Thanks
Describe the bug
it just exit after printing the following log:
[02:38:16] [INFO] Starting PwnXSS...
[02:38:16] [INFO] Checking connection to:
[02:38:16] [INFO] Connection estabilished 200
To Reproduce
python3 pwnxss.py -u https://the-url/ --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:82.0) Gecko/20100101 Firefox/82.0"
Expected behavior
crawling the website
Desktop (please complete the following information):
Traceback (most recent call last):
File "pwnxss.py", line 7, in
from lib.helper.helper import *
ModuleNotFoundError: No module named 'lib.helper'
Is your feature request related to a problem? Please describe.
I would love to have the script log all XSS it finds to keep track of where it's found them
Describe the solution you'd like
I would love to have the script log all XSS it finds to keep track of where it's found them
Describe alternatives you've considered
tee xss.txt but too verbose.
Is your feature request related to a problem? Please describe.
I want to work with my best friends π€
Describe the solution you'd like
Max is an amazing guy and super top developer and my best friend. Max loves C
.
@github developer maintainer. Full-Time Open-Sourcerer. Full-stack programmer. In the path of a real computer engineer... (Compiler enthusiast)
Describe alternatives you've considered
There is no alternative to git push --force
Additional context
Everyone can join our GitHub Organization just open a new issue and follow the templates.
One (onelang) is an open-source system programming language that makes it easy to build reliable, efficient and performant software.
Line 44 in 4f75e6a
When I try to run this from CLI in kali the proxy is getting handled as a string by the required requests module and throws an error
/lib/python3.7/site-packages/requests/sessions.py", line 290, in rebuild_proxies
new_proxies = proxies.copy()
This appears to be because when it receives the parameter it is a string, even when passed as --proxy={'https':'https://localhost:8080'}
Maybe a separate parameter for https, http, noproxy gets around this? Am I missing something simple?
Describe the bug
Error: AttributeError: 'str' object has no attribute 'copy'
To Reproduce
Steps to reproduce the behavior:
Screenshots
βββββββ βββ βββββββ ββββββ βββββββββββββββββββ
βββββββββββ ββββββββ βββββββββββββββββββββββββββ
βββββββββββ ββ βββββββββ βββ ββββββ ββββββββββββββββ {v0.5 Final}
βββββββ ββββββββββββββββββββ ββββββ ββββββββββββββββ
βββ βββββββββββββ ββββββββββ βββββββββββββββββββ
βββ ββββββββ βββ ββββββββ βββββββββββββββββββ
<<<<<<< STARTING >>>>>>>
[10:38:31] [INFO] Starting PwnXSS...
***************
[10:38:31] [INFO] Checking connection to: https://webdemo.cloud.invgate.net
[10:38:32] [INFO] Connection estabilished 200
[10:38:32] [WARNING] Target have form with POST method: https://webdemo.cloud.invgate.net/auth/login/type/dummy
[10:38:32] [INFO] Collecting form input key.....
[10:38:32] [INFO] Internal error: 'name'
[10:38:32] [INFO] Form key name: next value: <script>console.log(5000/3000)</script>
[10:38:32] [INFO] Sending payload (POST) method...
[10:38:33] [INFO] Parameter page using (POST) payloads but not 100% yet...
[10:38:33] [WARNING] Target have form with POST method: https://webdemo.cloud.invgate.net/auth/login/type/servicedesk
[10:38:33] [INFO] Collecting form input key.....
[10:38:33] [INFO] Form key name: value value: <script>console.log(5000/3000)</script>
[10:38:33] [INFO] Form key name: password value: <script>console.log(5000/3000)</script>
[10:38:33] [INFO] Internal error: 'name'
[10:38:33] [INFO] Form key name: next value: <script>console.log(5000/3000)</script>
[10:38:33] [INFO] Form key name: CSRFToken value: <script>console.log(5000/3000)</script>
[10:38:33] [INFO] Sending payload (POST) method...
[10:38:33] [INFO] Parameter page using (POST) payloads but not 100% yet...
***************
[10:38:35] [INFO] Checking connection to: https://webdemo.cloud.invgate.net/password-reset/trigger
[10:38:36] [INFO] Connection estabilished 200
[10:38:36] [WARNING] Target have form with POST method: https://webdemo.cloud.invgate.net/password-reset/trigger
[10:38:36] [INFO] Collecting form input key.....
[10:38:36] [INFO] Form key name: email value: <script>prompt(5000/200)</script>
[10:38:36] [INFO] Form key name: submit value: <Submit Confirm>
[10:38:36] [INFO] Form key name: CSRFToken value: <script>prompt(5000/200)</script>
[10:38:36] [INFO] Sending payload (POST) method...
[10:38:36] [INFO] Parameter page using (POST) payloads but not 100% yet...
Traceback (most recent call last):
File "/Users/faguirre/Desktop/PwnXSS/pwnxss.py", line 73, in <module>
start()
File "/Users/faguirre/Desktop/PwnXSS/pwnxss.py", line 54, in start
crawler.crawl(getopt.u,int(getopt.depth),getopt.proxy,getopt.user_agent,check(getopt),getopt.method,getopt.cookie)
File "/Users/faguirre/Desktop/PwnXSS/lib/crawler/crawler.py", line 52, in crawl
self.crawl(url,depth-1,base,proxy,level,method,cookie)
File "/Users/faguirre/Desktop/PwnXSS/lib/crawler/crawler.py", line 44, in crawl
urls=self.getLinks(base,proxy,headers,cookie)
File "/Users/faguirre/Desktop/PwnXSS/lib/crawler/crawler.py", line 19, in getLinks
text=conn.get(base).text
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 636, in send
kwargs.setdefault('proxies', self.rebuild_proxies(request, self.proxies))
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 289, in rebuild_proxies
new_proxies = proxies.copy()
AttributeError: 'str' object has no attribute 'copy'
Desktop (please complete the following information):
Didn't use the tool yet, just want to enquire if the tool supports locally hosted sites?
Your cookie support works fine! However the documentation you provide in your documentation is wrong. If one follows your documenation they will run into an error with the json parser.
To Reproduce
If i run
python3 -u myhost.com --cookie {'ID':'1094200543'}
you will run into the following error:
Traceback (most recent call last):
File "/home/kali/Desktop/MasterTools/PwnXSS/pwnxss.py", line 73, in <module>
start()
File "/home/kali/Desktop/MasterTools/PwnXSS/pwnxss.py", line 52, in start
core.main(getopt.u,getopt.proxy,getopt.user_agent,check(getopt),getopt.cookie,getopt.method)
File "/home/kali/Desktop/MasterTools/PwnXSS/lib/core.py", line 163, in main
self.session=session(proxy,headers,cookie)
File "/home/kali/Desktop/MasterTools/PwnXSS/lib/helper/helper.py", line 26, in session
r.cookies.update(json.loads(cookie))
File "/usr/lib/python3.9/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.9/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.9/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
which is completely fine since the Json parser requires properties in double quotes. Single quotes are not valid. Furthermore the json parser requires a string to be passed. Your documentation guides the user into providing the json object itself which is off.
When changing the request to actually take the functionallity of the parser into account the request works.
python3 pwnxss.py -u myhost.com --cookie "{\"ID\":\"1094200543\"}"
Expected behavior
There are two possibilities how you could fix this.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.