noir-cr / noir Goto Github PK
View Code? Open in Web Editor NEWAttack surface detector that identifies endpoints by static analysis
License: MIT License
Attack surface detector that identifies endpoints by static analysis
License: MIT License
OAS2
$ noir -b .
.__
____ ____ |__|______
/ \ / _ \| \_ __ \
| | ( <♠️> ) || | \/
|___| /\____/|__||__|
\/ v0.7.2
[*] Detecting technologies to base directory.
[I] Detected 2 technologies.
go_echo
oas2
[*] Initiate code analysis based on the detected technology.
[*] Starting analysis of endpoints.
17 Analyzers initialized
Analysis to 2 technologies
Unhandled exception: Missing hash key: "basePath" (KeyError)
from /opt/homebrew/Cellar/noir/0.7.2/bin/noir in 'raise<KeyError>:NoReturn'
from /opt/homebrew/Cellar/noir/0.7.2/bin/noir in 'JSON::Any#[]<String>:JSON::Any'
from /opt/homebrew/Cellar/noir/0.7.2/bin/noir in '~procProc(Hash(Symbol, String), Array(Endpoint))@src/analyzer/analyzer.cr:17'
from /opt/homebrew/Cellar/noir/0.7.2/bin/noir in '__crystal_main'
from /opt/homebrew/Cellar/noir/0.7.2/bin/noir in 'main'
As per the readme: https://github.com/hahwul/noir#from-sources
# Install Crystal-lang
# https://crystal-lang.org/install/
# Clone this repo
git clone https://github.com/hahwul/noir
cd noir
# Install Dependencies
shards install
# Build
shards build --release --no-debug
# Copy binary
cp ./bin/noir /usr/bin/
I hardly know anything about crystal lang, but can't we use Gihub actions to auto release static binaries?
Maybe something like this:
name: Build Static noir Binaries
on:
workflow_dispatch:
schedule:
- cron: "0 */12 * * *"
env:
# https://i.redd.it/o6xypg00uac91.png
USER_AGENT: "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/115.0.5790.160 Mobile/15E148 Safari/604.1"
GITHUB_TOKEN: ${{ secrets.STATIC_NOIR_RELEASER }}
jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
path: main
- name: Check Version
id: check_version
run: |
# Get latest version
export VERSION=$(curl -qfsSL "https://api.github.com/repos/hahwul/noir/releases/latest" | jq -r '.tag_name' )
# If we get rate-limited, git clone the repo
if [ -z "$VERSION" ]; then
cd $(mktemp -d) && git clone https://github.com/hahwul/noir && cd noir
export VERSION=$(git tag --sort=-creatordate | head -n 1)
fi
# Export it to ENV
echo "VERSION=$VERSION" >> $GITHUB_ENV
# Get stored version
export STORED_VERSION=$(cat $GITHUB_WORKSPACE/main/noir/version.txt)
if [ "$VERSION" == "$STORED_VERSION" ]; then
echo "Version $VERSION is already Fetched & Updated"
echo "versions_same=true" >> $GITHUB_ENV
else
echo "Fetching... --> $VERSION (from <-- $STORED_VERSION)"
echo "versions_same=false" >> $GITHUB_ENV
fi
shell: bash
- name: Compare Versions
if: env.versions_same != 'true'
run: |
# Update version.txt with the latest version
echo $VERSION > $GITHUB_WORKSPACE/main/noir/version.txt
echo "Updated version.txt with the latest version $VERSION"
- name: Download Latest Source Code (Github_API)
if: env.versions_same != 'true'
run: |
# Get latest Source Code
curl -qfLJ $(curl -qfsSL "https://api.github.com/repos/hahwul/noir/releases/latest" | jq -r '.zipball_url') -o "$HOME/noir.zip"
continue-on-error: true
- name: Clone repository if curl fails and zip
if: env.versions_same != 'true'
run: |
if [ ! -f "$HOME/noir.zip" ]; then
cd $HOME && git clone https://github.com/hahwul/noir.git && cd noir && git checkout "$VERSION"
zip -ro "$HOME/noir.zip" "$HOME/noir"
fi
continue-on-error: false
- name: Install CoreUtils
if: env.versions_same != 'true'
run: |
sudo apt-get update -y
sudo apt-get install -y --no-install-recommends bison build-essential ca-certificates flex file jq pkg-config qemu-user-static wget
continue-on-error: false
- name: Install Crystal
if: env.versions_same != 'true'
run: |
#https://crystal-lang.org/install/on_ubuntu/
curl -fsSL https://crystal-lang.org/install.sh | sudo bash
continue-on-error: false
#Linux
- name: Build noir for x86_64-unknown-linux-gnu
if: env.versions_same != 'true'
run: |
#Get the zip
cd $(mktemp -d) && cp "$HOME/noir.zip" .
find . -type f -name '*.zip*' -exec unzip -o {} \;
cd $(find . -maxdepth 1 -type d | grep -v '^.$')
# Build
shards install
shards build --release --no-debug --production --static
# Strip & Metadata
strip "./bin/noir"
file "./bin/noir" && ldd "./bin/noir" ; sha256sum "./bin/noir"
# Prepare Release
mkdir -p "/tmp/releases/"
cp "./bin/noir" "/tmp/releases/noir-x86_64-unknown-linux-gnu"
# Tar it
tar -czvf noir-x86_64-unknown-linux-gnu.tar.gz -C ./bin noir
cp "./noir-x86_64-unknown-linux-gnu.tar.gz" "/tmp/releases"
continue-on-error: false
#Create a new Release & Publish
# Repo: https://github.com/softprops/action-gh-release
# Market-Place: https://github.com/marketplace/actions/gh-release
- name: Releaser
if: env.versions_same != 'true'
uses: softprops/[email protected]
with:
name: "noir ${{ env.VERSION }}"
tag_name: "noir_${{ env.VERSION }}"
prerelease: false
draft: false
generate_release_notes: false
token: "${{ secrets.GITHUB_TOKEN }}"
body: |
`Changelog`: _https://github.com/hahwul/noir/releases/tag/${{ env.VERSION }}_
files: |
/tmp/releases/*
--include-evidence
--using-evidence
[
...
{
"headers": [],
"method": "POST",
"params": [
{
"name": "article_slug",
"param_type": "json",
"value": ""
},
{
"name": "title",
"param_type": "json",
"value": ""
},
{
"name": "id",
"param_type": "json",
"value": ""
}
],
"protocol": "http",
"url": "https://testapp.internal.domains/comments",
"evidence": "e.POST('/comments', commentHandler)"
}
]
example
#%RAML 1.0
title: Hello world # required title
/greeting: # optional resource
get: # HTTP method declaration
responses: # declare a response
200: # HTTP status code
body: # declare content of response
application/json: # media type
# structural definition of a response (schema or type)
type: object
properties:
message: string
example: # example how a response looks like
message: "Hello world"
Sometimes we only want to deliver a URL of a particular pattern.
noir -b . -u https://testapp.internal.domains
.__
____ ____ ||____
/ \ / _ | _ __
| | ( <
|| /_/||||
/ v0.7.1
[] Detecting technologies to base directory.
[I] Detected 16 technologies.
php_pure
c#-aspnet-mvc
python_django
python_flask
go_echo
go_gin
java_armeria
java_spring
java_jsp
crystal_kemal
kotlin_spring
oas3
ruby_sinatra
oas2
ruby_rails
raml
[] Initiate code analysis based on the detected technology.
[*] Starting analysis of endpoints.
17 Analyzers initialized
Analysis to 16 technologies
Unhandled exception: Regex match error: UTF-8 error: code points greater than 0x10ffff are not defined (ArgumentError)
from noir in '??'
from noir in '??'
from noir in '??'
from noir in '??'
from noir in '__crystal_main'
from noir in 'main'
from /lib/x86_64-linux-gnu/libc.so.6 in '__libc_start_main'
from noir in '_start'
from ???
In Django, FastAPI, and Flask projects, you can receive cookie parameters, and cookies act as one of the input vectors for the API. Therefore, let's add support for cookie parameter types to the src/models/endpoint.cr model.
Unhandled exception in spawn: Error opening file with mode 'r': './app/logs': No such file or directory (File::NotFoundError)
from /opt/homebrew/Cellar/noir/0.5.0/bin/noir in 'raise<File::Error+>:NoReturn'
from /opt/homebrew/Cellar/noir/0.5.0/bin/noir in 'Crystal::System::File::open<String, String, File::Permissions>:Int32'
from /opt/homebrew/Cellar/noir/0.5.0/bin/noir in 'File::new<String, String, File::Permissions, Nil, Nil>:File'
from /opt/homebrew/Cellar/noir/0.5.0/bin/noir in '~proc12Proc(Nil)@src/detector/detector.cr:21'
from /opt/homebrew/Cellar/noir/0.5.0/bin/noir in 'Fiber#run:(IO::FileDescriptor | Nil)'
Hey, I love this tool. It would be great if it supported C#.
C# routes are often defined in the MVC construct. https://www.tutorialsteacher.com/mvc/routing-in-mvc
MVC routes would probably be the easiest pattern to identify and report on.
Additionally, routes can also be defined as web services exposed via ASMX or SVC files, or inside the web.config file.
.NET has the concept of HTTPModules which are triggered on every request, and also has the concept of IS API DLL extensions, which are typically executed when hitting the specific DLL file in the web root. All of these could be great to detect and easy to as well from reading the web.config file.
{
"openapi": "3.0.0",
"info": {
"title": "Pet Store API",
"version": "1.0.0",
"description": "Sample API for managing pets in a pet store."
},
"servers": [
{
"url": "https://api.example.com/v1"
}
],
"paths": {
"/pets": {
"get": {
"summary": "Get a list of pets",
"responses": {
"200": {
"description": "A list of pets",
"content": {
"application/json": {
"example": [
{
"id": 1,
"name": "Fluffy"
},
{
"id": 2,
"name": "Fido"
}
]
}
}
}
}
},
"post": {
"summary": "Add a new pet",
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"name": {
"type": "string"
}
},
"required": ["name"]
}
}
}
},
"responses": {
"201": {
"description": "Pet added successfully"
}
}
}
},
"/pets/{petId}": {
"get": {
"summary": "Get information about a specific pet",
"parameters": [
{
"name": "petId",
"in": "path",
"required": true,
"schema": {
"type": "integer"
}
}
],
"responses": {
"200": {
"description": "Information about the pet",
"content": {
"application/json": {
"example": {
"id": 1,
"name": "Fluffy",
"breed": "Cat"
}
}
}
}
}
},
"put": {
"summary": "Update information about a specific pet",
"parameters": [
{
"name": "petId",
"in": "path",
"required": true,
"schema": {
"type": "integer"
}
}
],
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"breed": {
"type": "string"
}
},
"required": ["name"]
}
}
}
},
"responses": {
"200": {
"description": "Pet information updated successfully"
}
}
}
}
}
}
openapi: 3.0.0
info:
title: Pet Store API
version: 1.0.0
description: Sample API for managing pets in a pet store.
servers:
- url: https://api.example.com/v1
paths:
/pets:
get:
summary: Get a list of pets
responses:
'200':
description: A list of pets
content:
application/json:
example:
- id: 1
name: Fluffy
- id: 2
name: Fido
post:
summary: Add a new pet
requestBody:
required: true
content:
application/json:
schema:
type: object
properties:
name:
type: string
required:
- name
responses:
'201':
description: Pet added successfully
/pets/{petId}:
get:
summary: Get information about a specific pet
parameters:
- name: petId
in: path
required: true
schema:
type: integer
responses:
'200':
description: Information about the pet
content:
application/json:
example:
id: 1
name: Fluffy
breed: Cat
put:
summary: Update information about a specific pet
parameters:
- name: petId
in: path
required: true
schema:
type: integer
requestBody:
required: true
content:
application/json:
schema:
type: object
properties:
name:
type: string
breed:
type: string
required:
- name
responses:
'200':
description: Pet information updated successfully
I believe this issue can be resolved with the following code.
# Code1
locator = CodeLocator.instance
locator.set("rails-home", dirname)
# Code2
puts locator.get("rails-home")
Therefore, it would be beneficial to utilize a singleton key-value pair in the format of techname-home
for each technology.
@hahwul
생각나서 저도 코드를 작성해봤습니다~ (코드는 개판인 것 같긴해요 ㅠ)
https://github.com/ksg97031/noir/tree/support-django
ROOT_URLCONF 글로벌 변수를 읽어서 기준이되는 urls.py를 가져옵니다. (search_root_django_urls_list)
가져온 파일에서 추가로 포함시키는 urls.py를 추가로 검색합니다. (include() 형태만 가능)
검색 시 상위 URL Prefix를 계속 전달시키는 방법으로 최종 URL 엔드포인트를 만들었습니다.
View 파일을 가져오는 것도 작성했는데, 생각보다 코드를 봐도 GET, POST를 바로 알 수 있지 않아서 이건 좀 더 확인이 필요해보입니다.
아마 기본이 GET 이거나, 함수명에 따라 메소드가 정해지는 것 같아서 가능한 것만 추가하면 될 듯합니다~
아래 두 프로젝트로 테스트했는데 결과가 나쁘지 않은 것 같아 공유드립니다~
https://github.com/django/djangoproject.com
https://github.com/liangliangyy/DjangoBlog
https://github.com/tokio-rs/axum
async fn main() {
// initialize tracing
tracing_subscriber::fmt::init();
// build our application with a route
let app = Router::new()
// `GET /` goes to `root`
.route("/", get(root))
// `POST /users` goes to `create_user`
.route("/users", post(create_user));
// run our app with hyper, listening globally on port 3000
let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await.unwrap();
axum::serve(listener, app).await.unwrap();
}
Old
@map : Hash(String, String)
New
@s_map : Hash(String, String)
@a_map : Hash(String, Array(String))
Sometimes custom headers are required when using Deliver. e.g Authorization
--with-headers
--with-custom-headers
--custom-headers
--send-with-headers
# e.g
Deliver:
--send-req Send the results to the web request
--send-proxy http://proxy.. Send the results to the web request via http proxy
--with-headers
As part of our ongoing efforts to improve the performance and coverage of our code analyzer, planning to introduce structural changes based on test-driven development principles. These changes aim to enhance the overall efficiency and effectiveness of our code analysis process.
Currently, some analyzer use this method, and in the future, both the analyzer and the detector want to check this method and increase accuracy.
The current analyzer employs a fundamental string parsing logic (regular expressions, string splitting ..), which means that it is not guaranteed to be 100% accurate. This is because some characters, such as double quotes, can be interpreted as special characters by the analyzer.
For example, the following Python code:
from django.urls import path
from . import views
urlpatterns = [
path("example\"'route", views.app2_index, name="index"),
]
will not be parsed correctly by the analyzer because the double quotes(") are interpreted as part of the path variable.
Similarly, the following Go code:
e.GET("/pet,comma", func(c echo.Context) error {
return c.String(http.StatusOK, "Hello, Pet!")
})
will also not be parsed correctly because the comma(,) is interpreted as a delimiter.
As this doesn't represent a universal scenario, I'm not sure whether to keep as a known issue or implement a shared lexer and parser to handle these cases more comprehensively.
Separate output functions belonging to the noir model into separate classes for extension.
Jekyll!!
I was going to just make a PR but then realized that I wasn't 100% sure what you were going for:
"
♠️ Noir is an attack surface detector from source code.
or
♠️ Noir is an attack surface detector for source code.
https://spring.io/blog/2016/09/22/new-in-spring-5-functional-web-framework
RouterFunction<?> route =
route(GET("/hello-world"),
request -> Response.ok().body(fromObject("Hello World")))
.and(route(GET("/the-answer"),
request -> Response.ok().body(fromObject(42))));
https://docs.spring.io/spring-framework/reference/web/webflux-functional.html
RouterFunction<ServerResponse> route = route()
.GET("/person/{id}", accept(APPLICATION_JSON), handler::getPerson)
.GET("/person", accept(APPLICATION_JSON), handler::listPeople)
.POST("/person", handler::createPerson)
.build();
I'm a big fan of keeping brew releases up to date with Github releases. Your recent changes to add OAS3 support as part of issue #15 is NOT properly reflected in the current v0.4.0 release.
An enhancement suggestion would be to consider automating your CICD pipeline with a GitHub action to push to homebrew when merging into the main branch. Articles like this one, or marketplace actions like this one might go a long way to helping keep what is in brew in sync with what is actually in GitHub.
Just a suggestion. But I'd rather use homebrew to manage noir than have to pull and build when the versioning isn't already matching up.
HTH.
-f oas3
in output group--generate-oas3
in deliver groupHi,
Firstly, it is a great tool, thanks for open-sourcing.
The issue is that the --with-headers
flag removes the character 's' in the header value.
--with-headers 'Accept-Version:test'
ends up as Accept-Version: tet
.
Best,
~ hb
Identifies the API Endpoint in the .kt files
Hi I can not solve this issue, tried to update/reinstall both noir and crystal but none of them helped.
Any idea how can I solve this?
Analysis to 2 technologies Unhandled exception: Cast from Nil to Hash(K, V) failed, at /home/linuxbrew/.linuxbrew/Cellar/crystal/1.9.2/share/crystal/src/yaml/any.cr:286:5:286 (TypeCastError) from noir in '??' from noir in '??' from noir in '??' from noir in '__crystal_main' from noir in 'main' from /lib/x86_64-linux-gnu/libc.so.6 in '??' from /lib/x86_64-linux-gnu/libc.so.6 in '__libc_start_main' from noir in '_start' from ???
I'm running Noir against a Typescript project, I know that it currently not support Typescript and noted the it ignores and no problem, but it detects OAS3, and throws an Unhandled Exception:
.__
____ ____ |__|______
/ \ / _ \| \_ __ \
| | ( <♠> ) || | \/
|___| /\____/|__||__|
\/ v0.7.0
[*] Detecting technologies to base directory.
[I] Detected 1 technologies.
oas3
[*] Initiate code analysis based on the detected technology.
[*] Starting analysis of endpoints.
17 Analyzers initialized
Analysis to 1 technologies
Unhandled exception: Missing hash key: "servers" (KeyError)
from noir in '??'
from noir in '??'
from noir in '??'
from noir in '__crystal_main'
from noir in 'main'
from /lib/x86_64-linux-gnu/libc.so.6 in '??'
from /lib/x86_64-linux-gnu/libc.so.6 in '__libc_start_main'
from noir in '_start'
from ???
I`m using Kali Linux 2023.1 (kali-roolling), this behavior is expected ?
I tried using the following command line to see the --send-proxy flag in action:
noir -b . -u http://crapi.apisec.ai --send-proxy http://localhost:8080
The output showed as follows:
❯ noir -b . -u http://crapi.apisec.ai --send-proxy http://localhost:8080
.__
____ ____ |__|______
/ \ / _ \| \_ __ \
| | ( <♠️> ) || | \/
|___| /\____/|__||__|
\/ v0.4.0
[*] Detecting technologies to base directory.
[I] Detected 2 technologies.
python_django
java_spring
[*] Initiate code analysis based on the detected technology.
[*] Starting analysis of endpoints.
10 Analyzers initialized
Analysis to 2 technologies
27 endpoints found
[*] Optimizing endpoints.
[*] Sending requests with proxy http://localhost:8080
[I] Finally identified 27 endpoints.
[*] Generating Report.
GET http://crapi.apisec.ai/identity/api/v2/user/videos/{video_id}
POST http://crapi.apisec.ai/identity/api/v2/user/pictures
POST http://crapi.apisec.ai/identity/api/v2/user/videos
PUT http://crapi.apisec.ai/identity/api/v2/user/videos/{video_id}
DELETE http://crapi.apisec.ai/identity/api/v2/user/videos/{video_id}
DELETE http://crapi.apisec.ai/identity/api/v2/admin/videos/{video_id}
GET http://crapi.apisec.ai/identity
POST http://crapi.apisec.ai/identity/api/auth/login
POST http://crapi.apisec.ai/identity/api/auth/signup
POST http://crapi.apisec.ai/identity/api/auth/verify
GET http://crapi.apisec.ai/identity/api/auth/jwks.json
POST http://crapi.apisec.ai/identity/api/auth/forget-password
POST http://crapi.apisec.ai/identity/api/auth/v2/check-otp
POST http://crapi.apisec.ai/identity/api/auth/v3/check-otp
POST http://crapi.apisec.ai/identity/api/auth/v4.0/user/login-with-token
POST http://crapi.apisec.ai/identity/api/auth/v2.7/user/login-with-token
POST http://crapi.apisec.ai/identity/api/auth/reset-test-users
POST http://crapi.apisec.ai/identity/api/v2/vehicle/add_vehicle
POST http://crapi.apisec.ai/identity/api/v2/vehicle/resend_email
GET http://crapi.apisec.ai/identity/api/v2/vehicle/vehicles
GET http://crapi.apisec.ai/identity/api/v2/vehicle/vehiclespublicResponseEntity<?>
GET http://crapi.apisec.ai/identity/api/v2/vehicle/{carId}/location
GET http://crapi.apisec.ai/identity/health_check
POST http://crapi.apisec.ai/identity/api/v2/user/change-email
POST http://crapi.apisec.ai/identity/api/v2/user/verify-email-token
GET http://crapi.apisec.ai/identity/api/v2/user/dashboard
POST http://crapi.apisec.ai/identity/api/v2/user/reset-password
However, nothing reaches the Burp Suite proxy.
I verified that the proxy is setup right using curl against the /identity/health_check
endpoint:
curl -x http://localhost:8080 http://crapi.apisec.ai/identity/health_check
This works, and I can see the request in Burp.
So I am guessing Noir isn't delivering to the proxy correctly.
OS: macOS Ventura 13.4.1
Chip: Apple M2
Noir v: 0.4.0
Let me know if there is anything additional you need to debug this.
Your recent commit for #15 definitely supports OAS3, but may be too stringent.
Supporting OAS3, should, at a very minimum, also support minor updates (ie: v3.0.1, v3.0.2 etc). In fact, the regex should really support all 3.0.x. OpenAPI is already publishing 3.1, so you might want to consider even including the major updates too; very little will change for URL path extraction from the JSON/YML.
As of right now, noir can't really detect any up-to-date OAS spec file.
As a test, I tried to force a downgrade of the version to 3.0.0 for OWASP's crAPI spec doc. An updated analysis with noir will throw the following exception:
Love the work you are doing. Has a lot of potential. For finding the attack surface of APIs accounting for more than the base OAS3 would be very beneficial. It might require some tweaking on the detector and analyzer.
HTH.
The "--send-proxy" flag sends actual web requests to the target server.
In contrast, the new flags(--send-zap
--send-caido
) communicates directly with ZAP or Caido without initiating web requests.
Currently, we are combining the Endpoint and URL in the Analyzer. However, this approach allows for disparate handling in each Analyzer, so I plan to centralize URL handling as common logic.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.