Git Product home page Git Product logo

skipfish's Introduction

===========================================
skipfish - web application security scanner
===========================================

  http://code.google.com/p/skipfish/

  * Written and maintained by:

      Michal Zalewski <[email protected]>
      Niels Heinen <[email protected]>
      Sebastian Roschke <[email protected]>

  * Copyright 2009 - 2012 Google Inc, rights reserved.

  * Released under terms and conditions of the Apache License, version 2.0.

--------------------
1. What is skipfish?
--------------------

Skipfish is an active web application security reconnaissance tool. It
prepares an interactive sitemap for the targeted site by carrying out a
recursive crawl and dictionary-based probes. The resulting map is then
annotated with the output from a number of active (but hopefully
non-disruptive) security checks. The final report generated by the tool is
meant to serve as a foundation for professional web application security
assessments.

-------------------------------------------------
2. Why should I bother with this particular tool?
-------------------------------------------------

A number of commercial and open source tools with analogous functionality is
readily available (e.g., Nikto, Nessus); stick to the one that suits you
best. That said, skipfish tries to address some of the common problems
associated with web security scanners. Specific advantages include:

  * High performance: 500+ requests per second against responsive Internet
    targets, 2000+ requests per second on LAN / MAN networks, and 7000+ requests
    against local instances have been observed, with a very modest CPU, network,
    and memory footprint. This can be attributed to:

    * Multiplexing single-thread, fully asynchronous network I/O and data
      processing model that eliminates memory management, scheduling, and IPC
      inefficiencies present in some multi-threaded clients.

    * Advanced HTTP/1.1 features such as range requests, content compression,
      and keep-alive connections, as well as forced response size limiting, to
      keep network-level overhead in check.

    * Smart response caching and advanced server behavior heuristics are used to
      minimize unnecessary traffic.

    * Performance-oriented, pure C implementation, including a custom
      HTTP stack.

  * Ease of use: skipfish is highly adaptive and reliable. The scanner features:

    * Heuristic recognition of obscure path- and query-based parameter handling
      schemes.

    * Graceful handling of multi-framework sites where certain paths obey
      completely different semantics, or are subject to different filtering
      rules.

    * Automatic wordlist construction based on site content analysis.

    * Probabilistic scanning features to allow periodic, time-bound assessments
      of arbitrarily complex sites.

    * Well-designed security checks: the tool is meant to provide accurate
      and meaningful results:

      * Handcrafted dictionaries offer excellent coverage and permit thorough
        $keyword.$extension testing in a reasonable timeframe.

      * Three-step differential probes are preferred to signature checks for
         detecting vulnerabilities.

      * Ratproxy-style logic is used to spot subtle security problems:
        cross-site request forgery, cross-site script inclusion, mixed content,
        issues MIME- and charset mismatches, incorrect caching directives, etc.

      * Bundled security checks are designed to handle tricky scenarios:
        stored XSS (path, parameters, headers), blind SQL or XML injection,
        or blind shell injection.

      * Snort style content signatures which will highlight server errors,
        information leaks or potentially dangerous web applications.

      * Report post-processing drastically reduces the noise caused by any
        remaining false positives or server gimmicks by identifying repetitive
        patterns.

That said, skipfish is not a silver bullet, and may be unsuitable for certain
purposes. For example, it does not satisfy most of the requirements outlined
in WASC Web Application Security Scanner Evaluation Criteria (some of them on
purpose, some out of necessity); and unlike most other projects of this type,
it does not come with an extensive database of known vulnerabilities for
banner-type checks.

-----------------------------------------------------
3. Most curious! What specific tests are implemented?
-----------------------------------------------------

A rough list of the security checks offered by the tool is outlined below.

  * High risk flaws (potentially leading to system compromise):

    * Server-side query injection (including blind vectors, numerical parameters).
    * Explicit SQL-like syntax in GET or POST parameters.
    * Server-side shell command injection (including blind vectors).
    * Server-side XML / XPath injection (including blind vectors).
    * Format string vulnerabilities.
    * Integer overflow vulnerabilities.
    * Locations accepting HTTP PUT.

  * Medium risk flaws (potentially leading to data compromise):

    * Stored and reflected XSS vectors in document body (minimal JS XSS support).
    * Stored and reflected XSS vectors via HTTP redirects.
    * Stored and reflected XSS vectors via HTTP header splitting.
    * Directory traversal / LFI / RFI (including constrained vectors).
    * Assorted file POIs (server-side sources, configs, etc).
    * Attacker-supplied script and CSS inclusion vectors (stored and reflected).
    * External untrusted script and CSS inclusion vectors.
    * Mixed content problems on script and CSS resources (optional).
    * Password forms submitting from or to non-SSL pages (optional).
    * Incorrect or missing MIME types on renderables.
    * Generic MIME types on renderables.
    * Incorrect or missing charsets on renderables.
    * Conflicting MIME / charset info on renderables.
    * Bad caching directives on cookie setting responses.

  * Low risk issues (limited impact or low specificity):

    * Directory listing bypass vectors.
    * Redirection to attacker-supplied URLs (stored and reflected).
    * Attacker-supplied embedded content (stored and reflected).
    * External untrusted embedded content.
    * Mixed content on non-scriptable subresources (optional).
    * HTTPS -> HTTP submission of HTML forms (optional).
    * HTTP credentials in URLs.
    * Expired or not-yet-valid SSL certificates.
    * HTML forms with no XSRF protection.
    * Self-signed SSL certificates.
    * SSL certificate host name mismatches.
    * Bad caching directives on less sensitive content.

  * Internal warnings:

    * Failed resource fetch attempts.
    * Exceeded crawl limits.
    * Failed 404 behavior checks.
    * IPS filtering detected.
    * Unexpected response variations.
    * Seemingly misclassified crawl nodes.

  * Non-specific informational entries:

    * General SSL certificate information.
    * Significantly changing HTTP cookies.
    * Changing Server, Via, or X-... headers.
    * New 404 signatures.
    * Resources that cannot be accessed.
    * Resources requiring HTTP authentication.
    * Broken links.
    * Server errors.
    * All external links not classified otherwise (optional).
    * All external e-mails (optional).
    * All external URL redirectors (optional).
    * Links to unknown protocols.
    * Form fields that could not be autocompleted.
    * Password entry forms (for external brute-force).
    * File upload forms.
    * Other HTML forms (not classified otherwise).
    * Numerical file names (for external brute-force).
    * User-supplied links otherwise rendered on a page.
    * Incorrect or missing MIME type on less significant content.
    * Generic MIME type on less significant content.
    * Incorrect or missing charset on less significant content.
    * Conflicting MIME / charset information on less significant content.
    * OGNL-like parameter passing conventions.

Along with a list of identified issues, skipfish also provides summary
overviews of document types and issue types found; and an interactive
sitemap, with nodes discovered through brute-force denoted in a distinctive
way.

NOTE: As a conscious design decision, skipfish will not redundantly complain
about highly non-specific issues, including but not limited to:

  * Non-httponly or non-secure cookies,
  * Non-HTTPS or autocomplete-enabled forms,
  * HTML comments detected on a page,
  * Filesystem path disclosure in error messages,
  * Server of framework version disclosure,
  * Servers supporting TRACE or OPTIONS requests,
  * Mere presence of certain technologies, such as WebDAV.

Most of these aspects are easy to inspect in a report if so desired - for
example, all the HTML forms are listed separately, so are new cookies or
interesting HTTP headers - and the expectation is that the auditor may opt to
make certain design recommendations based on this data where appropriate.
That said, these occurrences are not highlighted as a specific security flaw.

-----------------------------------------------------------
4. All right, I want to try it out. What do I need to know?
-----------------------------------------------------------

First and foremost, please do not be evil. Use skipfish only against services
you own, or have a permission to test.

Keep in mind that all types of security testing can be disruptive. Although
the scanner is designed not to carry out malicious attacks, it may
accidentally interfere with the operations of the site. You must accept the
risk, and plan accordingly. Run the scanner against test instances where
feasible, and be prepared to deal with the consequences if things go wrong.

Also note that the tool is meant to be used by security professionals, and is
experimental in nature. It may return false positives or miss obvious
security problems - and even when it operates perfectly, it is simply not
meant to be a point-and-click application. Do not take its output at face
value.

Running the tool against vendor-supplied demo sites is not a good way to
evaluate it, as they usually approximate vulnerabilities very imperfectly; we
made no effort to accommodate these cases.

Lastly, the scanner is simply not designed for dealing with rogue and
misbehaving HTTP servers - and offers no guarantees of safe (or sane)
behavior there.

--------------------------
5. How to run the scanner?
--------------------------

To compile it, simply unpack the archive and try make. Chances are, you will
need to install libidn first.

Next, you need to read the instructions provided in doc/dictionaries.txt
to select the right dictionary file and configure it correctly. This step has a 
profound impact on the quality of scan results later on, so don't skip it.

Once you have the dictionary selected, you can use -S to load that dictionary,
and -W to specify an initially empty file for any newly learned site-specific
keywords (which will come handy in future assessments):

$ touch new_dict.wl
$ ./skipfish -o output_dir -S existing_dictionary.wl -W new_dict.wl \
  http://www.example.com/some/starting/path.txt

You can use -W- if you don't want to store auto-learned keywords anywhere.

Note that you can provide more than one starting URL if so desired; all of
them will be crawled. It is also possible to read URLs from file, using
the following syntax:

$ ./skipfish [...other options...] @../path/to/url_list.txt

The tool will display some helpful stats while the scan is in progress. You
can also switch to a list of in-flight HTTP requests by pressing return.

In the example above, skipfish will scan the entire www.example.com
(including services on other ports, if linked to from the main page), and
write a report to output_dir/index.html. You can then view this report with
your favorite browser (JavaScript must be enabled; and because of recent
file:/// security improvements in certain browsers, you might need to access
results over HTTP). The index.html file is static; actual results are stored
as a hierarchy of JSON files, suitable for machine processing or different
presentation frontends if needs be. In addition, a list of all the discovered
URLs will be saved to a single file, pivots.txt, for easy postprocessing.

A simple companion script, sfscandiff, can be used to compute a delta for
two scans executed against the same target with the same flags. The newer
report will be non-destructively annotated by adding red background to all
new or changed nodes; and blue background to all new or changed issues
found.

Some sites may require authentication for which our support is described
in doc/authentication.txt. In most cases, you'll be wanting to use the
form authentication method which is capable of detecting broken sessions
in order to re-authenticate.

Once authenticated, certain URLs on the site may log out your session;
you can combat this in two ways: by using the -N option, which causes
the scanner to reject attempts to set or delete cookies; or with the -X
parameter, which prevents matching URLs from being fetched:

$ ./skipfish -X /logout/logout.aspx ...other parameters...

The -X option is also useful for speeding up your scans by excluding /icons/,
/doc/, /manuals/, and other standard, mundane locations along these lines. In
general, you can use -X and -I (only spider URLs matching a substring) to
limit the scope of a scan any way you like - including restricting it only to
a specific protocol and port:

$ ./skipfish -I http://example.com:1234/ ...other parameters...

A related function, -K, allows you to specify parameter names not to fuzz
(useful for applications that put session IDs in the URL, to minimize noise).

Another useful scoping option is -D - allowing you to specify additional
hosts or domains to consider in-scope for the test. By default, all hosts
appearing in the command-line URLs are added to the list - but you can use -D
to broaden these rules, for example:

$ ./skipfish -D test2.example.com -o output-dir http://test1.example.com/

...or, for a domain wildcard match, use:

$ ./skipfish -D .example.com -o output-dir http://test1.example.com/

In some cases, you do not want to actually crawl a third-party domain, but
you trust the owner of that domain enough not to worry about cross-domain
content inclusion from that location. To suppress warnings, you can use the
-B option, for example:

$ ./skipfish -B .google-analytics.com -B .googleapis.com ...other
parameters...

By default, skipfish sends minimalistic HTTP headers to reduce the amount of
data exchanged over the wire; some sites examine User-Agent strings or header
ordering to reject unsupported clients, however. In such a case, you can use
-b ie, -b ffox, or -b phone to mimic one of the two popular browsers (or
iPhone).

When it comes to customizing your HTTP requests, you can also use the -H
option to insert any additional, non-standard headers; or -F to define a
custom mapping between a host and an IP (bypassing the resolver). The latter
feature is particularly useful for not-yet-launched or legacy services.

Some sites may be too big to scan in a reasonable timeframe. If the site
features well-defined tarpits - for example, 100,000 nearly identical user
profiles as a part of a social network - these specific locations can be
excluded with -X or -S. In other cases, you may need to resort to other
settings: -d limits crawl depth to a specified number of subdirectories; -c
limits the number of children per directory; -x limits the total number of
descendants per crawl tree branch; and -r limits the total number of requests
to send in a scan.

An interesting option is available for repeated assessments: -p. By
specifying a percentage between 1 and 100%, it is possible to tell the
crawler to follow fewer than 100% of all links, and try fewer than 100% of
all dictionary entries. This - naturally - limits the completeness of a scan,
but unlike most other settings, it does so in a balanced, non-deterministic
manner. It is extremely useful when you are setting up time-bound, but
periodic assessments of your infrastructure. Another related option is -q,
which sets the initial random seed for the crawler to a specified value. This
can be used to exactly reproduce a previous scan to compare results.
Randomness is relied upon most heavily in the -p mode, but also for making a
couple of other scan management decisions elsewhere.

Some particularly complex (or broken) services may involve a very high number
of identical or nearly identical pages. Although these occurrences are by
default grayed out in the report, they still use up some screen estate and
take a while to process on JavaScript level. In such extreme cases, you may
use the -Q option to suppress reporting of duplicate nodes altogether, before
the report is written. This may give you a less comprehensive understanding
of how the site is organized, but has no impact on test coverage.

In certain quick assessments, you might also have no interest in paying any
particular attention to the desired functionality of the site - hoping to
explore non-linked secrets only. In such a case, you may specify -P to
inhibit all HTML parsing. This limits the coverage and takes away the ability
for the scanner to learn new keywords by looking at the HTML, but speeds up
the test dramatically. Another similarly crippling option that reduces the
risk of persistent effects of a scan is -O, which inhibits all form parsing
and submission steps.

Some sites that handle sensitive user data care about SSL - and about getting
it right. Skipfish may optionally assist you in figuring out problematic
mixed content or password submission scenarios - use the -M option to enable
this. The scanner will complain about situations such as http:// scripts
being loaded on https:// pages - but will disregard non-risk scenarios such
as images.

Likewise, certain pedantic sites may care about cases where caching is
restricted on HTTP/1.1 level, but no explicit HTTP/1.0 caching directive is
given on specifying -E in the command-line causes skipfish to log all such
cases carefully.

In some occasions, you want to limit the requests per second to limit
the load on the targets server (or possibly bypass DoS protection). The
-l flag can be used to set this limit and the value given is the maximum
amount of requests per second you want skipfish to perform.

Scans typically should not take weeks. In many cases, you probably
want to limit the scan duration so that it fits within a certain time
window. This can be done with the -k flag, which allows the amount of
hours, minutes and seconds to be specified in a H:M:S format. Use of
this flag can affect the scan coverage if the scan timeout occurs before
testing all pages.

Lastly, in some assessments that involve self-contained sites without
extensive user content, the auditor may care about any external e-mails or
HTTP links seen, even if they have no immediate security impact. Use the -U
option to have these logged.

Dictionary management is a special topic, and - as mentioned - is covered in
more detail in doc/dictionaries.txt. Please read that file before
proceeding. Some of the relevant options include -S and -W (covered earlier),
-L to suppress auto-learning, -G to limit the keyword guess jar size, -R to
drop old dictionary entries, and -Y to inhibit expensive $keyword.$extension
fuzzing.

Skipfish also features a form auto-completion mechanism in order to maximize
scan coverage. The values should be non-malicious, as they are not meant to
implement security checks - but rather, to get past input validation logic.
You can define additional rules, or override existing ones, with the -T
option (-T form_field_name=field_value, e.g. -T login=test123 -T
password=test321 - although note that -C and -A are a much better method of
logging in).

There is also a handful of performance-related options. Use -g to set the
maximum number of connections to maintain, globally, to all targets (it is
sensible to keep this under 50 or so to avoid overwhelming the TCP/IP stack
on your system or on the nearby NAT / firewall devices); and -m to set the
per-IP limit (experiment a bit: 2-4 is usually good for localhost, 4-8 for
local networks, 10-20 for external targets, 30+ for really lagged or
non-keep-alive hosts). You can also use -w to set the I/O timeout (i.e.,
skipfish will wait only so long for an individual read or write), and -t to
set the total request timeout, to account for really slow or really fast
sites.

Lastly, -f controls the maximum number of consecutive HTTP errors you are
willing to see before aborting the scan; and -s sets the maximum length of a
response to fetch and parse (longer responses will be truncated).

When scanning large, multimedia-heavy sites, you may also want to specify -e.
This prevents binary documents from being kept in memory for reporting
purposes, and frees up a lot of RAM.

Further rate-limiting is available through third-party user mode tools such
as trickle, or kernel-level traffic shaping.

Oh, and real-time scan statistics can be suppressed with -u.

--------------------------------
6. But seriously, how to run it?
--------------------------------

A standard, authenticated scan of a well-designed and self-contained site
(warns about all external links, e-mails, mixed content, and caching header
issues), including gentle brute-force:

$ touch new_dict.wl
$ ./skipfish -MEU -S dictionaries/minimal.wl -W new_dict.wl \
  -C "AuthCookie=value" -X /logout.aspx -o output_dir \
  http://www.example.com/

Five-connection crawl, but no brute-force; pretending to be MSIE and
trusting example.com content:

$ ./skipfish -m 5 -L -W- -o output_dir -b ie -B example.com \
  http://www.example.com/

Heavy brute force only (no HTML link extraction), limited to a single
directory and timing out after 5 seconds:

$ touch new_dict.wl
$ ./skipfish -S dictionaries/complete.wl -W new_dict.wl \
   -P -I http://www.example.com/dir1/ -o output_dir -t 5 -I \
  http://www.example.com/dir1/

For a short list of all command-line options, try ./skipfish -h.

----------------------------------------------------
7. How to interpret and address the issues reported?
----------------------------------------------------

Most of the problems reported by skipfish should self-explanatory, assuming you
have a good gasp of the fundamentals of web security. If you need a quick
refresher on some of the more complicated topics, such as MIME sniffing, you
may enjoy our comprehensive Browser Security Handbook as a starting point:

  http://code.google.com/p/browsersec/

If you still need assistance, there are several organizations that put a
considerable effort into documenting and explaining many of the common web
security threats, and advising the public on how to address them. I encourage
you to refer to the materials published by OWASP and Web Application Security
Consortium, amongst others:

  * http://www.owasp.org/index.php/Category:Principle
  * http://www.owasp.org/index.php/Category:OWASP_Guide_Project
  * http://www.webappsec.org/projects/articles/

Although I am happy to diagnose problems with the scanner itself, I regrettably
cannot offer any assistance with the inner wokings of third-party web
applications.

---------------------------------------
8. Known limitations / feature wishlist
---------------------------------------

Below is a list of features currently missing in skipfish. If you wish to
improve the tool by contributing code in one of these areas, please let me
know:

  * Buffer overflow checks: after careful consideration, I suspect there is
    no reliable way to test for buffer overflows remotely. Much like the actual
    fault condition we are looking for, proper buffer size checks may also
    result in uncaught exceptions, 500 messages, etc. I would love to be proved
    wrong, though.

  * Fully-fledged JavaScript XSS detection: several rudimentary checks are
    present in the code, but there is no proper script engine to evaluate
    expressions and DOM access built in.

  * Variable length encoding character consumption / injection bugs: these
    problems seem to be largely addressed on browser level at this point, so
    they were much lower priority at the time of this writing.

  * Security checks and link extraction for third-party, plugin-based
    content (Flash, Java, PDF, etc).

  * Password brute-force and numerical filename brute-force probes.

  * Search engine integration (vhosts, starting paths).

  * VIEWSTATE decoding.

  * NTLM and digest authentication.

  * More specific PHP tests (eval injection, RFI).

  * Proxy support: an experimental HTTP proxy support is available through
    a #define directive in config.h. Adding support for HTTPS proxying is
    more complicated, and still in the works.

  * Scan resume option, better runtime info.

  * Standalone installation (make install) support.

  * Scheduling and management web UI.

-------------------------------------
9. Oy! Something went horribly wrong!
-------------------------------------

There is no web crawler so good that there wouldn't be a web framework to one
day set it on fire. If you encounter what appears to be bad behavior (e.g., a
scan that takes forever and generates too many requests, completely bogus
nodes in scan output, or outright crashes), please first check our known
issues page:

  http://code.google.com/p/skipfish/wiki/KnownIssues

If you can't find a satisfactory answer there, recompile the scanner with:

$ make clean debug

...and re-run it this way:

$ ./skipfish [...previous options...] 2>logfile.txt

You can then inspect logfile.txt to get an idea what went wrong; if it looks
like a scanner problem, please scrub any sensitive information from the log
file and send it to the author.

If the scanner crashed, please recompile it as indicated above, and then type:

$ ulimit -c unlimited
$ ./skipfish [...previous options...] 2>logfile.txt
$ gdb --batch -ex back ./skipfish core

...and be sure to send the author the output of that last command as well.

------------------------
10. Credits and feedback
------------------------

Skipfish is made possible thanks to the contributions of, and valuable
feedback from, Google's information security engineering team.

If you have any bug reports, questions, suggestions, or concerns regarding
the application, the primary author can be reached at [email protected].

skipfish's People

Contributors

mrheinen avatar

Watchers

 avatar

skipfish's Issues

Hash Value for Download?

Our security officer is nervous about our downloading OSS products. He's 
afraid the code might be tampered with. Would it be possible provide a 
hash value for the downloads? It would ease his mind.

Original issue reported on code.google.com by [email protected] on 31 Mar 2010 at 5:52

Proxy-support

How does proxy-support work? I did not find this information from 
documentation.

Original issue reported on code.google.com by [email protected] on 24 Mar 2010 at 8:17

Can't compile on Debian

~/skipfish# make

cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb
-D_FORTIFY_SOURCE=0 -O3 -Wno-format http_client.c database.c crawler.c
analysis.c report.c -lcrypto -lssl -lidn -lz
In file included from crawler.h:26,
                 from skipfish.c:39:
http_client.h:26:25: error: openssl/ssl.h: No such file or directory
In file included from crawler.h:26,
                 from skipfish.c:39:
http_client.h:189: error: expected specifier-qualifier-list before
‘SSL_CTX’
skipfish.c: In function ‘main’:
skipfish.c:143: warning: implicit declaration of function
‘SSL_library_init’
http_client.c:36:25: error: openssl/ssl.h: No such file or directory
http_client.c:37:25: error: openssl/err.h: No such file or directory
http_client.c:39:18: error: zlib.h: No such file or directory
In file included from database.h:29,
                 from http_client.c:44:
http_client.h:189: error: expected specifier-qualifier-list before
‘SSL_CTX’
http_client.c: In function ‘parse_response’:
http_client.c:1491: error: ‘z_stream’ undeclared (first use in this
function)
http_client.c:1491: error: (Each undeclared identifier is reported only once
http_client.c:1491: error: for each function it appears in.)
http_client.c:1491: error: expected ‘;’ before ‘d’
http_client.c:1496: error: ‘d’ undeclared (first use in this 
function)
http_client.c:1506: warning: implicit declaration of function
‘inflateInit2’
http_client.c:1506: error: ‘Z_OK’ undeclared (first use in this 
function)
http_client.c:1507: warning: implicit declaration of function 
‘inflateEnd’
http_client.c:1512: warning: implicit declaration of function 
‘inflate’
http_client.c:1512: error: ‘Z_FINISH’ undeclared (first use in this
function)
http_client.c:1515: error: ‘Z_BUF_ERROR’ undeclared (first use in 
this
function)
http_client.c:1515: error: ‘Z_STREAM_END’ undeclared (first use in 
this
function)
http_client.c: In function ‘destroy_unlink_conn’:
http_client.c:1611: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1611: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1612: error: ‘struct conn_entry’ has no member named
‘prev’
http_client.c:1612: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1612: error: ‘struct conn_entry’ has no member named
‘prev’
http_client.c:1612: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1613: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1613: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1613: error: ‘struct conn_entry’ has no member named
‘prev’
http_client.c:1614: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1614: warning: implicit declaration of function 
‘SSL_free’
http_client.c:1614: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1615: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1615: warning: implicit declaration of function
‘SSL_CTX_free’
http_client.c:1615: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1616: error: ‘struct conn_entry’ has no member named
‘write_buf’
http_client.c:1617: error: ‘struct conn_entry’ has no member named
‘read_buf’
http_client.c:1618: warning: implicit declaration of function 
‘close’
http_client.c: In function ‘reuse_conn’:
http_client.c:1627: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1627: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1628: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1629: error: ‘struct conn_entry’ has no member named
‘read_buf’
http_client.c:1630: error: ‘struct conn_entry’ has no member named
‘write_buf’
http_client.c:1631: error: ‘struct conn_entry’ has no member named
‘read_buf’
http_client.c:1631: error: ‘struct conn_entry’ has no member named
‘write_buf’
http_client.c:1632: error: ‘struct conn_entry’ has no member named
‘read_len’
http_client.c:1632: error: ‘struct conn_entry’ has no member named
‘write_len’
http_client.c:1632: error: ‘struct conn_entry’ has no member named
‘write_off’
http_client.c:1633: error: ‘struct conn_entry’ has no member named
‘SSL_rd_w_wr’
http_client.c:1633: error: ‘struct conn_entry’ has no member named
‘SSL_wr_w_rd’
http_client.c: In function ‘check_ssl’:
http_client.c:1738: error: ‘X509’ undeclared (first use in this 
function)
http_client.c:1738: error: ‘p’ undeclared (first use in this 
function)
http_client.c:1740: warning: implicit declaration of function
‘SSL_get_peer_certificate’
http_client.c:1740: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1748: warning: implicit declaration of function
‘ASN1_UTCTIME_cmp_time_t’
http_client.c:1752: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1753: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1757: warning: implicit declaration of function
‘X509_NAME_oneline’
http_client.c:1757: warning: assignment makes pointer from integer without
a cast
http_client.c:1760: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1761: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1763: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1764: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1771: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1783: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1784: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1786: warning: implicit declaration of function 
‘X509_free’
http_client.c:1788: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1789: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1791: error: ‘struct conn_entry’ has no member named
‘ssl_checked’
http_client.c: In function ‘conn_associate’:
http_client.c:1846: error: ‘errno’ undeclared (first use in this 
function)
http_client.c:1846: error: ‘EINPROGRESS’ undeclared (first use in 
this
function)
http_client.c:1852: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1852: warning: implicit declaration of function 
‘SSL_CTX_new’
http_client.c:1852: warning: implicit declaration of function
‘SSLv23_client_method’
http_client.c:1854: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1856: warning: implicit declaration of function
‘SSL_CTX_set_mode’
http_client.c:1856: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1856: error: ‘SSL_MODE_ENABLE_PARTIAL_WRITE’ 
undeclared
(first use in this function)
http_client.c:1857: error: ‘SSL_MODE_ACCEPT_MOVING_WRITE_BUFFER’
undeclared (first use in this function)
http_client.c:1859: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1859: warning: implicit declaration of function 
‘SSL_new’
http_client.c:1859: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1861: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1862: error: ‘struct conn_entry’ has no member named
‘srv_ctx’
http_client.c:1866: warning: implicit declaration of function 
‘SSL_set_fd’
http_client.c:1866: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1867: warning: implicit declaration of function
‘SSL_set_connect_state’
http_client.c:1867: error: ‘struct conn_entry’ has no member named
‘srv_ssl’
http_client.c:1873: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1875: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1875: error: ‘struct conn_entry’ has no member named
‘next’
http_client.c:1881: error: ‘struct conn_entry’ has no member named 
‘q’
http_client.c:1886: error: ‘struct conn_entry’ has no member named
‘write_buf’
http_client.c:1887: error: ‘struct conn_entry’ has no member named
‘write_len’
http_client.c:1887: error: ‘struct conn_entry’ has no member named
‘write_buf’
http_client.c: In function ‘next_from_queue’:
http_client.c:1915: error: ‘struct conn_entry’ has no member named
‘write_len’
http_client.c:1915: error: ‘struct conn_entry’ has no member named
‘write_off’


Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 7:33

bash color issue after error

What steps will reproduce the problem?
1. run skipfish without parameter
2. See the error : [-] PROGRAM ABORT : Scan target not specified (try -h
for help).
    Stop location : main(), skipfish.c:379
3. verify the color of your shell, it should be slightly different


Attached a patch that should correct the issue. I called the variable with
the "normal color" cRST for ReSeT. (being black, white or whatever the user
normally uses) . 
You might want to correct the #  define cNOR "\x1b[0;37m" variable, but I'm
not sure where you use that one.


Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 4:00

Attachments:

Mispelling in assets/index.html

$ diff index.html.orig index.html
265c265
<   "40301": "Incorrect or missing MIME type (higher rirsk)",

---
>   "40301": "Incorrect or missing MIME type (higher risk)",

Original issue reported on code.google.com by [email protected] on 22 Mar 2010 at 7:29

False positive MIME warnings on xhtml files served as application/xhtml+xml

The type sniffer is looking for an "<?xml" string and assuming any html without 
it is text/html, but 
this is an optional line and the W3 actually recommends not to use it in xhtml 
( 
http://www.w3.org/TR/xhtml1/guidelines.html)

I've hacked up my local copy with this patch which seems to improve things.

--- skipfish-1.07b/analysis.c   2010-03-20 02:47:46.000000000 +0000
+++ b/analysis.c    2010-03-21 19:44:45.623787778 +0000
@@ -1944,7 +1944,13 @@
         inl_strcasestr(sniffbuf, (u8*)"<h1") ||
         inl_strcasestr(sniffbuf, (u8*)"<li") ||
         inl_strcasestr(sniffbuf, (u8*)"href=")) {
-      res->sniff_mime_id = MIME_ASC_HTML;
+      if (inl_strcasestr(sniffbuf, (u8*)" 
xmlns=\"http://www.w3.org/1999/xhtml\"") &&
+          inl_strcasestr(sniffbuf, (u8*)" xml:lang=") &&
+          !inl_strcasestr(sniffbuf, (u8*)" lang="))
+        res->sniff_mime_id = MIME_XML_XHTML;
+      else
+        res->sniff_mime_id = MIME_ASC_HTML;
+
       return;
     }

Original issue reported on code.google.com by flussence on 21 Mar 2010 at 8:04

add a manpage

skipfish is distributed without manpage, which is quite popular format of
documentation. One of Debian developers created such manpage and it is
available in Debian's BTS:

http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=575596

Would be great if you could include it in your tarball.

regards
fEnIo

Original issue reported on code.google.com by [email protected] on 28 Mar 2010 at 10:08

no compile on MacOSX

I got this error when try compile skipfish:

http_client.c:39:18: error: idna.h: No such file or directory

Where is supposed to find idna.h?

thx a lot.

Original issue reported on code.google.com by [email protected] on 22 Mar 2010 at 9:50

URLs outside the include pattern/string is checked.

Using skipfish version 1.25b on Debian Testing.

I wanted to check only a subdirectory, so I used the -I option. However,
Skipfish still checks the root (and some directories above the subdirectory).

I used this commandline:
./skipfish -o somedir  -I /~username/dir/ajaxhelper.php
http://example.org/~username/dir/ajaxhelper.php

I notice that I get reports for http://example.org/,
http://example.org/~username/ and http://example.org/~username/dir/ Have I
misunderstood the usage of the -I option?

Original issue reported on code.google.com by [email protected] on 25 Mar 2010 at 10:43

Cant compile on FBSD 6.1

Added -I/usr/local/include/, -L/usr/local/lib/ to Makefile, gmake
gmake
cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb -
D_FORTIFY_SOURCE=0 -I/usr/local/include/ -O3 -Wno-format http_client.c 
database.c crawler.c analysis.c report.c -lcrypto -lssl -lidn -lz -L/usr/
local/lib/
In file included from skipfish.c:36:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
In file included from http_client.c:42:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
http_client.c: In function `destroy_unlink_conn':
http_client.c:1618: warning: implicit declaration of function `close'
http_client.c: In function `next_from_queue':
http_client.c:2013: warning: implicit declaration of function `read'
http_client.c:2091: warning: implicit declaration of function `write'
In file included from http_client.h:30,
                 from database.c:33:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
In file included from http_client.h:30,
                 from crawler.c:30:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
In file included from http_client.h:30,
                 from analysis.c:28:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
In file included from http_client.h:30,
                 from report.c:33:
alloc-inl.h: In function `__DFL_ck_alloc':
alloc-inl.h:69: warning: implicit declaration of function 
`malloc_usable_size'
report.c: In function `copy_static_code':
report.c:744: warning: passing arg 3 of `scandir' from incompatible 
pointer type
/var/tmp//ccmILv91.o(.text+0x590): In function `main':
/home/zm/skipfish/skipfish/alloc-inl.h:86: undefined reference to 
`malloc_usable_size'
/var/tmp//ccmILv91.o(.text+0x5c1):/home/zm/skipfish/skipfish/alloc-
inl.h:92: undefined reference to `malloc_usable_size'
/var/tmp//ccmILv91.o(.text+0x67f):/home/zm/skipfish/skipfish/alloc-
inl.h:86: undefined reference to `malloc_usable_size'
/var/tmp//ccmILv91.o(.text+0x6b0):/home/zm/skipfish/skipfish/alloc-
inl.h:92: undefined reference to `malloc_usable_size'
/var/tmp//ccmILv91.o(.text+0x7ac):/home/zm/skipfish/skipfish/alloc-
inl.h:86: undefined reference to `malloc_usable_size'
/var/tmp//ccmILv91.o(.text+0x7dd):/home/zm/skipfish/skipfish/alloc-
inl.h:92: more undefined references to `malloc_usable_size' follow
gmake: *** [skipfish] Error 1

Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 10:50

skipfish fails with "out of memory" against DVWA

I am trying skipfish against DVWA (http://www.dvwa.co.uk/) on a Ubuntu 9.04
VM with 2 GB RAM. Unfortunately it consistently fails with the following:

skipfish version 1.25b by <[email protected]>

Scan statistics
---------------

       Scan time : 3:10:39.0369
   HTTP requests : 1155023 sent (101.04/s), 837781.75 kB in, 565251.44 kB
out (122.65 kB/s)  
     Compression : 177901.86 kB in, 766857.50 kB out (62.34% gain)    
 HTTP exceptions : 1 net errors, 0 proto errors, 0 retried, 0 drops
 TCP connections : 11447 total (101.48 req/conn)  
  TCP exceptions : 0 failures, 1 timeouts, 7 purged
  External links : 38219 skipped
    Reqs pending : 6609        

Database statistics
-------------------

          Pivots : 44832 total, 29585 done (65.99%)    
     In progress : 15015 pending, 169 init, 37 attacks, 26 dict     
   Missing nodes : 275 spotted
      Node types : 1 serv, 3228 dir, 12413 file, 1792 pinfo, 12773 unkn,
14625 par, 0 val
    Issues found : 4546 info, 8977 warn, 14540 low, 1 medium, 2320 high impact
       Dict size : 223 words (23 new), 8 extensions, 231 candidates

[-] PROGRAM ABORT : out of memory: can't allocate 14226 bytes
    Stop location : __DFL_ck_alloc(), alloc-inl.h:69

Any idea?

Original issue reported on code.google.com by [email protected] on 30 Mar 2010 at 7:07

Abort trap when run under OSX 10.6.2

What steps will reproduce the problem?
1. Downloaded and compiled libidn-1.9
2. Compiled skipfish 1.0
3. Using the default.wi dictionary ran ./skipfish -o google.com
http://google.com

What is the expected output? What do you see instead?
skipfish version 1.00b by <[email protected]>
Abort trap

What version of the product are you using? On what operating system?
1.00b OSX 10.6.2

Please provide any additional information below.

Original issue reported on code.google.com by [email protected] on 19 Mar 2010 at 8:49

Can't run make from MAC OS X 10.5.8

cc -L/usr/local/lib/ -L/opt/local/lib skipfish.c -o skipfish -O3
-Wno-format -Wall -funsigned-char -g -ggdb -D_FORTIFY_SOURCE=0
-I/usr/local/include/ -I/opt/local/include/  \
          http_client.c database.c crawler.c analysis.c report.c -lcrypto
-lssl -lidn -lz
http_client.c:39:18: error: idna.h: No such file or directory
http_client.c: In function ‘parse_url’:
http_client.c:277: warning: implicit declaration of function 
‘idna_to_ascii_8z’
http_client.c:277: error: ‘IDNA_SUCCESS’ undeclared (first use in this
function)
http_client.c:277: error: (Each undeclared identifier is reported only once
http_client.c:277: error: for each function it appears in.)
report.c: In function ‘copy_static_code’:
report.c:744: warning: passing argument 3 of ‘scandir’ from incompatible
pointer type
make: *** [skipfish] Error 1
Note: before submitting, check:
http://code.google.com/p/skipfish/wiki/KnownIssues

Original issue reported on code.google.com by [email protected] on 24 Mar 2010 at 4:19

Abort error

I don't think this is error number 6, since this is with the latest version
of the code.

When I tried ./skipfish -o /var/tmp/out -W dictionaries/complete.wl
http://192.168.1.1

I got this error:

skipfish version 1.19b by <[email protected]>
*** glibc detected *** ./skipfish: realloc(): invalid pointer:
0x0000000002101420 ***
======= Backtrace: =========
/lib/libc.so.6[0x7f75d490ed16]
/lib/libc.so.6[0x7f75d49150c5]
./skipfish[0x40bff2]
./skipfish[0x40e0bb]
./skipfish[0x40e28a]
./skipfish[0x403123]
/lib/libc.so.6(__libc_start_main+0xfd)[0x7f75d48bcabd]
./skipfish[0x402369]
======= Memory map: ========
00400000-00429000 r-xp 00000000 08:12 21200950                          
/home/njh/Download/skipfish/skipfish
00628000-00629000 rw-p 00028000 08:12 21200950                          
/home/njh/Download/skipfish/skipfish
00629000-0062a000 rw-p 00000000 00:00 0
02100000-02121000 rw-p 00000000 00:00 0                                  [heap]
7f75d4484000-7f75d449a000 r-xp 00000000 08:06 3407913                   
/lib/libgcc_s.so.1
7f75d449a000-7f75d4699000 ---p 00016000 08:06 3407913                   
/lib/libgcc_s.so.1
7f75d4699000-7f75d469a000 rw-p 00015000 08:06 3407913                   
/lib/libgcc_s.so.1
7f75d469a000-7f75d469c000 r-xp 00000000 08:06 3408113                   
/lib/libdl-2.10.2.so
7f75d469c000-7f75d489c000 ---p 00002000 08:06 3408113                   
/lib/libdl-2.10.2.so
7f75d489c000-7f75d489d000 r--p 00002000 08:06 3408113                   
/lib/libdl-2.10.2.so
7f75d489d000-7f75d489e000 rw-p 00003000 08:06 3408113                   
/lib/libdl-2.10.2.so
7f75d489e000-7f75d49e8000 r-xp 00000000 08:06 3408094                   
/lib/libc-2.10.2.so
7f75d49e8000-7f75d4be8000 ---p 0014a000 08:06 3408094                   
/lib/libc-2.10.2.so
7f75d4be8000-7f75d4bec000 r--p 0014a000 08:06 3408094                   
/lib/libc-2.10.2.so
7f75d4bec000-7f75d4bed000 rw-p 0014e000 08:06 3408094                   
/lib/libc-2.10.2.so
7f75d4bed000-7f75d4bf2000 rw-p 00000000 00:00 0
7f75d4bf2000-7f75d4c09000 r-xp 00000000 08:06 4180650                   
/usr/lib/libz.so.1.2.3.4
7f75d4c09000-7f75d4e08000 ---p 00017000 08:06 4180650                   
/usr/lib/libz.so.1.2.3.4
7f75d4e08000-7f75d4e09000 rw-p 00016000 08:06 4180650                   
/usr/lib/libz.so.1.2.3.4
7f75d4e09000-7f75d4e3a000 r-xp 00000000 08:06 4181738                   
/usr/lib/libidn.so.11.6.1
7f75d4e3a000-7f75d503a000 ---p 00031000 08:06 4181738                   
/usr/lib/libidn.so.11.6.1
7f75d503a000-7f75d503b000 rw-p 00031000 08:06 4181738                   
/usr/lib/libidn.so.11.6.1
7f75d503b000-7f75d5089000 r-xp 00000000 08:06 4186090                   
/usr/lib/libssl.so.0.9.8
7f75d5089000-7f75d5289000 ---p 0004e000 08:06 4186090                   
/usr/lib/libssl.so.0.9.8
7f75d5289000-7f75d5290000 rw-p 0004e000 08:06 4186090                   
/usr/lib/libssl.so.0.9.8
7f75d5290000-7f75d5404000 r-xp 00000000 08:06 4184592                   
/usr/lib/libcrypto.so.0.9.8
7f75d5404000-7f75d5604000 ---p 00174000 08:06 4184592                   
/usr/lib/libcrypto.so.0.9.8
7f75d5604000-7f75d562c000 rw-p 00174000 08:06 4184592                   
/usr/lib/libcrypto.so.0.9.8
7f75d562c000-7f75d5630000 rw-p 00000000 00:00 0
7f75d5630000-7f75d564d000 r-xp 00000000 08:06 3407962                   
/lib/ld-2.10.2.so
7f75d5829000-7f75d582d000 rw-p 00000000 00:00 0
7f75d5847000-7f75d584c000 rw-p 00000000 00:00 0
7f75d584c000-7f75d584d000 r--p 0001c000 08:06 3407962                   
/lib/ld-2.10.2.so
7f75d584d000-7f75d584e000 rw-p 0001d000 08:06 3407962                   
/lib/ld-2.10.2.so
7fffb41b7000-7fffb41cd000 rw-p 00000000 00:00 0                         
[stack]
7fffb41ff000-7fffb4200000 r-xp 00000000 00:00 0                          [vdso]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0                 
[vsyscall]
Aborted (core dumped)
njh@packard:~/Download/skipfish$

The gdb backtrace is:

#0  0x00007f75d48cff45 in *__GI_raise (sig=<value optimized out>)
   at ../nptl/sysdeps/unix/sysv/linux/raise.c:64
#1  0x00007f75d48d2d80 in *__GI_abort () at abort.c:88
#2  0x00007f75d490554d in __libc_message (do_abort=2,
   fmt=0x7fffb41c9c90 ' ' <repeats 23 times>,
"[stack]\n7fffb41ff000-7fffb4200000 r-xp 00000000 00:00 0", ' ' <repeats 26
times>, "[vdso]\nffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0",
' ' <repeats 18 times>, "[vsyscall]\n:06 4"...) at
../sysdeps/unix/sysv/linux/libc_fatal.c:173
#3  0x00007f75d490ed16 in malloc_printerr (action=3,
   str=0x7f75d49b6baf "realloc(): invalid pointer", ptr=<value optimized out>)
   at malloc.c:6239
#4  0x00007f75d49150c5 in realloc_check (oldmem=0x2101420, bytes=16,
   caller=<value optimized out>) at hooks.c:330
#5  0x000000000040bff2 in __DFL_ck_realloc (orig=0x2101420, size=5665)
   at alloc-inl.h:91
#6  0x000000000040e0bb in wordlist_confirm_single (text=<value optimized out>,
   is_ext=<value optimized out>, add_hits=<value optimized out>, total_age=2,
   last_age=2) at database.c:841
#7  0x000000000040e28a in load_keywords (fname=<value optimized out>,
   purge_age=0) at database.c:976
#8  0x0000000000403123 in main (argc=6, argv=0x7fffb41ca758) at skipfish.c:398

Original issue reported on code.google.com by fumeoftheday on 24 Mar 2010 at 12:14

Skipfish cannot find skipfish.wl

Tested:
skipfish-1.11b.tgz

command:
bash-3.2$ ./skipfish -o test123abc http://10.0.0.1

Output:
skipfish version 1.11b by <[email protected]>
[-]  SYSTEM ERROR : Unable to open wordlist 'skipfish.wl'
    Stop location : load_keywords(), database.c:968
       OS message : No such file or directory

No problem if run like:
./skipfish -W /dev/null -o test123abc http://10.0.0.1

Skipfish is run from within the directory it was compiled.

bash-3.2$ find ./ -name *.wl
.//dictionaries/complete.wl
.//dictionaries/default.wl
.//dictionaries/extensions-only.wl
.//dictionaries/minimal.wl

Original issue reported on code.google.com by Louwrentius on 22 Mar 2010 at 8:22

segfault

1.08b segfaults after 

make clean debug
ulimit -c unlimited
./skipfish -p 25% -g 5000 -m 1000 -o www.unosoft.hu http://www.unosoft.hu/ 
2>logfile.txt

gdb --batch -ex back ./skipfish core >gdb.out

Any suggestions?

Tamás Gulácsi
[email protected]

Original issue reported on code.google.com by [email protected] on 21 Mar 2010 at 9:22

Attachments:

False positives for PUT request

I'm using version 1.25 beta. The server I'm testing is treating a PUT
request as a GET request and hence SkipFisk reports this a PUT issue. Is
there some way Skipfish can be improved to avoid false positives for PUT
(by testing if the PUT actually put a file there)?

Original issue reported on code.google.com by [email protected] on 25 Mar 2010 at 10:46

No build in OS X 10.6.2

cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb -U_FORTIFY_SOURCE -O3 
-Wno-
format http_client.c database.c crawler.c analysis.c report.c -lcrypto -lssl 
-lidn -lz

Results in:

http_client.c:38:18: error: idna.h: No such file or directory
http_client.c: In function ‘parse_url’:
http_client.c:275: warning: implicit declaration of function 
‘idna_to_ascii_8z’
http_client.c:275: error: ‘IDNA_SUCCESS’ undeclared (first use in this 
function)
http_client.c:275: error: (Each undeclared identifier is reported only once
http_client.c:275: error: for each function it appears in.)
report.c: In function ‘copy_static_code’:
report.c:744: warning: passing argument 3 of ‘scandir’ from incompatible 
pointer type

Original issue reported on code.google.com by [email protected] on 19 Mar 2010 at 10:48

tarball (is not build from and) does not unpack into a versioned directory

What steps will reproduce the problem?

pull the tarball and unpack it

What is the expected output? What do you see instead?

$ tar zxf skipfish-1.05b.tgz
$ ls

and we see: 
README  skipfish  skipfish-1.05b.tgz

rather than a nice versioned directory, permitting easier 'diff'ing,
deltas, and avoidance of error from being in the wrong place

We should have seen:

README  skipfish-1.05b/  skipfish-1.05b.tgz

perhaps

What version of the product are you using? On what operating system?

As above on a CentOS 5 unit

Please provide any additional information below.

Please consider this as it greatly simplifies source code management for
packaging systems.

Original issue reported on code.google.com by herrold on 21 Mar 2010 at 2:28

Forms w/o an action attribute are not submitted

Hi Michal,

I noticed that if there is a form on a webpage which looks like
<form method="post">
  ... form content ...
</form>
then skipfish completely ignores this form and don't even tries to submit 
anything. Even when I used the auto-complete rules for this form.

Once I've added the action attribute it started submitting this form.

I guess many forms omit action being supposed that it'll be submmitted to 
the current location.

Original issue reported on code.google.com by [email protected] on 31 Mar 2010 at 10:54

http_client.c - line 863

In the source http_client.c, line 863.

By looking at the code, I have seen :

ASD("Connction: keep-alive\r\n");

instead of :

ASD("Connection: keep-alive\r\n");

the 'e' character is missing.

Original issue reported on code.google.com by [email protected] on 22 Mar 2010 at 5:31

Make fails on Ubuntu 9.10

Have installed idn and all other pre-requisites. Attempting to Make generates:
http_client.c:39:18: error: idna.h: No such file or directory

Original issue reported on code.google.com by [email protected] on 26 Mar 2010 at 3:57

Interrupting the scan

I have been scanning a host using the complete dictionary and the scan
already ran for more than 5 hours. How can I stop this scan without losing
the data gathered by the scanner?

Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 12:30

eats way too much memory

After 1h it eat up over 2GB of RAM. It's too much, especially since I used
minimal.wl. To reproduce it scanning page I administrate is enough. I can
provide its URL, but privately.

skipfish version 1.06b by <[email protected]>

Scan statistics
---------------

       Scan time : 1:04:41.0108
   HTTP requests : 6467537 sent (1666.70/s), 4322432.00 kB in, 2537108.25
kB out (1767.42 kB/s)  
     Compression : 2239519.75 kB in, 6157672.50 kB out (46.66% gain)    
 HTTP exceptions : 2 net errors, 0 proto errors, 1 retried, 0 drops
 TCP connections : 64060 total (101.09 req/conn)  
  TCP exceptions : 0 failures, 2 timeouts, 1 purged
  External links : 1643732 skipped
    Reqs pending : 8367         

Database statistics
-------------------

          Pivots : 5486 total, 5279 done (96.23%)    
     In progress : 123 pending, 71 init, 3 attacks, 10 dict    
   Missing nodes : 100 spotted
      Node types : 1 serv, 88 dir, 4899 file, 123 pinfo, 241 unkn, 104 par,
30 val
    Issues found : 120 info, 6 warn, 3840 low, 8149 medium, 0 high impact
       Dict size : 2894 words (1010 new), 46 extensions, 256 candidates

Original issue reported on code.google.com by [email protected] on 21 Mar 2010 at 5:34

Fail to build on Mac OS X 10.6

When building from tarball got following :

make all
cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb -D_FORTIFY_SOURCE=0 
-O3 -Wno-format 
http_client.c database.c crawler.c analysis.c report.c -lcrypto -lssl -lidn -lz
http_client.c:38:18: error: idna.h: No such file or directory
http_client.c: In function ‘parse_url’:
http_client.c:275: warning: implicit declaration of function 
‘idna_to_ascii_8z’
http_client.c:275: error: ‘IDNA_SUCCESS’ undeclared (first use in this 
function)
http_client.c:275: error: (Each undeclared identifier is reported only once
http_client.c:275: error: for each function it appears in.)
report.c: In function ‘copy_static_code’:
report.c:744: warning: passing argument 3 of ‘scandir’ from incompatible 
pointer type
make: *** [skipfish] Error 1


Also, it looks like SVN for this project is mostly empty?

Andrei

Original issue reported on code.google.com by [email protected] on 21 Mar 2010 at 3:56

skipfish segfaulted while scanning


skipfish segfaultet while scanning an application.

Core was generated by 
`./skipfish -C AUDSSESSION 1d5282c6539eb1d7480d2b7b4ee107ec -N -o /tmp/auds
-g'.

The directory was empty so no logs can be provided.  A corefile can be
provided on request. 

gdb output:


Core was generated by `./skipfish -C AUDSSESSION
1d5282c6539eb1d7480d2b7b4ee107ec -N -o /tmp/auds3 -g'.
Program terminated with signal 11, Segmentation fault.
#0  0x00007f653967a7f4 in strcasecmp () from /lib/libc.so.6
(gdb) bt
#0  0x00007f653967a7f4 in strcasecmp () from /lib/libc.so.6
#1  0x00000000004052c6 in set_value (type=<value optimized out>, 
    name=0x491a1f0 "form", val=<value optimized out>, 
    offset=<value optimized out>, par=0x49295e8) at http_client.c:139
#2  0x000000000041ede2 in collect_form_data (req=<value optimized out>, 
    res=<value optimized out>) at analysis.c:545
#3  scrape_response (req=<value optimized out>, res=<value optimized out>)
    at analysis.c:789
#4  0x00000000004133e3 in par_dict_callback (req=0x4916410, res=0x4917600)
    at crawler.c:1894
#5  0x000000000040bbae in next_from_queue () at http_client.c:2038
#6  0x00000000004033b6 in main (argc=<value optimized out>, 
    argv=<value optimized out>) at skipfish.c:419
(gdb) u
The program is not running.
(gdb) up
#1  0x00000000004052c6 in set_value (type=<value optimized out>, 
    name=0x491a1f0 "form", val=<value optimized out>, 
    offset=<value optimized out>, par=0x49295e8) at http_client.c:139
139       if (name && strcasecmp((char*)par->n[i], (char*)name)) continue;
(gdb) print name
$1 = (u8 *) 0x491a1f0 "form"
(gdb) print par->n[i]
$2 = (u8 *) 0x0
(gdb) print i
$3 = 1

Original issue reported on code.google.com by florian.streibelt on 22 Mar 2010 at 10:56

building issue on ubuntu lucid

on ubuntu lucid beta1, after installing libidn11-dev, it is still not 
possible to compile skipfish 1.03b.

we need to install libssl-dev package to make it compile, but the readme 
does not contain information about this.

What steps will reproduce the problem?
1. try make
2. you'll get openssl/ssl.h: No such file or directory error 
3. install libssl-dev, and now it compiles.


Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 1:54

Improve compilation documentation

I just compiled Skipfish in Windows with Cygwin and I had to install the
following packages:

- libidn
- libidn-devel
- openssl-devel

In the docs, it's only told "libidn" but is should include the other packages.

Regards.

Original issue reported on code.google.com by [email protected] on 1 Apr 2010 at 12:16

No source code in svn repository

I've tried to checkout source code, but can't find it.
$svn checkout http://skipfish.googlecode.com/svn/trunk/ skipfish-read-only
$ls skipfish-read-only -lha
total 12K
drwxr-xr-x  3 dr dr 4,0K Mar 26 11:59 ./
drwx------ 28 dr dr 4,0K Mar 26 11:59 ../
drwxr-xr-x  6 dr dr 4,0K Mar 26 12:00 .svn/

Is this a problem? 

Original issue reported on code.google.com by [email protected] on 26 Mar 2010 at 10:08

  • Merged into: #15

False positives -- caused by insignificant HTML differences?

I just ran Skipfish against a website on our staging server.  The scan took
a couple of days to run (10 million hits on DB driven site -- should have
used a smaller dictionary!), and came up with 3000 odd issues.

Scanning through, almost all issues are of the form:

#  http://xxxx/numbers/letters.html/-2147483649 
Memo: response to -(2^31-1) different than to -12345

I believe these to be false positives, as the responses seem to be the
same, except for two things:

1) A comment in the footer of the page source that looks like this:
<!-- Generated by Messiah Ltd. on Fri, 26 Mar 2010 08:31:15 +1300 in 23.7
milliseconds. -->

2) An image tag that is randomly selected (server-side) for each page view.

3) A small quote, also randomly selected per page view.

The scan was run using Skipfish version 1.11b.  It only just finished, and
I haven't yet scanned again with a newer version.





Original issue reported on code.google.com by [email protected] on 25 Mar 2010 at 7:38

Build error: idna.h: No such file or directory

What steps will reproduce the problem?
1. Download skipfish-1.01b.tgz
2. Extract
3. cd skipfish
4. make

What is the expected output? What do you see instead?

➜  skipfish ✗ make
cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb -U_FORTIFY_SOURCE -O3 
-Wno-
format http_client.c database.c crawler.c analysis.c report.c -lcrypto -lssl 
-lidn -lz
http_client.c:38:18: error: idna.h: No such file or directory
http_client.c: In function ‘parse_url’:
http_client.c:275: warning: implicit declaration of function 
‘idna_to_ascii_8z’
http_client.c:275: error: ‘IDNA_SUCCESS’ undeclared (first use in this 
function)
http_client.c:275: error: (Each undeclared identifier is reported only once
http_client.c:275: error: for each function it appears in.)
report.c: In function ‘copy_static_code’:
report.c:744: warning: passing argument 3 of ‘scandir’ from incompatible 
pointer type
make: *** [skipfish] Error 1
➜  skipfish ✗ ls
COPYING       analysis.c    crawler.c     debug.h       report.c      
string-inl.h
Makefile      analysis.h    crawler.h     dictionaries  report.h      types.h
README        assets        database.c    http_client.c same_test.c
alloc-inl.h   config.h      database.h    http_client.h skipfish.c

What version of the product are you using? On what operating system?
1.01b
OSX 10.6.2

➜  skipfish ✗ cc --version
i686-apple-darwin10-gcc-4.2.1 (GCC) 4.2.1 (Apple Inc. build 5646) (dot 1)
Copyright (C) 2007 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 12:06

  • Merged into: #2

Limit bandwith

Is possible to limit the bandwith?
I try skipfish with an domain and I make a DoS :S

Original issue reported on code.google.com by diegors on 28 Mar 2010 at 11:53

ignore ssl certificate errors

Would it be possible to have a flag for ignoring:
1) self signed ssl certificates
2) out of date ssl certificates

Many thanks

Original issue reported on code.google.com by [email protected] on 24 Mar 2010 at 12:17

Makefile, LDFLAGS

Sorry I think we had a bit of misunderstanding here,

1.15b's Makefile has:

LDFLAGS   += -lcrypto -lssl -lidn -lz -L/usr/local/lib/ -L/opt/local/lib

$(CC) $(PROGNAME).c -o $(PROGNAME) $(CFLAGS_OPT) $(OBJFILES) $(LDFLAGS)

This won't work because LDFLAGS are appended too late, and as such, are
ignored.

It should be:

LIBS       = -lcrypto -lssl -lidn -lz -L/usr/local/lib/ -L/opt/local/lib

$(CC) $(LDFLAGS) $(PROGNAME).c -o $(PROGNAME) $(CFLAGS_OPT) \
$(OBJFILES) $(LIBS)

In order to respect environment's LDFLAGS they can't be mixed with -l
flags... and even when not respecting them, `make LDFLAGS=""` as argument
would be broken.

(That's what I really tried to say on the previous bug, and got a bit
sidetracked by other vars there, I guess, sorry...)

Attached is the way I would write it...

Then you can:

export CFLAGS="-march=core2 -pipe -O2"
export LDFLAGS="-Wl,--as-needed"

make debug          -> -g -ggdb is only appended then, and above is used

make                -> build with upstream optimization, -O3, above is
prepended to this too

make OPT_CFLAGS=""  -> use only your own above environment flags

Original issue reported on code.google.com by [email protected] on 23 Mar 2010 at 10:27

Attachments:

Can't make on Mac OS X 10.6.2

Instead of compiling i see this:

verigon:skipfish test$ sudo make
Password:
cc skipfish.c -o skipfish -Wall -funsigned-char -g -ggdb
-D_FORTIFY_SOURCE=0 -O3 -Wno-format http_client.c database.c crawler.c
analysis.c report.c -lcrypto -lssl -lidn -lz
http_client.c:38:18: error: idna.h: No such file or directory
http_client.c: In function ‘parse_url’:
http_client.c:275: warning: implicit declaration of function 
‘idna_to_ascii_8z’
http_client.c:275: error: ‘IDNA_SUCCESS’ undeclared (first use in this
function)
http_client.c:275: error: (Each undeclared identifier is reported only once
http_client.c:275: error: for each function it appears in.)
report.c: In function ‘copy_static_code’:
report.c:744: warning: passing argument 3 of ‘scandir’ from incompatible
pointer type
make: *** [skipfish] Error 1


What version of the product are you using? On what operating system?

Developer Information:

  Version:  3.2 (10M2003)
  Location: /Developer
  Applications:
  Xcode:    3.2.1 (1613)
  Interface Builder:    3.2.1 (740)
  Instruments:  2.0.1 (1096)
  Dashcode: 3.0 (328)
  SDKs:
  Mac OS X:
  10.5: (9J61)
  10.6: (10M2003)


Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 10:44

SVN Checkout: No files!

% svn checkout http://skipfish.googlecode.com/svn/trunk/ skipfish-read-only
Checked out revision 74.

% ls -a skipfish-read-only
.  ..  .svn

No other files to work with!

Original issue reported on code.google.com by dan.rasmussen on 23 Mar 2010 at 4:16

Html report not working in Chrome

What steps will reproduce the problem?
1. run a scan
2. open generated index.html in google chrome (I use 5.0.342.3)
3.

What is the expected output? What do you see instead?

Using Firefox, report parts appear, are clickable and expand properly. 

With Chrome, no results appear (section titles such as "Crawl results - 
click to expand:" are visible, but without content). Please see attached 
screenshot !


What version of the product are you using? On what operating system?

- OS is Linux, Debian "squeeze"
- skipfish 1.05b
- google chrome 5.0.342.3

Please provide any additional information below.

Original issue reported on code.google.com by [email protected] on 20 Mar 2010 at 10:22

Attachments:

Support OpenBSD

I can't compile it on openbsd :

tmp//ccitPTK6.o(.text+0x1482): In function `serialize_path':
/home/dbd/Downloads/skipfish/http_client.c:441: warning: sprintf() is often
misused, please use snprintf()
/tmp//ccMD7G7q.o(.text+0xeb3): In function `__DFL_ck_alloc':
/home/dbd/Downloads/skipfish/alloc-inl.h:71: undefined reference to
`malloc_usable_size'
/tmp//ccMD7G7q.o(.text+0xf6a): In function `__DFL_ck_realloc':
/home/dbd/Downloads/skipfish/alloc-inl.h:88: undefined reference to
`malloc_usable_size'
/tmp//ccMD7G7q.o(.text+0xf95):Downloads/skipfish/alloc-inl.h:94: undefined
reference to `malloc_usable_size'
/tmp//ccMD7G7q.o(.text+0x1093): In function `__DFL_ck_strdup':
Downloads/skipfish/alloc-inl.h:116: undefined reference to `malloc_usable_size'
/tmp//ccitPTK6.o(.text+0x11a6): In function `serialize_path':
Downloads/skipfish/http_client.c:635: undefined reference to
`malloc_usable_size'
/tmp//ccitPTK6.o(.text+0x128e):Downloads/skipfish/http_client.c:668: more
undefined references to `malloc_usable_size' follow
collect2: ld returned 1 exit status
*** Error code 1

I also get a warning :

/usr/include/malloc.h:4:2: warning: #warning "<malloc.h> is obsolete, use
<stdlib.h>"



Original issue reported on code.google.com by [email protected] on 22 Mar 2010 at 9:21

source code is empty

Wiki advises to download the source code via:

    svn checkout http://skipfish.googlecode.com/svn/trunk/ skipfish-read-only

But this command results only in an empty directory.

Original issue reported on code.google.com by [email protected] on 21 Mar 2010 at 6:01

Application Run Directory Not Configurable

It looks like the application must be run from the same directory that has 
the assets/* files.

[-]  SYSTEM ERROR : Unable to access 'assets/index.html' - wrong directory?
    Stop location : main(), skipfish.c:374
       OS message : No such file or directory

It would be nice if one could supply the assets path via the CLI and/or 
configure it to be a specific location.

Original issue reported on code.google.com by [email protected] on 22 Mar 2010 at 7:50

DoS attack possible

It is highly demanded, to implement an dos(denial of service) blocker.

The easiest way should be, to ask about a simple key file
 like "http://example.com/skipfish_[sha_digest_hex].html"


Original issue reported on code.google.com by [email protected] on 21 Mar 2010 at 5:05

no report on firefox

No report available using Firefox 3.0.11. 
It seems the same behaviour of the issue related to Chrome.
(see image)

Original issue reported on code.google.com by [email protected] on 23 Mar 2010 at 10:55

Attachments:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.