Git Product home page Git Product logo

squidanalyzer's People

Contributors

0xflotus avatar atlhon avatar atorrillasmat avatar badfiles avatar brenard avatar dalibot avatar darold avatar joseh-henrique avatar sathieu avatar slashdoom avatar tachtler avatar uow-dmurrell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

squidanalyzer's Issues

Weird top-domain result

Hello again,

I am currently trying to figure out why my 60Gb of youtube.com download is not appearing anywhere on my top-domain page. The total amount of daily downloaded data is 350Gb : 1st domain is only at 1,735.30Mb and the 30th one only at 1.18Mb. Regarding my traffic, those digit do not make sense.

By the way my "last visite" field (13:05:15) is older than my "first visite" (16:57:45).

Regards,

TAU

Bug in week statistic

In the following code:
https://github.com/darold/squidanalyzer/blob/master/SquidAnalyzer.pm#L630
up to L642.
There is a following error in logic: squidanalyzer assumes that only 1 week has been read (it is possible that one run of squidanalyzer over an access.log would capture 2 weeks or more of logs, so should be assumed that multiple number of weeks is read).
If source access.log file has 2 records one from let's say June8th (week23) and one from June9th(week24). Then relevant directory structure and statistics is not maintained for (week23) and squidanalyzer uncleanly ends with error message:
ERROR: Unable to open /www-test/2014/week23/index.html. No such file or directory.

This logfile causes this behaviour:
1402223172.211 0 10.121.32.197 TCP_IMS_HIT/304 354 GET http://crl.microsoft.com/pki/crl/products/MicCodSigPCA_08-31-2010.crl - NONE/- application/pkix-crl
1402309570.503 0 10.121.32.197 TCP_IMS_HIT/304 354 GET http://crl.microsoft.com/pki/crl/products/MicCodSigPCA_08-31-2010.crl - NONE/- application/pkix-crl

Feature request: Include Users

Hi,

Is there any plans to add include_users file?

Its very useful because it can be used to script per-group reports.

Thanks

filtering on header and mimetype basis

Hi,

I have setup this tool and its quite wonderful & fast.

I have one requirement not sure its implemented or not as below:-

  1. I would like to track only the GET requests not HEAD requests
  2. i would like to track some specific type of mime type not all like text/plain or html etc

I also noticed one issue as below:-

On Users tab it display info about the users but its only showing the HIT request i mean i have verified from squid log for that user its showing TCP_MISS but in Users data its showing only HIT always, dont know why :(.

Please suggest and thanks in advance.

Please let me know if you need more info.

Regards
Ajay

Bug in code while using DNS

Hi,
Current code is not working

if ( ($id eq $client) && $self->{UseClientDNSName}) {
if ($client =~ /^\d+.\d+.\d+.\d+$/) {
my $dnsname = $self->_gethostbyaddr($client);
if ($dnsname) {
$id = $dnsname;
}
}
}

Hence if i enable UseClientDNSName in config its not working. It should check for value as below.

if ( ($id eq $client) && $self->{UseClientDNSName} eq 0) {
if ($client =~ /^\d+.\d+.\d+.\d+$/) {
my $dnsname = $self->_gethostbyaddr($client);
if ($dnsname) {
$id = $dnsname;
}
}
print "Hello world";
}

weekly bug after upgrade

I upgraded from the previous version of whats out now. The upgraded went well but i'm unable to get the weekly feature working i get the following:

Not Found

The requested URL /squidreport//2014/week27 was not found on this server.

any ideas ? The feature i had previously did not have the weekly feature before.

Julian

Bug with week statistic

HI,

I have a problem with graph of the week statistic.
As you can see on the image in attachment, when a week is two month like 31 May and 1 June for the week 22. The graph is incorrect due to the number of the last day is lower than the previous month.
Do you think is it possible to do something easilly?

In the first pictures it's the original bug,
In the second pictures I changes the first graph by removed the 01 data in html and replace the 01 by 32 on second graph.

Thank you.
2014-06-02_15h27_05
2014-06-02_15h27_14

Squidreport site forbidden

Hi,
I have created a CentOS 6.4 server to run Squid Proxy and Squid Analyzer but I cannot get the website to load, when I try all I get is:
Forbidden you do not have permission to access /squidreport/ on this server.
I can access the servers root website and tomcat website but not squidreport.
I followed the instructions on the website but it is still not working.
Can anyone help?
Thanks,
alamb200

System Rebuild

Hi,
Thank you again for the issues I opened previously.
This time I just have suggestions about the rebuild action and reporting.

Firstly, I have 3 proxy, and to manage logs, I send those logs via rsyslog to a log server. This server executes a script every morning to extract previous logs from a database.
Then I rebuild squidanalyzer with this file (only yesterday's logs are in this file).
But when i rebuild, the script rebuild all day/week/month
If you have time can you look at a system to not rebuild all day/week/month.

Secondly, My boss like to have report in pdf format.
To do that I install "wkhtmltoimage" to convert html to image and then "imagemagick" to convert image to pdf, I knoxw i could install "wkhtmltopdf" and it will be faster but css is not completly accepted by the program.
I would like to have a button in the web browser for creating a report; do you think it's possible to do that?

Thank you again and again for your work.

spec file : requires squid needed ?

Hi,

I'm working with the provided spec file to build a custom rpm for a client.
I noticed the original spec file requires "squid" as a dependency.
Does that mean squidanalyzer cant work without squid installed on the same server ? I don't think so.
In the later case, the spec file should not "Requires" squid as a dependency.

Regards.

Badly ordered check on squid log file to parse

When you want to use the command line and specify a squid log file, check on log file is not done properly.
As far as I could understand, the code first checks your squidanalyzer.conf file and look for Logfile param. It checks if this file is present. If not, it throws an error (sub parse_config / line 2851).
But if you pass a parameter "-l" with a logfile path, the use of this is done in sub _init / line 402, just after the call to parse_config).

It does not make sense since you have to enter a value in the conf file even though you specify one in the command line. and in the end, its the command line one that is used.

Network-aliases improvement

Hello,

I am currently trying squidAnalyzer to get some interesting digits of the corporate internet usage. For the moment I admit it is one of the more complete I have ever tested, even if it is a bit slow for my 20m requests/day.

Anyway, network-aliases file had been filled with 70 regex representing our subnets. It took a long time and it were not so obvious to perform.

In my opinion a matching IP with subnet (192.16.0.0/16) feature can be intersting for people who administrate lot of subnets. BTW such subnets list can be easely extracted from most of DHCP systems.

I'm using this kind of tool for an other need, code is quiet old but do the job well from many years now, I can obviously provide it if interest you. It also appear to be quiet efficient : matching 11 000 IPs with 70 subnets in 2,7s real, on a small VM. It is also not using any extra cpan package.

Regards,

TAU

multiple log files

Is there a way to have Squidanalyzer look at multiple log files and combine the data?
I am doing a multi-instance implementation and have several access and error log files.

Thanks

Rerun for a day only

Hi,
My stats for a day remains incomplete or partially complete in case the process aborts due to certain reason. Is it possible to re-run squid analyzer for a particular day, may be after editing any file which keeps track.

Deepti

Timestap in User Statistics

Hi,
is possible to add a column "Time Stamp" in "user statistics"?
Like: Time stamp - Url - Requests (%) - Bytes (%) - Duration (%)

Also i have another request,
is possibile to implement a feature that by clicking in a link in "top domain" we obtain a web page with all user that requested the url?

Thank you.

Unable to generate data

For a reason that i can't explain, squidanalyzer is unable to generate graph & data.
When i execute /usr/local/bin/squid-analyzer, index.html is the only thing which is generated, this is really weird.

I tried to indicate the log file /var/log/squid3/access.log but nothing appears.

If someone has an idea for my problem ?

FR: Use of translated months number in daily statistic graph

Feature Request:

It will be nice, if you can use the month number translation (01, 02 -> Jan, Feb) in the daily statistics instead of month number:

now
"Daily Requests statistics on 11.2012"

FR target
"Daily Requests statistics on Nov 2012"

The best solution will be, if this is configurable in the squidanalyzer.conf :-)

Bytes statistic graph

Bytes graphs doesn't show correctly Megabytes and other transfer units and also in mime type byte graph it's rounds up the numbers. Please correct it.

default configuration file in sub usage of squid-analyzer wrapper is wrong

I the file squid-analyzer, the help text says the config file is /etc/squidanalyzer.conf.
It should be /etc/squidanalyzer/squidanalyzer.conf like $DEFAULT_CONFIGFILE
Furthermore, the string is modified by MakeFile.PL, it is still wrong after "making".

patch suggested :

--- squidanalyzer-5.3/squid-analyzer.orig   2014-02-14 11:09:15.224413505 +0100
+++ squidanalyzer-5.3/squid-analyzer    2014-02-14 11:09:31.214365890 +0100
@@ -94,7 +94,7 @@
 Usage: squid-analyzer [ -c squidanalyzer.conf ] [-l logfile]

     -c | --configfile filename : path to the SquidAnalyzer configuration file.
-                By default: /etc/squidanalyzer.conf
+                By default: /etc/squidanalyzer/squidanalyzer.conf
     -b | --build_date date     : set the date to be rebuilt, format: yyyy-mm-dd
                 or yyyy-mm or yyyy. Used with -r or --rebuild.
     -d | --debug               : show debug informations.

Otherwise, you may use the variable $DEFAULT_CONFIGFILE in sub "usage". I dunno which solution is best.

Strange behaviour while configfile path is given to squid-analyzer.

Hi,

It seem squid-analyzer use default configuration file even if a custom one is specified with the -c option.

After a quick look on code at line 53, conditionnal statement should be if (!($#ARGV < 0) && -e $DEFAULT_CONFFILE) with a ! before the first state no ?

Regards,

TAU

Cleaning old stats - minor patch

I have made a minor patch to recently added feature which was requested and resolved.
#7

I have modified code slightly (to make it more logical and with less iterations).

So I am sharing it just in case it is accepted.
Patch is here: https://gist.github.com/4696807

Basically it removes need for oldest_date.

Also I do not know why this line needs to check oldest_date: (line 1222 of original code)
next if (!$self->{oldest_date} && ($y < $old_year));

But I have kept it in my modified code.

So please go through.

thank you

Networks report always contains an extra zero in $show

Hi.
Thanks for this tool. It's great.

I've noticed that the networks.html file always contains an extra 'dot zero' (.0) in the name of the network, even when the network shown already has four octets in it.

For example, the _print_network_stat function produces code similar to the following.

<a href="networks/220.61.224.0/220.61.224.0.html">220.61.224.0.0</a>

I believe that the problem is around about line 1701.

https://github.com/darold/squidanalyzer/blob/master/SquidAnalyzer.pm#L1701

Many thanks,
Ben

Domain summarization

Hello,
first of all I'd like to thank you for great work that squidanalyzer is.
It's very useful tool as it is now.

My request is:
Would it be possible to create some form of domain summarization?
For example to be possible to set rule in configuration file to sum specific domain like .youtube. or .google.

The reason I ask is it would make clearer overview when looking at users or top urls.

To make it clear and easy to understand what I mean, now you see something like:
r4---sn-4g57kn7z.c.youtube.com
r13---sn-4g57knls.c.youtube.com
dcs-188-64-84-33.atmcdn.com
people.ece.cornell.edu
www.google.com
www.facebook.com
r9---sn-4g57ln76.c.youtube.com
i1.ytimg.com
r11---sn-4g57ln7k.c.youtube.com
clients4.google.com
r9---sn-4g57ln7k.c.youtube.com
r14---sn-4g57knzl.c.pack.google.com

And my proposition is to sum to:
*.youtube.com
*.atmcdn.com
*.cornell.edu
*.google.com
*.facebook.com
*.ytimg.com

And when you click specific domain you get details of it i.e you click *.youtube.com and you see:
r4---sn-4g57kn7z.c.youtube.com
r13---sn-4g57knls.c.youtube.com
(...)
r20---sn-4g57knez.c.youtube.com

Best regards,
Wojciech

sorting columns

the request % and bytes % columns are not sorting properly when you click on them.

Missing fields in regex parsing

Hi, i've recently installed Squid Analyzer and fields beyond username are not being parsed correctly, because it identifies status as login and so on. Squid log format was modified as mentioned with the same result.

I've rewritten the regex to match these fields and be more accurate on matches.

Here is the code:

diff SquidAnalyzer.pm /usr/share/perl5/SquidAnalyzer.pm
193c193,194
<               if ( $line =~ s#^(\d+\.\d{3})\s+(\d+)\s+([^\s]+)\s+([^\s]+)\s+(\d+)\s+([^\s]+)\s+## ) {

---
>               #if ( $line =~ s#^(\d+\.\d{3})\s+(\d+)\s+([^\s]+)\s+([^\s]+)\s+(\d+)\s+([^\s]+)\s+## ) {
>               if ( $line =~ s#^(\d+\.\d{3})\s+(\d+)\s+(\S+)\s+(\S+)\s+(\d+)\s+(\S+)\s+(.*)\s+(\S+)\s+(\w+\/\S+)\s+(\S+)\s+(\S+)## ) {
195a197
>                       $client_ip = $3 || '';
199c201,204
<                       $client_ip = $3 || '';

---
>                       $url = $7 || '';
>                       $login = $8 || '';
>                       $status = $9 || '';
>                       $mime_type = $10 || '';
221,225c226,231
<                       if ( $line =~ s#^(.*)\s+([^\s]+)\s+([^\s]+\/[^\s]+)\s+([^\s]+)\s*## ) {
<                               $url = lc($1) || '';
<                               $login = lc($2) || '';
<                               $status = lc($3) || '';
<                               $mime_type = lc($4) || '';

---
>                       #if ( $line =~ s#^(.*)\s+([^\s]+)\s+([^\s]+\/[^\s]+)\s+([^\s]+)\s*## ) {
>                       #if ( $line =~ s#^(.*)\s+(\S+)\s+(\w+\/\S+)\s+(\S+)$## ) {
>                               #$url = lc($1) || '';
>                               #$login = lc($2) || '';
>                               #$status = lc($3) || '';
>                               #$mime_type = lc($4) || '';
286c292
<                       }

---
>                       #}

Now, I'm testing this patch.

Regards

Some improvements / patch

I have few suggestions to make squidanalyzer better and little faster.

Please check them and implement if required.

Thanks.

  1. Makefile.PL should user --skip-alias for which

i.e.
my $zcat = which --skip-alias zcat;
my $bzcat = which --skip-alias bzcat;

  1. squid-analyzer should ignore HUP signal just before new instance is created.

Please add:
$SIG{'HUP'} = 'IGNORE';

before:
my $sa = new SquidAnalyzer($configfile, $logfile, $debug);

This allows to run (long) process even if your terminal ends or disconnects. (otherwise HUP signal kills the process)

  1. A patch is made which does few minor things (SquidAnalyzer.pm) as follows:

Link to patch:
https://gist.github.com/ammdispose/5459542

a) Pre-increment (++$i) is faster than post-increment ($i++). Since program processes lots of things recursively (possibly lakhs of increments), it is suggested to use pre-increment

b) At some places variables passed to functions are quoted, which is not necessary for e.g $self->_save_data("$self->{last_year}", "$self->{last_month}", "$self->{last_day}");
should be
$self->_save_data($self->{last_year}, $self->{last_month}, $self->{last_day});

$self->_save_data("$1", "$2");
should be
$self->_save_data($1, $2);

etc.

c) Some print statements were outputting to STDOUT instead of STDERR which is fixed in patch

d) Use of WebUrl is eliminated and patch uses adds DirLevel counter to decide relative URLs.(For Javascripts/CSS/images)

This allows to download the squidanalyzer directory and view later in offline mode without breaking links. With WebUrl, offline viewing was not possible.

It also allows to migrate reports from one server to another (even if your IP/BaseURL changes) without breaking links

e) Sacrifice LANG, locale etc and date/iconv programs

In _print_header the date/iconv were running 300-600 processes for high number of users (or may be much more).

I think this was done just for one print statement in report. And this was making program much much slow. If we just use strftime, this generates HTML pages almost 4-5 times faster. So I think thats a good sacrifice to make.

f) localtime is expensive function, should try to call it as little time as possible

g) backticks consume more resource than system. Internally in perl, backticks needs to create pipe, read output and store output in memory. So system should be used if you dont want to read output of the program.

Hope this makes sense.

Different use of table attributes

I'm trying to customize the css, but on some pages, you use HTML attributes which overwrites the css settings (cellpadding, cellspacing).

mime_type.html: hardcoded html attributes
<div id="stata"><table class="sortable" cellpadding=1 cellspacing=1 align=center>

network.html: hardcoded html attributes
<div id="stata"><table class="sortable" cellpadding=1 cellspacing=1 align=center>

user.html: OK
<div id="stata"><table class="sortable">

url.html: OK
<div id="stata"><table class="sortable">

domain.html: OK
<div id="stata"><table class="sortable">

Empty graphs

Guys, i had the grahps fully working last week, but now im experiencing problems related to them, theyre all BLANK, no data appears.
I tried to reinstall e generate the htmls again e nothing happend
I check timestamp and its correct, i can see the urls accessed by each users but graph with data.
Can anyone help?
Thanks guys

Only shows HITs for the users

Hi,
On Users tab it display info about the users but its only showing the HIT requests i mean i have verified from squid log for that user its showing TCP_MISS but in Users data its showing only HIT always. If we check any user(hostname) then its only showing HITs , info about MISS is missing.

Thanks and regards

Feature request: users per URL/domain report

Hi,

would it be possible to add a report where you can click on a URL / domain and you can see a list of the users / IPs that have visited that site, rather than just being directed to the web page?

The reverse is already in place - you can click on a username / IP and see a list of sites they've visited...

I've had a look through the module but my Perl's not that good!

Cheers

Ed

Unable to generate data

Hi guy, I have setup successful your script but I can't see anything. I have try using lightsquid with the same access_log file and this can work. I don't why the same access_log file but your script cann't generate data. Thanks for help me. Output access_log file me like that
1406806135.267 660 172.16.133.163 TCP_MISS/206 136801 GET http://r2---sn-8qj-nbok.googlevideo.com/videoplayback? - DIRECT/113.171.245.141 video/mp4
1406806136.710 1021 172.16.133.163 TCP_MISS/206 246121 GET http://r2---sn-8qj-nbok.googlevideo.com/videoplayback? - DIRECT/113.171.245.141 video/mp4
1406806137.077 158 172.16.133.163 TCP_MISS/206 81540 GET http://r2---sn-8qj-nbok.googlevideo.com/videoplayback? - DIRECT/113.171.245.141 video/mp4

Install to dir for make a distibution package

Hello!
I need make a package (slackbuild) for Slackware Linux, before that, i need install squidanalyzer to one directory instead root.

perl Makefile.PL INSTALLDIRS=site LOGFILE=/var/log/squid/access.log BINDIR=/usr/bin CONFDIR=/etc/squidanalyzer HTMLDIR=/home/www/squidreport BASEURL=/squidreport MANDIR=/usr/man/man3 DOCDIR=/usr/share/doc/squidanalyzer DESTDIR=/tmp/dpkgbuild SITEPREFIX=/usr

All good and program installed to /tmp/dpkgbuild, but how to change /tmp/dpkgbuild/usr/local to /tmp/dpkgbuild/usr/. How to romove local dir in path?
In Slackware all programms install to /usr, not to /usr/local.

Problem with year statistics

Good Morning,

We have a problem generating year statistics, our logs are very big, and only with 1 month in logs, the year statistics take 4-5 hours. How can we disable this year statistics? or another solution.... :)

Feature wish: Statistics for Cache

Hi,

it would be nice to have reports about the Cash Hit Ratio.

Like:

  • Objects from Cache / Web as Pie-Chart
  • Most Hits (Mimetypes / Domain) highest 10 Values as Pie Chart
  • Most Miss (Mimetypes / Domain) highest 10 Values as Pie-Chart
  • 30 Largest Objects in Cache
  • 30 Oldest Objects
  • 30 Most frequented Objects
  • Count of Objects in Cache (if possible) may be as Bar Chart per Day/Month
  • Stack Bar Char (From Web / From Cache / From Memory) loaded Objects per Day/Month

So I'am looking forward to the next releases :-)

With best wishes from Bavaria

Thomas

top sites

Would it possible to have a column that you can click on to view the top users who access the top domains or top urls ?

Exclude regex

Hi,

I use kerberos auth in squid. How can i exclude computer names with $ at the end to show up in the reports

example: USER-PC$

Sometimes the number of requests in the networks/*.html files is far too high.

Hello.

I have implemented SquidAnalyzer v4.4 on four fairly busy proxy servers, which are running RHEL4.
The number of requests going through each server is between 1.5 million and 3 million per day.
The squid-analyzer script runs once per hour.

Most of the reports are fine, but I have noticed a discrepancy with the individual subnet reports in networks/*.html

Sometimes, the number of requests shown in the table is far higher than it should be.
It doesn't always happen, so I've tried to trigger the problem with 1,000 lines, then 10,000 lines, then 100,000 lines etc.

I can't get the problem to happen predictably, but contrary to my earlier reports, I have now seen the problem on RHEL4, Linux Mint 13 and CentOS 6.3, and with SquidAnalyzer v5.0.

I'm continuing my own investigation.

UseClientDNSName

It would be nice to use a timeout on gethostbyaddr (or maybe use Net::DNS) and a DNS cache file.
On verry large squid logs, UseClientDNSName is unusable and take forever.

If you change Lang option, graphics do not appear

Even setting the charset does not work, I decided to fix all accent and did not work :-( the graphics do not appear. Below corrected file.

pt_BR.txt

#------------------------------------------------------------------------------
# This is the translation file of SquidAnalyzer program. The first column
# represente the program access key to translated string and the second
# column is the translated string itself.
# Keys should not be modified and are case sensitive. The column separator
# is the tabulation character.
#
# Specials tags %s and %d in the translated string are uses by the program to
# replace dynamics values. Following the language their place in the string
# may vary.
#
# Author: Alexandre Sieira <[email protected]>
# Last change: Sep 26th, 2013.
# Revision: Joseh-Henrique Caetano de Brito e Silva <[email protected]>
#------------------------------------------------------------------------------
CharSet             iso-8859-15
01              Jan
02              Fev
03              Mar
04              Abr
05              Mai
06              Jun
07              Jul
08              Ago
09              Set
10              Out
11              Nov
12              Dez
KB              Kilo bytes
MB              Mega bytes
GB              Giga bytes
Requests        Acessos
Bytes           Bytes
Megabytes       Mbytes
Total           Total
Years           Anos
Months          Meses
Days            Dias
Hit             Hit
Miss            Miss
Cost            Custo
Users           Usu&aacute;rios
Sites           Sites
Domains         Dom&iacute;nios
Requests_graph  Acessos
Megabytes_graph Mbytes
Months_graph    Meses
Days_graph      Dias
Hit_graph       Hit
Miss_graph      Miss
Total_graph     Total
Domains_graph   Dom&iacute;nios
Users_help      N&uacute;mero total de usu&aacute;rios distintos neste per&iacute;odo.
Sites_help      N&uacute;mero total de sites distintos visitados neste per&iacute;odo.
Domains_help    N&uacute;mero total de dom&iacute;nios de segundo n&iacute;vel distintos visitados neste per&iacute;odo.
Hit_help        Objetos presentes no proxy.
Miss_help       Objetos n&atilde;o encontrados no proxy e buscados externamente.
Cost_help       1 Mbyte =
Generation      Relat&oacute;rio gerado em
Main_cache_title    Estat&iacute;sticas do Proxy
Cache_title         Estat&iacute;sticas do Proxy em
Stat_label          estat&iacute;sticas
Network_link        Redes
User_link           Usu&aacute;rios
Top_url_link        Top URLs
Top_domain_link     Top Dom&iacute;nios
Mime_link           Tipos MIME
Back_link           Voltar
Graph_cache_hit_title       Acessos %s em
Graph_cache_byte_title      Bytes Transferidos %s em
Hourly              por Hora
Hours               Horas
Daily               por Dia
Days                Dias
Monthly             por M&ecirc;s
Months              Meses
Mime_title          Estat&iacute;sticas de tipo MIME em
Mime_number         N&uacute;mero de tipos MIME
Network_title       Estat&iacute;sticas de Rede em
Network_number      N&uacute;mero de redes
Duration            Tempo
Time                Tempo
Largest             Maior
Url                 URL
User_title          Estat&iacute;sticas de usu&aacute;rios em
User_number         N&uacute;mero de usu&aacute;rios
Url_Hits_title      As Top %d URLs por n&uacute;mero de acessos em
Url_Bytes_title     As Top %d URLs por bytes transferidos em
Url_Duration_title  As Top %d URLs por tempo de transfer&ecirc;ncia em
Url_number          N&uacute;mero de URLs
Domain_Hits_title   Os Top %d dom&iacute;nios por n&uacute;mero de acessos em
Domain_Bytes_title      Os Top %d dom&iacute;nios por bytes transferidos em
Domain_Duration_title   Os Top %d dom&iacute;nios por tempo de transfer&ecirc;ncia em
Domain_number           N&uacute;mero de dom&iacute;nios
Domain_graph_hits_title     Acessos a Dom&iacute;nios em
Domain_graph_bytes_title    Bytes Transferidos de Dom&iacute;nios em
First_visit         Primeira visita
Last_visit          Ultima visita
Globals_Statistics  Estat&iacute;sticas globais
Legend              Legenda
File_Generated      Gerado pela
Up_link             Up
Click_year_stat     Click on year's statistics link for details
Mime_graph_hits_title   Acessos a Tipos MIME em
Mime_graph_bytes_title  Bytes Transferidos de Tipos MIME em
User                Usu&aacute;rios
Count               N&uacute;mero

missing translation strings in de_DE.txt

In the german translation file are (at least) 2 strings not translated:

First_visit Erster Besuch
Last_visit Letzter Besuch

The author of german translation don't use "umlaute". If have changed this strings. But if the string is used in a graphic, they are displayed as is. Is this a known bug or will be there a solution to use "umlaute" in graphics, too?

p.e. the translation "Taegliche Zugriffsstatistik" should be "Tägliche Zugriffsstatistik".

I like your application. Great work!

Cleaning old stats

What is the best way to clean old statistics, say older than 3 months, without having broken HTML links within HTML pages?

That is my current question. As I have more than 2-3 GB of full of data and I want to clean them all safely. Without breaking dat files maintained by squidanalyzer.

I also have a suggestion. Can we have config parameter like say:
logrotate NNN

where NNN is number of days.

Once NNN days passes, stats/logs/indexes/dat files older than NNN days automatically gets deleted and HTML pages also updated automatically. i.e. links to old dates are deleted.

Thanks for this excellent software. (5.0 with JS graphs is amazing!)

Thank you.

Weekly analysis possible?

Squidanalyzer already has daily and monthly stats (per user/IP)

Is it possible to have weekly stats? (week start being Sunday or Monday). Each week numbered say 1 to 52 in a year?

This helps check user access at the end of the week. i.e. audit at end of the week.

Thanks.

Amm.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.