Git Product home page Git Product logo

tom-draper / api-analytics Goto Github PK

View Code? Open in Web Editor NEW
115.0 4.0 16.0 22.85 MB

Lightweight monitoring and analytics for API frameworks.

Home Page: https://apianalytics.dev

License: MIT License

Python 10.70% Go 34.95% CSS 0.71% JavaScript 4.52% Svelte 30.42% TypeScript 8.99% HTML 0.08% Rust 5.03% Ruby 3.00% Shell 0.03% C# 1.57%
analytics api api-analytics api-monitoring dashboard fastapi flask server-analytics server-monitoring gin

api-analytics's Introduction

API Analytics

A free and lightweight API analytics solution, complete with a dashboard.

Currently compatible with:

  • Python: FastAPI, Flask, Django and Tornado
  • Node.js: Express, Fastify and Koa
  • Go: Gin, Echo, Fiber and Chi
  • Rust: Actix, Axum and Rocket
  • Ruby: Rails and Sinatra
  • C#: ASP.NET Core

Getting Started

1. Generate an API key

Head to https://apianalytics.dev/generate to generate your unique API key with a single click. This key is used to monitor your API server and should be stored privately. It's also required in order to view your API analytics dashboard and data.

2. Add middleware to your API

Add our lightweight middleware to your API. Almost all processing is handled by our servers so there is minimal impact on the performance of your API.

FastAPI

PyPi version

pip install api-analytics[fastapi]
import uvicorn
from fastapi import FastAPI
from api_analytics.fastapi import Analytics

app = FastAPI()
app.add_middleware(Analytics, api_key=<API-KEY>)  # Add middleware

@app.get('/')
async def root():
    return {'message': 'Hello, World!'}

if __name__ == "__main__":
    uvicorn.run("app:app", reload=True)

Flask

PyPi version

pip install api-analytics[flask]
from flask import Flask
from api_analytics.flask import add_middleware

app = Flask(__name__)
add_middleware(app, <API-KEY>)  # Add middleware

@app.get('/')
def root():
    return {'message': 'Hello, World!'}

if __name__ == "__main__":
    app.run()

Django

PyPi version

pip install api-analytics[django]

Assign your API key to ANALYTICS_API_KEY in settings.py and add the Analytics middleware to the top of your middleware stack.

ANALYTICS_API_KEY = <API-KEY>

MIDDLEWARE = [
    'api_analytics.django.Analytics',  # Add middleware
    ...
]

Tornado

PyPi version

pip install api-analytics[tornado]

Modify your handler to inherit from Analytics. Create a __init__() method, passing along the application and response along with your unique API key.

import asyncio
from tornado.web import Application
from api_analytics.tornado import Analytics

# Inherit from the Analytics middleware class
class MainHandler(Analytics):
    def __init__(self, app, res):
        super().__init__(app, res, <API-KEY>)  # Provide api key
    
    def get(self):
        self.write({'message': 'Hello, World!'})

def make_app():
    return Application([
        (r"/", MainHandler),
    ])

if __name__ == "__main__":
    app = make_app()
    app.listen(8080)
    IOLoop.instance().start()

Express

Npm package version

npm install node-api-analytics
import express from 'express';
import { expressAnalytics } from 'node-api-analytics';

const app = express();

app.use(expressAnalytics(<API-KEY>)); // Add middleware

app.get('/', (req, res) => {
    res.send({ message: 'Hello, World!' });
});

app.listen(8080, () => {
    console.log('Server listening at http://localhost:8080');
})

Fastify

Npm package version

npm install node-api-analytics
import Fastify from 'fastify';
import { fastifyAnalytics } from 'node-api-analytics';

const fastify = Fastify();

fastify.addHook('onRequest', fastifyAnalytics(<API-KEY>)); // Add middleware

fastify.get('/', function (request, reply) {
  reply.send({ message: 'Hello, World!' });
})

fastify.listen({ port: 8080 }, function (err, address) {
  console.log('Server listening at http://localhost:8080');
  if (err) {
    fastify.log.error(err);
    process.exit(1);
  }
})

Koa

Npm package version

npm install node-api-analytics
import Koa from "koa";
import { koaAnalytics } from 'node-api-analytics';

const app = new Koa();

app.use(koaAnalytics(<API-KEY>)); // Add middleware

app.use((ctx) => {
  ctx.body = { message: 'Hello, World!' };
});

app.listen(8080, () =>
  console.log('Server listening at http://localhost:8080')
);

Gin

Gin

go get -u github.com/tom-draper/api-analytics/analytics/go/gin
package main

import (
    "net/http"
    "github.com/gin-gonic/gin"
    analytics "github.com/tom-draper/api-analytics/analytics/go/gin"
)

func root(c *gin.Context) {
    jsonData := []byte(`{"message": "Hello, World!"}`)
    c.Data(http.StatusOK, "application/json", jsonData)
}

func main() {
    router := gin.Default()
    
    router.Use(analytics.Analytics(<API-KEY>)) // Add middleware

    router.GET("/", root)
    router.Run(":8080")
}

Echo

Echo

go get -u github.com/tom-draper/api-analytics/analytics/go/echo
package main

import (
    "net/http"
    echo "github.com/labstack/echo/v4"
    analytics "github.com/tom-draper/api-analytics/analytics/go/echo"
)

func root(c echo.Context) error {
    jsonData := []byte(`{"message": "Hello, World!"}`)
    return c.JSON(http.StatusOK, jsonData)
}

func main() {
    apiKey := getAPIKey()

    router := echo.New()

    router.Use(analytics.Analytics(<API-KEY>)) // Add middleware

    router.GET("/", root)
    router.Start(":8080")
}

Fiber

Fiber

go get -u github.com/tom-draper/api-analytics/analytics/go/fiber
package main

import (
    "github.com/gofiber/fiber/v2"
    analytics "github.com/tom-draper/api-analytics/analytics/go/fiber"
)

func root(c *fiber.Ctx) error {
    jsonData := []byte(`{"message": "Hello, World!"}`)
    return c.SendString(string(jsonData))
}

func main() {
    app := fiber.New()

    app.Use(analytics.Analytics(<API-KEY>)) // Add middleware

    app.Get("/", root)
    app.Listen(":8080")
}

Chi

Chi

go get -u github.com/tom-draper/api-analytics/analytics/go/chi
package main

import (
    "net/http"
    analytics "github.com/tom-draper/api-analytics/analytics/go/chi"
    chi "github.com/go-chi/chi/v5"
)

func root(w http.ResponseWriter, r *http.Request) {
    w.Header().Set("Content-Type", "application/json")
    w.WriteHeader(http.StatusOK)
    jsonData := []byte(`{"message": "Hello, World!"}`)
    w.Write(jsonData)
}

func main() {
    router := chi.NewRouter()

    router.Use(analytics.Analytics(<API-KEY>)) // Add middleware

    router.GET("/", root)
    router.Run(":8080")
}

Actix

Crates.io

cargo add actix-analytics
use actix_web::{get, web, App, HttpServer, Responder, Result};
use serde::Serialize;
use actix_analytics::Analytics;

#[derive(Serialize)]
struct JsonData {
    message: String,
}

#[get("/")]
async fn index() -> Result<impl Responder> {
    let data = JsonData {
        message: "Hello, World!".to_string(),
    };
    Ok(web::Json(data))
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    HttpServer::new(|| {
        App::new()
            .wrap(Analytics::new(<API-KEY>))  // Add middleware
            .service(index)
    })
    .bind(("127.0.0.1", 8080))?
    .run()
    .await
}

Axum

Crates.io

cargo add axum-analytics
use axum::{routing::get, Json, Router};
use axum_analytics::Analytics;
use serde::Serialize;
use std::net::SocketAddr;

#[derive(Serialize)]
struct JsonData {
    message: String,
}

async fn root() -> Json<JsonData> {
    let json_data = JsonData {
        message: String::from("Hello World!"),
    };
    Json(json_data)
}

#[tokio::main]
async fn main() {
    let app = Router::new()
        .route("/", get(root))
        .layer(Analytics::new(<API-KEY>));

    let addr = SocketAddr::from(([127, 0, 0, 1], 8080));
    let listener = tokio::net::TcpListener::bind(addr).await.unwrap();
    println!("Server listening at: http://127.0.0.1:8080");
    axum::serve(listener, app).await.unwrap();
}

Rocket

Crates.io

cargo add rocket-analytics
#[macro_use]
extern crate rocket;
use rocket::serde::json::Json;
use serde::Serialize;
use rocket_analytics::Analytics;

#[derive(Serialize)]
pub struct JsonData {
    message: String,
}

#[get("/")]
fn root() -> Json<JsonData> {
    let data = JsonData {
        message: "Hello, World!".to_string(),
    };
    Json(data)
}

#[launch]
fn rocket() -> _ {
    rocket::build()
        .mount("/", routes![root])
        .attach(Analytics::new(<API-KEY>))
}

Rails

Gem version

gem install api_analytics

Add the analytics middleware to your rails application in config/application.rb.

require 'rails'
require 'api_analytics'

Bundler.require(*Rails.groups)

module RailsMiddleware
  class Application < Rails::Application
    config.load_defaults 6.1
    config.api_only = true

    config.middleware.use ::Analytics::Rails, <API-KEY>  # Add middleware
  end
end

Sinatra

Gem version

gem install api_analytics
require 'sinatra'
require 'api_analytics'

use Analytics::Sinatra, <API-KEY>  # Add middleware

before do
    content_type 'application/json'
end

get '/' do
    {message: 'Hello, World!'}.to_json
end

ASP.NET Core

NuGet Version

dotnet add package APIAnalytics.AspNetCore
using Analytics;
using Microsoft.AspNetCore.Mvc;

var builder = WebApplication.CreateBuilder(args);

var app = builder.Build();

app.UseAnalytics(<API-KEY>); // Add middleware

app.MapGet("/", () =>
{
    return Results.Ok(new OkObjectResult(new { message = "Hello, World!" }));
});

app.Run();

3. View your analytics

Your API will now log and store incoming request data on all valid routes. Your logged data can be viewed using two methods:

  1. Through visualizations and statistics on our dashboard
  2. Accessed directly via our data API

You can use the same API key across multiple APIs, but all your data will appear in the same dashboard. We recommend generating a new API key for each additional API server you want analytics for.

Dashboard

Head to https://apianalytics.dev/dashboard and paste in your API key to access your dashboard.

Demo: https://apianalytics.dev/dashboard/demo

dashboard

Data API

Logged data for all requests can be accessed via our REST API. Simply send a GET request to https://apianalytics-server.com/api/data with your API key set as X-AUTH-TOKEN in headers.

Python
import requests

headers = {
 "X-AUTH-TOKEN": <API-KEY>
}

response = requests.get("https://apianalytics-server.com/api/data", headers=headers)
print(response.json())
Node.js
fetch("https://apianalytics-server.com/api/data", {
  headers: { "X-AUTH-TOKEN": <API-KEY> },
})
  .then((response) => {
    return response.json();
  })
  .then((data) => {
    console.log(data);
  });
cURL
curl --header "X-AUTH-TOKEN: <API-KEY>" https://apianalytics-server.com/api/data
Parameters

You can filter your data by providing URL parameters in your request.

  • page - the page number, with a max page size of 50,000 (defaults to 1)
  • date - the exact day the requests occurred on (YYYY-MM-DD)
  • dateFrom - a lower bound of a date range the requests occurred in (YYYY-MM-DD)
  • dateTo - a upper bound of a date range the requests occurred in (YYYY-MM-DD)
  • hostname - the hostname of your service
  • ipAddress - the IP address of the client
  • status - the status code of the response
  • location - a two-character location code of the client
  • user_id - a custom user identifier (only relevant if a get_user_id mapper function has been set)

Example:

curl --header "X-AUTH-TOKEN: <API-KEY>" https://apianalytics-server.com/api/data?page=3&dateFrom=2022-01-01&hostname=apianalytics.dev&status=200&user_id=b56cbd92-1168-4d7b-8d94-0418da207908

Client ID and Privacy

By default, API Analytics logs and stores the client IP address of all incoming requests made to your API and infers a location (country) from the IP address if possible. This IP address is used as a form of client identification in the dashboard to estimate the number of users accessing your service.

This behaviour can be controlled through a privacy level defined in the configuration of the API middleware. There are three privacy levels to choose from 0 (default) to a maximum of 2. A privacy level of 1 will disable IP address storing, and a value of 2 will also disable location inference.

Privacy Levels:

  • 0 - The client IP address is used to infer a location and then stored for user identification. (default)
  • 1 - The client IP address is used to infer a location and then discarded.
  • 2 - The client IP address is never accessed and location is never inferred.
from fastapi import FastAPI
from api_analytics.fastapi import Analytics, Config

config = Config()
config.privacy_level = 2  # Disable IP storing and location inference

app = FastAPI()
app.add_middleware(Analytics, api_key=<API-KEY>, config=config)  # Add middleware

With any of these privacy levels, there is the option to define a custom user ID as a function of a request by providing a mapper function in the API middleware configuration. For example, your service may require an API key sent in the X-AUTH-TOKEN header field that can be used to identify a user. In the dashboard, this custom user ID will identify the user in conjunction with the IP address or as an alternative.

from fastapi import FastAPI
from api_analytics.fastapi import Analytics, Config

config = Config()
config.get_user_id = lambda request: request.headers.get('X-AUTH-TOKEN', '')

app = FastAPI()
app.add_middleware(Analytics, api_key=<API-KEY>, config=config)  # Add middleware

Data and Security

All data is stored securely in compliance with The EU General Data Protection Regulation (GDPR).

For any given request to your API, data recorded is limited to:

  • Path requested by client
  • Client IP address (optional)
  • Client operating system
  • Client browser
  • Request method (GET, POST, PUT, etc.)
  • Time of request
  • Status code
  • Response time
  • API hostname
  • API framework (FastAPI, Flask, Express etc.)

Data collected is only ever used to populate your analytics dashboard. All stored data is pseudo-anonymous, with the API key the only link between you and your logged request data. Should you lose your API key, you will have no method to access your API analytics.

Data Deletion

At any time you can delete all stored data associated with your API key by going to https://apianalytics.dev/delete and entering your API key.

API keys and their associated logged request data are scheduled to be deleted after 6 months of inactivity.

Monitoring

Active API monitoring can be set up by heading to https://apianalytics.dev/monitoring to enter your API key. Our servers will regularly ping chosen API endpoints to monitor uptime and response time.

Monitoring

Contributions

Contributions, issues and feature requests are welcome.

  • Fork it (https://github.com/tom-draper/api-analytics)
  • Create your feature branch (git checkout -b my-new-feature)
  • Commit your changes (git commit -am 'Add some feature')
  • Push to the branch (git push origin my-new-feature)
  • Create a new Pull Request

If you find value in my work consider supporting me.

Buy Me a Coffee: https://www.buymeacoffee.com/tomdraper
PayPal: https://www.paypal.com/paypalme/tomdraper

api-analytics's People

Contributors

1zun4 avatar thor314 avatar tom-draper avatar twitchax avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

api-analytics's Issues

Developer Docs

It would be useful if you could put some developer docs if someone wants to pull the project and try to run locally.

also a docker compose file would be much appreciated

Cannot Pull API Data from Api

Hello @tom-draper , just wondering why i cant pull data api as stated in README.

response = requests.get("https://api-analytics-server.vercel.app/api/data/", headers=headers)

It says {"message":"Invalid API key."}

(rust) panics on (almost) every request

Throws:

thread 'actix-rt|system:0|arbiter:5' panicked at 'called `Option::unwrap()` on a `None` value', /home/runner/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-analytics-1.0.8/src/analytics.rs:135:14

on almost every request.

Might be a ratelimit?

Setting up own server + extra questions

Hi Tom!

Firstly I really appreciate your work you've done here.
I've tested the solution with one of my APIs, I love the way you're presenting the data.

I have some questions (pointed also on the screen below):

  • Do you have any idea why I cannot see any locations?

  • Should I see 0's at the response times pane? I assume it might be due to most of the requests being CURLs

  • Is there an option/future plan to add an endpoint filtering mechanism? E.g. select endpoint X and see response times timeline only for that endpoint?

  • I can see in the git repo, that there is some monitoring code. I assume it's not yet deployed to the www.apianalytics.dev server?

  • I can see in the git repo that there is a full server code. Maybe you have a guide on how to start it up (ideally I would like to set up the server locally as a separate Docker container).

Please let me know your thoughts! I'm open to collaboration :)

image

Path for Rocket Analytics should be the route name.

Hello!

Awesome project. For the rocket analytics, the "true path" is recorded. This ends up yielding endpoints in the UI like /api/things/some_unique_id, rather than likely the grouping that most people would want, like /api/things/<id>, which is the route name.

This would make all calls to /api/things/<id> grouped in the UI, which, I believe, is how most people would like to see the data grouped.

This can be fixed for rocket by encoding the route rather than the path (e.g., req.route().as_ref().uri.unwrap_or_else("UNKNOWN")).

I'm happy to contribute, but I wanted to verify that you are OK with this.

Dashboard does not load

Whenever I try to go to the analytics dashboard it is stuck on an infinite loading screen.
This is the snippet for loading the middleware (using FastAPI) app.add_middleware(Analytics, api_key="MY-API-KEY").
By looking at devtools I can see that https://api-analytics-server.vercel.app/api/requests/id-is-here always returns {"message":"Invalid user ID.","status":400}. Am I doing anything wrong or is this an issue with the package?

KeyError: 'HTTP_USER_AGENT'

Oct 26 23:19:55 Neurohost python3[2273106]: KeyError: 'HTTP_USER_AGENT'
Oct 26 23:19:55 Neurohost python3[2273106]: During handling of the above exception, another exception occurred:
Oct 26 23:19:55 Neurohost python3[2273106]: Traceback (most recent call last):
Oct 26 23:19:55 Neurohost python3[2273106]:   File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 891, in finalize_request
Oct 26 23:19:55 Neurohost python3[2273106]:     response = self.process_response(response)
Oct 26 23:19:55 Neurohost python3[2273106]:   File "/usr/local/lib/python3.10/dist-packages/flask/app.py", line 1267, in process_response
Oct 26 23:19:55 Neurohost python3[2273106]:     response = self.ensure_sync(func)(response)
Oct 26 23:19:55 Neurohost python3[2273106]:   File "/usr/local/lib/python3.10/dist-packages/api_analytics/flask.py", line 24, in on_finish
Oct 26 23:19:55 Neurohost python3[2273106]:     'user_agent': request.headers['user-agent'],
Oct 26 23:19:55 Neurohost python3[2273106]:   File "/usr/local/lib/python3.10/dist-packages/werkzeug/datastructures/headers.py", line 493, in __getitem__
Oct 26 23:19:55 Neurohost python3[2273106]:     return self.environ[f"HTTP_{key}"]
Oct 26 23:19:55 Neurohost python3[2273106]: KeyError: 'HTTP_USER_AGENT'

In my server logs I see this error very often. If I disable analytics - the error disappears, everything works as it should.

Remove/Filter Unwanted Endpoints

There are certain endpoints I want to hide from my dashboard and the overall requests count. For example, I have a "/status" endpoint that I have an uptime bot ping every 60 seconds and it shows on the dashboard. I don't want the /status endpoint to affect the rest of my analytics. This can also be done for enpoints that have to do with logins to token refreshes, or whatever.

I would like to request that users are able to filter out any number of endpoints and make it not shown on the dashboard at all. But, instead of just deleting them, maybe grey them out instead so I can still see it if I wanted to.

Missing Requests on Dashboard

There's an issue with logging requests in analytics dashboard.

Sometimes it captures all of the incoming requests, sometimes it doesn't

Usually, when I request same api 5, 6 times, it misses some requests hence not logging all of the requests.
I tried generating new API key to see if the issue is with the key and also tried to install 1.1.3 version but didn't help.

axum analytics: Service not implemented

Looks like there may be a version bump needed for axum, and an update to the implementation of the Service trait.

the example with the latest axum version produces error:

63  |     .layer(Analytics::new(analytics_api_key.unwrap_or("").to_string())) // analytics middleware
    |      ----- ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `tower_service::Service<axum::http::Request<Body>>` is not implemented for `AnalyticsMiddleware<Route>`
    |      |
    |      required by a bound introduced by this call
    |
    = help: the trait `tower_service::Service<http::request::Request<hyper::body::body::Body>>` is implemented for `AnalyticsMiddleware<S>`
note: required by a bound in `Router::<S>::layer`
   --> /home/thor/.cargo/registry/src/index.crates.io-6f17d22bba15001f/axum-0.7.4/src/routing/mod.rs:278:21
    |
275 |     pub fn layer<L>(self, layer: L) -> Router<S>
    |            ----- required by a bound in this associated function
...
278 |         L::Service: Service<Request> + Clone + Send + 'static,
    |                     ^^^^^^^^^^^^^^^^ required by this bound in `Router::<S>::layer`
    ```

Backend down?

Hey!

This tool looks like what I need for my fastapi server. But after setup and some calls I am getting to metrics in the dashboard or through api calls.

Can someone else report if there are problems?

No requests logged for a long time

Hi, bro, I am using this library to monitor my axum app, I think I add this middleware layer to my main router rightly and send requests to my web app, but it seemed that it keeps finding no requests.

So does this logging has some update intervals such as every day that I can't see the recording instantly? Hoping for your reply!

KeyError: 'user-agent'

Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/usr/local/lib/python3.10/dist-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/fastapi/applications.py", line 1115, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 184, in call
raise exc
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 162, in call
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/base.py", line 108, in call
response = await self.dispatch_func(request, call_next)
File "/usr/local/lib/python3.10/dist-packages/api_analytics/fastapi.py", line 25, in dispatch
'user_agent': request.headers['user-agent'],
File "/usr/local/lib/python3.10/dist-packages/starlette/datastructures.py", line 568, in getitem
raise KeyError(key)
KeyError: 'user-agent'

i tried integrating this on my FastAPI, and this is what i am getting when utilizing this on AWS environment.

Long loading time

My analytics site has incredibly long loading time (over 40 seconds)!
https://www.apianalytics.dev/dashboard/fb1f330ea13b4f4c91a8de83895e56e3

That's the analytics for this website: https://markdown-videos-api.jorgenkh.no/
The source code is available here in case you wanna look at it as a part of the debugging: https://github.com/Snailedlt/Markdown-Videos

Here's the console log before the loading is done:
image

And after the loading is done (the same error continues below the screenshot:
image

Here's the file it's pointing at:
image

Seems like this is an error, but perhaps adding some form of caching could also speed things up?

maximum response time

I am having a POST function that currently takes up to 20 seconds. I see the request to the endpoint in my dashboard, but the response time seems to be 0 . Is there some kind of upper limit here?

Rocket rust library?

Will this api analytics have support for the rocket rust library? It is quite commonly used. Thanks.

The dashborad loading circle keeps for about 10 minutes

I think it is because the row size is too large. My web application gets about 150k requests, now approximately 300k for 2 days. It took 10 minutes for me to get data from the api.

And I found that the api directly return all these requests data, which may be reason for the front-end dashboard to load so slowly. As it has to analyzed 30k rows data.

Maybe summarize the traffic data hour by hour, only count the total count of every endpoint to another database table might be a more wise idea to send the preprocessed and simplified data to dashboard and api? Hoping for your reply!๐Ÿ˜Š

Dashboard is not loading the data

Hi Tom,

Firstly, I want to express my gratitude for your project, it has proven to be quite valuable to my team. However, I'm facing an issue over the past few days regarding the dashboard not displaying data.

Upon inspecting the network activity, I observe that all the necessary data is being loaded as expected. So, I suspect that the recent commits made to tweak the dashboard might be the root cause of this problem.

Also, the analysis data, from the following endpoint https://www.apianalytics-server.com/api/requests/<secret-key>, is loading correctly and has a size of around 2-3MB. So, the issue doesn't seem to originate from there.

If there's any specific information or action required from my end to assist in resolving this matter, please let me know. I appreciate your help and the work you've put into this project.

Thanks once again.

image

License and project vision

I had the same idea of building a lightweight middleware that can capture API metrics natively in the service.

  1. What is your vision or goal with this project? Maybe, we can collaborate in building this project.
  2. Also, what is the license for this project?

rust actix library spawns thread on every log request

Hello.

When I was looking into #5 I have noticed the "blocking" feature being used and was wondering why ... and you spawn a thread on every log request?

image

This might be no issue for services with not many requests, but when you have thousands of requests going in ... well...

`api/data` returns an empty list.

Hi,
I don't seem to be able to get data from https://apianalytics-server.com/api/data. It returns an empty list. Even tho dashboard shows records.
image

image

Doesn't work with flask

I did as in the above example - statistics does not work, is not displayed on the site. There are no errors in the script related to the analytics module.

On the server based on FastAPI everything works fine.

rust actix analytics library uses native-tls by default, unable to use rustls

Hello.

It is not possible to stop reqwest from using native-tls.
reqwest = { version = "0.11", features = ["json", "blocking"] }
defined here: https://github.com/tom-draper/api-analytics/blob/main/analytics/rust/actix/analytics/Cargo.toml

I prefer to use rustls-tls, which can be defined like this:
reqwest = { version = "0.11", default-features = false, features = ["json", "blocking", "rustls-tls"] }

A possible solution to this issue would add features to the analytics crate like this:

reqwest = { version = "0.11", default-features = false, features = ["json", "blocking"] }

...

[features]
default = ["native-tls"]

rustls = ["reqwest/rustls-tls"]
native-tls = ["reqwest/default-tls"]

from api_analytics.fastapi import Analytics fails

I get this error:

from api_analytics.fastapi import Analytics
  File "/opt/anaconda3/lib/python3.8/site-packages/api_analytics/fastapi.py", line 4, in <module>
    from api_analytics.core import log_request
  File "/opt/anaconda3/lib/python3.8/site-packages/api_analytics/core.py", line 10, in <module>
    def _post_requests(api_key: str, requests_data: list[dict], framework: str):
TypeError: 'type' object is not subscriptable

issues with analytics dashboard

Hello. So I want to use these analytics for my API, but there a following issues:

  • stops counting at 1000 requests(?)
  • shows every request as endpoint (causes extreme long endpoint list)
  • it should be possible to filter out 404 from all data (on my API it will cause success rate to stay at almost 0% and show many endpoints which do not even exist)

image

endless list of endpoints...
image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.