boerderij / varken Goto Github PK
View Code? Open in Web Editor NEWStandalone application to aggregate data from the Plex ecosystem into InfluxDB using Grafana as a frontend
License: MIT License
Standalone application to aggregate data from the Plex ecosystem into InfluxDB using Grafana as a frontend
License: MIT License
I run tautulli on a different port than port 80. Is there a way to add a custom port to the config without having to change the python script?
requests.exceptions.ConnectionError: HTTPConnectionPool(host='docker.lan', port=80): Max retries exceeded with url: /api/v2?apikey=xxxxxxxxxx&cmd=get_activity (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f80156f6eb8>: Failed to establish a new connection: [Errno 111] Connection refused',))
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution you'd like
A clear and concise description of what you want to happen.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
I was curious if support for Linux without ASA could be supported?
Will allow colorization of missing vs missing available
Initial thoughts are to use schedule
After making necessary changes to the configuration.py script and attempting to run the tautulli.py script using 'python3 tautulli.py I get the following output:
troy@grafana:/opt/plex/scripts$ python3 tautulli.py Traceback (most recent call last): File "tautulli.py", line 54, in <module> geodata = GeoLite2db(sessions[session]['ip_address_public']) File "tautulli.py", line 31, in GeoLite2db os.rename(tempfullpath, dbfile) NotADirectoryError: [Errno 20] Not a directory: 'GeoLite2-City.tar.gz/GeoLite2-City.mmdb' -> 'GeoLite2-City.mmdb'
Use python's logger to do proper logging.
Debug should be on by default. Config item to change the level?
san.py can be completely covered by a telegraf install, and asa/raid_init are out of scope of this project.
Biggest issue with initial work on this is multiple server support. Needs more input.
I recently noticed that my influxdb database stopped receiving data from tautulli.py. When I try to run the script manually I receive this message:
Traceback (most recent call last):
File "tautulli.py", line 59, in
geodata = GeoLite2db(sessions[session]['ip_address_public'])
File "tautulli.py", line 36, in GeoLite2db
geodata = reader.city(ipaddress)
File "/usr/local/lib/python3.6/dist-packages/geoip2/database.py", line 114, in city
return self._model_for(geoip2.models.City, 'City', ip_address)
File "/usr/local/lib/python3.6/dist-packages/geoip2/database.py", line 195, in _model_for
record = self._get(types, ip_address)
File "/usr/local/lib/python3.6/dist-packages/geoip2/database.py", line 191, in _get
"The address %s is not in the database." % ip_address)
geoip2.errors.AddressNotFoundError: The address 172.17.0.1 is not in the database.
I have found that this only occurs when a local stream is being played. I do not use 172.X.X.X anywhere in my network. If all streams are coming over the WAN the script runs as expected. When streaming on LAN plex shows that my server is "nearby" and Tautulli shows the correct LAN address.
There were Tautulli and Plex updates recently however I did not notice the issue until after both so I am not sure which update (if any) were installed when this issue started.
Saw your post on Reddit, which prompted me to install Grafana. I am a complete Grafana noob, but do have Plex, Sonarr, Radarr etc already running. How do I go about getting that data to show up in Grafana?
Thanks!
Fixed in 8a0cae4
Needs to be replaced with #32 and they are so out of date they do not work anyway, thus causing more problems than needed
I have several devices streaming internally on the same network as Plex\Tautulli as well as a recently deployed DVR. None of these sessions seem to get imported into InfluxDB. They show up within Tautulli when they're active. Is there anyway to get these streams imported into InfluxDB as well? Even if the IP address is the WAN.
Thanks!
Changes made in 273706d
I noticed that you were pulling Geo Location details from http://freegeoip.net and wanted to make sure you were aware that they are closing the public api for that on July 1st of this year. They are rebranding to https://ipstack.com/ looks like they will still offer their services for free but users will have to register and get an api key. If you still wanted to use these guys you will probably need to add their api key stuff to the configuration.py with the other api keys.
More details here:
https://github.com/apilayer/freegeoip#readme
Json based on used services, Initial thoughts: would be a parallel script
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution you'd like
A clear and concise description of what you want to happen.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
Add Geoip script to fetch database and refresh it after x time. Defined in the config. Create scheduler instance.
Would love to have a table in grafana that shows a list of requests currently pending.
Perhaps, the requester, date of request, and name of the movie/tv show
Is there a later version of the template grafana dashboard for this since the screenshot in the readme (https://github.com/DirtyCajunRice/grafana-scripts/blob/master/README.md) seems to include a few more metrics/graphs than the template json file in the dashboards folder?
Hey i did try some copy paste, but can not really get it to work.
I want to get the data in to grafana to show what a user have requested and maybe som more stuff
`def get_user_request():
# Set the time here so we have one timestamp to work with
now = now_iso()
user_requests = []
influx_payload = []
for ombi_url, ombi_api_key, server_id in configuration.ombi_server_list:
headers = {'X-Api-Key': ombi_api_key}
get_user_request = requests.get('{}/api/v1/Request/movie'.format(ombi_url),
headers=headers,
verify=configuration.ombi_verify_ssl).json()['records']
user_request = {d['id']: d for d in get_user_request}
for request in user_request.keys():
movie_name = '{}'.format(user_request[request]['title'])
username = '{}'.format(user_request[request]['requestedUser']['userName'])
user_requests.append((movie_name, username, user_request[request]['id'], user_request[request]['title']))
for movie_name, username, id in user_requests:
influx_payload.append(
{
"measurement": "Ombi",
"tags": {
"type": "User_Request",
"ombiId": id,
"server": server_id
},
"time": now,
"fields": {
"movie_name": movie_name,
"username": username
}
}
)
print('Movie Name: {0} | Username: {1}'.format(movie_name, username))
# Empty missing or else things get foo bared
user_requests = []
return influx_payload
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution you'd like
A clear and concise description of what you want to happen.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
Here is the relevant traceback:
Traceback (most recent call last):
File "ombi.py", line 38, in
influx.write_points(influx_payload)
File "/usr/local/lib/python3.5/dist-packages/influxdb/client.py", line 468, in write_points
tags=tags, protocol=protocol)
File "/usr/local/lib/python3.5/dist-packages/influxdb/client.py", line 532, in _write_points
protocol=protocol
File "/usr/local/lib/python3.5/dist-packages/influxdb/client.py", line 312, in write
headers=headers
File "/usr/local/lib/python3.5/dist-packages/influxdb/client.py", line 271, in request
raise InfluxDBClientError(response.content, response.status_code)
influxdb.exceptions.InfluxDBClientError: 404: <!DOCTYPE html>
<html lang="en">
<head>
....<snip a lot of HTML and JSON>.....
When the HTML doc from the traceback is rendered, this is all that is there... but the rest of the json is in the source of the page:
{{alert.title}}
• Docs
• Support Plans
• Community
• Grafana v4.4.3 (commit: 54c79c5)
• New version available!
I've walked through this but I'm stuck. I get to line 271 in client.py which is complaining about a response code of 404 from a request ( response = self._session.request ) where url = {str} 'http://192.168.1.70:3000/write' This is the correct IP and port for my Grafana intance.
I think it's trying to tell me that it cannot find anything at that URL, however I have the DB configured in grafana and testing the connection works. I can also connect to the influxdb database plex via the influxdb studio windows application.
Add an /metrics endpoint so that Prometheus can scrape from it. This way we can choose what TSDB to use.
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
Additional context
Add any other context about the problem here.
Status field not currently capturing 'Pending' Ombi requests - not approved but not yet denied either.
Currently:
0 = denied
1 = approved
2 = completed
Need catchall that sets 3 for 'Pending'
Thanks!
I live in the Netherlands. All streams therefore also come from the Netherlands.
Unfortunately worldmap panel indicates that the streams come from "Utah".
The location in "users online" does indicate the correct location.
does the panel only work with location from the United States?
Feature Request , Add Sickchill , when possivel !!!
Hello @dirtycajunrice , thanks for you work with this
im having a problem , wondering if you can help ... the influx db is empty , and nothing in coming from the scripts .... i have other docker containers that are working fine with influx db , incluing a docker for plex stats ... i have created a new DB called plex , and have re done all like you explain in the configuration . but still nothing in the Db, can u please see if im doing anyhting wrong , the logs only have this :
my config is this :
########################### INFLUXDB CONFIG ###########################
influxdb_url = 192.168.1.20
influxdb_port = 8086
influxdb_username =
influxdb_password =
############################ SONARR CONFIG ############################
sonarr_server_list = [
('https://sonarr1.domain.tld', 'xxxxxxxxxxxxxxx', '1'),
('https://sonarr2.domain.tld', 'xxxxxxxxxxxxxxx', '2'),
#('https://sonarr3.domain.tld', 'xxxxxxxxxxxxxxx', '3')
]
sonarr_influxdb_db_name = 'plex'
############################ RADARR CONFIG ############################
radarr_server_list = [
('https://radarr1.domain.tld', 'xxxxxxxxxxxxxxx', '1'),
('https://radarr2.domain.tld', 'xxxxxxxxxxxxxxx', '2'),
#('https://radarr3.domain.tld', 'xxxxxxxxxxxxxxx', '3')
]
radarr_influxdb_db_name = 'plex'
############################ OMBI CONFIG ##############################
ombi_url = 'https://ombi.domain.tld'
ombi_api_key = 'xxxxxxxxxxxxxxx'
ombi_influxdb_db_name = 'plex'
########################## TAUTULLI CONFIG ############################
tautulli_url = http://192.168.1.20:8181
tautulli_api_key = xxxxxxxxxxxxxx
tautulli_failback_ip = 192.168.1.20
tautulli_influxdb_db_name = plex
tautulli_verify_ssl = False
I would like to see the value "titleSlug" from the Radarr-request added in Influx as soon as the service runs. I don't need to update the field afterwards, I simply just need the "titleSlug" value.
Traceback (most recent call last):
File "tautulli.py", line 18, in
sessions = {d['session_id']: d for d in activity['sessions']}
KeyError: 'sessions'
Remove legacy missing and use missing_days. Initial thoughts are to keep --missing and map it as missing_days since epoch 0
Hello, Long time user of your dashboard, great work! Initially, in preparation for the changes i modified tautulli.py to leverage ipstack after I got my api key. This worked fine, but you can only do 100 free lookups. After than, i pip3 installed geoip, get your updated tautulli.py and ran it. It works for everything except my map.
I checked to see if you had made corresponding changes to the grafana json but I saw none. In lieu of logs (not sure which ones you'd want or where to get them) I hope the following data helps.
SELECT count("location") AS "metric" FROM "Tautulli" WHERE ("type" = 'Session') AND time >= now() - 1m GROUP BY "region_code"
which in my case returns:
name: Tautulli
tags: region_code=5000473
time metric
---- ------
1531954365899092742 2
this data doesn't plot anything on the map for some reason though.
If there is anything I can provide or do to assist tshooting, please let me know! Thanks!
No option to specify InfluxDB database to write to; database requirements are hard coded. Requires admin user to create the database. This can be mitigated by introducing a variable in the config.ini file to specify the database. Optionally, include additional steps needed in the README to pre-configure a database & user to avoid having to use an admin account. Steps should be identical for the Docker image, as well.
Can the Tautulli script be modified to support multiple tautulli servers (radarr/sonarr scripts already seem to support multiple servers)?
This is especially useful for those with more than one Plex server since I understand Tautulli currently is only able to monitor a single Plex server, therefore 2 Tautulli instances are required to monitor 2 Plex servers?
Will put it in line with the other code as well
When using a dns or container name for the influx url like: unfluxdb, I get the error that it's not correct and needs to be a hostname or ip. When reverting to docker image with tag v1.1 it works like normal.
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
Additional context
Add any other context about the problem here.
Hi There,
Just a suggestion but, would it be possible to include / merge the following speedtest-cli script into the dirtycajunrice grafana-scripts docker?
https://github.com/barrycarey/Speedtest-for-InfluxDB-and-Grafana
I currently run this as a separate docker but it would be good to merge these into a single docker and remove the current Cisco ASA dependency for bandwidth statistics from the grafana-scripts docker?
Thanks!
Default database retention? (90 days)
Migrate DB if retention is changed?
Hi There,
Just wondering if there is (or plans to be) a docker image for these cool scripts? Would be good to have this in a docker container (with appropriate environment variables) to better control deployment and management of the scripts?
Thanks!
Cannot get folder info currently from parent
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.