nlamirault / speedtest_exporter Goto Github PK
View Code? Open in Web Editor NEWPrometheus exporter for Speedtest metrics
License: Apache License 2.0
Prometheus exporter for Speedtest metrics
License: Apache License 2.0
When starting up the command line from the Docker dashboard, error messages say the exec failed due to no /bin/sh. The docker image was nlamirault/speedtest_exporter:latest as of today. From the metrics page:
speedtest_exporter_build_info{branch="",goversion="go1.12.9",revision="",version=""} 1.0
~/Docker peiffer$ docker exec -it 191c5e3b7f162c3120e663c13b8f0a3557372dee3ab2bf730e3652b78192e14a /bin/sh; exit
OCI runtime exec failed: exec failed: container_linux.go:349: starting container process caused "exec: \"/bin/sh\": stat /bin/sh: no such file or directory": unknown
logout
Saving session...
...copying shared history...
...saving history...truncating history files...
...completed.
[Process completed]
There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.
Error type: Cannot find preset's package (github>whitesource/merge-confidence:beta)
It would be awesome if the exporter could allow a set of server IDs to be configured - currently the list of "closest" servers is chosen, but having this be configurable would mean being able to test speeds to multiple different locations, allowing testing of transatlantic links etc
I am trying to pass the environment EXPORTER_SERVERID to force a different server, however it defaults to closest server every time:
speedtest:
tty: true
stdin_open: true
expose:
- 9696
environment:
- EXPORTER_SERVERID=1772
- EXPORTER_PORT=9696
ports:
- 9696:9696
image: nlamirault/speedtest_exporter:0.3.0
restart: always
networks:
- back-tier
log output:
INFO[0000] Starting speedtest exporter (version=, branch=, revision=) source="speedtest_exporter.go:121"
INFO[0000] Build context (go=go1.13.3, user=, date=) source="speedtest_exporter.go:122"
INFO[0000] Setup Speedtest client with interval 1m0s source="speedtest_exporter.go:64"
2020/02/07 18:42:02 Env Report
2020/02/07 18:42:02 -------------------------------
2020/02/07 18:42:02 [User Environment]
2020/02/07 18:42:02 Arch: arm
2020/02/07 18:42:02 OS: linux
2020/02/07 18:42:02 IP: XXX.XXX.XXX.XXX
2020/02/07 18:42:02 Lat: XXXXX
2020/02/07 18:42:02 Lon: XXXXX
2020/02/07 18:42:02 ISP: Spectrum
2020/02/07 18:42:02 Config: http://c.speedtest.net/speedtest-config.php?x=h3HeHkroR6qyjQ0y
2020/02/07 18:42:02 Servers: http://c.speedtest.net/speedtest-servers-static.php?x=EiPLZFaoIb5w6PBy
2020/02/07 18:42:02 User Agent: speedtest_exporter
2020/02/07 18:42:02 -------------------------------
2020/02/07 18:42:02 [args]
2020/02/07 18:42:02 []string{"/usr/bin/speedtest_exporter"}
2020/02/07 18:42:02 --------------------------------
2020/02/07 18:42:04 Sorting all servers by distance...
2020/02/07 18:42:04 Doing 3 runs of {http://cavt.ost.myvzw.com:5060/speedtest/upload.php 33.185 -117.2754 Vista, CA United States US Verizon 15951 16.430756041944438 0}
2020/02/07 18:42:04 Testing latency: Vista, CA (Verizon)
2020/02/07 18:42:04 Run took: 89.057381ms
2020/02/07 18:42:04 Testing latency: Vista, CA (Verizon)
2020/02/07 18:42:04 Run took: 95.598126ms
2020/02/07 18:42:04 Testing latency: Vista, CA (Verizon)
2020/02/07 18:42:04 Run took: 93.695711ms
2020/02/07 18:42:04 Testing latency: Vista, CA (Verizon)
2020/02/07 18:42:04 Run took: 99.654367ms
2020/02/07 18:42:04 Testing latency: Vista, CA (Verizon)
2020/02/07 18:42:04 Run took: 92.692054ms
2020/02/07 18:42:04 Total runs took: 89.057381
2020/02/07 18:42:04 Server latency was ok 89.057381 adding to successful servers list
2020/02/07 18:42:04 Doing 3 runs of {http://esc.speedtest.t-mobile.com:8080/speedtest/upload.php 33.1192 -117.0864 Escondido, CA United States US T-Mobile 15037 19.021629683503186 0}
2020/02/07 18:42:04 Testing latency: Escondido, CA (T-Mobile)
2020/02/07 18:42:04 Get http://esc.speedtest.t-mobile.com:8080/speedtest/latency.txt: dial tcp [2607:fb90:5f0:1ea2::5eed]:8080: connect: cannot assign requested address
I am trying to stop it from attempting to resolve this IPv6 speedtest URL. Docker is having other issues with IPv6.
http://esc.speedtest.t-mobile.com:8080/speedtest/latency.txt
I build the docker image and start the container like this:
$ docker build -t nlamirault/speedtest_exporter .
…
$ docker run --rm -p 9112:9112 nlamirault/speedtest_exporter
time="2019-05-11T20:02:14Z" level=info msg="Starting speedtest exporter (version=, branch=, revision=)" source="speedtest_exporter.go:121"
time="2019-05-11T20:02:14Z" level=info msg="Build context (go=go1.12, user=, date=)" source="speedtest_exporter.go:122"
time="2019-05-11T20:02:14Z" level=info msg="Setup Speedtest client with interval 1m0s" source="speedtest_exporter.go:64"
"2019/05/11 20:02:15 Env Report
2019/05/11 20:02:15 -------------------------------
2019/05/11 20:02:15 [User Environment]
2019/05/11 20:02:15 Arch: amd64
2019/05/11 20:02:15 OS: linux
2019/05/11 20:02:15 IP: xxx
2019/05/11 20:02:15 Lat: xxx
2019/05/11 20:02:15 Lon: xxx
2019/05/11 20:02:15 ISP: Vodafone
2019/05/11 20:02:15 Config: http://c.speedtest.net/speedtest-config.php?x=qzJzNlHL6vKMFsVo
2019/05/11 20:02:15 Servers: http://c.speedtest.net/speedtest-servers-static.php?x=A8I36hx2Wwte8RYn
2019/05/11 20:02:15 User Agent: speedtest_exporter
2019/05/11 20:02:15 -------------------------------
2019/05/11 20:02:15 [args]
2019/05/11 20:02:15 []string{"/go/bin/speedtest_exporter"}
2019/05/11 20:02:15 --------------------------------
2019/05/11 20:02:15 Sorting all servers by distance...
2019/05/11 20:02:15 Doing 3 runs of {xxx}
2019/05/11 20:02:15 Testing latency: xxx (Vodafone xxx)
2019/05/11 20:02:16 Run took: 139.074451ms
2019/05/11 20:02:16 Testing latency: xxx (Vodafone xxx)
2019/05/11 20:02:16 Run took: 36.892258ms
2019/05/11 20:02:16 Testing latency: xxx (Vodafone xxx)
2019/05/11 20:02:16 Run took: 36.307575ms
2019/05/11 20:02:16 Testing latency: xxx (Vodafone xxx)
2019/05/11 20:02:16 Run took: 34.768812ms
2019/05/11 20:02:16 Testing latency: xxx (Vodafone xxx)
2019/05/11 20:02:16 Run took: 36.344563ms
2019/05/11 20:02:16 Total runs took: 34.768812
2019/05/11 20:02:16 Server latency was ok 34.768812 adding to successful servers list
2019/05/11 20:02:16 Doing 3 runs of {xxx}
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 183.942836ms
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 104.255068ms
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 39.84698ms
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 39.880487ms
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 81.800425ms
2019/05/11 20:02:16 Total runs took: 39.84698
2019/05/11 20:02:16 Server latency was ok 39.846980 adding to successful servers list
2019/05/11 20:02:16 Doing 3 runs of {xxx}
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:16 Run took: 297.908185ms
2019/05/11 20:02:16 Testing latency: xxx (xxx)
2019/05/11 20:02:17 Run took: 180.862797ms
2019/05/11 20:02:17 Testing latency: xxx (xxx)
2019/05/11 20:02:17 Run took: 77.775783ms
2019/05/11 20:02:17 Testing latency: xxx (xxx)
2019/05/11 20:02:17 Run took: 214.135188ms
2019/05/11 20:02:17 Testing latency: xxx
2019/05/11 20:02:17 Run took: 94.088704ms
2019/05/11 20:02:17 Total runs took: 77.775783
2019/05/11 20:02:17 Server latency was ok 77.775783 adding to successful servers list
2019/05/11 20:02:17 Server: {xxx}
time="2019-05-11T20:02:17Z" level=info msg="Test server: {xxx}
time="2019-05-11T20:02:17Z" level=info msg="Register exporter" source="speedtest_exporter.go:130"
time="2019-05-11T20:02:17Z" level=info msg="Listening on :9112" source="speedtest_exporter.go:144"
time="2019-05-11T20:02:31Z" level=info msg="Speedtest exporter starting" source="speedtest_exporter.go:88"
Then I query /metrics
:
$ curl -m 3000 localhost:9112/metrics
curl: (52) Empty reply from server
The speestest-exporter container logs this and exit:
2019/05/11 20:02:31 Testing download speed
2019/05/11 20:02:31 Starting test at: 2019-05-11 20:02:31.316398567 +0000 UTC m=+16.453172511
.2019/05/11 20:02:36 Starting test at: 2019-05-11 20:02:36.769776246 +0000 UTC m=+21.906550185
2019/05/11 20:02:48 Starting test at: 2019-05-11 20:02:48.779609173 +0000 UTC m=+33.916383142
.2019/05/11 20:03:03 net/http: request canceled (Client.Timeout exceeded while reading body)
Hello,
The default gateway is used and i have a second that i want to test too.
Can it be possible to add a option to select the gateway for speedtest ?
Hi,
I get different (lower) measurements with the speedtest version used in this exporter.
When using this speedtest for absolute measurements I advise to validate the results with the speedtest-cli as well as the official tools of Ookla before making a judgement speed.
Andreas
Hi,
The links to your precompiled binaries no longer exist:
https://bintray.com/artifact/download/nlamirault/oss/speedtest_exporter-0.3.0_linux_arm
I'm sorry if this is a basic thing. I'm new to Prometheus. I have been trying to install this for over an hour. What am I supposed to do with the binary? I don't see a way to install it with Prometheus, and I haven't been able to extract anything from it.
Sys Info:
System:
OS: Linux 4.15 Ubuntu 18.04.1 LTS (Bionic Beaver)
CPU: x64 Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz
Memory: 287.60 MB / 7.63 GB
Container: Yes
Shell: 4.4.19 - /bin/bash
Binaries:
Node: 8.12.0 - /usr/local/bin/node
Yarn: 1.10.1 - /usr/local/bin/yarn
npm: 6.4.1 - /usr/local/bin/npm
Utilities:
Make: 4.1 - /usr/bin/make
GCC: 7.3.0 - /usr/bin/gcc
Git: 2.17.1 - /usr/bin/git
IDEs:
Nano: 2.9.3 - /bin/nano
Vim: 8.0 - /usr/bin/vim
Languages:
Bash: 4.4.19 - /bin/bash
Perl: 5.26.1 - /usr/bin/perl
Python: 2.7.15 - /usr/bin/python
Ruby: 2.5.1 - /usr/bin/ruby
Databases:
MongoDB: 3.4.15 - /usr/bin/mongo
This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.
These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.
Dockerfile
.github/workflows/draft-labels.yml
jinmayamashita/ready-for-review 1.0.0
.github/workflows/gitleaks.yml
actions/checkout v3
.github/workflows/gorelease.yaml
actions/checkout v3
actions/setup-go v3
release-drafter/release-drafter v5.22.0
goreleaser/goreleaser-action v4
.github/workflows/prow-labels.yml
.github/workflows/prow-lgtm-merge.yml
jpmcb/prow-github-actions v1.1.3
.github/workflows/prow-lgtm-pull.yml
jpmcb/prow-github-actions v1.1.3
.github/workflows/prow.yml
jpmcb/prow-github-actions v1.1.3
.github/workflows/rebase.yml
actions/checkout v3
cirrus-actions/rebase 1.8
.github/workflows/release-drafter.yml
release-drafter/release-drafter v5.22.0
.github/workflows/renovate.yml
actions/checkout v3
peter-evans/create-pull-request v4
.github/workflows/size.yaml
actions/checkout v3
actions-ecosystem/action-size v2
actions-ecosystem/action-remove-labels v1
actions-ecosystem/action-add-labels v1
go.mod
go 1.16
github.com/dchest/uniuri v0.0.0-20160212164326-8902c56451e9@8902c56451e9
github.com/prometheus/client_golang v0.9.4
github.com/prometheus/common v0.4.1
github.com/zpeters/speedtest v1.0.3
According to https://github.com/zpeters/speedtest :
Notice
I no longer have time to actively develop this code. There are currently multiple speed issues that i don't have time to track down. It appears that Ookla is working on a beta site that will completely change the interface which woudl require a pretty much complete rewrite. For the moment i will accept and patches but i wont be actively doing any development.
Other implementation :
Hi, first of all - THANKS.
after some init. probs I created a init service for the speedtest and now it's running great.
Question: Is there any option to define the interval of the test?
I think it's running continuously right? I need to run the test just every hour or so.
Thanks, T'
With the CLI:
root@nclient-rpi-46ec8662-tainted:~# ./speedtest
Speedtest by Ookla
Server: razorblue - Leeds (id = 17449)
ISP: Sky Broadband
Latency: 11.74 ms (0.83 ms jitter)
Download: 736.45 Mbps (data used: 1.1 GB )
Upload: 108.14 Mbps (data used: 191.4 MB )
Packet Loss: 0.0%
However, the exporter consistently reports less than that. For example:
INFO[0126] Speedtest exporter starting source="speedtest_exporter.go:88"
2022/08/05 09:48:21 Testing download speed
2022/08/05 09:48:21 Starting test at: 2022-08-05 09:48:21.920180567 +0100 BST m=+126.629379847
.2022/08/05 09:48:21 Starting test at: 2022-08-05 09:48:21.96699229 +0100 BST m=+126.676191439
.2022/08/05 09:48:22 Starting test at: 2022-08-05 09:48:22.015578141 +0100 BST m=+126.724777310
.2022/08/05 09:48:22 Starting test at: 2022-08-05 09:48:22.074364678 +0100 BST m=+126.783563847
.2022/08/05 09:48:22 Starting test at: 2022-08-05 09:48:22.145855901 +0100 BST m=+126.855055069
.2022/08/05 09:48:22 Starting test at: 2022-08-05 09:48:22.333710697 +0100 BST m=+127.042909846
.2022/08/05 09:48:22 Starting test at: 2022-08-05 09:48:22.588437363 +0100 BST m=+127.297636513
.2022/08/05 09:48:23 Starting test at: 2022-08-05 09:48:23.011429437 +0100 BST m=+127.720628605
.2022/08/05 09:48:23 Starting test at: 2022-08-05 09:48:23.478775511 +0100 BST m=+128.187974679
.2022/08/05 09:48:24 Starting test at: 2022-08-05 09:48:24.091344918 +0100 BST m=+128.800544068
.
INFO[0129] Speedtest Download: 203.33081262373443 Mbps source="client.go:91"
2022/08/05 09:48:25 Testing upload speed
2022/08/05 09:48:25 Starting test at: 2022-08-05 09:48:25.174871121 +0100 BST m=+129.884070363
2022/08/05 09:48:25 Starting test at: 1659689305174871121 (nano)
2022/08/05 09:48:25 Finishing test at: 2022-08-05 09:48:25.221817658 +0100 BST m=+129.931016826
2022/08/05 09:48:25 Finishing test at: 1659689305221817658 (nano)
2022/08/05 09:48:25 Took: 46946463 (nano)
.2022/08/05 09:48:25 Starting test at: 2022-08-05 09:48:25.257852028 +0100 BST m=+129.967051178
2022/08/05 09:48:25 Starting test at: 1659689305257852028 (nano)
2022/08/05 09:48:25 Finishing test at: 2022-08-05 09:48:25.331548176 +0100 BST m=+130.040747345
2022/08/05 09:48:25 Finishing test at: 1659689305331548176 (nano)
2022/08/05 09:48:25 Took: 73696167 (nano)
.2022/08/05 09:48:25 Starting test at: 2022-08-05 09:48:25.402849213 +0100 BST m=+130.112048382
2022/08/05 09:48:25 Starting test at: 1659689305402849213 (nano)
2022/08/05 09:48:25 Finishing test at: 2022-08-05 09:48:25.509839843 +0100 BST m=+130.219039011
2022/08/05 09:48:25 Finishing test at: 1659689305509839843 (nano)
2022/08/05 09:48:25 Took: 106990629 (nano)
.2022/08/05 09:48:25 Starting test at: 2022-08-05 09:48:25.61713788 +0100 BST m=+130.326337048
2022/08/05 09:48:25 Starting test at: 1659689305617137880 (nano)
2022/08/05 09:48:25 Finishing test at: 2022-08-05 09:48:25.759140361 +0100 BST m=+130.468339529
2022/08/05 09:48:25 Finishing test at: 1659689305759140361 (nano)
2022/08/05 09:48:25 Took: 142002481 (nano)
.2022/08/05 09:48:25 Starting test at: 2022-08-05 09:48:25.901137472 +0100 BST m=+130.610336659
2022/08/05 09:48:25 Starting test at: 1659689305901137472 (nano)
2022/08/05 09:48:26 Finishing test at: 2022-08-05 09:48:26.081011454 +0100 BST m=+130.790210622
2022/08/05 09:48:26 Finishing test at: 1659689306081011454 (nano)
2022/08/05 09:48:26 Took: 179873963 (nano)
.
INFO[0130] Speedtest Upload: 72.37444943025025 Mbps source="client.go:93"
2022/08/05 09:48:26 Testing latency: London (BRSK)
2022/08/05 09:48:26 Run took: 16.092685ms
2022/08/05 09:48:26 Testing latency: London (BRSK)
2022/08/05 09:48:26 Run took: 17.106259ms
2022/08/05 09:48:26 Testing latency: London (BRSK)
2022/08/05 09:48:26 Run took: 17.993722ms
2022/08/05 09:48:26 Testing latency: London (BRSK)
2022/08/05 09:48:26 Run took: 15.864056ms
2022/08/05 09:48:26 Testing latency: London (BRSK)
2022/08/05 09:48:26 Run took: 15.939074ms
INFO[0130] Speedtest Latency: 15.864056 ms source="client.go:100"
INFO[0130] Speedtest results: map[download:%!s(float64=203.33081262373443) ping:%!s(float64=15.864056) upload:%!s(float64=72.37444943025025)] source="client.go:104"
INFO[0130] Speedtest exporter finished source="speedtest_exporter.go:98"
$ curl -s nclient-rpi-46ec8662-tainted:9112/metrics | grep "download\|upload"
# HELP speedtest_download Download bandwidth (Mbps).
# TYPE speedtest_download gauge
speedtest_download 203.33081262373443
# HELP speedtest_upload Upload bandwidth (Mbps).
# TYPE speedtest_upload gauge
speedtest_upload 72.37444943025025
Is there some configuration I'm missing?
Speedtest on command line or the browser exposes which speedtest site that is chosen for measurement as well as jitter. It would be useful to also expose those as metrics to the exporter.
@nlamirault, it would be great to have this repository configured to automatically build images on Docker Hub. Currently PRs must be merged here and then @tnwhitwell must merge it too on his fork to have an available Docker image on https://hub.docker.com/r/tnwhitwell/speedtest_exporter.
Hello and thank you for this interesting plugin.
I have a small problem with binary. I follow you README.md:
./speedtest_exporter-0.3.0_linux_amd64 -log.level=debug
But I got this error:
flag provided but not defined: -log.level
Sometime it crash and throw this error:
2018/01/28 05:14:05 net/http: request canceled (Client.Timeout exceeded while reading body)
Hi,
Are there any plans to provide a docker image for the app, and perhaps also a Helm chart? :)
Hello,
Tonight I tried your project and I met the following error.
022/10/05 23:25:49 User Agent: speedtest_exporter
2022/10/05 23:25:49 -------------------------------
2022/10/05 23:25:49 [args]
2022/10/05 23:25:49 []string{"./speedtest_exporter"}
2022/10/05 23:25:49 --------------------------------
ERRO[0000] Can't create exporter : Can't create the Speedtest client: expected element type <settings> but have <html> source="speedtest_exporter.go:127"
I met the same issue using the docker image (Dockerfile) or building from sources.
go version go1.19.2 darwin/amd64
Thank you for your work!
Have a nice day
gmake init 0
gmake: /bin/bash: No such file or directory
gmake: *** [Makefile:70: init] Error 127
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.