ma3str0 / kimsufi-crawler Goto Github PK
View Code? Open in Web Editor NEWCrawler that will send you an email alert as soon as servers on OVH/Kimsufi become available for purchase
License: MIT License
Crawler that will send you an email alert as soon as servers on OVH/Kimsufi become available for purchase
License: MIT License
My solution was a tiny shell script.
#!/bin/bash
#
# Kimsufi server alert.
AVAILABILITY_JSONP="https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2?callback=Request.JSONP.request_map.request_0"
KS="150sk10"
wget -q -O- "$AVAILABILITY_JSONP" \
| sed 's/},/},\n/g' \
| grep -A8 "\"reference\":\"${KS}\"" | cut -d'"' -f8,12 \
| tail -n +2 | grep -v "^unknown"
exit 0
Hi,
i tried to get your crawler on my server, but after following your "guide", i am getting this error:
[root@titanium kimsufi-crawler]# python crawler.py
2015-03-25 11:23:59,022 Notifier loading failed,check configuration for errors
Traceback (most recent call last):
File "crawler.py", line 166, in
NOTIFIER = getattr(NOTIFIER_MODULE, NOTIFIER_CLASSNAME)(CONFIG)
File "/root/kimsufi-crawler/notifiers/email_notifier.py", line 26, in init
super(EmailNotifier, self).init(config)
File "/root/kimsufi-crawler/notifiers/base_notifier.py", line 12, in init
self.check_requirements()
File "/root/kimsufi-crawler/notifiers/email_notifier.py", line 30, in check_re quirements
server = smtplib.SMTP(self.host, self.port)
File "/usr/lib64/python2.7/smtplib.py", line 250, in init
(code, msg) = self.connect(host, port)
File "/usr/lib64/python2.7/smtplib.py", line 310, in connect
self.sock = self._get_socket(host, port, self.timeout)
File "/usr/lib64/python2.7/smtplib.py", line 285, in _get_socket
return socket.create_connection((host, port), timeout)
File "/usr/lib64/python2.7/socket.py", line 553, in create_connection
for res in getaddrinfo(host, port, 0, SOCK_STREAM):
gaierror: [Errno -2] Name or service not known
Do you can identify this one? Did i do something wrong? Running this on CentOS
It seems to be giving false positives. I adding some debugging output into the code, and it doesn't seem to be working at all actually. I'm a bit confused by this.
Is there any way you can confirm for me that this project is supposed to be still working and if so, can you help me to understand what is going on, on my end?
---------------------------------------------------
2015-02-20 06:42:27,397 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,397 SERVER: KS-1 :: ZONE: bhs
2015-02-20 06:42:27,398 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,398 SERVER: KS-1 :: ZONE: rbx
2015-02-20 06:42:27,398 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,398 SERVER: KS-1 :: ZONE: sbg
2015-02-20 06:42:27,398 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,398 SERVER: KS-3 :: ZONE: bhs
2015-02-20 06:42:27,398 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,398 SERVER: KS-3 :: ZONE: rbx
2015-02-20 06:42:27,398 State change - KS-3_available_in_rbx: True
2015-02-20 06:42:27,776 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,776 SERVER: KS-3 :: ZONE: sbg
2015-02-20 06:42:27,776 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,776 SERVER: KS-3 :: ZONE: bhs
2015-02-20 06:42:27,776 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,776 SERVER: KS-3 :: ZONE: rbx
2015-02-20 06:42:27,776 State change - KS-3_available_in_rbx: False
2015-02-20 06:42:27,776 E :: {u'__class': u'dedicatedType:dedicatedAvailability2ZoneStruct', u'zone': u'bhs', u'availability': u'unknown'}
2015-02-20 06:42:27,776 SERVER: KS-3 :: ZONE: sbg
Thanks
B
What do? Error:
[root@s3 kimsufi-crawler-master]# python crawler.py
2014-10-21 00:14:39,238 Exception in callback <functools.partial object at 0x184d158>
Traceback (most recent call last):
File "/usr/lib64/python2.6/site-packages/tornado/ioloop.py", line 565, in _run_callback
ret = callback()
File "/usr/lib64/python2.6/site-packages/tornado/stack_context.py", line 275, in null_wrapper
return fn(*args, **kwargs)
File "/usr/lib64/python2.6/site-packages/tornado/ioloop.py", line 571, in <lambda>
self.add_future(ret, lambda f: f.result())
File "/usr/lib64/python2.6/site-packages/tornado/concurrent.py", line 109, in result
raise_exc_info(self._exc_info)
File "/usr/lib64/python2.6/site-packages/tornado/gen.py", line 633, in run
yielded = self.gen.send(value)
File "crawler.py", line 104, in run_crawler
update_state(state_id, False)
UnboundLocalError: local variable 'state_id' referenced before assignment
Hi I get this error:
2015-02-15 21:36:52,889 Cannot connect to your SMTP account. Correct your config and try again. Error details:
2015-02-15 21:36:52,917 'bytes' object has no attribute 'encode'
2015-02-15 21:36:52,927 Notifier loading failed,check configuration for errors
Traceback (most recent call last):
File "crawler.py", line 124, in <module>
notifier = getattr(n_module, n_classname)(config)
File "/home/<user>/kimsufi-crawler/notifiers/email_notifier.py", line 26, in __init__
super(EmailNotifier, self).__init__(config)
File "/home/<user>/kimsufi-crawler/notifiers/base_notifier.py", line 12, in __init__
self.check_requirements()
File "/home/<user>/kimsufi-crawler/notifiers/email_notifier.py", line 37, in check_requirements
server.login(self.fromuser, self.frompwd)
File "/usr/lib/python3.4/smtplib.py", line 642, in login
"%s %s" % (AUTH_LOGIN, encode_base64(user.encode('ascii'), eol='')))
AttributeError: 'bytes' object has no attribute 'encode'
However on checking config.json my details are correct:
{
"servers": ["KS-1"],
"zones": ["rbx", "sbg"],
"notifier": "email",
"use_starttls": true,
"from_email": "<EMAIL>",
"from_user": "<EMAIL>",
"from_pwd": "<PASSWORD>",
"from_smtp_host": "smtp.gmail.com",
"from_smtp_port": 587,
"to_email": "<SAME EMAIL>"
}
I also did this: https://support.google.com/accounts/answer/6010255
Any ideas?
basically, it would scan all of the available types and alert on new ones that arent in that list.
This would allow you to be alerted to 'specials' that arent in the normal server options as soon as they become available.
I have run script just now, when I see available servers in web-interface:
But there are not any announcements in console or on my email:
# python crawler.py
2015-04-18 12:03:22,697 SMTP server check passed
2015-04-18 12:03:22,697 Notification system check passed
2015-04-18 12:03:22,698 Starting IO loop
My config.json
:
{
"servers": ["KS-2", "KS-2 SSD", "KS-3"],
"zones": ["rbx", "sbg"],
"notifier": "email",
"from_email": "<gmail-account>",
"from_pwd": "<password>",
"from_smtp_host": "smtp.gmail.com",
"to_email": "<my-email>",
"crawler_interval": 20
}
Here is content of https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2
at this time:
Would it be possible to implement a test feature that sends an SMS / Email to see if everything is working correctly? I don't believe I'm getting an e-mail.
A good idea would be to add some kind of support for IRC notifier. Running crawler could spawn an IRC connection, and messages could be delivered as private or channel messages.
Additional config keys would be at least IRC server, IRC nick, message target.
There seems to be a library in pypi to support IRC protocol with tornado: https://pypi.python.org/pypi/tornado-irc/0.0.1
My run failed because gmail blocked the auth attempt until I specifically whitelisted it.
Is there any chance you can add a function to test the smtp server prior to running?
Something like python crawler.py --test_smtp
When trying to run with the notifiers option on Ubuntu 15.10 I was getting the error to install easygui:
2016-06-24 09:28:01,372 Notifier loading failed, check config for errors
Traceback (most recent call last):
File "crawler.py", line 147, in <module>
NOTIFIER = getattr(_NOTIFIER_MODULE, _NOTIFIER_CLASSNAME)(_CONFIG)
File "/usr/local/basement/kimsufi-crawler/notifiers/base_notifier.py", line 12, in __init__
self.check_requirements()
File "/usr/local/basement/kimsufi-crawler/notifiers/popup_notifier.py", line 19, in check_requirements
"easygui Python library is required for popup notifications. "
Warning: easygui Python library is required for popup notifications. You can install it by running in terminal:
sudo easy_install easygui
, even though it had been installed via the requirements.txt file. A quick test revealed that easygui was missing a dependency:
Python 2.7.10 (default, Oct 14 2015, 16:09:02)
[GCC 5.2.1 20151010] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> __import__('easygui');
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/easygui/__init__.py", line 34, in <module>
from .boxes.button_box import buttonbox
File "/usr/local/lib/python2.7/dist-packages/easygui/boxes/button_box.py", line 19, in <module>
import utils as ut
File "/usr/local/lib/python2.7/dist-packages/easygui/boxes/utils.py", line 43, in <module>
raise ImportError("Unable to find tkinter package.")
ImportError: Unable to find tkinter package.
Quick fix was to run sudo apt-get install python-tk python3-tk
Not very familiar with Python, but either hoisting imported module errors to crawler.py or having a check for python-tk seems like a good fix. Additionally, since it is very minor and possible an edge case, a simple mention in the README might do just as well.
I want to monitor for Europe and Canada servers, how do I do this?
I want to enable Email notification and pushbullet notification, how do I do this?
I have not yet received a notification for a ks-2a even though it has been available while i was running your script. I have tested succesfully with ks-3a and ks-2e. I have tried the online notifiers and they seem to have the same problem about not notifying for ks-2a.
I'm sorry if this is not a real problem, but i thought they might have changed the link for ks-2a or something.
Thank you for your script and time!
I have my script set up but I want to make my computer emit an alarm sound as well as have a push notification on screen.
I tried to use sys.stdout.write('\a') but when I put it on line 111 of the script the alarm went off seemingly at random.
It would be nice to my computer checking without my needing to look at the screen.
did something wrong?
{
"servers": ["KS-1"],
"region": "europe",
"notifier": "smsapi",
"smsapi_username": "[email protected]",
"smsapi_password": "<(MD5password)>",
"smsapi_recipient": "4879000****",
"crawler_interval": 30,
"request_timeout": 30
}
$ python crawler.py
2016-06-09 13:59:14,788 Notifier loading failed, check config for errors
Traceback (most recent call last):
File "crawler.py", line 146, in <module>
_NOTIFIER_MODULE = importlib.import_module(_NOTIFIER_FILE)
File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/home/iscolor/kimsufi-crawler/notifiers/smsapi_notifier.py", line 6, in <module>
from smsapi import SmsApi
ImportError: cannot import name SmsApi
KS-4B is one of the new 2016 models, its not listed in https://github.com/MA3STR0/kimsufi-crawler/blob/master/mapping/server_types.json
Would adding it to this json file allow it to be searched?
I have been trying all morning to get this script working on my NAS (Netgear Readynas RN104, running OS6.20). I am under the impression that given how Python works there should be minimal dispruption running scripts on different devices.
I have configured a gmail account and renamed the file as described in the readme.
I had Python installed from previous script use and installed Tornado as below.
aptitude install python-tornado The following NEW packages will be installed: librtmp0{a} python-pycurl{a} python-tornado The following packages are RECOMMENDED but will NOT be installed: python-mysqldb 0 packages upgraded, 3 newly installed, 0 to remove and 2 not upgraded. Need to get 402 kB of archives. After unpacking 1,438 kB will be used. Do you want to continue? [Y/n/?] y Get: 1 http://mirrors.kernel.org/debian/ wheezy/main librtmp0 armel 2.4+20111222.git4e06e21-1 [58.7 kB] Get: 2 http://mirrors.kernel.org/debian/ wheezy/main python-pycurl armel 7.19.0-5 [89.3 kB] Get: 3 http://mirrors.kernel.org/debian/ wheezy/main python-tornado all 2.3-2 [254 kB] Fetched 402 kB in 11s (34.0 kB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package librtmp0:armel. (Reading database ... 36003 files and directories currently installed.) Unpacking librtmp0:armel (from .../librtmp0_2.4+20111222.git4e06e21-1_armel.deb) ... Selecting previously unselected package python-pycurl. Unpacking python-pycurl (from .../python-pycurl_7.19.0-5_armel.deb) ... Selecting previously unselected package python-tornado. Unpacking python-tornado (from .../python-tornado_2.3-2_all.deb) ... Setting up librtmp0:armel (2.4+20111222.git4e06e21-1) ... Setting up python-pycurl (7.19.0-5) ... Setting up python-tornado (2.3-2) ... Processing triggers for libc-bin ... Processing triggers for python-support ...
On running python crawler.py as root I receive the following error -
python crawler.py Traceback (most recent call last): File "crawler.py", line 13, in <module> from tornado.gen import coroutine ImportError: cannot import name coroutine
I am not sure how to move forward from this point.
Would be possible to implement a feature where i can send emails to multiple recipients or to send multiple notifications via email and sms at the same time?
Basically i need the first one, the second can come up later
needs an update of server types missing KS-2SSD etc
and can see that "zone":"fr"}, is missing?
https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2
Hello Guys,
when i try to start the crawler i get this error:
root@server:~/kimsufi-crawler# python crawler.py
2015-02-12 21:46:41,663 Cannot connect to your SMTP account. Correct your config and try again. Error details:
2015-02-12 21:46:41,664 character mapping must return integer, None or unicode
2015-02-12 21:46:41,664 Notifier loading failed,check configuration for errors
Traceback (most recent call last):
File "crawler.py", line 124, in
notifier = getattr(n_module, n_classname)(config)
File "/root/kimsufi-crawler/notifiers/email_notifier.py", line 26, in init
super(EmailNotifier, self).init(config)
File "/root/kimsufi-crawler/notifiers/base_notifier.py", line 12, in init
self.check_requirements()
File "/root/kimsufi-crawler/notifiers/email_notifier.py", line 37, in check_requirements
server.login(self.fromuser, self.frompwd)
File "/usr/lib/python2.7/smtplib.py", line 598, in login
(code, resp) = self.docmd(encode_cram_md5(resp, user, password))
File "/usr/lib/python2.7/smtplib.py", line 562, in encode_cram_md5
response = user + " " + hmac.HMAC(password, challenge).hexdigest()
File "/usr/lib/python2.7/hmac.py", line 72, in init
self.outer.update(key.translate(trans_5C))
TypeError: character mapping must return integer, None or unicode
Python 2.7.3
tornado 4.0.2
I check my SMTP Setting in a Local Thunderbird and the are working!
Thanks for your help in advance.
Hello,
I have this error.. !
2019-11-11 18:33:01,560 Starting main loop
.2019-11-11 18:33:09,673 No answer from API: {u'answer': None, u'version': u'1.0', u'id': 0, u'error': {u'status': u'252', u'message': u"Unknown function 'dedicated#getAvailability2'", u'__class': u'result:error', u'value': None, u'exceptionType': u'Unavailable'}}
Thanks you !
I had an exception when I wanted to use the popup
notifier :
2016-12-12 14:47:34,344 Notifier loading failed, check config for errors
Traceback (most recent call last):
File "crawler.py", line 147, in <module>
NOTIFIER = getattr(_NOTIFIER_MODULE, _NOTIFIER_CLASSNAME)(_CONFIG)
File "/home/sylvain/Perso/repositories/kimsufi-crawler-2/notifiers/base_notifier.py", line 12, in __init__
self.check_requirements()
File "/home/sylvain/Perso/repositories/kimsufi-crawler-2/notifiers/popup_notifier.py", line 19, in check_requirements
"easygui Python library is required for popup notifications. "
Warning: easygui Python library is required for popup notifications. You can install it by running in terminal:
sudo easy_install easygui
Then I tried to run sudo easy_install easygui
but it seems like the bot wasn't starting yet.
I'm on Ubuntu 16.04 and I had to install easygui
via apt :
sudo apt-get install python-easygui
Then the bot started normally ๐
Maybe a little doc improvement ? ๐
File "/usr/local/lib/python2.7/dist-packages/tornado/ioloop.py", line 565, in _run_callback
ret = callback()
File "/usr/local/lib/python2.7/dist-packages/tornado/stack_context.py", line 275, in null_wrapper
return fn(_args, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/tornado/ioloop.py", line 571, in
self.add_future(ret, lambda f: f.result())
File "/usr/local/lib/python2.7/dist-packages/tornado/concurrent.py", line 109, in result
raise_exc_info(self._exc_info)
File "/usr/local/lib/python2.7/dist-packages/tornado/gen.py", line 633, in run
yielded = self.gen.send(value)
File "crawler.py", line 95, in run_crawler
availability = response_json['answer']['availability']
TypeError: 'NoneType' object has no attribute 'getitem'
python crawler.py
2019-09-12 11:59:10,063 Notifier loading failed, check config for errors
Traceback (most recent call last):
File "crawler.py", line 190, in
NOTIFIER = getattr(_NOTIFIER_MODULE, _NOTIFIER_CLASSNAME)(_CONFIG)
File "/home/jkvint/kimsufi-crawler/notifiers/telegram_notifier.py", line 18, in init
super().init(config)
I'm missing availabilities because the servers apparently get ordered (perhaps programmatically) in less than the 15 seconds it takes for the crawler to respawn.
I know there's a rate limit set up by OVH, but I believe it's less than 15sec
(I think it's 500 calls per hour-long slot).
When there is an error from the server (for example the request quota is exceeded), we may think that it is working fine while it's not. What could be great is logging the error content.
What would be even better is wait until the quota is filled again when it's exceeded.
These are the errors I get after leaving the crawler running a few weeks. Not once had it send me an alert.
2016-12-18 18:04:07,957 Too many HTTP Errors: [HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330dd0>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff5e9b9d0>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2dc8850>,request_time=0.3997790813446045,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330e30>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff5e9b990>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2b49b90>,request_time=0.40337395668029785,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330e90>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff24befd0>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2b49e10>,request_time=0.4039421081542969,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330ef0>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff2b681d0>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff24be710>,request_time=0.39690589904785156,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330d70>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff2333c10>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff24be610>,request_time=0.4051640033721924,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330d10>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff2b68410>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2b495d0>,request_time=0.40108799934387207,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330cb0>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff23aaa90>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2333bd0>,request_time=0.403148889541626,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330c50>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff23aa250>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff2b49650>,request_time=0.40256690979003906,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330bf0>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff23aa150>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff23aa8d0>,request_time=0.3977830410003662,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330b90>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff23aa710>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff23aa890>,request_time=0.4003868103027344,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330a70>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff2333f10>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff23aa6d0>,request_time=0.4134371280670166,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330a10>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff3535450>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff23aac50>,request_time=0.4054241180419922,time_info={})), HTTPError(500, 'Internal Server Error', HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f8ff2330ad0>,code=500,effective_url='https://ws.ovh.com/dedicated/r2/ws.dispatcher/getAvailability2',error=HTTPError(...),headers=<tornado.httputil.HTTPHeaders object at 0x7f8ff2b68e90>,reason='Internal Server Error',request=<tornado.httpclient.HTTPRequest object at 0x7f8ff23aacd0>,request_time=0.41061997413635254,time_info={}))]
Keep getting these
2015-02-17 07:20:51,581 HTTP Error: HTTP 599: Timeout
2015-02-17 07:23:51,506 HTTP Error: HTTP 599: Timeout
2015-02-17 07:24:21,438 HTTP Error: HTTP 599: Timeout
2015-02-17 07:44:51,524 HTTP Error: HTTP 599: Timeout
2015-02-17 07:59:21,704 HTTP Error: HTTP 599: Timeout
2015-02-17 08:05:06,058 HTTP Error: HTTP 500: Internal Server Error
2015-02-17 08:06:51,498 HTTP Error: HTTP 599: Timeout
2015-02-17 11:28:04,438 HTTP Error: HTTP 500: Internal Server Error
2015-02-17 13:18:51,432 HTTP Error: HTTP 599: Timeout
2015-02-17 13:38:04,602 HTTP Error: HTTP 500: Internal Server Error
2015-02-17 17:24:03,527 HTTP Error: HTTP 500: Internal Server Error
Any idea why? It's never sending any emails
Keep config file minimal:
from_smtp_port
optional, using 587 by defaultfrom_smtp_port
and from_user
from config exampleAll the listed servers are no longer valid on soyoustart :(
I dont mind updating the server file, but need some kind of how to on how the format works.
Implement notification backends as pluggable python modules.
Load dynamically on startup, based on selection in config.
From what I see Kimsufi have added most of the servers 2 times on the table and due to this the crawler maybe doesn't detect this properly... There was server delivered few days ago and I didn't even get an alert that there was an available server.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.