zabuldon / teslajsonpy Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
According to https://tesla-api.timdorr.com/api-basics/authentication
When the optional login_hint parameter is supplied with the GET request and the email is registered with a Tesla SSO service in another region this will respond with a 303 HTTP response code (See Other), which will redirect you to the Tesla SSO service in that region (e.g. auth.tesla.cn). Should this redirect happen you should continue using the region specific Tesla SSO host name in all subsequent steps. Easy way to test this is to use auth.tesla.cn with login_hint using an email registered under auth.tesla.com.
This isn't terribly important... the usage of "Parking Sensor" to indicate the car's shift state could confuse some folks into thinking the sensor has something to do with the Parking Sensors (sonar/ultrasonic sensors) that dot the perimeter of Teslas manufactured after 2014.
Feel free to mark as won't fix ๐
Reading the odometer from the "vehicle state" always shows Miles regardless of the settings in the GUI parameter "gui_distance_units". Needs to be multiplied by 1.609344 when "gui_distance_units" = "km/hr".
This also goes for the range parameters under "Charge State".
https://tesla-api.timdorr.com/vehicle/commands/trunk
Would be really nice!
I use Apple Watch integration with the HA Companion app, and would love to just tap a button on the watch to unlock the trunk/frunk.
A switch component for charging would be helpful, so that HA could manage more advanced charging scenarios/rules than the Tesla is able to handle on its own through schedule charging.
Here are some possible usage scenarios of having a charging switch within HA:
I'm getting this error in a live system. Can you please check and provide a fix? Please note there will be vehicles that may not have seat heaters.
File "/workspaces/home-assistant/homeassistant/components/tesla/config_flow.py", line 229, in validate_input
config = await controller.connect(
File "/workspaces/home-assistant/config/teslajsonpy/controller.py", line 316, in connect
self._add_components(car)
File "/workspaces/home-assistant/config/teslajsonpy/controller.py", line 549, in _add_components
self.__components.append(HeatedSeatSwitch(car, self, 'left'))
File "/workspaces/home-assistant/config/teslajsonpy/homeassistant/heated_seats.py", line 47, in __init__
self.__seat_heat_level = data['climate_state'][f'seat_heater_{seat_name}']
KeyError: 'climate_state'
I'll spawn a new issue with this comment so please respond there.
Originally posted by @alandtse in #180 (comment)
I can also assist in adding some CI if you enable GitHub Actions. However, I'll need access to add secrets so I can add a personal access token to push to this repo and to add my credentials to push to pypi automatically.
Right now, this library creates entities specifically for HA to consume. This has two downsides that I can see.
controller.list_vehicles()
returned a list of objects for HA and not just a list of vehicles. It also means people not using HA likely won't use this library which runs against the philosophy from HA requiring the library be put in pypi to allow others to reuse.I think we should move the HA related architecture into the HA component. I'd be happy to take that project on.
From an initial review, controller, connection, and exception make sense to me in this library. Everything based on vehicle probably should go into HA.
@zabuldon, this is your project to start, do you have any objections to this rearchitecture? I'll plan to start on doing that unless you object.
I can probably take a look at it since I had to do it for alexa_media_player.
A really useful addition to this library would be the ability to set the charge limit as a percentage. I would like to set the maximum charge limit differently on weekends to optimise for differences in required range, and cost of electricity.
There appears to be a bug where the list of online_cars is not being ever refreshed.
During debugging to fix #51 and seeing the available data from the api..
Is it a big thing to expose this in HA without any "processing" just so we could start using it to display information?
That could probably bring more interest and contributors to the project.
I'm currently working on an appdaemon app that uses this integration in order to start charging when the utility rate is at the lowest and stop it when it's high.
Are there any way today to force an update of data from the car?
For example, if I want to know how long time it will take to charge the car I simply start charging and wait until I get information on the sensor indicating charge rate and also time remaining.
Since this can take some time to refresh after charging has been started I'd like to force an update after say 30 seconds or so.
Is this already possible? Otherwise I'm thinking of implementing this most likely by adding a service (force update/wake up or similar) and would appreciate some input/feedback/thoughts regarding this or if anyone have another idea on how this can be solved?
After poking around a bit at the new seat heater functionality (thanks @alandtse and @Trupal00p), I starting thinking about current behavior and how it will integrate into HA.
Firstly, the current approach of checking for availability at startup can't work since it doesn't have the data for the car until after the initial setup and then only if the car is awake. My thought is that it should either add those as disabled entities by default and/or have a config option in the integration to enable them. I have a proof of concept on a local branch for the former option.
The other (probably more controversial option) is that we shouldn't actually represent them as switches but as sensors and then expose a service (tesla.set_heated_seats?) that controls them. There's no entity type that accurately represents the functionality so anything we do will be a hack. Fans might work as long as they can only have the SUPPORT_PRESET_MODE
feature, and climate could work if we only set the SUPPORT_PRESET_MODE
feature, but either one is a bit dirty IMO. I'm happy to work on an implementation for the service if that would be helpful as well.
I think it would be pretty sweet for the charge port to open automatically when I pull into the garage.
Another thing I noticed in my logging is that a call to get_vehicles is being made every second or so when the car is offline. This seems to be caused by the call in the update function:
if (cur_time - last_update > self.update_interval):
cars = self.get_vehicles()
Wouldn't it be better to update the last_update time also if the car is offline? In that case the next try would happen only after another update_interval period (300 seconds in my config).
Now testing the updated version. For some reason, I still see way to many calls to the Tesla API being made:
2019-02-22 15:15:40 DEBUG (Thread-14) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:15:41 DEBUG (Thread-16) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:15:42 DEBUG (Thread-7) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:15:43 DEBUG (Thread-7) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:15:43 DEBUG (Thread-7) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:00 DEBUG (Thread-3) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:00 DEBUG (Thread-3) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:00 DEBUG (Thread-3) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:05 DEBUG (Thread-21) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:05 DEBUG (Thread-21) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:05 DEBUG (Thread-4) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:06 DEBUG (Thread-4) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:06 DEBUG (Thread-21) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:07 DEBUG (Thread-4) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:08 DEBUG (Thread-9) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:09 DEBUG (Thread-9) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:09 DEBUG (Thread-21) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:10 DEBUG (Thread-21) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:10 DEBUG (Thread-7) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:10 DEBUG (Thread-7) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
2019-02-22 15:16:10 DEBUG (Thread-2) [teslajsonpy.connection] /api/1/vehicles/21382736132137618/data
2019-02-22 15:16:11 DEBUG (Thread-2) [teslajsonpy.controller] Wrapped <function Controller.update at 0x7fac4c27d510> fails with False
Additional info: args:(<teslajsonpy.controller.Controller object at 0x7fac4c2516d8>, 21382736132137618), kwargs:{'wake_if_asleep': False}, vehicle_id:21382736132137618, _car_online:{21382736132137618: True}
These calls are being logged by an additional debugging statement I added to connection.py
.
Originally posted by @bart-roos in #16 (comment)
As from the tweets from Elon you can get the idea that Sentry will have multiple options (always on, home, locations etc) we have no clue how long this will be. If you can add it, it would be as simple as "not driving? put on sentry mode" and we could easilly say "exclude if in this location" if we wanted too.
One test is failing. This came up during the review for the NixOS package.
Executing pytestCheckPhase
============================= test session starts ==============================
platform linux -- Python 3.8.6, pytest-6.1.2, py-1.9.0, pluggy-0.13.1
rootdir: /build/source, configfile: tox.ini
collected 151 items
tests/test_tesla_exception.py ............... [ 9%]
[...]
tests/unit_tests/homeassistant/test_vehicle_device.py ..Fssss [100%]
=================================== FAILURES ===================================
_____________________________ test_values_on_init ______________________________
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7ffff5e3e700>
def test_values_on_init(monkeypatch):
"""Test values after initialization."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_data = _mock.data_request_vehicle()
_device = VehicleDevice(_data, _controller)
assert _device is not None
assert _device.car_name() is not None
assert _device.car_name() == "Nikola 2.0"
assert _device.car_type is not None
assert _device.car_type == "Model S"
assert _device.car_version is not None
assert _device.car_version == ""
assert _device.id() is not None
assert _device.id() == 12345678901234567
assert _device.sentry_mode_available is not None
> assert _device.sentry_mode_available
E assert False
E + where False = <teslajsonpy.homeassistant.vehicle.VehicleDevice object at 0x7ffff5e3eb20>.sentry_mode_available
tests/unit_tests/homeassistant/test_vehicle_device.py:59: AssertionError
=========================== short test summary info ============================
FAILED tests/unit_tests/homeassistant/test_vehicle_device.py::test_values_on_init
=================== 1 failed, 60 passed, 90 skipped in 0.83s ===================
It would be nice if you could have a look. Thanks.
The api seems to expose more info then we are using atm. I would like to have a soc_limit as a sensors or attribute in home assistant. If this is acceptable i can create a PR for this. This would be usefull you you want to setup your own smart charging.
Given HA is Apache-2.0, I think we should swap this over to Apache-2.0 instead of WTFPL to be consistent.
@zabuldon Any objections since you chose the initial license? I can handle the migration, flagged you so you'd see it.
Hi.
I'd really like to see the homelink-integration in this api like https://tesla-api.timdorr.com/vehicle/commands/homelink
I see that the tests have included a homelink_device_count = 0.
Are there any plans to support this?
Best regards
It would be nice to be able to read if a charging cable is attached, so that one can automate reminders for i.e. attaching the cable.
In the Charge State query, the following two responses would be great to see:
"conn_charge_cable": "IEC" [IEC, SAE]
"charge_port_latch": "Engaged"
Edit:
After investigating the "charge state" response, adding a sensor called something like "ChargeCableAttatchedSensor" which basically is true when "charge_state" parameter is not "Disconnected"
Often when calling set_operation_mode
on a Tesla climate entity, the call either fails with an error or does not do anything at all.
Would it be possible/sensible to add a vehicle wake up call to the Climate.set_status
call?
I deleted the integration when errors started occurring, restarted Hass, and then tried to re-add. Logs attached from my attempt to re-add. FWIW, had the same problem with and without MFA enabled.
Reported here as well.
Currently, we have 4 ways of unlocking the charge cable on the tesla:
As the api is already available on Tesla, would it be possible to make it accessible through this library.
In that case, Home Assistant will be able to share the lock with Google Assistant, making the unplug as easy as asking your phone to do it.
Hello @zabuldon!
I am planning on making some pretty big improvements to the Tesla component in Home Assistant. In order to do them, I need to make several changes to this library. I am going to make these changes regardless, but I wanted to check to see if you're still active and would accept these changes to your library. If not, I'll fork and publish a new lib, but it would be overall better to keep it all here!
Proposed changes:
I haven't written this yet; I am just putting this here to give you time to see it and let me know how you'd like to proceed before I get to the point where I'm merging it in to Hass :)
Today, the polling interval will be set back to default (11 minutes) when the car moves from "Charging" to e.g "Complete", even if the car is still connected to the charger.
I don't know if this is intentional, but personally, I would prefer to keep on polling every minute even if the charging is done as long as it is still connected to charger.
teslajsonpy/teslajsonpy/controller.py
Line 628 in 5889b89
I don't know the best way to check if the car is still connected to the charger, as it has been some years since I played with the api, and remember from "back then" that some values could be a bit unreliable, and was lingering even after they should have been reset.
Some ideas:
If the drop in poll rate is intended, I would like to add something like a "always_poll_on_charger_connected" flag, so you can keep the default of not polling, but the user kan increase polling frequency if he wants.
Been on the forum and on discord but could not find any similar issue. I am using the HA component just fine, all shows and works but the only thing it does not show is the device_tracker for the car. I tried the javascript library (teslams) and it does show my car correctly. Any thoughts on how to debug this further?
By querying https://owner-api.teslamotors.com/api/1/products
I can get a list of all power walls linked to an account along with some basic info such as charge rate and level.
I'd like to get this into HA, and since it uses this library it makes sense for me to add it here - the login mechanism etc are exactly the same. But before I add the code I just wanted to check if this is still active and if such a PR would be accepted?
I was thinking about adding support for Tesla Solar, it would be nice to pull the current status of my solar system and then integrate it into Home Assistant.
It looks like you might be in the middle of refactoring, should I wait a bit before I start digging in, or go ahead and attempt something.
When using the API to poll the car, if the car is asleep, it will be woken. When using teslajsonpy with things like HomeAssistant where it periodiclaly polls to make graphs of the battery level for example, it keeps waking the car to get the battery status. This results in large 'vampire drain' when the car is sitting idle (overnight, while owner is at work, etc.).
Could a flag be added to the API to tell it what to do if the car was asleep?
Something like:
Vehicle.data_request(name, wake_if_asleep=False)
Where wake_if_asleep
is either True
(should be default to preserve current behavior) or False
. If set to False
and the vehicle is asleep, the call return value should be something that could be easily checked.
How to check:
The data returned from /api/1/vehicles/{id}
includes a state
key which is either online
or asleep
- this call should NOT wake the vehicle and allows checking first.
Hi,
I have been using a HA template to multiply; current, voltage and phases to calculate charging power.
After having a look at https://tesla-api.timdorr.com/vehicle/state/chargestate, I see that the API can return charger_power directly, but this currently isn't returned in telsajsonpy.
Could I request you include returning charger_power in a future release.
It would be nice to add some tests to be sure we didn't broke anything.
Given many HA configs are publicly hosted, posting the full VIN is probably not a good privacy setting.
Need the ability to avoid doing an update for individual vehicles. This is useful to reduce polling in evening hours and to disable polling for cars that may be parked for long periods of time to avoid battery drain (e.g., airports).
Cyclic dependency: teslajsonpy requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy, which requires teslajsonpy...
Line 33 in 8d1b00e
Hi @zabuldon, I just added sphinx support to the repo. Can you enable it on https://readthedocs.org/ please?
Edit: forgot my manners.
Environment: latest Hass with teslajsonpy 0.18.3
Issue: unable to authenticate with Tesla api, response to https://auth.tesla.com/oauth2/v3/authorize contains "Enter the characters in the picture" type captcha.
Up until recently, Tesla integration for HomeAssistant worked flawlessly.
Now it shows car entities as "unavailable", and if I add "teslajsonpy: debug" into Hass logger, the following output is produced:
(possibly identifying data replaced with "...")
Aug 08 12:06:54 hass[26844]: 2021-08-08 12:06:54 DEBUG (MainThread) [teslajsonpy.connection] Attempt #0
Aug 08 12:06:54 hass[26844]: 2021-08-08 12:06:54 DEBUG (MainThread) [teslajsonpy.connection] POST:
Aug 08 12:06:54 hass[26844]: https://auth.tesla.com/oauth2/v3/authorize?client_id=ownerapi&code_challenge=N...NQ%3D%3D&code_challenge_method=S256&redirect_uri=https://auth.tesla.com/void/callback&response_type=code&scope=openid+email+offline_access&state=b1...MQ with
Aug 08 12:06:54 hass[26844]: Headers({'host': 'auth.tesla.com', 'accept': '*/*', 'accept-encoding': 'gzip, deflate', 'connection': 'keep-alive', 'user-agent': 'HomeAssistant/2021.8.2 httpx/0.18.2 Python/3.9', 'cookie': '_abck=2...D~-1~-1~-1; ak_bmsc=79...bQ==; bm_sv=58A3...3Dx0=; bm_sz=EEC0...5601; tesla-auth.sid=s%3Ahc...uqw', 'content-length': '150', 'content-type': 'application/x-www-form-urlencoded'})
Aug 08 12:06:54 hass[26844]: returned 200:OK with response headers 'Headers([('content-type', 'text/html; charset=utf-8'), ('x-dns-prefetch-control', 'off'), ('x-frame-options', 'DENY'), ('strict-transport-security', 'max-age=15552000; includeSubDomains'), ('x-download-options', 'noopen'), ('x-content-type-options', 'nosniff'), ('x-xss-protection', '1; mode=block'), ('x-request-id', '3c...0fa'), ('x-correlation-id', '3ca...80fa'), ('cache-control', 'no-store'), ('content-security-policy', "connect-src 'self'; default-src 'none'; font-src 'self' data: fonts.gstatic.com; frame-src 'self' www.google.com www.recaptcha.net; img-src 'self' data:; script-src www.recaptcha.net 'self' 'nonce-9...8'; style-src 'unsafe-inline' 'self'"), ('x-content-security-policy', "connect-src 'self'; default-src 'none'; font-src 'self' data: fonts.gstatic.com; frame-src 'self' www.google.com www.recaptcha.net; img-src 'self' data:; script-src www.recaptcha.net 'self' 'nonce-9...8'; style-src 'unsafe-inline' 'self'"), ('x-webkit-csp', "connect-src 'self'; default-src 'none'; font-src 'self' data: fonts.gstatic.com; frame-src 'self' www.google.com www.recaptcha.net; img-src 'self' data:; script-src www.recaptcha.net 'self' 'nonce-9...8'; style-src 'unsafe-inline' 'self'"), ('etag', 'W/"70...DCa+pI"'), ('x-response-time', '25.724ms'), ('content-encoding', 'gzip'), ('x-edgeconnect-midmile-rtt', '26'), ('x-edgeconnect-origin-mex-latency', '172'), ('originip', '209.133.79.56'), ('x-akamai-transformed', '9 6110 0 pmb=mTOE,1'), ('date', 'Sun, 08 Aug 2021 09:06:54 GMT'), ('content-length', '5950'), ('connection', 'keep-alive'), ('vary', 'Accept-Encoding'), ('set-cookie', 'tesla-auth.sid=s%3Ah...qw; Path=/; Expires=Wed, 11 Aug 2021 09:06:54 GMT; HttpOnly; Secure; SameSite=Lax'), ('origin_hostname', 'auth.tesla.com'), ('permissions-policy', 'interest-cohort=()'), ('set-cookie', 'bm_sv=58A3...uAjc=; Domain=.tesla.com; Path=/; Max-Age=6841; HttpOnly'), ('set-cookie', '_abck=2B58...~-1; Domain=.tesla.com; Path=/; Expires=Mon, 08 Aug 2022 09:06:54 GMT; Max-Age=31536000; Secure')])'
Aug 08 12:06:57 hass[26844]: 2021-08-08 12:06:57 DEBUG (MainThread) [teslajsonpy.connection] Failed to authenticate
Aug 08 12:07:57 hass[26844]: 2021-08-08 12:07:57 DEBUG (MainThread) [teslajsonpy.connection] Token expiration in -1 day, 23:52:16
Aug 08 12:07:57 hass[26844]: 2021-08-08 12:07:57 DEBUG (MainThread) [teslajsonpy.connection] Oauth expiration detected
Aug 08 12:07:57 hass[26844]: 2021-08-08 12:07:57 DEBUG (MainThread) [teslajsonpy.connection] Getting sso auth code using credentials
Aug 08 12:07:57 hass[26844]: 2021-08-08 12:07:57 DEBUG (MainThread) [teslajsonpy.connection] Attempt #1
Output HTML also has the following:
var messages = {"captcha":["Captcha is required","Captcha does not match"]};
so I guess Tesla Auth now requires some additional auth steps.
Re-adding Tesla integration, enabling/disabling MFA at Tesla side do not work as well.
Hi Zabuldon,
first of thanks for maintaining this project. I appreciate the time and effort you put into this. Maybe you can fix the problem described below when you have some spare time. Thanks.
Description:
Adding this integration to HA results in an error handling the request. The procedure stops due to a component error relating to an index range. I could not understand where it is failing exactly, I have tested this with random length username/password with and without special chars, to rule out this is causing the issue. This has been posted on HA forums too and the issue was confirmed by multiple users.
Issue: HA Error Handling the Request
Error message: IndexError: tuple index out of range"
Result: Integration not working
Related file: connection.py
Error Log from HA:
Logger: aiohttp.server
Source: components/tesla/config_flow.py:157
First occurred: 10:55:51 AM (7 occurrences)
Last logged: 11:21:07 AM
Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_protocol.py", line 422, in _handle_request
resp = await self._request_handler(request)
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_app.py", line 499, in _handle
resp = await handler(request)
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_middlewares.py", line 118, in impl
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/security_filter.py", line 56, in security_filter_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/request_context.py", line 18, in request_context_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/ban.py", line 72, in ban_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/auth.py", line 127, in auth_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/view.py", line 129, in handle
result = await result
File "/usr/src/homeassistant/homeassistant/components/config/config_entries.py", line 169, in post
return await super().post(request, flow_id)
File "/usr/src/homeassistant/homeassistant/components/http/data_validator.py", line 60, in wrapper
result = await method(view, request, *args, **kwargs)
File "/usr/src/homeassistant/homeassistant/helpers/data_entry_flow.py", line 106, in post
result = await self._flow_mgr.async_configure(flow_id, data)
File "/usr/src/homeassistant/homeassistant/data_entry_flow.py", line 155, in async_configure
result = await self._async_handle_step(flow, cur_step["step_id"], user_input)
File "/usr/src/homeassistant/homeassistant/data_entry_flow.py", line 213, in _async_handle_step
result: Dict = await getattr(flow, method)(user_input)
File "/usr/src/homeassistant/homeassistant/components/tesla/config_flow.py", line 57, in async_step_user
info = await validate_input(self.hass, user_input)
File "/usr/src/homeassistant/homeassistant/components/tesla/config_flow.py", line 157, in validate_input
(config[CONF_TOKEN], config[CONF_ACCESS_TOKEN]) = await controller.connect(
File "/usr/local/lib/python3.8/site-packages/teslajsonpy/controller.py", line 268, in connect
cars = await self.get_vehicles()
File "/usr/local/lib/python3.8/site-packages/backoff/_async.py", line 133, in retry
ret = await target(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/teslajsonpy/controller.py", line 357, in get_vehicles
return (await self.__connection.get("vehicles"))["response"]
File "/usr/local/lib/python3.8/site-packages/teslajsonpy/connection.py", line 78, in get
return await self.post(command, "get", None)
File "/usr/local/lib/python3.8/site-packages/teslajsonpy/connection.py", line 96, in post
self.code = await self.get_authorization_code(
File "/usr/local/lib/python3.8/site-packages/teslajsonpy/connection.py", line 372, in get_authorization_code
code_url = URL(resp.history[-1].url)
IndexError: tuple index out of range
As I can sees in one of updates (when?) Tesla was introduced WebSockets API.
That make sense to re-write library with async-io and websockets.
@zabuldon now that we have more people helping, I was thinking it made sense to set up a two branch system where there's a dev
branch for PRs to commit to and master
is used for releases only.
Any objections? I don't have the rights to change the default branch to dev
though.
I'm using the library using home-assistant integration and works fine. However, after using this integration I notice i'm being forcefully logged out of my android app every once in a while which is annoying when I want to unlock the car :)
This makes me believe the repeated authentications is rate-limited in some way, or at least, there are protections in place.
Instead of re-authenticating with the user/pass every time, we should use the refresh_token instead to fix this behaviour.
https://tesla-api.timdorr.com/api-basics/authentication#post-oauth-token-grant_type-refresh_token
Should act like the existing binary_sensor.updater
I am trying to determine if I need to flag the home-assistant version bump with a breaking change label due to potential changing entity_ids. Coming to this repo, I was going to compare the changes between versions when I discovered that there are no version tags at all!
We should have version tags added here so that the changes between versions can be seen.
Hey @alandtse, I was looking into your home-assistant dev and found this: alandtse/home-assistant@ee73d56
I like this change. May I suggest something I'm running currently with good results (however, I did it hardcoded):
This way, automations would change the polling interval instead of disabling it.
3 automations are enough to control this switch to allow the car to sleep as well do a fast polling inbetween.
I'm running with min = 60 and default = 900 and it's working really good. After a 15min idle time, polling is reduced from min to default. It returns to min when car awakes from sleep or it's not idle anymore.
Point me to a branch and I can try to implement and submit the pull request.
Thanks!
Give the choice to limit the resulting components to a subset of the cars connected to the account.
This is very useful in case the user has access to more than one car but does not want to control all of them in Home Assistant.
This could be achieved by passing an optional list of VINs to connect(...)
Tesla Accounts now support Multi-Factor Authentication (MFA) [1]. It would be great to add MFA support here as well.
[1] https://www.tesla.com/support/multi-factor-authentication
It would be great if we would be able to fully control HVAC mode, e.g. turn the HVAC to HIGH which would allow me to quickly defrost my Model 3. At the moment, we are only able to turn it on or off.
When using the Tesla plugin for Home-assistant it crashes when trying to initialize. I tracked this back to the constructor for the teslaAPI object from your library. Using python v3.6 on Mac OS X.
I wrote the following python which replicates the issue:
(Username and password purposefully obscured)
from teslajsonpy import Controller as teslaAPI, TeslaException
controller = teslaAPI('<>', '<>', 300)
Exception that occurs:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 1318, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1026, in _send_output
self.send(msg)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 964, in send
self.connect()
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1400, in connect
server_hostname=server_hostname)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py", line 407, in wrap_socket
_context=self, _session=session)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py", line 814, in init
self.do_handshake()
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py", line 1068, in do_handshake
self._sslobj.do_handshake()
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py", line 689, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "test.py", line 3, in
controller = teslaAPI('<>, '<>', 300)
File "/Users/rodtoll/github/teslajsonpy/teslajsonpy/controller.py", line 24, in init
cars = self.__connection.get('vehicles')['response']
File "/Users/rodtoll/github/teslajsonpy/teslajsonpy/connection.py", line 29, in get
return self.post(command, None)
File "/Users/rodtoll/github/teslajsonpy/teslajsonpy/connection.py", line 35, in post
auth = self.__open("/oauth/token", data=self.oauth)
File "/Users/rodtoll/github/teslajsonpy/teslajsonpy/connection.py", line 60, in __open
resp = opener.open(req)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 526, in open
response = self._open(req, data)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 544, in _open
'_open', req)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 1361, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 1320, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)>
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.