zweckj / lmcloud Goto Github PK
View Code? Open in Web Editor NEWLibrary to interface with La Marzocco's cloud
License: MIT License
Library to interface with La Marzocco's cloud
License: MIT License
The new version 3.1-rc6 of the gw fw seems to support a new endpoint for logs that is missing in the former v2.2-rc0.
strings home-gateway_v3.1-rc6.bin | grep /v1/
/api/v1/config
/api/v1/command/*
/api/v1/streaming
/api/v1/logs
Sending a GET request to that url causes the connection to be closed (instead of a 404 for an invalid uri).
I am not sure if that endpoint is of any use but maybe we can find out :-)
/api/v1/command/*
also looks kind of promising.
Thanks for your work on this! Is it possible to get a ping when shot extraction is started and ended from the Micra with this API?
This would be great to integrate it with a scale like the Acaia
Hello,
I have recently purchased a micra (still shipping) and have some resistance on having to keep an app on standby to access settings on the machine. I have been looking into building some basic controls via an ESP32 setup over bluetooth/micropython and found all this great work via the HA forums :).
I would like to build set/get temp controls which I can follow from their source and yours, however, I would also like to build an automatic shot timer. I see the localapi websocket seems to expose this notification: Would you know if this is also sent over bluetooth? I would like to keep the machine completely off of wifi if possible (or at least their cloud) and not sure if I can also get a local connection without also phoning LM directy.
This function has the on/off times passed in as float
, but expects an int
, resulting in TypeError
being raised:
async def configure_prebrew(
self, on_time=5000, off_time=5000, key: int = 1
) -> bool:
"""Set Pre-Brew details. Also used for preinfusion (prebrewOnTime=0, prebrewOnTime=ms)."""
if not isinstance(on_time, int) or not isinstance(off_time, int):
msg = "Prebrew times must be in ms (integer)"
_logger.debug(msg)
raise TypeError(msg)
From the HA integration:
async def set_prebrew_times(
self, key: int, seconds_on: float, seconds_off: float
) -> None:
"""Set the prebrew times of the machine."""
await self.configure_prebrew(
on_time=seconds_on * 1000, off_time=seconds_off * 1000, key=key
)
Failed to call service lamarzocco/set_preinfusion_time. Service call encountered error: Prebrew times must be in ms (integer)
In this function:
@property
def true_model_name(self) -> str:
"""Return the model name from the cloud, even if it's not one we know about.
Used for display only."""
if self.model_name == LaMarzoccoModel.LINEA_MICRA:
return "Linea Micra"
if self.model_name in LaMarzoccoModel:
return self.model_name
return f"Unsupported Model ({self.model_name})"
This part causes an exception because self.model_name
is a string and LaMarzoccoModel
is an enum.
if self.model_name in LaMarzoccoModel:
It results in exceptions like the following with a non-Micra model:
2023-12-29 13:03:08.499 ERROR (MainThread) [homeassistant] Error doing job: Task exception was never retrieved
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 495, in async_add_entities
tasks = [
^
File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 495, in <listcomp>
tasks = [
^
File "/config/custom_components/lamarzocco/binary_sensor.py", line 61, in <genexpr>
LaMarzoccoBinarySensorEntity(coordinator, hass, description)
File "/config/custom_components/lamarzocco/entity.py", line 53, in __init__
model=self._lm_client.true_model_name,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/lmcloud/lmcloud.py", line 123, in true_model_name
if self.model_name in LaMarzoccoModel:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/enum.py", line 740, in __contains__
raise TypeError(
TypeError: unsupported operand type(s) for 'in': 'str' and 'EnumType'
2023-12-29 13:03:08.505 ERROR (MainThread) [homeassistant] Error doing job: Task exception was never retrieved
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 495, in async_add_entities
tasks = [
^
File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 495, in <listcomp>
tasks = [
^
File "/config/custom_components/lamarzocco/button.py", line 44, in <genexpr>
LaMarzoccoButtonEntity(coordinator, hass, description)
File "/config/custom_components/lamarzocco/entity.py", line 53, in __init__
model=self._lm_client.true_model_name,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/lmcloud/lmcloud.py", line 123, in true_model_name
if self.model_name in LaMarzoccoModel:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/enum.py", line 740, in __contains__
raise TypeError(
TypeError: unsupported operand type(s) for 'in': 'str' and 'EnumType'
These exceptions completely break the HA integration with v0.13.2b2 of the integration and HA 2024.1.0b2. Looks like it came in with a7d4c69.
One of these tests would fix it:
if self.model_name in LaMarzoccoModel._value2member_map_:
return self.model_name
or
try:
return LaMarzoccoModel(self.model_name)
except:
return f"Unsupported Model ({self.model_name})"
I'm using this and it's working for me:
@property
def true_model_name(self) -> str:
"""Return the model name from the cloud, even if it's not one we know about.
Used for display only."""
if self.model_name == LaMarzoccoModel.LINEA_MICRA:
return "Linea Micra"
try:
return LaMarzoccoModel(self.model_name)
except:
return f"Unsupported Model ({self.model_name})"
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.