Hi, I have attempted to use this provider on the 2.6.2 version of Airflow and it seems to me like it doesn't push the logs to Loki.
Once I open the logs view for a task, it loads local logs but I can see in the docker logs that it is trying to load it from Loki.
airflow-airflow-webserver-1 | [2023-07-08T06:39:49.086+0000] {loki_task_handler.py:134} INFO - loki log query params {'query': ' {dag_id="crm-elastic-dag",task_id="hello"}\n
| json try_number="try_number",map_index="map_index",run_id="run_id"\n | try_number="1" and\n map_index="-1" and\n run_id="manual__2023-07-08T06:39:32.209086+00:00"\n | __error__!="JSONParserErr"\n ', 'start': '2023-06-23T06:39:34.475096+00:00', 'end': '2023-07-08T07:39:34.724944+00:00', 'limit': 5000, 'direction': 'forward'}
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.041+0000] {scheduler_job_runner.py:412} INFO - 1 tasks up for execution:
airflow-airflow-scheduler-1 | <TaskInstance: crm-elastic-dag.hello manual__2023-07-08T06:46:49.915800+00:00 [scheduled]>
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.041+0000] {scheduler_job_runner.py:480} INFO - DAG crm-elastic-dag has 0/16 running and queued tasks
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.042+0000] {scheduler_job_runner.py:587} INFO - Setting the following tasks to queued state:
airflow-airflow-scheduler-1 | <TaskInstance: crm-elastic-dag.hello manual__2023-07-08T06:46:49.915800+00:00 [scheduled]>
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.046+0000] {scheduler_job_runner.py:625} INFO - Sending TaskInstanceKey(dag_id='crm-elastic-dag', task_id='hello', run_id='manual__2023-07-08T06:46:49.915800+00:00', try_number=1, map_index=-1) to executor with priority 1 and queue default
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.047+0000] {base_executor.py:147} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'crm-elastic-dag', 'hello', 'manual__2023-07-08T06:46:49.915800+00:00', '--local', '--subdir', 'DAGS_FOLDER/crm-elastig-dag.py']
airflow-airflow-worker-1 | [2023-07-08 06:46:51,056: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[8a3bb8d5-9a63-4495-b234-ef7762a1a788] received
airflow-airflow-worker-1 | [2023-07-08 06:46:51,066: INFO/ForkPoolWorker-15] [8a3bb8d5-9a63-4495-b234-ef7762a1a788] Executing command in Celery: ['airflow', 'tasks', 'run', 'crm-elastic-dag', 'hello', 'manual__2023-07-08T06:46:49.915800+00:00', '--local', '--subdir', 'DAGS_FOLDER/crm-elastig-dag.py']
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.107+0000] {scheduler_job_runner.py:677} INFO - Received executor event with state queued for task instance TaskInstanceKey(dag_id='crm-elastic-dag', task_id='hello', run_id='manual__2023-07-08T06:46:49.915800+00:00', try_number=1, map_index=-1)
airflow-airflow-scheduler-1 | [2023-07-08T06:46:51.131+0000] {scheduler_job_runner.py:703} INFO - Setting external_id for <TaskInstance: crm-elastic-dag.hello manual__2023-07-08T06:46:49.915800+00:00 [queued]> to 8a3bb8d5-9a63-4495-b234-ef7762a1a788
airflow-airflow-worker-1 | [2023-07-08 06:46:51,151: INFO/ForkPoolWorker-15] Filling up the DagBag from /opt/airflow/dags/crm-elastig-dag.py
airflow-airflow-worker-1 | [2023-07-08 06:46:51,191: INFO/ForkPoolWorker-15] This is a log message
airflow-airflow-worker-1 | [2023-07-08 06:46:52,330: INFO/ForkPoolWorker-15] Running <TaskInstance: crm-elastic-dag.hello manual__2023-07-08T06:46:49.915800+00:00 [queued]> on host d5752f79742d
airflow-airflow-webserver-1 | 127.0.0.1 - - [08/Jul/2023:06:46:53 +0000] "GET /health HTTP/1.1" 200 243 "-" "curl/7.74.0"
airflow-airflow-worker-1 | [2023-07-08 06:46:53,394: INFO/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[8a3bb8d5-9a63-4495-b234-ef7762a1a788] succeeded in 2.335277110338211s: None
airflow-airflow-scheduler-1 | [2023-07-08T06:46:54.344+0000] {dagrun.py:616} INFO - Marking run <DagRun crm-elastic-dag @ 2023-07-08 06:46:49.915800+00:00: manual__2023-07-08T06:46:49.915800+00:00, state:running, queued_at: 2023-07-08 06:46:49.935860+00:00. externally triggered: True> successful
airflow-airflow-scheduler-1 | [2023-07-08T06:46:54.344+0000] {dagrun.py:682} INFO - DagRun Finished: dag_id=crm-elastic-dag, execution_date=2023-07-08 06:46:49.915800+00:00, run_id=manual__2023-07-08T06:46:49.915800+00:00, run_start_date=2023-07-08 06:46:50.980423+00:00, run_end_date=2023-07-08 06:46:54.344670+00:00, run_duration=3.364247, state=success, external_trigger=True, run_type=manual, data_interval_start=2023-07-07 00:00:00+00:00, data_interval_end=2023-07-08 00:00:00+00:00, dag_hash=c848848d668b428fd5345193e82ebc08
airflow-airflow-scheduler-1 | [2023-07-08T06:46:54.354+0000] {dag.py:3490} INFO - Setting next_dagrun for crm-elastic-dag to 2023-07-08T00:00:00+00:00, run_after=2023-07-09T00:00:00+00:00
airflow-airflow-scheduler-1 | [2023-07-08T06:46:54.391+0000] {scheduler_job_runner.py:677} INFO - Received executor event with state success for task instance TaskInstanceKey(dag_id='crm-elastic-dag', task_id='hello', run_id='manual__2023-07-08T06:46:49.915800+00:00', try_number=1, map_index=-1)
airflow-airflow-scheduler-1 | [2023-07-08T06:46:54.398+0000] {scheduler_job_runner.py:733} INFO - TaskInstance Finished: dag_id=crm-elastic-dag, task_id=hello, run_id=manual__2023-07-08T06:46:49.915800+00:00, map_index=-1, run_start_date=2023-07-08 06:46:52.740642+00:00, run_end_date=2023-07-08 06:46:53.212918+00:00, run_duration=0.472276, state=success, executor_state=success, try_number=1, max_tries=0, job_id=67, pool=default_pool, queue=default, priority_weight=1, operator=PythonOperator, queued_dttm=2023-07-08 06:46:51.043588+00:00, queued_by_job_id=63, pid=8798
airflow-airflow-webserver-1 | 172.29.0.23 - - [08/Jul/2023:06:46:55 +0000] "GET /get_logs_with_metadata?dag_id=crm-elastic-dag&task_id=hello&map_index=-1&execution_date=2023-07-08T06%3A39%3A32.209086%2B00%3A00&try_number=1&metadata=null HTTP/1.1" 200 2451 "https://airflow.local.lab.com/log?dag_id=crm-elastic-dag&task_id=hello&execution_date=2023-07-08T06%3A39%3A32.209086%2B00%3A00&map_index=-1" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36
Has anyone tried it recently for automatic pushing to Loki? Or maybe an example of a DAG task that works with it?