Comments (2)
- on a normal working deployment run the flow logs start with:
Worker 'KubernetesWorker bdbd6bb1-2083-4f55-ab1b-77ff81b20182' submitting flow run 'a4c220af-e62b-481b-bfab-64e5cd8a1fd6'
Creating Kubernetes job...
Completed submission of flow run 'a4c220af-e62b-481b-bfab-64e5cd8a1fd6'
Job 'transparent-llama-44pb7': Pod has status 'Pending'.
Job 'transparent-llama-44pb7': Pod has status 'Running'.
Opening process...
Downloading flow code from storage at '.'
....
- on a failed deployment run the flow logs start straightaway with:
Downloading flow code from storage at '.'
Flow could not be retrieved from deployment.
Traceback (most recent call last):
File "", line 879, in exec_module
File "", line 1016, in get_code
File "", line 1073, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmp329n4crpprefect/pipelines/flows/test.py'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 422, in retrieve_flow_then_begin_flow_run
else await load_flow_from_flow_run(flow_run, client=client)
File "/usr/local/lib/python3.10/site-packages/prefect/client/utilities.py", line 100, in with_injected_client
return await fn(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/prefect/deployments/deployments.py", line 317, in load_flow_from_flow_run
flow = await run_sync_in_worker_thread(load_flow_from_entrypoint, str(import_path))
File "/usr/local/lib/python3.10/site-packages/prefect/utilities/asyncutils.py", line 136, in run_sync_in_worker_thread
return await anyio.to_thread.run_sync(
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/prefect/flows.py", line 1668, in load_flow_from_entrypoint
flow = import_object(entrypoint)
File "/usr/local/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 201, in import_object
module = load_script_as_module(script_path)
File "/usr/local/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 164, in load_script_as_module
raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'pipelines/flows/test.py' encountered an exception: FileNotFoundError(2, 'No such file or directory')
Summary: it is as if the worker skips some steps, any pointers of where the problem might be?
EDIT: after further investigation I notice that when it errors, the logs in prefect server UI do not even show up in the worker CLI logs, any pointers of this erratic behavior?
from prefect.
SOLVED: old agent (currently running still) was "stealing" flow runs from worker work pool, because the work queue names were the same;
from prefect.
Related Issues (20)
- Incompatible with `uvloop="^0.19.0` HOT 1
- CLI: Provide URL in output after running `work-pool create`, etc. HOT 1
- Next run's "scheduled for" time doesn't match the "Schedules" time HOT 2
- Task server should submit sync tasks to thread or event loop
- Task.apply_async should match Celery's calling semantics
- migrate all collections to pydantic 2
- prefect_dbt can't run `dbt source freshness` HOT 1
- Implement pytest-markdown
- Missing support for `job_parameters`in `prefect_databricks`
- Remove docs references to agents & block-based deployments
- Task.apply_async should mark runs as needing deferred execution
- prefect-shell on 3.12
- Emit warning when running prefect server with sqlite
- Create `Task.delay`
- prefect_dbt: Bring back logging and stream output of dbt cli comamnds in trigger_dbt_cli_command function
- Prefect Cloud: A comma in a Deployment Title crashes every run in the Deployment - without any error messages as to why
- Invalid names being generated for Azure Container Instance runs
- Parameter schema generation regression in 2.19.3 for keyword-only flows
- Parameter schema generation regression in 2.19.3 for dynamically created models
- Add a `deferred` parameter to `Task.map` to control whether to defer or submit tasks
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from prefect.