FAQ
This page captures recurring questions about the AGILab tooling and runtime.
Missing worker packages during AGI.run_*
If a run fails with ModuleNotFoundError inside a worker virtual environment, rerun the
matching installer script (for example uv run --project src/agilab/core/agi-cluster python
src/agilab/examples/flight/AGI.install_flight.py
). The installer rebuilds the worker egg and
provisions its environment so the next AGI.run_*
picks up the dependencies.
Why installers still build eggs
The distributed upload path expects bdist_egg
artifacts. Each app ships a build.py
helper
that produces eggs and symlinks the required modules before they are sent to Dask. Moving to pure
wheels would break that upload contract, so eggs remain the canonical package format.
Do we already have DAG/task orchestration?
Yes. Managers hand WorkDispatcher
a work plan and DagWorker
executes it, enforcing
dependencies and parallelism across workers. The improvement areas are telemetry and richer
policies (retries, priorities), not building a brand-new planner.
Who manages multithreading when Dask is disabled?
agi_dispatcher
owns the local process and thread pools. Dask only coordinates execution when
you explicitly opt into distributed mode; otherwise, the dispatcher handles the orchestration end
to end.
Regenerating IDE run configurations
pycharm/gen_app_script.py
is the authoritative generator for JetBrains run configurations.
Wrap it (and setup_pycharm.py
) in a single helper command—e.g. just run-configs
or make
run-configs
—so developers and CI regenerate configs consistently from the same entry point.
“VIRTUAL_ENV … does not match the project environment” warning
uv
emits this when you launch a command from an activated shell whose
$VIRTUAL_ENV
differs from the target project’s .venv
directory. The message is
informational—the command will still run using the project lock. If you truly want to
reuse the activated environment, pass --active
to uv
; otherwise you can safely
ignore the warning.
Why does a run create distribution.json
?
WorkDispatcher
caches the last work-plan in distribution.json
inside each app
directory. On subsequent runs it reuses the plan if the workers layout and arguments
are unchanged; delete the file (or change args) to force a full repartition.
Switching the active app in Streamlit
Use the project selector in the left sidebar of the Streamlit UI. AgiEnv
will
recreate symbolic links under ~/wenv
and adjust the virtual environment for the
chosen app. When you add a brand-new app under src/agilab/apps/
, restart the
Streamlit session so the selector picks it up.
Docs drift after touching core APIs
If you change BaseWorker
or other primitives surfaced in the guides, rebuild the
reference documentation with uv run python docs/gen-docs.py
so the published docs
match the updated source.
AGI.install_* fails looking for pyproject.toml
Each worker must carry its own pyproject.toml
(for example
src/agilab/apps/ilp_project/src/ilp_worker/pyproject.toml
). If the installer raises
FileNotFoundError
for that path, add the file with the worker’s runtime
dependencies—typically mirroring the manager’s requirements plus the appropriate
dag-worker
/polars-worker
extra.
Where are installer logs written?
Every installer run streams output to the UI and also appends a timestamped log under
$AGI_LOG_DIR/install_logs
. By default $AGI_LOG_DIR
is ~/log
(see
$HOME/.agilab/.env
), so you will find files like
~/log/install_logs/install_20250921_072751.log_
with the full transcript.