Frequently Asked Questions#
Below is a compilation of common issues you may encounter when getting started with FluxVLA. You are also welcome to use the 🦞 Lobster Assistant in the upper-right corner to ask questions — the “Lobster” will answer your queries and collect feedback.
conda takes a very long time to install av#
If the installation hangs at the “Solving environment” step for an extended period, try the following command instead:
conda install -c conda-forge av=14.4.0 --solver=libmamba
Common Transformers Installation Issues#
FluxVLA depends on the Hugging Face transformers library, but this dependency is not included in requirements.txt and must be installed manually. Because different models require different transformers versions, version conflicts are common during installation.
1. Recommended Installation#
Following the README instructions, install transformers separately after installing FluxVLA:
pip install transformers==4.53.0
2. Version Requirements by Model#
Different models expect different transformers versions in their code or configuration:
Model |
Recommended Version |
Notes |
|---|---|---|
OpenVLA / dinosiglip-qwen2_5 |
|
Has explicit version checks in code; also requires |
Pi0 / Pi0.5 / Gr00t / LlavaVLA etc. |
|
Use the README-recommended version |
Tron2 deployment |
|
See the Tron2 inference deployment documentation |
3. Common Issues and Solutions#
Issue 1: Version warning when using OpenVLA
Expected `transformers==4.40.1` and `tokenizers==0.19.1` but got ...
there might be inference-time regressions due to dependency changes.
This occurs because the OpenVLA pretrained model was built with transformers==4.40.1. If you primarily use OpenVLA, consider downgrading:
pip install transformers==4.40.1 tokenizers==0.19.1
Note: After downgrading to 4.40.1, other models (such as Pi0, Gr00t, etc.) may not work properly. If you need to use multiple models simultaneously, it is recommended to create separate Conda environments for each.
Issue 2: pip install transformers upgrades other dependencies
When installing transformers, pip may automatically upgrade packages like numpy, tokenizers, and huggingface-hub, causing conflicts with other FluxVLA dependencies. The recommended installation order is:
# 1. Install FluxVLA and its dependencies first
pip install -r requirements.txt
python setup.py develop
# 2. Then install transformers
pip install transformers==4.53.0
# 3. Finally fix the numpy version
pip install numpy==1.26.4
Issue 3: ImportError or AttributeError
If you encounter errors like:
ImportError: cannot import name 'XXX' from 'transformers'
AttributeError: module 'transformers' has no attribute 'XXX'
This is typically caused by an incompatible transformers version (too old or too new). Verify your current version and reinstall the target version:
python -c "import transformers; print(transformers.__version__)"
pip install transformers==<target_version>
Issue 4: Installing transformers from source
When the pip-installed version does not include the latest fixes, you can install from source:
pip install git+https://github.com/huggingface/transformers.git@v4.53.0
How to run Libero evaluation on devices without ray tracing (e.g., A100)?#
To enable Libero evaluation on devices without ray tracing support (such as A100), refer to GPU Rendering on EGL Devices
What is the difference between FluxVLA’s open-source models (e.g., Pi0.5 and GR00T) and the official versions?#
The model architectures provided by FluxVLA are fully aligned with the official implementations. We also provide FluxVLA-compatible model weights corresponding to the official weights. These model weights have been validated on both simulation and real-robot environments to ensure alignment with the official models.
Does FluxVLA support VLM training?#
Yes, VLM training is supported. However, the current version does not yet support mixed training with text data and robot data. This feature will be added in future updates.