A newer version of the Gradio SDK is available:
6.2.0
Manage Conda Environments
You are helping the user list conda environments and work with them to add packages.
Your tasks:
Check if conda is installed:
which conda conda --version conda infoIf not installed, offer to help install Miniconda or Anaconda.
List all conda environments:
conda env list # or conda info --envsThis shows:
- All environment names
- Their locations
- Current active environment (marked with *)
Show current environment:
echo $CONDA_DEFAULT_ENV conda info --envs | grep "*"Display detailed environment information: For each environment, show:
# List packages in specific environment conda list -n <env-name> # Show environment details conda env export -n <env-name> # Show size du -sh ~/miniconda3/envs/<env-name> # or du -sh ~/anaconda3/envs/<env-name>Ask user which environment to work with: Present the list and ask which environment they want to modify or examine.
Activate environment:
conda activate <env-name>Verify activation:
conda info --envs python --version which pythonShow packages in environment:
conda list # or for specific environment conda list -n <env-name> # Show only explicitly installed packages conda env export --from-history -n <env-name>Search for packages: Ask what packages user wants to install:
conda search <package-name> conda search <package-name> --infoInstall packages:
Single package:
conda install <package-name> # or specify environment conda install -n <env-name> <package-name>Multiple packages:
conda install <package1> <package2> <package3>Specific version:
conda install <package-name>=<version> # Example: conda install python=3.11 conda install numpy=1.24.0From specific channel:
conda install -c conda-forge <package-name>Suggest common packages by category:
Data Science:
conda install numpy pandas matplotlib seaborn scikit-learn conda install jupyter jupyterlab notebook conda install scipy statsmodelsMachine Learning:
conda install tensorflow pytorch torchvision conda install keras scikit-learn xgboost conda install -c conda-forge lightgbmDevelopment:
conda install ipython black flake8 pytest conda install requests beautifulsoup4 selenium conda install flask django fastapiVisualization:
conda install matplotlib seaborn plotly conda install bokeh altairDatabase:
conda install sqlalchemy psycopg2 pymongo conda install sqliteUpdate packages:
Update specific package:
conda update <package-name>Update all packages in environment:
conda update --allUpdate conda itself:
conda update condaRemove packages:
conda remove <package-name> # or from specific environment conda remove -n <env-name> <package-name>Create new environment: Offer to create a new environment:
# Basic environment conda create -n <env-name> python=3.11 # With packages conda create -n myenv python=3.11 numpy pandas jupyter # From file conda env create -f environment.ymlExport environment: Help user export environment for sharing:
Full export (with all dependencies):
conda env export -n <env-name> > environment.ymlOnly explicitly installed packages:
conda env export --from-history -n <env-name> > environment.ymlAs requirements.txt:
conda list -n <env-name> --export > requirements.txtClone environment:
conda create --name <new-env> --clone <existing-env>Clean up conda:
# Remove unused packages and caches conda clean --all # Remove packages cache conda clean --packages # Remove tarballs conda clean --tarballs # Check what would be removed conda clean --all --dry-runCheck environment conflicts:
# Check for broken dependencies conda info <package-name> # Verify environment conda env export -n <env-name> | conda env create -n test-env -f -Show environment size:
# Size of all environments du -sh ~/miniconda3/envs/* # or du -sh ~/anaconda3/envs/* # Total conda size du -sh ~/miniconda3Configure conda:
Add channels:
conda config --add channels conda-forge conda config --add channels biocondaSet channel priority:
conda config --set channel_priority strictShow configuration:
conda config --show conda config --show channelsRemove channel:
conda config --remove channels <channel-name>Use mamba (faster alternative): If user has performance issues:
# Install mamba conda install mamba -n base -c conda-forge # Use mamba instead of conda mamba install <package-name> mamba search <package-name> mamba env create -f environment.ymlTroubleshooting common issues:
Environment not activating:
conda init bash source ~/.bashrcPackage conflicts:
# Create new environment instead conda create -n new-env python=3.11 <packages>Slow package resolution:
# Use mamba conda install mamba -c conda-forge # or conda config --set solver libmambaConda command not found:
export PATH="$HOME/miniconda3/bin:$PATH" conda init bashBest practices to share:
- Create separate environment for each project
- Use environment.yml for reproducibility
- Pin important package versions
- Use conda-forge channel for latest packages
- Regularly clean up with
conda clean --all - Don't install packages in base environment
- Use mamba for faster package resolution
- Export environments before major changes
- Keep Python version explicit in environment
- Use
--from-historyfor cross-platform compatibility
Show workflow example:
# Create environment conda create -n data-project python=3.11 # Activate it conda activate data-project # Install packages conda install numpy pandas jupyter matplotlib scikit-learn # Verify conda list # Export for sharing conda env export --from-history > environment.yml # Deactivate when done conda deactivateIntegration with Jupyter:
# Install ipykernel in environment conda activate <env-name> conda install ipykernel # Register environment as Jupyter kernel python -m ipykernel install --user --name=<env-name> # Now available in Jupyter jupyter labReport current status: Summarize:
- Number of environments
- Active environment
- Total disk usage
- conda version
- Suggested actions based on user needs
Important notes:
- Always activate environment before installing packages
- Base environment should be kept minimal
- Use
-n <env-name>to work with environments without activating - conda-forge channel has more packages than default
- mamba is drop-in replacement, much faster
- Environment.yml files ensure reproducibility
- Pin versions in production environments
- Clean up regularly to save disk space
- Don't mix pip and conda unless necessary (prefer conda)
- Use
--from-historywhen exporting for other OS - Jupyter needs ipykernel in each environment
- conda init modifies .bashrc - check if needed