Choose your installation path based on what you want to explore:
This installation is for exploring HM3D house environments with our finetuned language model. This is a simpler setup that installs Habitat-Sim from conda-forge (no building from source) and doesn't require the mesh pipeline or segmentation.
- Conda (Miniconda or Anaconda)
- Git
Run the quick setup script:
git clone https://github.com/rzninvo/CNSG.git
cd CNSG
bash scripts/install_hm3d.shThis script will:
- Initialize git submodules
- Create the
habitat-defaultconda environment - Install Habitat-Sim from conda-forge (auto-detects headless mode)
- Install Habitat-Lab
- Install GUI and audio dependencies
- Setup LoRA adapter weights directory
- Optionally download the HM3D dataset
During installation, you'll be prompted to download the HM3D dataset. You'll need a Matterport API Token from https://my.matterport.com/settings/account/devtools.
If you skip this during installation, you can download it later:
conda activate habitat-default
python -m habitat_sim.utils.datasets_download --username <api-token-id> --password <api-token-secret> --uids hm3d_minival_v0.2Activate the habitat-default environment and run the viewer:
conda activate habitat-default
cd habitat-sim
python examples/mr_viewer.py --backend=local --finetuned-model=TrueThe base model (microsoft/Phi-3-mini-4k-instruct) will be downloaded automatically from Hugging Face on first run.
Use W/A/S/D keys to move and arrow keys or mouse to look around.
Local Model (finetuned) - Recommended:
# Requires LoRA adapter weights (see LoRA section below)
python examples/mr_viewer.py --backend=local --finetuned-model=TrueLocal Model (base):
python examples/mr_viewer.py --backend=localOpenAI Backend:
# Create .env file with your API key in the project root
echo "OPENAI_API_KEY=your_api_key_here" > .env
python examples/mr_viewer.py --backend=openaiImportant: The
habitat-defaultenvironment uses Habitat-Sim from conda-forge and is specifically for HM3D datasets. Do not use this environment with the ETH HG E floor scene.
This installation is for exploring the ETH HG E floor academic building with full semantic segmentation capabilities. This requires building Habitat-Sim from source and running a mesh processing pipeline.
- Conda (Miniconda or Anaconda)
- Git
- Docker (for the localization pipeline)
Run the automated installation script:
git clone https://github.com/rzninvo/CNSG.git
cd CNSG
bash scripts/install.shThis script will:
- Initialize git submodules
- Create and configure the
habitat-sourceconda environment - Build and install habitat-sim with Bullet physics from source
- Install habitat-lab
- Create and configure the
CNSG-meshingconda environment for the mesh pipeline - Download required mesh data
- Optionally run the segmentation pipeline for ETH HG E floor
After installation, you'll have two conda environments:
# For Habitat-Sim built from source (ETH HG E floor)
conda activate habitat-source
# For Mesh Pipeline (3D reconstruction and segmentation)
conda activate CNSG-meshingThe localization pipeline requires a Docker container:
cd mesh_pipeline/third_party/lamar-benchmark
docker build --target lamar -t lamar:lamar -f Dockerfile ./Activate the habitat-source environment and run the viewer with the HGE scene:
conda activate habitat-source
cd habitat-sim
python examples/mr_viewer.py --scene ./data/scene_datasets/HGE/HGE.basis.glb --dataset data/scene_datasets/HGE.scene_dataset_config.jsonUse W/A/S/D keys to move and arrow keys or mouse to look around. Press K to toggle semantic visualization.
By default, the viewer uses OpenAI as the backend. You can choose different backends:
OpenAI Backend (default):
# Create .env file with your API key in the project root
echo "OPENAI_API_KEY=your_api_key_here" > .env
python examples/mr_viewer.py --scene ./data/scene_datasets/HGE/HGE.basis.glb --dataset data/scene_datasets/HGE.scene_dataset_config.json --backend=openaiLocal Model (base):
python examples/mr_viewer.py --scene ./data/scene_datasets/HGE/HGE.basis.glb --dataset data/scene_datasets/HGE.scene_dataset_config.json --backend=localLocal Model (finetuned):
# Requires LoRA adapter weights (see LoRA section below)
python examples/mr_viewer.py --scene ./data/scene_datasets/HGE/HGE.basis.glb --dataset data/scene_datasets/HGE.scene_dataset_config.json --backend=local --finetuned-model=TrueImportant: The
habitat-sourceenvironment is specifically built for the ETH HG E floor with semantic segmentation. Do not use this environment with HM3D datasets.
To use the finetuned model with either installation option, download the LoRA adapter weights:
- Download from https://huggingface.co/FBondi/phi3-mr-lora-weights
- Create the target directory:
mkdir -p finetuning/phi3-mr-lora-fixed-v3
- Place the downloaded files in
finetuning/phi3-mr-lora-fixed-v3/
The directory must contain:
adapter_config.json
adapter_model.safetensors
The application will load the adapter at runtime from finetuning/phi3-mr-lora-fixed-v3/.