Skip to content

Commit e0c2a46

Browse files
Update vimunet.md
1 parent 3323975 commit e0c2a46

File tree

1 file changed

+11
-10
lines changed

1 file changed

+11
-10
lines changed

vimunet.md

+11-10
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,25 @@
1-
# ViM-UNet: Vision Mamba in Biomedical Segmentation*
1+
# ViM-UNet: Vision Mamba in Biomedical Segmentation
22

3-
We introduce **ViM-UNet**. a novel segmentation architecture based on Vision Mamba for instance segmentation in microscopy.
3+
We introduce **ViM-UNet**, a novel segmentation architecture based on [Vision Mamba](https://github.com/hustvl/Vim) for instance segmentation in microscopy.
44

55
This is the documentation for the installation instructions, known issues and linked suggestions, benchmarking scripts, and link to the tutorial notebook.
66

77
## TLDR
88
1. Please install [`torch-em`](https://github.com/constantinpape/torch-em) and `ViM` (based on our fork: https://github.com/anwai98/Vim)
99
2. Supports `ViM Tiny` and `ViM Small` for 2d segmentation using ViM-UNet.
10-
3. *More details on the preprint coming soon.
11-
- Our observations: "ViM-UNet performs similarly or better that UNet (depending on the task), and outperforms UNETR while being more efficient."
10+
3. Our preprint on ViMUNet will be available soon.
11+
- The key observation: "ViM-UNet performs similarly or better that UNet (depending on the task), and outperforms UNETR while being more efficient. Its main advantages is for segmentation problems that rely on large context."
1212

1313
## Benchmarking Methods
1414

1515
### Re-implemented methods in `torch-em`:
16-
1. [ViM-UNet]()
17-
2. [UNet]()
18-
3. [UNETR]()
16+
1. [ViM-UNet](https://github.com/constantinpape/torch-em/blob/main/torch_em/model/vim.py)
17+
2. [UNet](https://github.com/constantinpape/torch-em/blob/main/torch_em/model/unet.py)
18+
3. [UNETR](https://github.com/constantinpape/torch-em/blob/main/torch_em/model/unetr.py)
1919

2020
### External methods:
2121

22-
> [Here](https://github.com/anwai98/vimunet-benchmarking) are the scripts to run the benchmarking for the aforementioned external methods.
22+
> [Here](https://github.com/anwai98/vimunet-benchmarking) are the scripts to run the benchmarking for the reference methods.
2323
2424
1. nnU-Net (see [here](https://github.com/MIC-DKFZ/nnUNet) for installation instructions)
2525
2. U-Mamba (see [here](https://github.com/bowang-lab/U-Mamba#installation) for installation instructions, and [issues]() encountered with our suggestions to take care of them)
@@ -40,7 +40,7 @@ $ pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url htt
4040
```
4141
> Q1. Why use `pip`? - for installation consistency
4242
43-
> Q2. Why choose CUDA 11.8? - Vim seems to prefer $\le$ 11.8 ([hint](https://github.com/hustvl/Vim/issues/51))
43+
> Q2. Why choose CUDA 11.8? - Vim seems to prefer $\le$ 11.8 ([see here](https://github.com/hustvl/Vim/issues/51))
4444
4545
4. Install `ViM` and related dependencies (`causal-conv1d`\**, `mamba`, `Vim`\***):
4646
```bash
@@ -53,6 +53,7 @@ $ pip install -e .
5353
```
5454

5555
> NOTE: The installation is sometimes a bit tricky, but following the steps and keeping the footnotes in mind should do the trick.
56+
> We are working on providing an easier and more stable installation, [see this issue](https://github.com/constantinpape/torch-em/issues/237).
5657
5758
### For UNet and UNETR
5859

@@ -77,4 +78,4 @@ $ pip install -e .
7778
- Suggestion: This one's a bit tricky. From our findings, the possible issue is that the path to `CUDA_HOME` isn't visible to the installed PyTorch. The quickest way to test this is: `python -c "from torch.utils.cpp_extension import CUDA_HOME; print(CUDA_HOME)"`. It's often stored at `/usr/local/cuda`, hence to expose the path, here's the example script: `export CUDA_HOME=/usr/local/cuda`.
7879
> NOTE: If you are using your cluster's cuda installation and not sure where is it located, this should do the trick: `module show cuda/$VERSION`
7980

80-
- ***Remember to install the suggested `ViM` branch for installation. It's important as we enable a few changes to: a) automatically install the vision mamba as a developer module, and b) setting AMP to false for known issues (see [mention 1](https://github.com/hustvl/Vim/issues/30) and [mention 2](https://github.com/bowang-lab/U-Mamba/issues/8) for hints)
81+
- ***Remember to install the suggested `ViM` branch for installation. It's important as we enable a few changes to: a) automatically install the vision mamba as a developer module, and b) setting AMP to false for known issues (see [mention 1](https://github.com/hustvl/Vim/issues/30) and [mention 2](https://github.com/bowang-lab/U-Mamba/issues/8) for hints)

0 commit comments

Comments
 (0)