Ctrl K

DeBCR

DeBCR is a sparsity-efficient deep learning framework for denoising, deblurring, and deconvolving fluorescence microscopy images. Using the m-rBCR model, it restores structural details and improves resolution. It is available as a Python package and napari plugin for efficient image restoration.

5
contributors
Get started
143 commitsLast commit ≈ 3 days ago2 stars1 fork

Cite this software

Description

DeBCR

DeBCR is a Python-based framework for light microscopy data enhancement, including denoising and deconvolution.

As an enhancement core, DeBCR implements a multi-scale sparsity-efficient deep learning model m-rBCR.

As a framework, DeBCR provides user interfaces such as:

How to cite us

Li, R., Yushkevich, A., Chu, X., Kudryashev, M. and Yakimovich, A., 2026. DeBCR: a sparsity-efficient framework for image enhancement through a deep-learning-based solution to inverse problems. Communications Engineering.

@article{li2026debcr,
  title={DeBCR: a sparsity-efficient framework for image enhancement through a deep-learning-based solution to inverse problems},
  author={Li, Rui and Yushkevich, Artsemi and Chu, Xiaofeng and Kudryashev, Mikhail and Yakimovich, Artur},
  journal={Communications Engineering},
  year={2026},
  publisher={Nature Publishing Group UK London}
}

License

This is an open-source project and is licensed under MIT license.

Contact

For any questions or bug-reports on debcr please use dedicated GitHub Issue Tracker.

Installation

There are two hardware-based installation options for debcr:

  • debcr[tf-gpu] - for a GPU-based trainig and prediction (recommended);
  • debcr[tf-cpu] - for a CPU-only execution (note: training on CPUs might be quite slow!).

GPU prerequisites

For a GPU version you need:

  • a GPU device with at least 12Gb of VRAM;
  • a compatible CUDA Toolkit (recommemded: CUDA-11.7);
  • a compatible cuDNN library (recommemded: v8.4.0 for CUDA-11.x from cuDNN archive).

For more info on GPU dependencies please check our GPU-advice page.

Create a package environment (optional)

For a clean isolated installation, we advice using one of Python package environment managers, for example:

Create an environment for debcr using

micromamba env create -n debcr python=3.9 -y

and activate it for further installation or usage by

micromamba activate debcr

Install DeBCR

Install one of the DeBCR versions:

  • GPU (recommended; backend: TensorFlow-GPU-v2.11):
    pip install 'debcr[tf-gpu]'
    
  • CPU (limited; backend: TensorFlow-CPU-v2.11)
    pip install 'debcr[tf-cpu]'
    

Test GPU visibility

For a GPU version installation, it is recommended to check if your GPU device is recognised by TensorFlow using

python -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"

which for a single GPU device should produce a similar output as below:

[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]

If your GPU device list is empty, please check our GPU-advice page.

Install Jupyter

To use debcr as a Python library (API) interactively, please also install Jupyter Notebook/Lab, for example

pip install jupyterlab

Usage

To learn using debcr as a python library (API) interactively, follow our notebook tutorials:

Notebook tutorialPurposeHardwareInputs
debcr_predict.ipynbenhanced predictionCPU/GPUpre-processed input data (NPZ/NPY), trained DeBCR model.
debcr_train.ipynbmodel trainingGPUtraining/validation data (NPZ/NPY).
debcr_preproc.ipynbraw data pre-processingCPUraw data (TIF/TIFF, JPG/JPEG, PNG).

To use these notebooks,

  1. activate debcr environment, if was inactive, by
micromamba activate debcr
  1. start Jupyter session at the notebooks location (download them from the DeBCR GitHub)
jupyter-lab

Example data and trained model weights

Based on several previously published datasets (from CARE, DeepBacs, and TA-GAN), we prepared four example datasets and trained m-rBCR model weights to both evaluate our model and serve as the example data/weights for notebook tutorials.

The datasets are distributed as NumPy (.npz) arrays in three essential sets (train, validation and test), available along with the trained model weights on Zenodo: 10.5281/zenodo.12626121.

About model

The core DeBCR enhancement model m-rBCR approximates imaging process inversion with deep convolutional neural network (DCNN), based on compact BCR-representation (Beylkin G. et al., Comm. Pure Appl. Math, 1991) for convolutions and its DCNN implementation as proposed in BCR-Net (Fan Y. et al., J. Comput. Phys., 2019):

DeBCR network structure

In contrast to the traditional single-stage residual BCR learning process, the core DeBCR model integrates feature maps from multiple resolution levels: DeBCR multi-resolution

The example of the DeBCR performance on the low/high exposure confocal data of Tribolium castaneum sample from the CARE work (Weigert et al., Nat. Methods, 2018) is shown below: DeBCR LM

For more details on the multi-stage residual BCR (m-rBCR) architechture implemented within DeBCR framework see:

Li, R., Kudryashev, M., Yakimovich, A. Solving the Inverse Problem of Microscopy Deconvolution with a Residual Beylkin-Coifman-Rokhlin Neural Network. ECCV 2024, Lecture Notes in Computer Science, vol 15133. Springer, Cham. https://doi.org/10.1007/978-3-031-73226-3_22

Logo of DeBCR
Keywords
Programming languages
  • Python 61%
  • Jupyter Notebook 39%
License
</>Source code
Packages
pypi.org

Participating organisations

Center for Advanced Systems Understanding
Max Delbrück Center for Molecular Medicine

Reference papers

Contributors

AY
Artsemi Yushkevich
RL
Rui Li
Center for Advanced Systems Understanding
XC
MK
Misha Kudryashev
AY
Artur Yakimovich
Center for Advanced Systems Understanding

Helmholtz Program-oriented Funding IV