--- language: - en license: apache-2.0 library_name: atommic datasets: - CC359 thumbnail: null tags: - image-reconstruction - LPDNet - ATOMMIC - pytorch model-index: - name: REC_LPDNet_CC359_12_channel_poisson2d_5x_10x_NNEstimationCSM results: [] --- ## Model Overview Learned Primal Dual Network (LPDNet) for 5x & 10x accelerated MRI Reconstruction on the CC359 dataset. ## ATOMMIC: Training To train, fine-tune, or test the model you will need to install [ATOMMIC](https://github.com/wdika/atommic). We recommend you install it after you've installed latest Pytorch version. ``` pip install atommic['all'] ``` ## How to Use this Model The model is available for use in ATOMMIC, and can be used as a pre-trained checkpoint for inference or for fine-tuning on another dataset. Corresponding configuration YAML files can be found [here](https://github.com/wdika/atommic/tree/main/projects/REC/CC359/conf). ### Automatically instantiate the model ```base pretrained: true checkpoint: https://huggingface.co/wdika/REC_LPDNet_CC359_12_channel_poisson2d_5x_10x_NNEstimationCSM/blob/main/REC_LPDNet_CC359_12_channel_poisson2d_5x_10x_NNEstimationCSM.atommic mode: test ``` ### Usage You need to download the CC359 dataset to effectively use this model. Check the [CC359](https://github.com/wdika/atommic/blob/main/projects/REC/CC359/README.md) page for more information. ## Model Architecture ```base model: model_name: LPDNet num_primal: 5 num_dual: 5 num_iter: 5 primal_model_architecture: UNET primal_in_channels: 2 primal_out_channels: 2 primal_unet_num_filters: 16 primal_unet_num_pool_layers: 2 primal_unet_dropout_probability: 0.0 primal_unet_padding_size: 11 primal_unet_normalize: true dual_model_architecture: UNET dual_in_channels: 2 dual_out_channels: 2 dual_unet_num_filters: 16 dual_unet_num_pool_layers: 2 dual_unet_dropout_probability: 0.0 dual_unet_padding_size: 11 dual_unet_normalize: true dimensionality: 2 reconstruction_loss: l1: 0.1 ssim: 0.9 estimate_coil_sensitivity_maps_with_nn: true ``` ## Training ```base optim: name: adamw lr: 1e-4 betas: - 0.9 - 0.999 weight_decay: 0.0 sched: name: CosineAnnealing min_lr: 0.0 last_epoch: -1 warmup_ratio: 0.1 trainer: strategy: ddp_find_unused_parameters_false accelerator: gpu devices: 1 num_nodes: 1 max_epochs: 20 precision: 16-mixed enable_checkpointing: false logger: false log_every_n_steps: 50 check_val_every_n_epoch: -1 max_steps: -1 ``` ## Performance To compute the targets using the raw k-space and the chosen coil combination method, accompanied with the chosen coil sensitivity maps estimation method, you can use [targets](https://github.com/wdika/atommic/tree/main/projects/REC/CC359/conf/targets) configuration files. Evaluation can be performed using the [evaluation](https://github.com/wdika/atommic/blob/main/tools/evaluation/reconstruction.py) script for the reconstruction task, with --evaluation_type per_slice. Results ------- Evaluation against RSS targets ------------------------------ 5x: MSE = 0.001668 +/- 0.001584 NMSE = 0.02567 +/- 0.0265 PSNR = 28.26 +/- 4.222 SSIM = 0.8493 +/- 0.07524 10x: MSE = 0.002367 +/- 0.002247 NMSE = 0.03687 +/- 0.03859 PSNR = 26.73 +/- 4.229 SSIM = 0.8096 +/- 0.09866 ## Limitations This model was trained on the CC359 using a UNet coil sensitivity maps estimation and might differ from the results reported on the challenge leaderboard. ## References [1] [ATOMMIC](https://github.com/wdika/atommic) [2] Beauferris, Y., Teuwen, J., Karkalousos, D., Moriakov, N., Caan, M., Yiasemis, G., Rodrigues, L., Lopes, A., Pedrini, H., Rittner, L., Dannecker, M., Studenyak, V., Gröger, F., Vyas, D., Faghih-Roohi, S., Kumar Jethi, A., Chandra Raju, J., Sivaprakasam, M., Lasby, M., … Souza, R. (2022). Multi-Coil MRI Reconstruction Challenge—Assessing Brain MRI Reconstruction Models and Their Generalizability to Varying Coil Configurations. Frontiers in Neuroscience, 16. https://doi.org/10.3389/fnins.2022.919186