guohanghui commited on
Commit
3e86bfa
·
verified ·
1 Parent(s): 56685f8

Upload 1525 files

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +38 -0
  2. Dockerfile +18 -0
  3. MONAI/mcp_output/README_MCP.md +51 -0
  4. MONAI/mcp_output/analysis.json +0 -0
  5. MONAI/mcp_output/diff_report.md +61 -0
  6. MONAI/mcp_output/mcp_plugin/__init__.py +0 -0
  7. MONAI/mcp_output/mcp_plugin/adapter.py +181 -0
  8. MONAI/mcp_output/mcp_plugin/main.py +13 -0
  9. MONAI/mcp_output/mcp_plugin/mcp_service.py +139 -0
  10. MONAI/mcp_output/requirements.txt +7 -0
  11. MONAI/mcp_output/start_mcp.py +30 -0
  12. MONAI/mcp_output/workflow_summary.json +228 -0
  13. MONAI/source/.clang-format +88 -0
  14. MONAI/source/.coderabbit.yaml +53 -0
  15. MONAI/source/.deepsource.toml +27 -0
  16. MONAI/source/.dockerignore +13 -0
  17. MONAI/source/.pre-commit-config.yaml +61 -0
  18. MONAI/source/.readthedocs.yml +28 -0
  19. MONAI/source/CHANGELOG.md +1293 -0
  20. MONAI/source/CITATION.cff +139 -0
  21. MONAI/source/CODE_OF_CONDUCT.md +76 -0
  22. MONAI/source/CONTRIBUTING.md +417 -0
  23. MONAI/source/Dockerfile +66 -0
  24. MONAI/source/LICENSE +201 -0
  25. MONAI/source/MANIFEST.in +5 -0
  26. MONAI/source/README.md +96 -0
  27. MONAI/source/SECURITY.md +18 -0
  28. MONAI/source/__init__.py +4 -0
  29. MONAI/source/docs/.readthedocs.yaml +14 -0
  30. MONAI/source/docs/Makefile +29 -0
  31. MONAI/source/docs/_static/custom.css +4 -0
  32. MONAI/source/docs/images/3d_paired.png +0 -0
  33. MONAI/source/docs/images/BTCV_organs.png +3 -0
  34. MONAI/source/docs/images/MONAI-logo-color.png +0 -0
  35. MONAI/source/docs/images/MONAI_arch.png +0 -0
  36. MONAI/source/docs/images/MONAI_bundle_cloud.png +3 -0
  37. MONAI/source/docs/images/MONAI_clouds.png +0 -0
  38. MONAI/source/docs/images/MONAI_map_cloud.png +3 -0
  39. MONAI/source/docs/images/UNETR.png +3 -0
  40. MONAI/source/docs/images/affine.png +3 -0
  41. MONAI/source/docs/images/amp_training_a100.png +3 -0
  42. MONAI/source/docs/images/amp_training_v100.png +3 -0
  43. MONAI/source/docs/images/arch_modules.png +3 -0
  44. MONAI/source/docs/images/auto3dseg.png +3 -0
  45. MONAI/source/docs/images/blend.png +0 -0
  46. MONAI/source/docs/images/blend_images.png +0 -0
  47. MONAI/source/docs/images/brats_distributed.png +3 -0
  48. MONAI/source/docs/images/cache_dataset.png +3 -0
  49. MONAI/source/docs/images/cam.png +0 -0
  50. MONAI/source/docs/images/coplenet.png +3 -0
.gitattributes CHANGED
@@ -33,3 +33,41 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ MONAI/source/docs/images/affine.png filter=lfs diff=lfs merge=lfs -text
37
+ MONAI/source/docs/images/amp_training_a100.png filter=lfs diff=lfs merge=lfs -text
38
+ MONAI/source/docs/images/amp_training_v100.png filter=lfs diff=lfs merge=lfs -text
39
+ MONAI/source/docs/images/arch_modules.png filter=lfs diff=lfs merge=lfs -text
40
+ MONAI/source/docs/images/auto3dseg.png filter=lfs diff=lfs merge=lfs -text
41
+ MONAI/source/docs/images/brats_distributed.png filter=lfs diff=lfs merge=lfs -text
42
+ MONAI/source/docs/images/BTCV_organs.png filter=lfs diff=lfs merge=lfs -text
43
+ MONAI/source/docs/images/cache_dataset.png filter=lfs diff=lfs merge=lfs -text
44
+ MONAI/source/docs/images/coplenet.png filter=lfs diff=lfs merge=lfs -text
45
+ MONAI/source/docs/images/deepedit.png filter=lfs diff=lfs merge=lfs -text
46
+ MONAI/source/docs/images/deepgrow_scheme.png filter=lfs diff=lfs merge=lfs -text
47
+ MONAI/source/docs/images/detection.png filter=lfs diff=lfs merge=lfs -text
48
+ MONAI/source/docs/images/dints-overview.png filter=lfs diff=lfs merge=lfs -text
49
+ MONAI/source/docs/images/fast_training.png filter=lfs diff=lfs merge=lfs -text
50
+ MONAI/source/docs/images/gmm_feature_set_comparison_s.png filter=lfs diff=lfs merge=lfs -text
51
+ MONAI/source/docs/images/invert_transforms.png filter=lfs diff=lfs merge=lfs -text
52
+ MONAI/source/docs/images/maisi_infer.png filter=lfs diff=lfs merge=lfs -text
53
+ MONAI/source/docs/images/maisi_train.png filter=lfs diff=lfs merge=lfs -text
54
+ MONAI/source/docs/images/matshow3d.png filter=lfs diff=lfs merge=lfs -text
55
+ MONAI/source/docs/images/medical_transforms.png filter=lfs diff=lfs merge=lfs -text
56
+ MONAI/source/docs/images/metrics_report.png filter=lfs diff=lfs merge=lfs -text
57
+ MONAI/source/docs/images/MONAI_bundle_cloud.png filter=lfs diff=lfs merge=lfs -text
58
+ MONAI/source/docs/images/MONAI_map_cloud.png filter=lfs diff=lfs merge=lfs -text
59
+ MONAI/source/docs/images/nsight_comparison.png filter=lfs diff=lfs merge=lfs -text
60
+ MONAI/source/docs/images/nuclick.png filter=lfs diff=lfs merge=lfs -text
61
+ MONAI/source/docs/images/pathology-meta.png filter=lfs diff=lfs merge=lfs -text
62
+ MONAI/source/docs/images/pathology.png filter=lfs diff=lfs merge=lfs -text
63
+ MONAI/source/docs/images/postprocessing_transforms.png filter=lfs diff=lfs merge=lfs -text
64
+ MONAI/source/docs/images/sliding_window.png filter=lfs diff=lfs merge=lfs -text
65
+ MONAI/source/docs/images/ssl_overview.png filter=lfs diff=lfs merge=lfs -text
66
+ MONAI/source/docs/images/swin_unetr.png filter=lfs diff=lfs merge=lfs -text
67
+ MONAI/source/docs/images/tta.png filter=lfs diff=lfs merge=lfs -text
68
+ MONAI/source/docs/images/unet-pipe.png filter=lfs diff=lfs merge=lfs -text
69
+ MONAI/source/docs/images/UNETR.png filter=lfs diff=lfs merge=lfs -text
70
+ MONAI/source/docs/images/vista2d.png filter=lfs diff=lfs merge=lfs -text
71
+ MONAI/source/docs/images/workflows.png filter=lfs diff=lfs merge=lfs -text
72
+ MONAI/source/tests/testing_data/ultrasound_confidence_map/femur_input.png filter=lfs diff=lfs merge=lfs -text
73
+ MONAI/source/tests/testing_data/ultrasound_confidence_map/neck_input.png filter=lfs diff=lfs merge=lfs -text
Dockerfile ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ FROM python:3.10
2
+
3
+ RUN useradd -m -u 1000 user && python -m pip install --upgrade pip
4
+ USER user
5
+ ENV PATH="/home/user/.local/bin:$PATH"
6
+
7
+ WORKDIR /app
8
+
9
+ COPY --chown=user ./requirements.txt requirements.txt
10
+ RUN pip install --no-cache-dir --upgrade -r requirements.txt
11
+
12
+ COPY --chown=user . /app
13
+ ENV MCP_TRANSPORT=http
14
+ ENV MCP_PORT=7860
15
+
16
+ EXPOSE 7860
17
+
18
+ CMD ["python", "MONAI/mcp_output/start_mcp.py"]
MONAI/mcp_output/README_MCP.md ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MONAI: Medical Open Network for AI
2
+
3
+ ## Project Introduction
4
+
5
+ MONAI (Medical Open Network for AI) is an open-source, PyTorch-based framework designed to facilitate deep learning in healthcare imaging. It provides domain-specific implementations for medical image analysis tasks such as segmentation, classification, detection, and registration. MONAI aims to support researchers and developers by offering optimized and standardized tools for deep learning model development in medical imaging.
6
+
7
+ ## Installation Method
8
+
9
+ To install MONAI, ensure that you have Python and pip installed. MONAI requires several dependencies, including `numpy`, `torch`, and `nibabel`. Optional dependencies include `matplotlib` and `scipy` for additional functionalities.
10
+
11
+ To install MONAI, use the following pip command:
12
+
13
+ ```
14
+ pip install monai
15
+ ```
16
+
17
+ Ensure that your environment meets the necessary requirements for PyTorch and other dependencies.
18
+
19
+ ## Quick Start
20
+
21
+ Here's a quick example of how to use MONAI for a simple medical imaging task:
22
+
23
+ 1. Import the necessary modules from MONAI.
24
+ 2. Load your medical imaging data using MONAI's data handling utilities.
25
+ 3. Apply preprocessing transforms using MONAI's transform system.
26
+ 4. Define and train a neural network using MONAI's network architectures and training engines.
27
+
28
+ For detailed examples and tutorials, refer to the [MONAI documentation](https://docs.monai.io).
29
+
30
+ ## Available Tools and Endpoints List
31
+
32
+ - **Auto3DSeg**: Modules for automated 3D segmentation tasks, including `AutoRunner`, `BundleGen`, and `DataAnalyzer`.
33
+ - **Transforms**: Compose multiple transforms together using `Compose` and `SomeOf`.
34
+ - **Networks**: Network architectures for medical imaging, such as `UNet` and `ResNet`.
35
+ - **Bundle System**: CLI for managing MONAI bundles, accessible via `monai-bundle`.
36
+
37
+ ## Common Issues and Notes
38
+
39
+ - **Dependencies**: Ensure all required dependencies are installed. Use the provided `requirements.txt` for guidance.
40
+ - **Environment**: MONAI is designed to work with PyTorch, so ensure your environment is compatible with the PyTorch version you are using.
41
+ - **Performance**: For optimal performance, consider using a GPU-enabled environment, as MONAI supports multi-GPU and multi-node data parallelism.
42
+
43
+ ## Reference Links or Documentation
44
+
45
+ For more detailed information, tutorials, and API references, please visit the following resources:
46
+
47
+ - [MONAI GitHub Repository](https://github.com/Project-MONAI/MONAI)
48
+ - [MONAI Documentation](https://docs.monai.io)
49
+ - [MONAI Tutorials](https://github.com/Project-MONAI/tutorials)
50
+
51
+ For any issues or contributions, please refer to the [CONTRIBUTING.md](https://github.com/Project-MONAI/MONAI/blob/main/CONTRIBUTING.md) and [CODE_OF_CONDUCT.md](https://github.com/Project-MONAI/MONAI/blob/main/CODE_OF_CONDUCT.md) files in the repository.
MONAI/mcp_output/analysis.json ADDED
The diff for this file is too large to render. See raw diff
 
MONAI/mcp_output/diff_report.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MONAI Project Difference Report
2
+
3
+ **Date:** February 3, 2026
4
+ **Time:** 13:35:35
5
+ **Repository:** MONAI
6
+ **Project Type:** Python Library
7
+ **Intrusiveness:** None
8
+ **Workflow Status:** Success
9
+ **Test Status:** Failed
10
+
11
+ ## Project Overview
12
+
13
+ The MONAI project is a Python library designed to facilitate the development of deep learning models in medical imaging. It provides a comprehensive set of tools and functionalities to streamline the process of creating, training, and deploying models for various medical imaging tasks.
14
+
15
+ ## Difference Analysis
16
+
17
+ ### New Files Added
18
+
19
+ Since the last update, the MONAI project has introduced 8 new files. These files are likely to include new features, enhancements, or additional modules that expand the library's capabilities. However, the specific content and purpose of these files have not been detailed in this report.
20
+
21
+ ### Modified Files
22
+
23
+ There have been no modifications to existing files in this update. This suggests that the recent changes are entirely additive, focusing on expanding the library's functionality without altering the current codebase.
24
+
25
+ ## Technical Analysis
26
+
27
+ ### Workflow Status
28
+
29
+ The workflow status is marked as "success," indicating that the integration and deployment processes for the new files were executed without any errors or interruptions. This suggests that the new additions were integrated smoothly into the existing framework.
30
+
31
+ ### Test Status
32
+
33
+ Despite the successful workflow, the test status is marked as "failed." This indicates that one or more tests did not pass, which could be due to issues with the new files or their integration with existing components. The failure in testing suggests potential bugs or compatibility issues that need to be addressed.
34
+
35
+ ## Recommendations and Improvements
36
+
37
+ 1. **Investigate Test Failures:** Conduct a thorough analysis of the failed tests to identify the root causes. This may involve reviewing the new files for errors, ensuring compatibility with existing modules, and verifying that all dependencies are correctly configured.
38
+
39
+ 2. **Enhance Testing Framework:** Consider expanding the test coverage to include the new functionalities introduced by the recent files. This will help in identifying potential issues early in the development cycle.
40
+
41
+ 3. **Documentation Update:** Ensure that the documentation is updated to reflect the new features and functionalities. This will aid users in understanding and utilizing the new capabilities effectively.
42
+
43
+ 4. **Community Feedback:** Engage with the user community to gather feedback on the new features. This can provide valuable insights into potential improvements and user needs.
44
+
45
+ ## Deployment Information
46
+
47
+ The deployment of the new files was successful, as indicated by the workflow status. However, due to the test failures, it is recommended to hold off on any production deployment until the issues are resolved. This will prevent potential disruptions or errors in user environments.
48
+
49
+ ## Future Planning
50
+
51
+ 1. **Bug Fixes and Patches:** Prioritize resolving the test failures and releasing patches to address any identified issues.
52
+
53
+ 2. **Feature Expansion:** Continue to expand the library's capabilities by introducing new features and enhancements based on user feedback and technological advancements in medical imaging.
54
+
55
+ 3. **Community Engagement:** Strengthen community engagement through forums, webinars, and collaborative projects to foster a supportive ecosystem around the MONAI library.
56
+
57
+ 4. **Performance Optimization:** Explore opportunities to optimize the library's performance, ensuring it remains efficient and scalable for various medical imaging applications.
58
+
59
+ ## Conclusion
60
+
61
+ The recent update to the MONAI project introduces new functionalities that enhance the library's capabilities. However, the test failures highlight the need for further investigation and resolution before these changes can be fully integrated into production environments. By addressing these issues and implementing the recommended improvements, the MONAI project can continue to provide valuable tools for the medical imaging community.
MONAI/mcp_output/mcp_plugin/__init__.py ADDED
File without changes
MONAI/mcp_output/mcp_plugin/adapter.py ADDED
@@ -0,0 +1,181 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import sys
3
+
4
+ # Path settings
5
+ source_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), "source")
6
+ sys.path.insert(0, source_path)
7
+
8
+ # Import statements
9
+ try:
10
+ from monai.bundle import ConfigWorkflow, ConfigParser
11
+ from monai.apps.auto3dseg import AutoRunner, BundleGen, AlgoEnsemble
12
+ from monai.data import MetaTensor
13
+ from monai.transforms import Compose, MapTransform, InvertibleTransform
14
+ from monai.engines import Workflow, IterationEvents
15
+ except ImportError as e:
16
+ print(f"Import failed: {e}. Please ensure all dependencies are correctly installed.")
17
+
18
+ class Adapter:
19
+ """
20
+ Adapter class for the MCP plugin, providing methods to interact with MONAI's core functionalities.
21
+ """
22
+
23
+ def __init__(self):
24
+ self.mode = "import"
25
+
26
+ # -------------------- Bundle Module --------------------
27
+
28
+ def create_config_workflow(self, config_path):
29
+ """
30
+ Create a ConfigWorkflow instance.
31
+
32
+ :param config_path: Path to the configuration file.
33
+ :return: Dictionary with status and ConfigWorkflow instance or error message.
34
+ """
35
+ try:
36
+ workflow = ConfigWorkflow(config_path)
37
+ return {"status": "success", "workflow": workflow}
38
+ except Exception as e:
39
+ return {"status": "error", "message": str(e)}
40
+
41
+ def parse_config(self, config_path):
42
+ """
43
+ Parse a configuration file using ConfigParser.
44
+
45
+ :param config_path: Path to the configuration file.
46
+ :return: Dictionary with status and parsed configuration or error message.
47
+ """
48
+ try:
49
+ config = ConfigParser(config_path)
50
+ return {"status": "success", "config": config}
51
+ except Exception as e:
52
+ return {"status": "error", "message": str(e)}
53
+
54
+ # -------------------- Auto3DSeg Module --------------------
55
+
56
+ def run_auto_runner(self, config):
57
+ """
58
+ Run the AutoRunner with the given configuration.
59
+
60
+ :param config: Configuration for the AutoRunner.
61
+ :return: Dictionary with status and AutoRunner instance or error message.
62
+ """
63
+ try:
64
+ runner = AutoRunner(config)
65
+ runner.run()
66
+ return {"status": "success", "runner": runner}
67
+ except Exception as e:
68
+ return {"status": "error", "message": str(e)}
69
+
70
+ def generate_bundle(self, template_path):
71
+ """
72
+ Generate a bundle using BundleGen.
73
+
74
+ :param template_path: Path to the template.
75
+ :return: Dictionary with status and BundleGen instance or error message.
76
+ """
77
+ try:
78
+ bundle_gen = BundleGen(template_path)
79
+ return {"status": "success", "bundle_gen": bundle_gen}
80
+ except Exception as e:
81
+ return {"status": "error", "message": str(e)}
82
+
83
+ def create_algo_ensemble(self, models):
84
+ """
85
+ Create an algorithm ensemble using AlgoEnsemble.
86
+
87
+ :param models: List of models to include in the ensemble.
88
+ :return: Dictionary with status and AlgoEnsemble instance or error message.
89
+ """
90
+ try:
91
+ ensemble = AlgoEnsemble(models)
92
+ return {"status": "success", "ensemble": ensemble}
93
+ except Exception as e:
94
+ return {"status": "error", "message": str(e)}
95
+
96
+ # -------------------- Data Module --------------------
97
+
98
+ def create_meta_tensor(self, data, meta=None):
99
+ """
100
+ Create a MetaTensor instance.
101
+
102
+ :param data: Data for the MetaTensor.
103
+ :param meta: Metadata for the MetaTensor.
104
+ :return: Dictionary with status and MetaTensor instance or error message.
105
+ """
106
+ try:
107
+ meta_tensor = MetaTensor(data, meta)
108
+ return {"status": "success", "meta_tensor": meta_tensor}
109
+ except Exception as e:
110
+ return {"status": "error", "message": str(e)}
111
+
112
+ # -------------------- Transforms Module --------------------
113
+
114
+ def compose_transforms(self, transforms):
115
+ """
116
+ Compose a series of transforms.
117
+
118
+ :param transforms: List of transforms to compose.
119
+ :return: Dictionary with status and Compose instance or error message.
120
+ """
121
+ try:
122
+ composed = Compose(transforms)
123
+ return {"status": "success", "composed": composed}
124
+ except Exception as e:
125
+ return {"status": "error", "message": str(e)}
126
+
127
+ def map_transform(self, data, transform):
128
+ """
129
+ Apply a MapTransform to the data.
130
+
131
+ :param data: Data to transform.
132
+ :param transform: Transform to apply.
133
+ :return: Dictionary with status and transformed data or error message.
134
+ """
135
+ try:
136
+ mapped_data = MapTransform(data, transform)
137
+ return {"status": "success", "mapped_data": mapped_data}
138
+ except Exception as e:
139
+ return {"status": "error", "message": str(e)}
140
+
141
+ def invertible_transform(self, data, transform):
142
+ """
143
+ Apply an InvertibleTransform to the data.
144
+
145
+ :param data: Data to transform.
146
+ :param transform: Transform to apply.
147
+ :return: Dictionary with status and transformed data or error message.
148
+ """
149
+ try:
150
+ inverted_data = InvertibleTransform(data, transform)
151
+ return {"status": "success", "inverted_data": inverted_data}
152
+ except Exception as e:
153
+ return {"status": "error", "message": str(e)}
154
+
155
+ # -------------------- Engines Module --------------------
156
+
157
+ def create_workflow(self, config):
158
+ """
159
+ Create a Workflow instance.
160
+
161
+ :param config: Configuration for the workflow.
162
+ :return: Dictionary with status and Workflow instance or error message.
163
+ """
164
+ try:
165
+ workflow = Workflow(config)
166
+ return {"status": "success", "workflow": workflow}
167
+ except Exception as e:
168
+ return {"status": "error", "message": str(e)}
169
+
170
+ def handle_iteration_events(self, workflow):
171
+ """
172
+ Handle iteration events in a workflow.
173
+
174
+ :param workflow: Workflow instance.
175
+ :return: Dictionary with status and event handling result or error message.
176
+ """
177
+ try:
178
+ events = IterationEvents(workflow)
179
+ return {"status": "success", "events": events}
180
+ except Exception as e:
181
+ return {"status": "error", "message": str(e)}
MONAI/mcp_output/mcp_plugin/main.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ MCP Service Auto-Wrapper - Auto-generated
3
+ """
4
+ from mcp_service import create_app
5
+
6
+ def main():
7
+ """Main entry point"""
8
+ app = create_app()
9
+ return app
10
+
11
+ if __name__ == "__main__":
12
+ app = main()
13
+ app.run()
MONAI/mcp_output/mcp_plugin/mcp_service.py ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import sys
3
+
4
+ # Add the local source directory to sys.path
5
+ source_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), "source")
6
+ if source_path not in sys.path:
7
+ sys.path.insert(0, source_path)
8
+
9
+ from fastmcp import FastMCP
10
+
11
+ # Import core modules from the local source directory
12
+ from monai.data import Dataset, DataLoader
13
+ from monai.transforms import Compose, LoadImage, NormalizeIntensity
14
+ from monai.networks.nets import UNet
15
+ from monai.engines import SupervisedTrainer
16
+ from monai.metrics import DiceMetric
17
+
18
+ # Create the FastMCP service application
19
+ mcp = FastMCP("monai_service")
20
+
21
+ @mcp.tool(name="load_dataset", description="Load a dataset using MONAI's Dataset class")
22
+ def load_dataset(data_dir: str) -> dict:
23
+ """
24
+ Load a dataset from the specified directory.
25
+
26
+ Parameters:
27
+ - data_dir (str): The directory containing the dataset.
28
+
29
+ Returns:
30
+ - dict: A dictionary containing success status and the dataset object or error message.
31
+ """
32
+ try:
33
+ dataset = Dataset(data_dir)
34
+ return {"success": True, "result": dataset}
35
+ except Exception as e:
36
+ return {"success": False, "error": str(e)}
37
+
38
+ @mcp.tool(name="create_dataloader", description="Create a DataLoader for the dataset")
39
+ def create_dataloader(dataset: Dataset, batch_size: int) -> dict:
40
+ """
41
+ Create a DataLoader for the given dataset.
42
+
43
+ Parameters:
44
+ - dataset (Dataset): The dataset to load.
45
+ - batch_size (int): The number of samples per batch.
46
+
47
+ Returns:
48
+ - dict: A dictionary containing success status and the DataLoader object or error message.
49
+ """
50
+ try:
51
+ dataloader = DataLoader(dataset, batch_size=batch_size)
52
+ return {"success": True, "result": dataloader}
53
+ except Exception as e:
54
+ return {"success": False, "error": str(e)}
55
+
56
+ @mcp.tool(name="apply_transforms", description="Apply transforms to the dataset")
57
+ def apply_transforms(dataset: Dataset) -> dict:
58
+ """
59
+ Apply a series of transforms to the dataset.
60
+
61
+ Parameters:
62
+ - dataset (Dataset): The dataset to transform.
63
+
64
+ Returns:
65
+ - dict: A dictionary containing success status and the transformed dataset or error message.
66
+ """
67
+ try:
68
+ transforms = Compose([LoadImage(), NormalizeIntensity()])
69
+ transformed_dataset = [transforms(item) for item in dataset]
70
+ return {"success": True, "result": transformed_dataset}
71
+ except Exception as e:
72
+ return {"success": False, "error": str(e)}
73
+
74
+ @mcp.tool(name="initialize_unet", description="Initialize a UNet model")
75
+ def initialize_unet(spatial_dims: int, in_channels: int, out_channels: int) -> dict:
76
+ """
77
+ Initialize a UNet model with the specified parameters.
78
+
79
+ Parameters:
80
+ - spatial_dims (int): The number of spatial dimensions.
81
+ - in_channels (int): The number of input channels.
82
+ - out_channels (int): The number of output channels.
83
+
84
+ Returns:
85
+ - dict: A dictionary containing success status and the UNet model or error message.
86
+ """
87
+ try:
88
+ model = UNet(spatial_dims=spatial_dims, in_channels=in_channels, out_channels=out_channels)
89
+ return {"success": True, "result": model}
90
+ except Exception as e:
91
+ return {"success": False, "error": str(e)}
92
+
93
+ @mcp.tool(name="train_model", description="Train a model using MONAI's SupervisedTrainer")
94
+ def train_model(model, dataloader: DataLoader, max_epochs: int) -> dict:
95
+ """
96
+ Train a model using the specified dataloader and number of epochs.
97
+
98
+ Parameters:
99
+ - model: The model to train.
100
+ - dataloader (DataLoader): The DataLoader for training data.
101
+ - max_epochs (int): The maximum number of training epochs.
102
+
103
+ Returns:
104
+ - dict: A dictionary containing success status and training results or error message.
105
+ """
106
+ try:
107
+ trainer = SupervisedTrainer(max_epochs=max_epochs, train_data_loader=dataloader, network=model)
108
+ trainer.run()
109
+ return {"success": True, "result": "Training completed"}
110
+ except Exception as e:
111
+ return {"success": False, "error": str(e)}
112
+
113
+ @mcp.tool(name="calculate_dice", description="Calculate Dice metric for model evaluation")
114
+ def calculate_dice(predictions, targets) -> dict:
115
+ """
116
+ Calculate the Dice metric for the given predictions and targets.
117
+
118
+ Parameters:
119
+ - predictions: The model predictions.
120
+ - targets: The ground truth targets.
121
+
122
+ Returns:
123
+ - dict: A dictionary containing success status and the Dice score or error message.
124
+ """
125
+ try:
126
+ dice_metric = DiceMetric()
127
+ dice_score = dice_metric(predictions, targets)
128
+ return {"success": True, "result": dice_score}
129
+ except Exception as e:
130
+ return {"success": False, "error": str(e)}
131
+
132
+ def create_app() -> FastMCP:
133
+ """
134
+ Create and return the FastMCP application instance.
135
+
136
+ Returns:
137
+ - FastMCP: The FastMCP application instance.
138
+ """
139
+ return mcp
MONAI/mcp_output/requirements.txt ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ fastmcp
2
+ fastapi
3
+ uvicorn[standard]
4
+ pydantic>=2.0.0
5
+ torch>=2.4.1; platform_system != "Windows"
6
+ numpy>=1.24,<3.0
7
+ nibabel
MONAI/mcp_output/start_mcp.py ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ """
3
+ MCP Service Startup Entry
4
+ """
5
+ import sys
6
+ import os
7
+
8
+ project_root = os.path.dirname(os.path.abspath(__file__))
9
+ mcp_plugin_dir = os.path.join(project_root, "mcp_plugin")
10
+ if mcp_plugin_dir not in sys.path:
11
+ sys.path.insert(0, mcp_plugin_dir)
12
+
13
+ from mcp_service import create_app
14
+
15
+ def main():
16
+ """Start FastMCP service"""
17
+ app = create_app()
18
+ # Use environment variable to configure port, default 8000
19
+ port = int(os.environ.get("MCP_PORT", "8000"))
20
+
21
+ # Choose transport mode based on environment variable
22
+ transport = os.environ.get("MCP_TRANSPORT", "stdio")
23
+ if transport == "http":
24
+ app.run(transport="http", host="0.0.0.0", port=port)
25
+ else:
26
+ # Default to STDIO mode
27
+ app.run()
28
+
29
+ if __name__ == "__main__":
30
+ main()
MONAI/mcp_output/workflow_summary.json ADDED
@@ -0,0 +1,228 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "repository": {
3
+ "name": "MONAI",
4
+ "url": "https://github.com/Project-MONAI/MONAI",
5
+ "local_path": "/export/zxcpu1/shiweijie/code/ghh/Code2MCP/workspace/MONAI",
6
+ "description": "Python library",
7
+ "features": "Basic functionality",
8
+ "tech_stack": "Python",
9
+ "stars": 0,
10
+ "forks": 0,
11
+ "language": "Python",
12
+ "last_updated": "",
13
+ "complexity": "complex",
14
+ "intrusiveness_risk": "medium"
15
+ },
16
+ "execution": {
17
+ "start_time": 1770096773.8238096,
18
+ "end_time": 1770096879.8965826,
19
+ "duration": 106.07277607917786,
20
+ "status": "success",
21
+ "workflow_status": "success",
22
+ "nodes_executed": [
23
+ "download",
24
+ "analysis",
25
+ "env",
26
+ "generate",
27
+ "run",
28
+ "review",
29
+ "finalize"
30
+ ],
31
+ "total_files_processed": 35,
32
+ "environment_type": "unknown",
33
+ "llm_calls": 0,
34
+ "deepwiki_calls": 0
35
+ },
36
+ "tests": {
37
+ "original_project": {
38
+ "passed": false,
39
+ "details": {},
40
+ "test_coverage": "100%",
41
+ "execution_time": 0,
42
+ "test_files": []
43
+ },
44
+ "mcp_plugin": {
45
+ "passed": true,
46
+ "details": {},
47
+ "service_health": "healthy",
48
+ "startup_time": 0,
49
+ "transport_mode": "stdio",
50
+ "fastmcp_version": "unknown",
51
+ "mcp_version": "unknown"
52
+ }
53
+ },
54
+ "analysis": {
55
+ "structure": {
56
+ "packages": [
57
+ "source.monai",
58
+ "source.monai._extensions",
59
+ "source.monai.apps",
60
+ "source.monai.auto3dseg",
61
+ "source.monai.bundle",
62
+ "source.monai.config",
63
+ "source.monai.data",
64
+ "source.monai.engines",
65
+ "source.monai.fl",
66
+ "source.monai.handlers",
67
+ "source.monai.inferers",
68
+ "source.monai.losses",
69
+ "source.monai.metrics",
70
+ "source.monai.networks",
71
+ "source.monai.optimizers",
72
+ "source.monai.transforms",
73
+ "source.monai.utils",
74
+ "source.monai.visualize",
75
+ "source.tests",
76
+ "source.tests.apps",
77
+ "source.tests.bundle",
78
+ "source.tests.config",
79
+ "source.tests.engines",
80
+ "source.tests.fl",
81
+ "source.tests.handlers",
82
+ "source.tests.inferers",
83
+ "source.tests.integration",
84
+ "source.tests.losses",
85
+ "source.tests.metrics",
86
+ "source.tests.networks",
87
+ "source.tests.optimizers",
88
+ "source.tests.profile_subclass",
89
+ "source.tests.transforms",
90
+ "source.tests.utils",
91
+ "source.tests.visualize"
92
+ ]
93
+ },
94
+ "dependencies": {
95
+ "has_environment_yml": false,
96
+ "has_requirements_txt": true,
97
+ "pyproject": true,
98
+ "setup_cfg": true,
99
+ "setup_py": true
100
+ },
101
+ "entry_points": {
102
+ "imports": [],
103
+ "cli": [],
104
+ "modules": []
105
+ },
106
+ "risk_assessment": {
107
+ "import_feasibility": 0.8,
108
+ "intrusiveness_risk": "medium",
109
+ "complexity": "complex"
110
+ },
111
+ "deepwiki_analysis": {
112
+ "repo_url": "https://github.com/Project-MONAI/MONAI",
113
+ "repo_name": "MONAI",
114
+ "content": "Project-MONAI/MONAI\nMONAI Overview\nCore Architecture\nCore Utilities and Module System\nTransform System\nTransform Architecture and Base Classes\nSpatial Transforms\nIntensity and Utility Transforms\nCrop, Pad, and Post-processing Transforms\nDictionary-Based Transforms\nInvertible Transforms and MetaTensor\nLazy Transform Execution\nData Loading and Processing\nDataset System and Caching Strategies\nImage I/O and Readers\nModels and Training\nNetwork Architectures and Components\nTraining Engines and Workflows\nAdvanced Features\nBundle System\nBundle Format and Structure\nBundle Scripts and Workflows\nAuto3DSeg Pipeline\nDevelopment and Infrastructure\nProject Setup and Dependencies\nCI/CD Workflows and Testing\nPackaging and Distribution\nMONAI Overview\ndocs/images/MONAI_arch.png\ndocs/images/MONAI_bundle_cloud.png\ndocs/images/MONAI_clouds.png\ndocs/images/MONAI_map_cloud.png\ndocs/source/bundle_intro.rst\ndocs/source/index.rst\ndocs/source/mb_specification.rst\ntests/testing_data/metadata.json\nPurpose and Scope\nThis document provides a high-level introduction to the MONAI framework, its architecture, and core concepts. It covers the overall organization of the codebase, key modules, and how they interact to support deep learning workflows for medical imaging.\nFor detailed information about specific subsystems:\nCore utilities and module management: seeCore Utilities and Module System\nTransform system architecture: seeTransform System\nData loading and caching: seeData Loading and Processing\nNeural network architectures: seeNetwork Architectures and Components\nTraining workflows: seeTraining Engines and Workflows\nBundle format and distribution: seeBundle System\nDevelopment setup and CI/CD: seeDevelopment and Infrastructure\nWhat is MONAI?\nMONAI(Medical Open Network for AI) is a PyTorch-based, open-source framework for deep learning in healthcare imaging. It is part of the PyTorch Ecosystem and provides domain-specific implementations for medical image analysis tasks including segmentation, classification, detection, and registration.\nCore Ambitions:\nDevelop a collaborative community of academic, industrial, and clinical researchers\nCreate state-of-the-art, end-to-end training workflows for healthcare imaging\nProvide researchers with optimized and standardized tools for deep learning model development\nKey Features:\nFlexible preprocessing for multi-dimensional medical imaging data\nCompositional and portable APIs for integration into existing workflows\nDomain-specific network architectures, loss functions, and evaluation metrics\nCustomizable design supporting varying levels of user expertise\nMulti-GPU and multi-node data parallelism support\nSources:README.md1-26docs/source/index.rst1-44\nArchitecture Overview\nMONAI's architecture is organized into distinct layers, each providing specific functionality for medical imaging workflows. The diagram below maps high-level system components to their corresponding code modules.\nmonai/Core UtilitiesHigh-Level SystemsTraining InfrastructureNeural NetworksCore Data Pipelinesupportssupportssupportsmonai.utilsmodule, misc, enumsmonai.dataDataset, DataLoader,CacheDataset, ImageReadermonai.transformsCompose, Spatial, Intensity,MapTransform, InvertibleTransformmonai.enginesTrainer, Evaluator, Workflowmonai.networksnets/, blocks/, layers/monai.lossesDiceLoss, FocalLoss, etc.monai.metricsDiceMetric, ConfusionMatrixMetricmonai.handlersCheckpointSaver, StatsHandler,ValidationHandlermonai.bundleConfigWorkflow, scripts/monai.apps.auto3dsegAutoRunner, BundleGenmonai.configdeviceconfig, type_definitions\nCore Utilities\nHigh-Level Systems\nTraining Infrastructure\nNeural Networks\nCore Data Pipeline\nmonai.utilsmodule, misc, enums\nmonai.dataDataset, DataLoader,CacheDataset, ImageReader\nmonai.transformsCompose, Spatial, Intensity,MapTransform, InvertibleTransform\nmonai.enginesTrainer, Evaluator, Workflow\nmonai.networksnets/, blocks/, layers/\nmonai.lossesDiceLoss, FocalLoss, etc.\nmonai.metricsDiceMetric, ConfusionMatrixMetric\nmonai.handlersCheckpointSaver, StatsHandler,ValidationHandler\nmonai.bundleConfigWorkflow, scripts/\nmonai.apps.auto3dsegAutoRunner, BundleGen\nmonai.configdeviceconfig, type_definitions\nSources:README.md20-44docs/source/index.rst14-34\nCore Module Organization\nThe MONAI codebase is organized into the following primary modules located under themonai/directory:\nData Pipeline (monai.data)\nHandles medical image loading, caching, and batching:\nDataset: Base dataset class and specialized variants (CacheDataset,PersistentDataset,LMDBDataset)\nCacheDataset\nPersistentDataset\nLMDBDataset\nDataLoader: PyTorch-compatible data loaders with custom collation\nImageReader: Format-specific readers (ITKReader,NibabelReader,PydicomReader)\nImageReader\nNibabelReader\nPydicomReader\nDecathalon dataset utilities\nKey Classes:Dataset,CacheDataset,DataLoader,ImageReader\nCacheDataset\nImageReader\nTransforms (monai.transforms)\nmonai.transforms\nComposable preprocessing and augmentation operations:\nBase Classes:Transform,MapTransform,RandomizableTransform,InvertibleTransform,LazyTransform\nMapTransform\nRandomizableTransform\nInvertibleTransform\nLazyTransform\nCategories:Spatial (resampling, rotation), Intensity (normalization, scaling), Crop/Pad, Post-processing\nDictionary Pattern:Transforms ending with 'd' suffix operate on dictionaries (e.g.,LoadImaged,Spacingd)\nKey Classes:Compose,LoadImage,Spacing,NormalizeIntensity,RandCropByPosNegLabel\nNormalizeIntensity\nRandCropByPosNegLabel\nNetworks (monai.networks)\nmonai.networks\nNeural network architectures and building blocks:\nnets/: Complete architectures (UNet, ResNet, DenseNet, ViT, UNETR, etc.)\nblocks/: Reusable network components (Convolution, Residual, Attention blocks)\nlayers/: Low-level layer implementations (Act, Norm, Conv, Dropout)\nKey Classes:UNet,DynUNet,SegResNet,UNETR,SwinUNETR\nTraining & Evaluation (monai.engines,monai.metrics,monai.handlers)\nmonai.engines\nmonai.metrics\nmonai.handlers\nEvent-driven training infrastructure built on PyTorch Ignite:\nEngines:Workflow,SupervisedTrainer,SupervisedEvaluator,EnsembleEvaluator\nSupervisedTrainer\nSupervisedEvaluator\nEnsembleEvaluator\nMetrics:DiceMetric,HausdorffDistanceMetric,ConfusionMatrixMetric,ROCAUCMetric\nHausdorffDistanceMetric\nConfusionMatrixMetric\nROCAUCMetric\nHandlers:CheckpointSaver,StatsHandler,TensorBoardHandler,ValidationHandler\nCheckpointSaver\nStatsHandler\nTensorBoardHandler\nValidationHandler\nKey Classes:SupervisedTrainer,SupervisedEvaluator,DiceMetric,CheckpointSaver\nSupervisedTrainer\nSupervisedEvaluator\nCheckpointSaver\nBundle System (monai.bundle)\nmonai.bundle\nStandardized model packaging and distribution:\nConfiguration-based workflows (ConfigWorkflow)\nConfigWorkflow\nBundle scripts for download, load, run, verify operations\nModel export utilities (TorchScript, ONNX, TensorRT)\nKey Classes:ConfigWorkflow,ConfigParser\nConfigWorkflow\nConfigParser\nAuto3DSeg (monai.apps.auto3dseg)\nmonai.apps.auto3dseg\nAutomated pipeline for 3D segmentation tasks:\nAutoRunner: Orchestrates the complete automated workflow\nBundleGen: Generates algorithm bundles from templates\nAlgoEnsemble: Combines multiple models for inference\nAlgoEnsemble\nKey Classes:AutoRunner,BundleGen,AlgoEnsemble\nAlgoEnsemble\nSources:README.md27-44docs/source/index.rst27-34\nKey Abstractions and Design Patterns\nMONAI'sMetaTensorextends PyTorch tensors with metadata tracking capabilities. It stores:\nAffine transformations for spatial coordinates\nApplied operations history (applied_operations)\napplied_operations\nPending lazy operations (pending_operations)\npending_operations\nOriginal image metadata (spacing, orientation, etc.)\nThis enables invertible transforms and preserves provenance throughout the pipeline.\nKey Class:monai.data.MetaTensor\nmonai.data.MetaTensor\nTransform Composition\nTransforms follow a functional composition pattern where operations are chained:\nRaw ImageComposeLoadImageEnsureChannelFirstSpacingNormalizeIntensityRandSpatialCropPreprocessed Tensor\nEnsureChannelFirst\nNormalizeIntensity\nRandSpatialCrop\nPreprocessed Tensor\nDictionary-based transforms (suffix 'd') operate on dictionaries of data items, enabling coordinated transformations across images, labels, and metadata.\nKey Classes:Compose,MapTransform,InvertibleTransform\nMapTransform\nInvertibleTransform\nEvent-Driven Training\nTraining engines use an event system built on PyTorch Ignite:\nEngine / WorkflowEPOCH_STARTEDITERATION_STARTEDFORWARD_COMPLETEDBACKWARD_COMPLETEDITERATION_COMPLETEDEPOCH_COMPLETEDMetrics UpdateStatsHandlerCheckpointSaverValidationHandler\nEngine / Workflow\nEPOCH_STARTED\nITERATION_STARTED\nFORWARD_COMPLETED\nBACKWARD_COMPLETED\nITERATION_COMPLETED\nEPOCH_COMPLETED\nMetrics Update\nStatsHandler\nCheckpointSaver\nValidationHandler\nHandlers attach to events to perform actions like logging, checkpointing, or validation.\nKey Classes:Workflow,IterationEvents,Handler\nIterationEvents\nBundle Configuration System\nBundles use JSON/YAML configurations with a reference syntax for defining workflows:\n@network_defreferences network instantiation",
115
+ "model": "gpt-4o-2024-08-06",
116
+ "source": "selenium",
117
+ "success": true
118
+ },
119
+ "code_complexity": {
120
+ "cyclomatic_complexity": "medium",
121
+ "cognitive_complexity": "medium",
122
+ "maintainability_index": 75
123
+ },
124
+ "security_analysis": {
125
+ "vulnerabilities_found": 0,
126
+ "security_score": 85,
127
+ "recommendations": []
128
+ }
129
+ },
130
+ "plugin_generation": {
131
+ "files_created": [
132
+ "mcp_output/start_mcp.py",
133
+ "mcp_output/mcp_plugin/__init__.py",
134
+ "mcp_output/mcp_plugin/mcp_service.py",
135
+ "mcp_output/mcp_plugin/adapter.py",
136
+ "mcp_output/mcp_plugin/main.py",
137
+ "mcp_output/requirements.txt",
138
+ "mcp_output/README_MCP.md"
139
+ ],
140
+ "main_entry": "start_mcp.py",
141
+ "requirements": [
142
+ "fastmcp>=0.1.0",
143
+ "pydantic>=2.0.0"
144
+ ],
145
+ "readme_path": "/export/zxcpu1/shiweijie/code/ghh/Code2MCP/workspace/MONAI/mcp_output/README_MCP.md",
146
+ "adapter_mode": "import",
147
+ "total_lines_of_code": 0,
148
+ "generated_files_size": 0,
149
+ "tool_endpoints": 0,
150
+ "supported_features": [
151
+ "Basic functionality"
152
+ ],
153
+ "generated_tools": [
154
+ "Basic tools",
155
+ "Health check tools",
156
+ "Version info tools"
157
+ ]
158
+ },
159
+ "code_review": {},
160
+ "errors": [],
161
+ "warnings": [],
162
+ "recommendations": [
163
+ "Improve test coverage by adding more unit tests for critical modules",
164
+ "streamline the CI/CD workflows to reduce build times",
165
+ "enhance documentation for complex modules to aid new contributors",
166
+ "optimize large files for better performance",
167
+ "implement code quality checks using tools like linters and formatters",
168
+ "ensure all dependencies are up-to-date and compatible",
169
+ "improve modularity by breaking down large functions or classes",
170
+ "enhance error handling and logging for better debugging",
171
+ "conduct regular code reviews to maintain code quality",
172
+ "consider using automated tools for dependency management and security checks."
173
+ ],
174
+ "performance_metrics": {
175
+ "memory_usage_mb": 0,
176
+ "cpu_usage_percent": 0,
177
+ "response_time_ms": 0,
178
+ "throughput_requests_per_second": 0
179
+ },
180
+ "deployment_info": {
181
+ "supported_platforms": [
182
+ "Linux",
183
+ "Windows",
184
+ "macOS"
185
+ ],
186
+ "python_versions": [
187
+ "3.8",
188
+ "3.9",
189
+ "3.10",
190
+ "3.11",
191
+ "3.12"
192
+ ],
193
+ "deployment_methods": [
194
+ "Docker",
195
+ "pip",
196
+ "conda"
197
+ ],
198
+ "monitoring_support": true,
199
+ "logging_configuration": "structured"
200
+ },
201
+ "execution_analysis": {
202
+ "success_factors": [
203
+ "Comprehensive structure and dependency analysis",
204
+ "Successful execution of all workflow nodes"
205
+ ],
206
+ "failure_reasons": [],
207
+ "overall_assessment": "excellent",
208
+ "node_performance": {
209
+ "download_time": "Efficient, completed without delay",
210
+ "analysis_time": "Thorough analysis completed in a reasonable time",
211
+ "generation_time": "Code generation was swift and accurate",
212
+ "test_time": "Testing was comprehensive but could be improved with more coverage"
213
+ },
214
+ "resource_usage": {
215
+ "memory_efficiency": "Memory usage was not explicitly measured, but no issues reported",
216
+ "cpu_efficiency": "CPU usage was not explicitly measured, but no issues reported",
217
+ "disk_usage": "Disk usage was efficient with no excessive consumption"
218
+ }
219
+ },
220
+ "technical_quality": {
221
+ "code_quality_score": 85,
222
+ "architecture_score": 90,
223
+ "performance_score": 80,
224
+ "maintainability_score": 75,
225
+ "security_score": 85,
226
+ "scalability_score": 80
227
+ }
228
+ }
MONAI/source/.clang-format ADDED
@@ -0,0 +1,88 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ AccessModifierOffset: -1
3
+ AlignAfterOpenBracket: AlwaysBreak
4
+ AlignConsecutiveAssignments: false
5
+ AlignConsecutiveDeclarations: false
6
+ AlignEscapedNewlinesLeft: true
7
+ AlignOperands: false
8
+ AlignTrailingComments: false
9
+ AllowAllParametersOfDeclarationOnNextLine: false
10
+ AllowShortBlocksOnASingleLine: false
11
+ AllowShortCaseLabelsOnASingleLine: false
12
+ AllowShortFunctionsOnASingleLine: Empty
13
+ AllowShortIfStatementsOnASingleLine: false
14
+ AllowShortLoopsOnASingleLine: false
15
+ AlwaysBreakAfterReturnType: None
16
+ AlwaysBreakBeforeMultilineStrings: true
17
+ AlwaysBreakTemplateDeclarations: true
18
+ BinPackArguments: false
19
+ BinPackParameters: false
20
+ BraceWrapping:
21
+ AfterClass: false
22
+ AfterControlStatement: false
23
+ AfterEnum: false
24
+ AfterFunction: false
25
+ AfterNamespace: false
26
+ AfterObjCDeclaration: false
27
+ AfterStruct: false
28
+ AfterUnion: false
29
+ BeforeCatch: false
30
+ BeforeElse: false
31
+ IndentBraces: false
32
+ BreakBeforeBinaryOperators: None
33
+ BreakBeforeBraces: Attach
34
+ BreakBeforeTernaryOperators: true
35
+ BreakConstructorInitializersBeforeComma: false
36
+ BreakAfterJavaFieldAnnotations: false
37
+ BreakStringLiterals: false
38
+ ColumnLimit: 120
39
+ CommentPragmas: '^ IWYU pragma:'
40
+ CompactNamespaces: false
41
+ ConstructorInitializerAllOnOneLineOrOnePerLine: true
42
+ ConstructorInitializerIndentWidth: 4
43
+ ContinuationIndentWidth: 4
44
+ Cpp11BracedListStyle: true
45
+ DerivePointerAlignment: false
46
+ DisableFormat: false
47
+ ForEachMacros: [ FOR_EACH_RANGE, FOR_EACH, ]
48
+ IncludeCategories:
49
+ - Regex: '^<.*\.h(pp)?>'
50
+ Priority: 1
51
+ - Regex: '^<.*'
52
+ Priority: 2
53
+ - Regex: '.*'
54
+ Priority: 3
55
+ IndentCaseLabels: true
56
+ IndentWidth: 2
57
+ IndentWrappedFunctionNames: false
58
+ KeepEmptyLinesAtTheStartOfBlocks: false
59
+ MacroBlockBegin: ''
60
+ MacroBlockEnd: ''
61
+ MaxEmptyLinesToKeep: 1
62
+ NamespaceIndentation: None
63
+ ObjCBlockIndentWidth: 2
64
+ ObjCSpaceAfterProperty: false
65
+ ObjCSpaceBeforeProtocolList: false
66
+ PenaltyBreakBeforeFirstCallParameter: 1
67
+ PenaltyBreakComment: 300
68
+ PenaltyBreakFirstLessLess: 120
69
+ PenaltyBreakString: 1000
70
+ PenaltyExcessCharacter: 1000000
71
+ PenaltyReturnTypeOnItsOwnLine: 2000000
72
+ PointerAlignment: Left
73
+ ReflowComments: true
74
+ SortIncludes: true
75
+ SpaceAfterCStyleCast: false
76
+ SpaceBeforeAssignmentOperators: true
77
+ SpaceBeforeParens: ControlStatements
78
+ SpaceInEmptyParentheses: false
79
+ SpacesBeforeTrailingComments: 1
80
+ SpacesInAngles: false
81
+ SpacesInContainerLiterals: true
82
+ SpacesInCStyleCastParentheses: false
83
+ SpacesInParentheses: false
84
+ SpacesInSquareBrackets: false
85
+ Standard: Cpp11
86
+ TabWidth: 8
87
+ UseTab: Never
88
+ ...
MONAI/source/.coderabbit.yaml ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json
2
+
3
+ # This file configures CodeRabbit with the various options described in https://docs.coderabbit.ai/configure-coderabbit.
4
+ # CodeRabbit also has a set of commands here: https://docs.coderabbit.ai/guides/commands/
5
+
6
+ language: "en-US"
7
+ early_access: false
8
+ tone_instructions: "Be terse and to the point in all statements and commentary."
9
+ reviews:
10
+ # chill is less verbose, assertive is more verbose with more nitpick feedback
11
+ profile: chill
12
+ high_level_summary: false
13
+ high_level_summary_placeholder: "@coderabbitai summary"
14
+ sequence_diagrams: false
15
+ auto_apply_labels: false
16
+ suggested_reviewers: false
17
+ changed_files_summary: false
18
+ suggested_labels: false
19
+ abort_on_close: true
20
+ poem: false
21
+ path_instructions:
22
+ - path: '**/*.md'
23
+ instructions: Remember that documentation must be updated with the latest information.
24
+ - path: '**/*.rst'
25
+ instructions: Remember that documentation must be updated with the latest information.
26
+ - path: '**/*.py'
27
+ instructions: >-
28
+ Review the Python code for quality and correctness. Ensure variable names adhere to PEP8 style guides, are
29
+ sensible and informative in regards to their function, though permitting simple names for loop and comprehension
30
+ variables. Ensure routine names are meaningful in regards to their function and use verbs, adjectives, and
31
+ nouns in a semantically appropriate way. Docstrings should be present for all definition which describe each
32
+ variable, return value, and raised exception in the appropriate section of the Google-style of docstrings.
33
+ Examine code for logical error or inconsistencies, and suggest what may be changed to addressed these. Suggest
34
+ any enhancements for code improving efficiency, maintainability, comprehensibility, and correctness. Ensure new
35
+ or modified definitions will be covered by existing or new unit tests.
36
+
37
+ auto_review:
38
+ # Automatic Review | Automatic code review
39
+ enabled: true
40
+ # Review draft PRs/MRs.
41
+ drafts: false
42
+ # ignore PRs with these in the title, these sorts of PRs should be drafts anyway
43
+ ignore_title_keywords:
44
+ - "WIP"
45
+ - "DO NOT MERGE"
46
+
47
+ # opt out for now until it's clear this isn't too much info and is useful
48
+ knowledge_base:
49
+ opt_out: true
50
+
51
+ # chat is allowed
52
+ chat:
53
+ auto_reply: true
MONAI/source/.deepsource.toml ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version = 1
2
+
3
+ test_patterns = ["tests/**"]
4
+
5
+ exclude_patterns = [
6
+ "monai/_version.py",
7
+ "versioneer.py"
8
+ ]
9
+
10
+ [[analyzers]]
11
+ name = "python"
12
+ enabled = true
13
+
14
+ [analyzers.meta]
15
+ runtime_version = "3.x.x"
16
+
17
+ [[analyzers]]
18
+ name = "test-coverage"
19
+ enabled = true
20
+
21
+ [[analyzers]]
22
+ name = "docker"
23
+ enabled = true
24
+
25
+ [[analyzers]]
26
+ name = "shell"
27
+ enabled = true
MONAI/source/.dockerignore ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Ignore the following files/folders during docker build
2
+
3
+ __pycache__/
4
+ docs/
5
+
6
+ .coverage
7
+ .coverage.*
8
+ .coverage/
9
+ coverage.xml
10
+ .readthedocs.yml
11
+ *.toml
12
+
13
+ !README.md
MONAI/source/.pre-commit-config.yaml ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ default_language_version:
2
+ python: python3
3
+
4
+ ci:
5
+ autofix_prs: true
6
+ autoupdate_commit_msg: '[pre-commit.ci] pre-commit suggestions'
7
+ autoupdate_schedule: quarterly
8
+ # submodules: true
9
+
10
+ repos:
11
+ - repo: https://github.com/pre-commit/pre-commit-hooks
12
+ rev: v5.0.0
13
+ hooks:
14
+ - id: end-of-file-fixer
15
+ - id: trailing-whitespace
16
+ - id: check-yaml
17
+ - id: check-docstring-first
18
+ - id: check-executables-have-shebangs
19
+ - id: check-toml
20
+ - id: check-case-conflict
21
+ - id: check-added-large-files
22
+ args: ['--maxkb=1024']
23
+ - id: detect-private-key
24
+ - id: forbid-new-submodules
25
+ - id: pretty-format-json
26
+ args: ['--autofix', '--no-sort-keys', '--indent=4']
27
+ - id: end-of-file-fixer
28
+ - id: mixed-line-ending
29
+ - repo: https://github.com/astral-sh/ruff-pre-commit
30
+ rev: v0.7.0
31
+ hooks:
32
+ - id: ruff
33
+ args: ["--fix"]
34
+ exclude: |
35
+ (?x)(
36
+ ^versioneer.py|
37
+ ^monai/_version.py
38
+ )
39
+
40
+ - repo: https://github.com/asottile/yesqa
41
+ rev: v1.5.0
42
+ hooks:
43
+ - id: yesqa
44
+ name: Unused noqa
45
+ additional_dependencies:
46
+ - flake8>=3.8.1
47
+ - flake8-bugbear<=24.2.6
48
+ - flake8-comprehensions
49
+ - pep8-naming
50
+ exclude: |
51
+ (?x)^(
52
+ monai/__init__.py|
53
+ docs/source/conf.py|
54
+ tests/utils.py
55
+ )$
56
+
57
+ - repo: https://github.com/hadialqattan/pycln
58
+ rev: v2.5.0
59
+ hooks:
60
+ - id: pycln
61
+ args: [--config=pyproject.toml]
MONAI/source/.readthedocs.yml ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # .readthedocs.yml
2
+ # Read the Docs configuration file
3
+ # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
4
+
5
+ # Required
6
+ version: 2
7
+
8
+ # Build documentation in the docs/ directory with Sphinx
9
+ sphinx:
10
+ configuration: docs/source/conf.py
11
+
12
+ # Build documentation with MkDocs
13
+ #mkdocs:
14
+ # configuration: mkdocs.yml
15
+
16
+ # Optionally build your docs in additional formats such as PDF and ePub
17
+ # formats: all
18
+
19
+ # Optionally set the version of Python and requirements required to build your docs
20
+ python:
21
+ version: 3
22
+ install:
23
+ - requirements: docs/requirements.txt
24
+ # system_packages: true
25
+
26
+
27
+ build:
28
+ image: stable
MONAI/source/CHANGELOG.md ADDED
@@ -0,0 +1,1293 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Changelog
2
+ All notable changes to MONAI are documented in this file.
3
+
4
+ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
5
+
6
+ ## [Unreleased]
7
+
8
+ ## [1.5.2] - 2026-01-28
9
+
10
+ ## What's Changed
11
+ ### Fixed
12
+ * Fix Zip Slip vulnerability in NGC private bundle download (#8682)
13
+
14
+ ## [1.5.1] - 2025-09-22
15
+
16
+ ## What's Changed
17
+ ### Added
18
+ * PyTorch 2.7 and 2.8 support (#8429, #8530)
19
+ * Create SECURITY.md (#8546)
20
+ * Add kwargs in array and functional file (#8508)
21
+ * Add .coderabbit.yaml File (#8513)
22
+ * Add input validation to ImageStats class (#8501)
23
+ * Add support for optional conditioning in PatchInferer, SliceInferer, and SlidingWindowInferer (#8400)
24
+ * Add classifier free guidance unconditioned value (#8562)
25
+ * Improved `DiffusionModelEncoder` to support output linear layers of different dimensions (#8578, #8580)
26
+
27
+ ### Fixed
28
+ * Fix for insecure zip file extraction to address [GHSA-x6ww-pf9m-m73m](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-x6ww-pf9m-m73m) (#8568)
29
+ * Fix for insecure use of `torch.load` and `pickle` to address [GHSA-6vm5-6jv9-rjpj](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-6vm5-6jv9-rjpj) and [GHSA-p8cm-mm2v-gwjm](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-p8cm-mm2v-gwjm) (#8566)
30
+ * Torchvision fix for loading pretrained weights using current syntax (#8563)
31
+ * Fix bug in MAISI vae (#8517)
32
+ * Throw exception on invalid images in retinanet detector (#8515)
33
+ * Fix: HistogramNormalized doc (#8543)
34
+ * Fix build failure by pinning pyamg to versions below 5.3.0 (#8548)
35
+ * Fix hardcoded input dim in DiffusionModelEncoder (#8514)
36
+ * Fix for gdown downloading fails (#8576)
37
+
38
+ ### Changed
39
+ * Update README badges to add research paper citations number (#8494)
40
+ * CI: Add custom timeout to ci job in order to save resources (#8504)
41
+ * Improve documentation on the datalist format (#8539)
42
+ * Tests Cleanup and refactor (#8405, #8535)
43
+ * Improve Orientation transform to use the "space" (LPS vs RAS) of a metatensor by default (#8473)
44
+ * Updated supported version of Huggingface Transformers (#8574)
45
+
46
+ ## [1.5.0] - 2025-06-13
47
+
48
+ ## What's Changed
49
+ ### Added
50
+ * Add platform-specific constraints to setup.cfg (#8260)
51
+ * Add PythonicWorkflow (#8151)
52
+ * Add SM architecture version check (#8199)
53
+ * Add MedNext implementation (#8004)
54
+ * Added a top button to CONSTRIBUTING.md (#8163)
55
+ * Adding CODEOWNERS (#8457)
56
+ * Restormer Implementation (#8312)
57
+ * Add rectified flow noise scheduler for accelerated diffusion model (#8374)
58
+ * Add prediction type for rflow scheduler (#8386)
59
+ * Add Average Precision to metrics (#8089)
60
+ * Implementation of a Masked Autoencoder for representation learning (#8152)
61
+ * Implement TorchIO transforms wrapper analogous to TorchVision transfo… (#7579)
62
+ * 8328 nnunet bundle integration (#8329)
63
+ * Adding Support Policy + Doc Updates (#8458)
64
+ * Classifier free guidance (#8460)
65
+
66
+ ### Fixed
67
+ * Fix Ruff Numpy2 deprecation rules (#8179)
68
+ * Fix `torch.load()` frequently warning in PersistentDataset and GDSDataset (#8177)
69
+ * Fix the logging of a nested dictionary metric in MLflow (#8169)
70
+ * Fix ImageFilter to allow Gaussian filter without filter_size (#8189)
71
+ * Fix fold_constants, test_handler switched to onnx (#8211)
72
+ * Fix TypeError in meshgrid (#8252)
73
+ * Fix PatchMerging duplicate merging (#8285)
74
+ * Fix test load image issue (#8297)
75
+ * Fix bundle download error from ngc source (#8307)
76
+ * Fix deprecated usage in zarr (#8313, #8477)
77
+ * Fix DataFrame subsets indexing in CSVDataset() (#8351)
78
+ * Fix `packaging` imports in version comparison logic (#8347)
79
+ * Fix CommonKeys docstring (#8342)
80
+ * Fix: correctly apply fftshift to real-valued data inputs (#8407)
81
+ * Fix OptionalImportError: required package `openslide` is not installed (#8419)
82
+ * Fix cosine noise scheduler (#8427)
83
+ * Fix AutoencoderKL docstrings. (#8445)
84
+ * Inverse Threading Fix (#8418)
85
+ * Fix normalize intensity (#8286)
86
+ * Fix path at test onnx trt export (#8361)
87
+ * Fix broken urls (#8481, #8483)
88
+
89
+ ### Changed
90
+ * [DOC] Update README.md (#8157)
91
+ * Streamlined Rearrange in SpatialAttentionBlock (#8130)
92
+ * Optimize VISTA3D (#8123)
93
+ * Skip torch trt convert test with torch newer than or equal to 2.5.0 (#8165)
94
+ * Enable redirection of all loggers by configuring a FileHandler within the bundle (#8142)
95
+ * Apply pyupgrade fixes for Python 3.9+ syntax (#8150)
96
+ * Update base image to 2410 (#8164)
97
+ * TRT support for MAISI (#8153)
98
+ * 8134 Add unit test for responsive inference (#8146)
99
+ * SwinUNETR refactor to accept additional parameters (#8212)
100
+ * Allow an arbitrary mask to be used in the self attention (#8235)
101
+ * Bump codecov/codecov-action from 4 to 5 (#8245)
102
+ * Docs: update brats classes description (#8246)
103
+ * Change default value of `patch_norm` to False in `SwinUNETR` (#8249)
104
+ * Modify Dice, Jaccard and Tversky losses (#8138)
105
+ * Modify Workflow to Allow IterableDataset Inputs (#8263)
106
+ * Enhance download_and_extract (#8216)
107
+ * Relax gpu load check (#8282, #8275)
108
+ * Using LocalStore in Zarr v3 (#8299)
109
+ * Enable gpu load nifti (#8188)
110
+ * update pydicom reader to enable gpu load (#8283)
111
+ * Zarr compression tests only with versions before 3.0 (#8319)
112
+ * Changing utils.py to test_utils.py (#8335)
113
+ * Refactor testd (#8231)
114
+ * Recursive Item Mapping for Nested Lists in Compose (#8187)
115
+ * Bump min torch to 1.13.1 to mitigate CVE-2022-45907 unsafe usage of eval (#8296)
116
+ * Inferer modification - save_intermediates clashes with latent shape adjustment in latent diffusion inferers (#8343)
117
+ * Solves path problem in test_bundle_trt_export.py (#8357)
118
+ * Modify ControlNet inferer so that it takes in context when the diffus… (#8360)
119
+ * Update monaihosting download method (#8364)
120
+ * Bump torch minimum to mitigate CVE-2024-31580 & CVE-2024-31583 and enable numpy 2 compatibility (#8368)
121
+ * Auto3DSeg algo_template hash update (#8378)
122
+ * Enable Pytorch 2.6 (#8309)
123
+ * Auto3DSeg algo_template hash update (#8393, #8397)
124
+ * Update Dice Metric Docs (#8388)
125
+ * Auto3DSeg algo_template hash update (#8406)
126
+ * Update bundle download API (#8403)
127
+ * Add Skip test in TestTranschex (#8416)
128
+ * Update get latest bundle version function (#8420)
129
+ * Temporarily Restrict setuptools Version to 79.0.1 (#8441)
130
+ * Update default overlap value in occlusion_sensitivity to 0.6 (#8446)
131
+ * Enable code coverage comments on PRs in codecov configuration (#8402)
132
+ * Migrate to modern Python Logger API (#8449)
133
+
134
+ ### Deprecated
135
+ ### Removed
136
+ * Remove deprecated functionality for v1.5 (#8430)
137
+ * Remove deprecated `return_state_dict ` in bundle `load` (#8454)
138
+ * Remove deprecated `net_name` in test file (#8461)
139
+ * Remove unused test cases in bundle load (#8463)
140
+ * selfattention block: Remove the fc linear layer if it is not used (#8325)
141
+ * Removed outdated `torch` version checks from transform functions (#8359)
142
+
143
+ ## [1.4.0] - 2024-10-17
144
+ ## What's Changed
145
+ ### Added
146
+ * Implemented Conjugate Gradient Solver to generate confidence maps. (#7876)
147
+ * Added norm parameter to `ResNet` (#7752, #7805)
148
+ * Introduced alpha parameter to `DiceFocalLoss` for improved flexibility (#7841)
149
+ * Integrated Tailored ControlNet Implementations (#7875)
150
+ * Integrated Tailored Auto-Encoder Model (#7861)
151
+ * Integrated Tailored Diffusion U-Net Model (7867)
152
+ * Added Maisi morphological functions (#7893)
153
+ * Added support for downloading bundles from NGC private registry (#7907, #7929, #8076)
154
+ * Integrated generative refactor into the core (#7886, #7962)
155
+ * Made `ViT` and `UNETR` models compatible with TorchScript (#7937)
156
+ * Implemented post-download checks for MONAI bundles and compatibility warnings (#7938)
157
+ * Added NGC prefix argument when downloading bundles (#7974)
158
+ * Added flash attention support in the attention block for improved performance (#7977)
159
+ * Enhanced `MLPBlock` for compatibility with VISTA-3D (#7995)
160
+ * Added support for Neighbor-Aware Calibration Loss (NACL) for calibrated models in segmentation tasks (#7819)
161
+ * Added label_smoothing parameter to `DiceCELoss` for enhanced model calibration (#8000)
162
+ * Add `include_fc` and `use_combined_linear` argument in the `SABlock` (#7996)
163
+ * Added utilities, networks, and an inferer specific to VISTA-3D (#7999, #7987, #8047, #8059, #8021)
164
+ * Integrated a new network, `CellSamWrapper`, for cell-based applications (#7981)
165
+ * Introduced `WriteFileMapping` transform to map between input image paths and their corresponding output paths (#7769)
166
+ * Added `TrtHandler` to accelerate models using TensorRT (#7990, #8064)
167
+ * Added box and points conversion transforms for more flexible spatial manipulation (#8053)
168
+ * Enhanced `RandSimulateLowResolutiond` transform with deterministic support (#8057)
169
+ * Added a contiguous argument to the `Fourier` class to facilitate contiguous tensor outputs (#7969)
170
+ * Allowed `ApplyTransformToPointsd` to receive a sequence of reference keys for more versatile point manipulation (#8063)
171
+ * Made `MetaTensor` an optional print in `DataStats` and `DataStatsd` for more concise logging (#7814)
172
+ #### misc.
173
+ * Refactored Dataset to utilize Compose for handling transforms. (#7784)
174
+ * Combined `map_classes_to_indices` and `generate_label_classes_crop_centers` into a unified function (#7712)
175
+ * Introduced metadata schema directly into the codebase for improved structure and validation (#7409)
176
+ * Renamed `optional_packages_version` to `required_packages_version` for clearer package dependency management. (#7253)
177
+ * Replaced `pkg_resources` with the more modern packaging module for package handling (#7953)
178
+ * Refactored MAISI-related networks to align with the new generative components (#7989, #7993, #8005)
179
+ * Added a badge displaying monthly download statistics to enhance project visibility (#7891)
180
+ ### Fixed
181
+ #### transforms
182
+ * Ensured deterministic behavior in `MixUp`, `CutMix`, and `CutOut` transforms (#7813)
183
+ * Applied a minor correction to `AsDiscrete` transform (#7984)
184
+ * Fixed handling of integer weightmaps in `RandomWeightedCrop` (#8097)
185
+ * Resolved data type bug in `ScaleIntensityRangePercentile` (#8109)
186
+ #### data
187
+ * Fixed negative strides issue in the `NrrdReader` (#7809)
188
+ * Addressed wsireader issue with retrieving MPP (7921)
189
+ * Ensured location is returned as a tuple in wsireader (#8007)
190
+ * Corrected interpretation of space directions in nrrd reader (#8091)
191
+ #### metrics and losses
192
+ * Improved memory management for `NACLLoss` (#8020)
193
+ * Fixed reduction logic in `GeneralizedDiceScore` (#7970)
194
+ #### networks
195
+ * Resolved issue with loading pre-trained weights in `ResNet` (#7924)
196
+ * Fixed error where `torch.device` object had no attribute gpu_id during TensorRT export (#8019)
197
+ * Corrected function for loading older weights in `DiffusionModelUNet` (#8031)
198
+ * Switched to `torch_tensorrt.Device` instead of `torch.device` during TensorRT compilation (#8051)
199
+ #### engines and handlers
200
+ * Attempted to resolve the "experiment already exists" issue in `MLFlowHandler` (#7916)
201
+ * Refactored the model export process for conversion and saving (#7934)
202
+ #### misc.
203
+ * Adjusted requirements to exclude Numpy version 2.0 (#7859)
204
+ * Updated deprecated `scipy.ndimage` namespaces in optional imports (#7847, #7897)
205
+ * Resolved `load_module()` deprecation in Python 3.12 (#7881)
206
+ * Fixed Ruff type check issues (#7885)
207
+ * Cleaned disk space in the conda test pipeline (#7902)
208
+ * Replaced deprecated `pkgutil.find_loader` usage (#7906)
209
+ * Enhanced docstrings in various modules (#7913, #8055)
210
+ * Test cases fixing (#7905, #7794, #7808)
211
+ * Fix mypy issue introduced in 1.11.0 (#7941)
212
+ * Cleaned up warnings during test collection (#7914)
213
+ * Fix incompatible types in assignment issue (#7950)
214
+ * Fix outdated link in the docs (#7971)
215
+ * Addressed CI issues (#7983, #8013)
216
+ * Fix module can not import correctly issue (#8015)
217
+ * Fix AttributeError when using torch.min and max (#8041)
218
+ * Ensure synchronization by adding `cuda.synchronize` (#8058)
219
+ * Ignore warning from nptyping as workaround (#8062)
220
+ * Suppress deprecated warning when importing monai (#8067)
221
+ * Fix link in test bundle under MONAI-extra-test-data (#8092)
222
+ ### Changed
223
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:24.08-py3` from `nvcr.io/nvidia/pytorch:23.08-py3`
224
+ * Change blossom-ci to ACL security format (#7843)
225
+ * Move PyType test to weekly test (#8025)
226
+ * Adjusted to meet Numpy 2.0 requirements (#7857)
227
+ ### Deprecated
228
+ * Dropped support for Python 3.8 (#7909)
229
+ * Remove deprecated arguments and class for v1.4 (#8079)
230
+ ### Removed
231
+ * Remove use of deprecated python 3.12 strtobool (#7900)
232
+ * Removed the pipeline for publishing to testpypi (#8086)
233
+ * Cleaning up some very old and now obsolete infrastructure (#8113, #8118, #8121)
234
+
235
+ ## [1.3.2] - 2024-06-25
236
+ ### Fixed
237
+ #### misc.
238
+ * Updated Numpy version constraint to < 2.0 (#7859)
239
+
240
+ ## [1.3.1] - 2024-05-17
241
+ ### Added
242
+ * Support for `by_measure` argument in `RemoveSmallObjects` (#7137)
243
+ * Support for `pretrained` flag in `ResNet` (#7095)
244
+ * Support for uploading and downloading bundles to and from the Hugging Face Hub (#6454)
245
+ * Added weight parameter in DiceLoss to apply weight to voxels of each class (#7158)
246
+ * Support for returning dice for each class in `DiceMetric` (#7163)
247
+ * Introduced `ComponentStore` for storage purposes (#7159)
248
+ * Added utilities used in MONAI Generative (#7134)
249
+ * Enabled Python 3.11 support for `convert_to_torchscript` and `convert_to_onnx` (#7182)
250
+ * Support for MLflow in `AutoRunner` (#7176)
251
+ * `fname_regex` option in PydicomReader (#7181)
252
+ * Allowed setting AutoRunner parameters from config (#7175)
253
+ * `VoxelMorphUNet` and `VoxelMorph` (#7178)
254
+ * Enabled `cache` option in `GridPatchDataset` (#7180)
255
+ * Introduced `class_labels` option in `write_metrics_reports` for improved readability (#7249)
256
+ * `DiffusionLoss` for image registration task (#7272)
257
+ * Supported specifying `filename` in `Saveimage` (#7318)
258
+ * Compile support in `SupervisedTrainer` and `SupervisedEvaluator` (#7375)
259
+ * `mlflow_experiment_name` support in `Auto3DSeg` (#7442)
260
+ * Arm support (#7500)
261
+ * `BarlowTwinsLoss` for representation learning (#7530)
262
+ * `SURELoss` and `ConjugateGradient` for diffusion models (#7308)
263
+ * Support for `CutMix`, `CutOut`, and `MixUp` augmentation techniques (#7198)
264
+ * `meta_file` and `logging_file` options to `BundleWorkflow` (#7549)
265
+ * `properties_path` option to `BundleWorkflow` for customized properties (#7542)
266
+ * Support for both soft and hard clipping in `ClipIntensityPercentiles` (#7535)
267
+ * Support for not saving artifacts in `MLFlowHandler` (#7604)
268
+ * Support for multi-channel images in `PerceptualLoss` (#7568)
269
+ * Added ResNet backbone for `FlexibleUNet` (#7571)
270
+ * Introduced `dim_head` option in `SABlock` to set dimensions for each head (#7664)
271
+ * Direct links to github source code to docs (#7738, #7779)
272
+ #### misc.
273
+ * Refactored `list_data_collate` and `collate_meta_tensor` to utilize the latest PyTorch API (#7165)
274
+ * Added __str__ method in `Metric` base class (#7487)
275
+ * Made enhancements for testing files (#7662, #7670, #7663, #7671, #7672)
276
+ * Improved documentation for bundles (#7116)
277
+ ### Fixed
278
+ #### transforms
279
+ * Addressed issue where lazy mode was ignored in `SpatialPadd` (#7316)
280
+ * Tracked applied operations in `ImageFilter` (#7395)
281
+ * Warnings are now given only if missing class is not set to 0 in `generate_label_classes_crop_centers` (#7602)
282
+ * Input is now always converted to C-order in `distance_transform_edt` to ensure consistent behavior (#7675)
283
+ #### data
284
+ * Modified .npz file behavior to use keys in `NumpyReader` (#7148)
285
+ * Handled corrupted cached files in `PersistentDataset` (#7244)
286
+ * Corrected affine update in `NrrdReader` (#7415)
287
+ #### metrics and losses
288
+ * Addressed precision issue in `get_confusion_matrix` (#7187)
289
+ * Harmonized and clarified documentation and tests for dice losses variants (#7587)
290
+ #### networks
291
+ * Removed hard-coded `spatial_dims` in `SwinTransformer` (#7302)
292
+ * Fixed learnable `position_embeddings` in `PatchEmbeddingBlock` (#7564, #7605)
293
+ * Removed `memory_pool_limit` in TRT config (#7647)
294
+ * Propagated `kernel_size` to `ConvBlocks` within `AttentionUnet` (#7734)
295
+ * Addressed hard-coded activation layer in `ResNet` (#7749)
296
+ #### bundle
297
+ * Resolved bundle download issue (#7280)
298
+ * Updated `bundle_root` directory for `NNIGen` (#7586)
299
+ * Checked for `num_fold` and failed early if incorrect (#7634)
300
+ * Enhanced logging logic in `ConfigWorkflow` (#7745)
301
+ #### misc.
302
+ * Enabled chaining in `Auto3DSeg` CLI (#7168)
303
+ * Addressed useless error message in `nnUNetV2Runner` (#7217)
304
+ * Resolved typing and deprecation issues in Mypy (#7231)
305
+ * Quoted `$PY_EXE` variable to handle Python path that contains spaces in Bash (#7268)
306
+ * Improved documentation, code examples, and warning messages in various modules (#7234, #7213, #7271, #7326, #7569, #7584)
307
+ * Fixed typos in various modules (#7321, #7322, #7458, #7595, #7612)
308
+ * Enhanced docstrings in various modules (#7245, #7381, #7746)
309
+ * Handled error when data is on CPU in `DataAnalyzer` (#7310)
310
+ * Updated version requirements for third-party packages (#7343, #7344, #7384, #7448, #7659, #7704, #7744, #7742, #7780)
311
+ * Addressed incorrect slice compute in `ImageStats` (#7374)
312
+ * Avoided editing a loop's mutable iterable to address B308 (#7397)
313
+ * Fixed issue with `CUDA_VISIBLE_DEVICES` setting being ignored (#7408, #7581)
314
+ * Avoided changing Python version in CICD (#7424)
315
+ * Renamed partial to callable in instantiate mode (#7413)
316
+ * Imported AttributeError for Python 3.12 compatibility (#7482)
317
+ * Updated `nnUNetV2Runner` to support nnunetv2 2.2 (#7483)
318
+ * Used uint8 instead of int8 in `LabelStats` (#7489)
319
+ * Utilized subprocess for nnUNet training (#7576)
320
+ * Addressed deprecated warning in ruff (#7625)
321
+ * Fixed downloading failure on FIPS machine (#7698)
322
+ * Updated `torch_tensorrt` compile parameters to avoid warning (#7714)
323
+ * Restrict `Auto3DSeg` fold input based on datalist (#7778)
324
+ ### Changed
325
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:24.03-py3` from `nvcr.io/nvidia/pytorch:23.08-py3`
326
+ ### Removed
327
+ * Removed unrecommended star-arg unpacking after a keyword argument, addressed B026 (#7262)
328
+ * Skipped old PyTorch version test for `SwinUNETR` (#7266)
329
+ * Dropped docker build workflow and migrated to Nvidia Blossom system (#7450)
330
+ * Dropped Python 3.8 test on quick-py3 workflow (#7719)
331
+
332
+ ## [1.3.0] - 2023-10-12
333
+ ### Added
334
+ * Intensity transforms `ScaleIntensityFixedMean` and `RandScaleIntensityFixedMean` (#6542)
335
+ * `UltrasoundConfidenceMapTransform` used for computing confidence map from an ultrasound image (#6709)
336
+ * `channel_wise` support in `RandScaleIntensity` and `RandShiftIntensity` (#6793, #7025)
337
+ * `RandSimulateLowResolution` and `RandSimulateLowResolutiond` (#6806)
338
+ * `SignalFillEmptyd` (#7011)
339
+ * Euclidean distance transform `DistanceTransformEDT` with GPU support (#6981)
340
+ * Port loss and metrics from `monai-generative` (#6729, #6836)
341
+ * Support `invert_image` and `retain_stats` in `AdjustContrast` and `RandAdjustContrast` (#6542)
342
+ * New network `DAF3D` and `Quicknat` (#6306)
343
+ * Support `sincos` position embedding (#6986)
344
+ * `ZarrAvgMerger` used for patch inference (#6633)
345
+ * Dataset tracking support to `MLFlowHandler` (#6616)
346
+ * Considering spacing and subvoxel borders in `SurfaceDiceMetric` (#6681)
347
+ * CUCIM support for surface-related metrics (#7008)
348
+ * `loss_fn` support in `IgniteMetric` and renamed it to `IgniteMetricHandler` (#6695)
349
+ * `CallableEventWithFilter` and `Events` options for `trigger_event` in `GarbageCollector` (#6663)
350
+ * Support random sorting option to `GridPatch`, `RandGridPatch`, `GridPatchd` and `RandGridPatchd` (#6701)
351
+ * Support multi-threaded batch sampling in `PatchInferer` (#6139)
352
+ * `SoftclDiceLoss` and `SoftDiceclDiceLoss` (#6763)
353
+ * `HausdorffDTLoss` and `LogHausdorffDTLoss` (#6994)
354
+ * Documentation for `TensorFloat-32` (#6770)
355
+ * Docstring format guide (#6780)
356
+ * `GDSDataset` support for GDS (#6778)
357
+ * PyTorch backend support for `MapLabelValue` (#6872)
358
+ * `filter_func` in `copy_model_state` to filter the weights to be loaded and `filter_swinunetr` (#6917)
359
+ * `stats_sender` to `MonaiAlgo` for FL stats (#6984)
360
+ * `freeze_layers` to help freeze specific layers (#6970)
361
+ #### misc.
362
+ * Refactor multi-node running command used in `Auto3DSeg` into dedicated functions (#6623)
363
+ * Support str type annotation to `device` in `ToTensorD` (#6737)
364
+ * Improve logging message and file name extenstion in `DataAnalyzer` for `Auto3DSeg` (#6758)
365
+ * Set `data_range` as a property in `SSIMLoss` (#6788)
366
+ * Unify environment variable access (#7084)
367
+ * `end_lr` support in `WarmupCosineSchedule` (#6662)
368
+ * Add `ClearML` as optional dependency (#6827)
369
+ * `yandex.disk` support in `download_url` (#6667)
370
+ * Improve config expression error message (#6977)
371
+ ### Fixed
372
+ #### transforms
373
+ * Make `convert_box_to_mask` throw errors when box size larger than the image (#6637)
374
+ * Fix lazy mode in `RandAffine` (#6774)
375
+ * Raise `ValueError` when `map_items` is bool in `Compose` (#6882)
376
+ * Improve performance for `NormalizeIntensity` (#6887)
377
+ * Fix mismatched shape in `Spacing` (#6912)
378
+ * Avoid FutureWarning in `CropForeground` (#6934)
379
+ * Fix `Lazy=True` ignored when using `Dataset` call (#6975)
380
+ * Shape check for arbitrary types for DataStats (#7082)
381
+ #### data
382
+ * Fix wrong spacing checking logic in `PydicomReader` and broken link in `ITKReader` (#6660)
383
+ * Fix boolean indexing of batched `MetaTensor` (#6781)
384
+ * Raise warning when multiprocessing in `DataLoader` (#6830)
385
+ * Remove `shuffle` in `DistributedWeightedRandomSampler` (#6886)
386
+ * Fix missing `SegmentDescription` in `PydicomReader` (#6937)
387
+ * Fix reading dicom series error in `ITKReader` (#6943)
388
+ * Fix KeyError in `PydicomReader` (#6946)
389
+ * Update `metatensor_to_itk_image` to accept RAS `MetaTensor` and update default 'space' in `NrrdReader` to `SpaceKeys.LPS` (#7000)
390
+ * Collate common meta dictionary keys (#7054)
391
+ #### metrics and losses
392
+ * Fixed bug in `GeneralizedDiceLoss` when `batch=True` (#6775)
393
+ * Support for `BCEWithLogitsLoss` in `DiceCELoss` (#6924)
394
+ * Support for `weight` in Dice and related losses (#7098)
395
+ #### networks
396
+ * Use `np.prod` instead of `np.product` (#6639)
397
+ * Fix dimension issue in `MBConvBlock` (#6672)
398
+ * Fix hard-coded `up_kernel_size` in `ViTAutoEnc` (#6735)
399
+ * Remove hard-coded `bias_downsample` in `resnet` (#6848)
400
+ * Fix unused `kernel_size` in `ResBlock` (#6999)
401
+ * Allow for defining reference grid on non-integer coordinates (#7032)
402
+ * Padding option for autoencoder (#7068)
403
+ * Lower peak memory usage for SegResNetDS (#7066)
404
+ #### bundle
405
+ * Set `train_dataset_data` and `dataset_data` to unrequired in BundleProperty (#6607)
406
+ * Set `None` to properties that do not have `REF_ID` (#6607)
407
+ * Fix `AttributeError` for default value in `get_parsed_content` for `ConfigParser` (#6756)
408
+ * Update `monai.bundle.scripts` to support NGC hosting (#6828, #6997)
409
+ * Add `MetaProperties` (#6835)
410
+ * Add `create_workflow` and update `load` function (#6835)
411
+ * Add bundle root directory to Python search directories automatically (#6910)
412
+ * Generate properties for bundle docs automatically (#6918)
413
+ * Move `download_large_files` from model zoo to core (#6958)
414
+ * Bundle syntax `#` as alias of `::` (#6955)
415
+ * Fix bundle download naming issue (#6969, #6963)
416
+ * Simplify the usage of `ckpt_export` (#6965)
417
+ * `update_kwargs` in `monai.bundle.script` for merging multiple configs (#7109)
418
+ #### engines and handlers
419
+ * Added int options for `iteration_log` and `epoch_log` in `TensorBoardStatsHandler` (#7027)
420
+ * Support to run validator at training start (#7108)
421
+ #### misc.
422
+ * Fix device fallback error in `DataAnalyzer` (#6658)
423
+ * Add int check for `current_mode` in `convert_applied_interp_mode` (#6719)
424
+ * Consistent type in `convert_to_contiguous` (#6849)
425
+ * Label `argmax` in `DataAnalyzer` when retry on CPU (#6852)
426
+ * Fix `DataAnalyzer` with `histogram_only=True` (#6874)
427
+ * Fix `AttributeError` in `RankFilter` in single GPU environment (#6895)
428
+ * Remove the default warning on `TORCH_ALLOW_TF32_CUBLAS_OVERRIDE` and add debug print info (#6909)
429
+ * Hide user information in `print_config` (#6913, #6922)
430
+ * Optionally pass coordinates to predictor during sliding window (#6795)
431
+ * Proper ensembling when trained with a sigmoid in `AutoRunner` (#6588)
432
+ * Fixed `test_retinanet` by increasing absolute differences (#6615)
433
+ * Add type check to avoid comparing a np.array with a string in `_check_kwargs_are_present` (#6624)
434
+ * Fix md5 hashing with FIPS mode (#6635)
435
+ * Capture failures from Auto3DSeg related subprocess calls (#6596)
436
+ * Code formatting tool for user-specified directory (#7106)
437
+ * Various docstring fixes
438
+ ### Changed
439
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:23.08-py3` from `nvcr.io/nvidia/pytorch:23.03-py3`
440
+ ### Deprecated
441
+ * `allow_smaller=True`; `allow_smaller=False` will be the new default in `CropForeground` and `generate_spatial_bounding_box` (#6736)
442
+ * `dropout_prob` in `VNet` in favor of `dropout_prob_down` and `dropout_prob_up` (#6768)
443
+ * `workflow` in `BundleWorkflow` in favor of `workflow_type`(#6768)
444
+ * `pos_embed` in `PatchEmbeddingBlock` in favor of `proj_type`(#6986)
445
+ * `net_name` and `net_kwargs` in `download` in favor of `model`(#7016)
446
+ * `img_size` parameter in SwinUNETR (#7093)
447
+ ### Removed
448
+ * `pad_val`, `stride`, `per_channel` and `upsampler` in `OcclusionSensitivity` (#6642)
449
+ * `compute_meaniou` (#7019)
450
+ * `AsChannelFirst`, `AddChannel`and `SplitChannel` (#7019)
451
+ * `create_multigpu_supervised_trainer` and `create_multigpu_supervised_evaluator` (#7019)
452
+ * `runner_id` in `run` (#7019)
453
+ * `data_src_cfg_filename` in `AlgoEnsembleBuilder` (#7019)
454
+ * `get_validation_stats` in `Evaluator` and `get_train_stats` in `Trainer` (#7019)
455
+ * `epoch_interval` and `iteration_interval` in `TensorBoardStatsHandler` (#7019)
456
+ * some self-hosted test (#7041)
457
+
458
+ ## [1.2.0] - 2023-06-08
459
+ ### Added
460
+ * Various Auto3DSeg enhancements and integration tests including multi-node multi-GPU optimization, major usability improvements
461
+ * TensorRT and ONNX support for `monai.bundle` API and the relevant models
462
+ * nnU-Net V2 integration `monai.apps.nnunet`
463
+ * Binary and categorical metrics and event handlers using `MetricsReloaded`
464
+ * Python module and CLI entry point for bundle workflows in `monai.bundle.workflows` and `monai.fl.client`
465
+ * Modular patch inference API including `PatchInferer`, `merger`, and `splitter`
466
+ * Initial release of lazy resampling including transforms and MetaTensor implementations
467
+ * Bridge for ITK Image object and MetaTensor `monai.data.itk_torch_bridge`
468
+ * Sliding window inference memory efficiency optimization including `SlidingWindowInfererAdapt`
469
+ * Generic kernel filtering transforms `ImageFiltered` and `RandImageFiltered`
470
+ * Trainable bilateral filters and joint bilateral filters
471
+ * ClearML stats and image handlers for experiment tracking
472
+ #### misc.
473
+ * Utility functions to warn API default value changes (#5738)
474
+ * Support of dot notation to access content of `ConfigParser` (#5813)
475
+ * Softmax version to focal loss (#6544)
476
+ * FROC metric for N-dimensional (#6528)
477
+ * Extend SurfaceDiceMetric for 3D images (#6549)
478
+ * A `track_meta` option for Lambda and derived transforms (#6385)
479
+ * CLIP pre-trained text-to-vision embedding (#6282)
480
+ * Optional spacing to surface distances calculations (#6144)
481
+ * `WSIReader` read by power and mpp (#6244)
482
+ * Support GPU tensor for `GridPatch` and `GridPatchDataset` (#6246)
483
+ * `SomeOf` transform composer (#6143)
484
+ * GridPatch with both count and threshold filtering (#6055)
485
+ ### Fixed
486
+ #### transforms
487
+ * `map_classes_to_indices` efficiency issue (#6468)
488
+ * Adaptive resampling mode based on backends (#6429)
489
+ * Improve Compose encapsulation (#6224)
490
+ * User-provided `FolderLayout` in `SaveImage` and `SaveImaged` transforms (#6213)
491
+ * `SpacingD` output shape compute stability (#6126)
492
+ * No mutate ratio /user inputs `croppad` (#6127)
493
+ * A `warn` flag to RandCropByLabelClasses (#6121)
494
+ * `nan` to indicate `no_channel`, split dim singleton (#6090)
495
+ * Compatible padding mode (#6076)
496
+ * Allow for missing `filename_or_obj` key (#5980)
497
+ * `Spacing` pixdim in-place change (#5950)
498
+ * Add warning in `RandHistogramShift` (#5877)
499
+ * Exclude `cuCIM` wrappers from `get_transform_backends` (#5838)
500
+ #### data
501
+ * `__format__` implementation of MetaTensor (#6523)
502
+ * `channel_dim` in `TiffFileWSIReader` and `CuCIMWSIReader` (#6514)
503
+ * Prepend `"meta"` to `MetaTensor.__repr__` and `MetaTensor.__str__` for easier identification (#6214)
504
+ * MetaTensor slicing issue (#5845)
505
+ * Default writer flags (#6147)
506
+ * `WSIReader` defaults and tensor conversion (#6058)
507
+ * Remove redundant array copy for WSITiffFileReader (#6089)
508
+ * Fix unused arg in `SlidingPatchWSIDataset` (#6047)
509
+ * `reverse_indexing` for PILReader (#6008)
510
+ * Use `np.linalg` for the small affine inverse (#5967)
511
+ #### metrics and losses
512
+ * Removing L2-norm in contrastive loss (L2-norm already present in CosSim) (#6550)
513
+ * Fixes the SSIM metric (#6250)
514
+ * Efficiency issues of Dice metrics (#6412)
515
+ * Generalized Dice issue (#5929)
516
+ * Unify output tensor devices for multiple metrics (#5924)
517
+ #### networks
518
+ * Make `RetinaNet` throw errors for NaN only when training (#6479)
519
+ * Replace deprecated arg in torchvision models (#6401)
520
+ * Improves NVFuser import check (#6399)
521
+ * Add `device` in `HoVerNetNuclearTypePostProcessing` and `HoVerNetInstanceMapPostProcessing` (#6333)
522
+ * Enhance hovernet load pretrained function (#6269)
523
+ * Access to the `att_mat` in self-attention modules (#6493)
524
+ * Optional swinunetr-v2 (#6203)
525
+ * Add transform to handle empty box as training data for `retinanet_detector` (#6170)
526
+ * GPU utilization of DiNTS network (#6050)
527
+ * A pixelshuffle upsample shape mismatch problem (#5982)
528
+ * GEGLU activation function for the MLP Block (#5856)
529
+ * Constructors for `DenseNet` derived classes (#5846)
530
+ * Flexible interpolation modes in `regunet` (#5807)
531
+ #### bundle
532
+ * Optimized the `deepcopy` logic in `ConfigParser` (#6464)
533
+ * Improve check and error message of bundle run (#6400)
534
+ * Warn or raise ValueError on duplicated key in json/yaml config (#6252)
535
+ * Default metadata and logging values for bundle run (#6072)
536
+ * `pprint` head and tail in bundle script (#5969)
537
+ * Config parsing issue for substring reference (#5932)
538
+ * Fix instantiate for object instantiation with attribute `path` (#5866)
539
+ * Fix `_get_latest_bundle_version` issue on Windows (#5787)
540
+ #### engines and handlers
541
+ * MLflow handler run bug (#6446)
542
+ * `monai.engine` training attribute check (#6132)
543
+ * Update StatsHandler logging message (#6051)
544
+ * Added callable options for `iteration_log` and `epoch_log` in TensorBoard and MLFlow (#5976)
545
+ * `CheckpointSaver` logging error (#6026)
546
+ * Callable options for `iteration_log` and `epoch_log` in StatsHandler (#5965)
547
+ #### misc.
548
+ * Avoid creating cufile.log when `import monai` (#6106)
549
+ * `monai._extensions` module compatibility with rocm (#6161)
550
+ * Issue of repeated UserWarning: "TypedStorage is deprecated" (#6105)
551
+ * Use logging config at module level (#5960)
552
+ * Add ITK to the list of optional dependencies (#5858)
553
+ * `RankFilter` to skip logging when the rank is not meeting criteria (#6243)
554
+ * Various documentation issues
555
+ ### Changed
556
+ * Overall more precise and consistent type annotations
557
+ * Optionally depend on PyTorch-Ignite v0.4.11 instead of v0.4.10
558
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:23.03-py3` from `nvcr.io/nvidia/pytorch:22.10-py3`
559
+ ### Deprecated
560
+ * `resample=True`; `resample=False` will be the new default in `SaveImage`
561
+ * `random_size=True`; `random_size=False` will be the new default for the random cropping transforms
562
+ * `image_only=False`; `image_only=True` will be the new default in `LoadImage`
563
+ * `AddChannel` and `AsChannelFirst` in favor of `EnsureChannelFirst`
564
+ ### Removed
565
+ * Deprecated APIs since v0.9, including WSIReader from `monai.apps`, `NiftiSaver` and `PNGSaver` from `monai.data`
566
+ * Support for PyTorch 1.8
567
+ * Support for Python 3.7
568
+
569
+ ## [1.1.0] - 2022-12-19
570
+ ### Added
571
+ * Hover-Net based digital pathology workflows including new network, loss, postprocessing, metric, training, and inference modules
572
+ * Various enhancements for Auto3dSeg `AutoRunner` including template caching, selection, and a dry-run mode `nni_dry_run`
573
+ * Various enhancements for Auto3dSeg algo templates including new state-of-the-art configurations, optimized GPU memory utilization
574
+ * New bundle API and configurations to support experiment management including `MLFlowHandler`
575
+ * New `bundle.script` API to support model zoo query and download
576
+ * `LossMetric` metric to compute loss as cumulative metric measurement
577
+ * Transforms and base transform APIs including `RandomizableTrait` and `MedianSmooth`
578
+ * `runtime_cache` option for `CacheDataset` and the derived classes to allow for shared caching on the fly
579
+ * Flexible name formatter for `SaveImage` transform
580
+ * `pending_operations` MetaTensor property and basic APIs for lazy image resampling
581
+ * Contrastive sensitivity for SSIM metric
582
+ * Extensible backbones for `FlexibleUNet`
583
+ * Generalize `SobelGradients` to 3D and any spatial axes
584
+ * `warmup_multiplier` option for `WarmupCosineSchedule`
585
+ * F beta score metric based on confusion matrix metric
586
+ * Support of key overwriting in `Lambdad`
587
+ * Basic premerge tests for Python 3.11
588
+ * Unit and integration tests for CUDA 11.6, 11.7 and A100 GPU
589
+ * `DataAnalyzer` handles minor image-label shape inconsistencies
590
+ ### Fixed
591
+ * Review and enhance previously untyped APIs with additional type annotations and casts
592
+ * `switch_endianness` in LoadImage now supports tensor input
593
+ * Reduced memory footprint for various Auto3dSeg tests
594
+ * Issue of `@` in `monai.bundle.ReferenceResolver`
595
+ * Compatibility issue with ITK-Python 5.3 (converting `itkMatrixF44` for default collate)
596
+ * Inconsistent of sform and qform when using different backends for `SaveImage`
597
+ * `MetaTensor.shape` call now returns a `torch.Size` instead of tuple
598
+ * Issue of channel reduction in `GeneralizedDiceLoss`
599
+ * Issue of background handling before softmax in `DiceFocalLoss`
600
+ * Numerical issue of `LocalNormalizedCrossCorrelationLoss`
601
+ * Issue of incompatible view size in `ConfusionMatrixMetric`
602
+ * `NetAdapter` compatibility with Torchscript
603
+ * Issue of `extract_levels` in `RegUNet`
604
+ * Optional `bias_downsample` in `ResNet`
605
+ * `dtype` overflow for `ShiftIntensity` transform
606
+ * Randomized transforms such as `RandCuCIM` now inherit `RandomizableTrait`
607
+ * `fg_indices.size` compatibility issue in `generate_pos_neg_label_crop_centers`
608
+ * Issue when inverting `ToTensor`
609
+ * Issue of capital letters in filename suffixes check in `LoadImage`
610
+ * Minor tensor compatibility issues in `apps.nuclick.transforms`
611
+ * Issue of float16 in `verify_net_in_out`
612
+ * `std` variable type issue for `RandRicianNoise`
613
+ * `DataAnalyzer` accepts `None` as label key and checks empty labels
614
+ * `iter_patch_position` now has a smaller memory footprint
615
+ * `CumulativeAverage` has been refactored and enhanced to allow for simple tracking of metric running stats.
616
+ * Multi-threading issue for `MLFlowHandler`
617
+ ### Changed
618
+ * Printing a MetaTensor now generates a less verbose representation
619
+ * `DistributedSampler` raises a ValueError if there are too few devices
620
+ * OpenCV and `VideoDataset` modules are loaded lazily to avoid dependency issues
621
+ * `device` in `monai.engines.Workflow` supports string values
622
+ * `Activations` and `AsDiscrete` take `kwargs` as additional arguments
623
+ * `DataAnalyzer` is now more efficient and writes summary stats before detailed all case stats
624
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:22.10-py3` from `nvcr.io/nvidia/pytorch:22.09-py3`
625
+ * Simplified Conda environment file `environment-dev.yml`
626
+ * Versioneer dependency upgraded to `0.23` from `0.19`
627
+ ### Deprecated
628
+ * `NibabelReader` input argument `dtype` is deprecated, the reader will use the original dtype of the image
629
+ ### Removed
630
+ * Support for PyTorch 1.7
631
+
632
+ ## [1.0.1] - 2022-10-24
633
+ ### Fixes
634
+ * DiceCELoss for multichannel targets
635
+ * Auto3DSeg DataAnalyzer out-of-memory error and other minor issues
636
+ * An optional flag issue in the RetinaNet detector
637
+ * An issue with output offset for Spacing
638
+ * A `LoadImage` issue when `track_meta` is `False`
639
+ * 1D data output error in `VarAutoEncoder`
640
+ * An issue with resolution computing in `ImageStats`
641
+ ### Added
642
+ * Flexible min/max pixdim options for Spacing
643
+ * Upsample mode `deconvgroup` and optional kernel sizes
644
+ * Docstrings for gradient-based saliency maps
645
+ * Occlusion sensitivity to use sliding window inference
646
+ * Enhanced Gaussian window and device assignments for sliding window inference
647
+ * Multi-GPU support for MonaiAlgo
648
+ * `ClientAlgoStats` and `MonaiAlgoStats` for federated summary statistics
649
+ * MetaTensor support for `OneOf`
650
+ * Add a file check for bundle logging config
651
+ * Additional content and an authentication token option for bundle info API
652
+ * An anti-aliasing option for `Resized`
653
+ * `SlidingWindowInferer` adaptive device based on `cpu_thresh`
654
+ * `SegResNetDS` with deep supervision and non-isotropic kernel support
655
+ * Premerge tests for Python 3.10
656
+ ### Changed
657
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:22.09-py3` from `nvcr.io/nvidia/pytorch:22.08-py3`
658
+ * Replace `None` type metadata content with `"none"` for `collate_fn` compatibility
659
+ * HoVerNet Mode and Branch to independent StrEnum
660
+ * Automatically infer device from the first item in random elastic deformation dict
661
+ * Add channel dim in `ComputeHoVerMaps` and `ComputeHoVerMapsd`
662
+ * Remove batch dim in `SobelGradients` and `SobelGradientsd`
663
+ ### Deprecated
664
+ * Deprecating `compute_meandice`, `compute_meaniou` in `monai.metrics`, in favor of
665
+ `compute_dice` and `compute_iou` respectively
666
+
667
+ ## [1.0.0] - 2022-09-16
668
+ ### Added
669
+ * `monai.auto3dseg` base APIs and `monai.apps.auto3dseg` components for automated machine learning (AutoML) workflow
670
+ * `monai.fl` module with base APIs and `MonaiAlgo` for federated learning client workflow
671
+ * An initial backwards compatibility [guide](https://github.com/Project-MONAI/MONAI/blob/dev/CONTRIBUTING.md#backwards-compatibility)
672
+ * Initial release of accelerated MRI reconstruction components, including `CoilSensitivityModel`
673
+ * Support of `MetaTensor` and new metadata attributes for various digital pathology components
674
+ * Various `monai.bundle` enhancements for MONAI model-zoo usability, including config debug mode and `get_all_bundles_list`
675
+ * new `monai.transforms` components including `SignalContinuousWavelet` for 1D signal, `ComputeHoVerMaps` for digital pathology, and `SobelGradients` for spatial gradients
676
+ * `VarianceMetric` and `LabelQualityScore` metrics for active learning
677
+ * Dataset API for real-time stream and videos
678
+ * Several networks and building blocks including `FlexibleUNet` and `HoVerNet`
679
+ * `MeanIoUHandler` and `LogfileHandler` workflow event handlers
680
+ * `WSIReader` with the TiffFile backend
681
+ * Multi-threading in `WSIReader` with cuCIM backend
682
+ * `get_stats` API in `monai.engines.Workflow`
683
+ * `prune_meta_pattern` in `monai.transforms.LoadImage`
684
+ * `max_interactions` for deepedit interaction workflow
685
+ * Various profiling utilities in `monai.utils.profiling`
686
+ ### Changed
687
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:22.08-py3` from `nvcr.io/nvidia/pytorch:22.06-py3`
688
+ * Optionally depend on PyTorch-Ignite v0.4.10 instead of v0.4.9
689
+ * The cache-based dataset now matches the transform information when read/write the cache
690
+ * `monai.losses.ContrastiveLoss` now infers `batch_size` during `forward()`
691
+ * Rearrange the spatial axes in `RandSmoothDeform` transforms following PyTorch's convention
692
+ * Unified several environment flags into `monai.utils.misc.MONAIEnvVars`
693
+ * Simplified `__str__` implementation of `MetaTensor` instead of relying on the `__repr__` implementation
694
+ ### Fixed
695
+ * Improved error messages when both `monai` and `monai-weekly` are pip-installed
696
+ * Inconsistent pseudo number sequences for different `num_workers` in `DataLoader`
697
+ * Issue of repeated sequences for `monai.data.ShuffleBuffer`
698
+ * Issue of not preserving the physical extent in `monai.transforms.Spacing`
699
+ * Issue of using `inception_v3` as the backbone of `monai.networks.nets.TorchVisionFCModel`
700
+ * Index device issue for `monai.transforms.Crop`
701
+ * Efficiency issue when converting the array dtype and contiguous memory
702
+ ### Deprecated
703
+ * `Addchannel` and `AsChannelFirst` transforms in favor of `EnsureChannelFirst`
704
+ * `monai.apps.pathology.data` components in favor of the corresponding components from `monai.data`
705
+ * `monai.apps.pathology.handlers` in favor of the corresponding components from `monai.handlers`
706
+ ### Removed
707
+ * `Status` section in the pull request template in favor of the pull request draft mode
708
+ * `monai.engines.BaseWorkflow`
709
+ * `ndim` and `dimensions` arguments in favor of `spatial_dims`
710
+ * `n_classes`, `num_classes` arguments in `AsDiscrete` in favor of `to_onehot`
711
+ * `logit_thresh`, `threshold_values` arguments in `AsDiscrete` in favor of `threshold`
712
+ * `torch.testing.assert_allclose` in favor of `tests.utils.assert_allclose`
713
+
714
+ ## [0.9.1] - 2022-07-22
715
+ ### Added
716
+ * Support of `monai.data.MetaTensor` as core data structure across the modules
717
+ * Support of `inverse` in array-based transforms
718
+ * `monai.apps.TciaDataset` APIs for The Cancer Imaging Archive (TCIA) datasets, including a pydicom-backend reader
719
+ * Initial release of components for MRI reconstruction in `monai.apps.reconstruction`, including various FFT utilities
720
+ * New metrics and losses, including mean IoU and structural similarity index
721
+ * `monai.utils.StrEnum` class to simplify Enum-based type annotations
722
+ ### Changed
723
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:22.06-py3` from `nvcr.io/nvidia/pytorch:22.04-py3`
724
+ * Optionally depend on PyTorch-Ignite v0.4.9 instead of v0.4.8
725
+ ### Fixed
726
+ * Fixed issue of not skipping post activations in `Convolution` when input arguments are None
727
+ * Fixed issue of ignoring dropout arguments in `DynUNet`
728
+ * Fixed issue of hard-coded non-linear function in ViT classification head
729
+ * Fixed issue of in-memory config overriding with `monai.bundle.ConfigParser.update`
730
+ * 2D SwinUNETR incompatible shapes
731
+ * Fixed issue with `monai.bundle.verify_metadata` not raising exceptions
732
+ * Fixed issue with `monai.transforms.GridPatch` returns inconsistent type location when padding
733
+ * Wrong generalized Dice score metric when denominator is 0 but prediction is non-empty
734
+ * Docker image build error due to NGC CLI upgrade
735
+ * Optional default value when parsing id unavailable in a ConfigParser instance
736
+ * Immutable data input for the patch-based WSI datasets
737
+ ### Deprecated
738
+ * `*_transforms` and `*_meta_dict` fields in dictionary-based transforms in favor of MetaTensor
739
+ * `meta_keys`, `meta_key_postfix`, `src_affine` arguments in various transforms, in favor of MetaTensor
740
+ * `AsChannelFirst` and `AddChannel`, in favor of `EnsureChannelFirst` transform
741
+
742
+ ## [0.9.0] - 2022-06-08
743
+ ### Added
744
+ * `monai.bundle` primary module with a `ConfigParser` and command-line interfaces for configuration-based workflows
745
+ * Initial release of MONAI bundle specification
746
+ * Initial release of volumetric image detection modules including bounding boxes handling, RetinaNet-based architectures
747
+ * API preview `monai.data.MetaTensor`
748
+ * Unified `monai.data.image_writer` to support flexible IO backends including an ITK writer
749
+ * Various new network blocks and architectures including `SwinUNETR`
750
+ * DeepEdit interactive training/validation workflow
751
+ * NuClick interactive segmentation transforms
752
+ * Patch-based readers and datasets for whole-slide imaging
753
+ * New losses and metrics including `SurfaceDiceMetric`, `GeneralizedDiceFocalLoss`
754
+ * New pre-processing transforms including `RandIntensityRemap`, `SpatialResample`
755
+ * Multi-output and slice-based inference for `SlidingWindowInferer`
756
+ * `NrrdReader` for NRRD file support
757
+ * Torchscript utilities to save models with meta information
758
+ * Gradient-based visualization module `SmoothGrad`
759
+ * Automatic regular source code scanning for common vulnerabilities and coding errors
760
+
761
+ ### Changed
762
+ * Simplified `TestTimeAugmentation` using de-collate and invertible transforms APIs
763
+ * Refactoring `monai.apps.pathology` modules into `monai.handlers` and `monai.transforms`
764
+ * Flexible activation and normalization layers for `TopologySearch` and `DiNTS`
765
+ * Anisotropic first layers for 3D resnet
766
+ * Flexible ordering of activation, normalization in `UNet`
767
+ * Enhanced performance of connected-components analysis using Cupy
768
+ * `INSTANCE_NVFUSER` for enhanced performance in 3D instance norm
769
+ * Support of string representation of dtype in `convert_data_type`
770
+ * Added new options `iteration_log`, `iteration_log` to the logging handlers
771
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:22.04-py3` from `nvcr.io/nvidia/pytorch:21.10-py3`
772
+ * `collate_fn` generates more data-related debugging info with `dev_collate`
773
+
774
+ ### Fixed
775
+ * Unified the spellings of "meta data", "metadata", "meta-data" to "metadata"
776
+ * Various inaccurate error messages when input data are in invalid shapes
777
+ * Issue of computing symmetric distances in `compute_average_surface_distance`
778
+ * Unnecessary layer `self.conv3` in `UnetResBlock`
779
+ * Issue of torchscript compatibility for `ViT` and self-attention blocks
780
+ * Issue of hidden layers in `UNETR`
781
+ * `allow_smaller` in spatial cropping transforms
782
+ * Antialiasing in `Resize`
783
+ * Issue of bending energy loss value at different resolutions
784
+ * `kwargs_read_csv` in `CSVDataset`
785
+ * In-place modification in `Metric` reduction
786
+ * `wrap_array` for `ensure_tuple`
787
+ * Contribution guide for introducing new third-party dependencies
788
+
789
+ ### Removed
790
+ * Deprecated `nifti_writer`, `png_writer` in favor of `monai.data.image_writer`
791
+ * Support for PyTorch 1.6
792
+
793
+ ## [0.8.1] - 2022-02-16
794
+ ### Added
795
+ * Support of `matshow3d` with given `channel_dim`
796
+ * Support of spatial 2D for `ViTAutoEnc`
797
+ * Support of `dataframe` object input in `CSVDataset`
798
+ * Support of tensor backend for `Orientation`
799
+ * Support of configurable delimiter for CSV writers
800
+ * A base workflow API
801
+ * `DataFunc` API for dataset-level preprocessing
802
+ * `write_scalar` API for logging with additional `engine` parameter in `TensorBoardHandler`
803
+ * Enhancements for NVTX Range transform logging
804
+ * Enhancements for `set_determinism`
805
+ * Performance enhancements in the cache-based datasets
806
+ * Configurable metadata keys for `monai.data.DatasetSummary`
807
+ * Flexible `kwargs` for `WSIReader`
808
+ * Logging for the learning rate schedule handler
809
+ * `GridPatchDataset` as subclass of `monai.data.IterableDataset`
810
+ * `is_onehot` option in `KeepLargestConnectedComponent`
811
+ * `channel_dim` in the image readers and support of stacking images with channels
812
+ * Skipping workflow `run` if epoch length is 0
813
+ * Enhanced `CacheDataset` to avoid duplicated cache items
814
+ * `save_state` utility function
815
+
816
+ ### Changed
817
+ * Optionally depend on PyTorch-Ignite v0.4.8 instead of v0.4.6
818
+ * `monai.apps.mmars.load_from_mmar` defaults to the latest version
819
+
820
+ ### Fixed
821
+ * Issue when caching large items with `pickle`
822
+ * Issue of hard-coded activation functions in `ResBlock`
823
+ * Issue of `create_file_name` assuming local disk file creation
824
+ * Issue of `WSIReader` when the backend is `TiffFile`
825
+ * Issue of `deprecated_args` when the function signature contains kwargs
826
+ * Issue of `channel_wise` computations for the intensity-based transforms
827
+ * Issue of inverting `OneOf`
828
+ * Issue of removing temporary caching file for the persistent dataset
829
+ * Error messages when reader backend is not available
830
+ * Output type casting issue in `ScaleIntensityRangePercentiles`
831
+ * Various docstring typos and broken URLs
832
+ * `mode` in the evaluator engine
833
+ * Ordering of `Orientation` and `Spacing` in `monai.apps.deepgrow.dataset`
834
+
835
+ ### Removed
836
+ * Additional deep supervision modules in `DynUnet`
837
+ * Deprecated `reduction` argument for `ContrastiveLoss`
838
+ * Decollate warning in `Workflow`
839
+ * Unique label exception in `ROCAUCMetric`
840
+ * Logger configuration logic in the event handlers
841
+
842
+ ## [0.8.0] - 2021-11-25
843
+ ### Added
844
+ * Overview of [new features in v0.8](docs/source/whatsnew_0_8.md)
845
+ * Network modules for differentiable neural network topology search (DiNTS)
846
+ * Multiple Instance Learning transforms and models for digital pathology WSI analysis
847
+ * Vision transformers for self-supervised representation learning
848
+ * Contrastive loss for self-supervised learning
849
+ * Finalized major improvements of 200+ components in `monai.transforms` to support input and backend in PyTorch and NumPy
850
+ * Initial registration module benchmarking with `GlobalMutualInformationLoss` as an example
851
+ * `monai.transforms` documentation with visual examples and the utility functions
852
+ * Event handler for `MLfLow` integration
853
+ * Enhanced data visualization functions including `blend_images` and `matshow3d`
854
+ * `RandGridDistortion` and `SmoothField` in `monai.transforms`
855
+ * Support of randomized shuffle buffer in iterable datasets
856
+ * Performance review and enhancements for data type casting
857
+ * Cumulative averaging API with distributed environment support
858
+ * Module utility functions including `require_pkg` and `pytorch_after`
859
+ * Various usability enhancements such as `allow_smaller` when sampling ROI and `wrap_sequence` when casting object types
860
+ * `tifffile` support in `WSIReader`
861
+ * Regression tests for the fast training workflows
862
+ * Various tutorials and demos including educational contents at [MONAI Bootcamp 2021](https://github.com/Project-MONAI/MONAIBootcamp2021)
863
+ ### Changed
864
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:21.10-py3` from `nvcr.io/nvidia/pytorch:21.08-py3`
865
+ * Decoupled `TraceKeys` and `TraceableTransform` APIs from `InvertibleTransform`
866
+ * Skipping affine-based resampling when `resample=False` in `NiftiSaver`
867
+ * Deprecated `threshold_values: bool` and `num_classes: int` in `AsDiscrete`
868
+ * Enhanced `apply_filter` for spatially 1D, 2D and 3D inputs with non-separable kernels
869
+ * Logging with `logging` in downloading and model archives in `monai.apps`
870
+ * API documentation site now defaults to `stable` instead of `latest`
871
+ * `skip-magic-trailing-comma` in coding style enforcements
872
+ * Pre-merge CI pipelines now include unit tests with Nvidia Ampere architecture
873
+ ### Removed
874
+ * Support for PyTorch 1.5
875
+ * The deprecated `DynUnetV1` and the related network blocks
876
+ * GitHub self-hosted CI/CD pipelines for package releases
877
+ ### Fixed
878
+ * Support of path-like objects as file path inputs in most modules
879
+ * Issue of `decollate_batch` for dictionary of empty lists
880
+ * Typos in documentation and code examples in various modules
881
+ * Issue of no available keys when `allow_missing_keys=True` for the `MapTransform`
882
+ * Issue of redundant computation when normalization factors are 0.0 and 1.0 in `ScaleIntensity`
883
+ * Incorrect reports of registered readers in `ImageReader`
884
+ * Wrong numbering of iterations in `StatsHandler`
885
+ * Naming conflicts in network modules and aliases
886
+ * Incorrect output shape when `reduction="none"` in `FocalLoss`
887
+ * Various usability issues reported by users
888
+
889
+ ## [0.7.0] - 2021-09-24
890
+ ### Added
891
+ * Overview of [new features in v0.7](docs/source/whatsnew_0_7.md)
892
+ * Initial phase of major usability improvements in `monai.transforms` to support input and backend in PyTorch and NumPy
893
+ * Performance enhancements, with [profiling and tuning guides](https://github.com/Project-MONAI/tutorials/blob/master/acceleration/fast_model_training_guide.md) for typical use cases
894
+ * Reproducing [training modules and workflows](https://github.com/Project-MONAI/tutorials/tree/master/kaggle/RANZCR/4th_place_solution) of state-of-the-art Kaggle competition solutions
895
+ * 24 new transforms, including
896
+ * `OneOf` meta transform
897
+ * DeepEdit guidance signal transforms for interactive segmentation
898
+ * Transforms for self-supervised pre-training
899
+ * Integration of [NVIDIA Tools Extension](https://developer.nvidia.com/blog/nvidia-tools-extension-api-nvtx-annotation-tool-for-profiling-code-in-python-and-c-c/) (NVTX)
900
+ * Integration of [cuCIM](https://github.com/rapidsai/cucim)
901
+ * Stain normalization and contextual grid for digital pathology
902
+ * `Transchex` network for vision-language transformers for chest X-ray analysis
903
+ * `DatasetSummary` utility in `monai.data`
904
+ * `WarmupCosineSchedule`
905
+ * Deprecation warnings and documentation support for better backwards compatibility
906
+ * Padding with additional `kwargs` and different backend API
907
+ * Additional options such as `dropout` and `norm` in various networks and their submodules
908
+
909
+ ### Changed
910
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:21.08-py3` from `nvcr.io/nvidia/pytorch:21.06-py3`
911
+ * Deprecated input argument `n_classes`, in favor of `num_classes`
912
+ * Deprecated input argument `dimensions` and `ndims`, in favor of `spatial_dims`
913
+ * Updated the Sphinx-based documentation theme for better readability
914
+ * `NdarrayTensor` type is replaced by `NdarrayOrTensor` for simpler annotations
915
+ * Self-attention-based network blocks now support both 2D and 3D inputs
916
+
917
+ ### Removed
918
+ * The deprecated `TransformInverter`, in favor of `monai.transforms.InvertD`
919
+ * GitHub self-hosted CI/CD pipelines for nightly and post-merge tests
920
+ * `monai.handlers.utils.evenly_divisible_all_gather`
921
+ * `monai.handlers.utils.string_list_all_gather`
922
+
923
+ ### Fixed
924
+ * A Multi-thread cache writing issue in `LMDBDataset`
925
+ * Output shape convention inconsistencies of the image readers
926
+ * Output directory and file name flexibility issue for `NiftiSaver`, `PNGSaver`
927
+ * Requirement of the `label` field in test-time augmentation
928
+ * Input argument flexibility issues for `ThreadDataLoader`
929
+ * Decoupled `Dice` and `CrossEntropy` intermediate results in `DiceCELoss`
930
+ * Improved documentation, code examples, and warning messages in various modules
931
+ * Various usability issues reported by users
932
+
933
+ ## [0.6.0] - 2021-07-08
934
+ ### Added
935
+ * 10 new transforms, a masked loss wrapper, and a `NetAdapter` for transfer learning
936
+ * APIs to load networks and pre-trained weights from Clara Train [Medical Model ARchives (MMARs)](https://docs.nvidia.com/clara/clara-train-sdk/pt/mmar.html)
937
+ * Base metric and cumulative metric APIs, 4 new regression metrics
938
+ * Initial CSV dataset support
939
+ * Decollating mini-batch as the default first postprocessing step, [Migrating your v0.5 code to v0.6](https://github.com/Project-MONAI/MONAI/wiki/v0.5-to-v0.6-migration-guide) wiki shows how to adapt to the breaking changes
940
+ * Initial backward compatibility support via `monai.utils.deprecated`
941
+ * Attention-based vision modules and `UNETR` for segmentation
942
+ * Generic module loaders and Gaussian mixture models using the PyTorch JIT compilation
943
+ * Inverse of image patch sampling transforms
944
+ * Network block utilities `get_[norm, act, dropout, pool]_layer`
945
+ * `unpack_items` mode for `apply_transform` and `Compose`
946
+ * New event `INNER_ITERATION_STARTED` in the deepgrow interactive workflow
947
+ * `set_data` API for cache-based datasets to dynamically update the dataset content
948
+ * Fully compatible with PyTorch 1.9
949
+ * `--disttests` and `--min` options for `runtests.sh`
950
+ * Initial support of pre-merge tests with Nvidia Blossom system
951
+
952
+ ### Changed
953
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:21.06-py3` from
954
+ `nvcr.io/nvidia/pytorch:21.04-py3`
955
+ * Optionally depend on PyTorch-Ignite v0.4.5 instead of v0.4.4
956
+ * Unified the demo, tutorial, testing data to the project shared drive, and
957
+ [`Project-MONAI/MONAI-extra-test-data`](https://github.com/Project-MONAI/MONAI-extra-test-data)
958
+ * Unified the terms: `post_transform` is renamed to `postprocessing`, `pre_transform` is renamed to `preprocessing`
959
+ * Unified the postprocessing transforms and event handlers to accept the "channel-first" data format
960
+ * `evenly_divisible_all_gather` and `string_list_all_gather` moved to `monai.utils.dist`
961
+
962
+ ### Removed
963
+ * Support of 'batched' input for postprocessing transforms and event handlers
964
+ * `TorchVisionFullyConvModel`
965
+ * `set_visible_devices` utility function
966
+ * `SegmentationSaver` and `TransformsInverter` handlers
967
+
968
+ ### Fixed
969
+ * Issue of handling big-endian image headers
970
+ * Multi-thread issue for non-random transforms in the cache-based datasets
971
+ * Persistent dataset issue when multiple processes sharing a non-exist cache location
972
+ * Typing issue with Numpy 1.21.0
973
+ * Loading checkpoint with both `model` and `optmizier` using `CheckpointLoader` when `strict_shape=False`
974
+ * `SplitChannel` has different behaviour depending on numpy/torch inputs
975
+ * Transform pickling issue caused by the Lambda functions
976
+ * Issue of filtering by name in `generate_param_groups`
977
+ * Inconsistencies in the return value types of `class_activation_maps`
978
+ * Various docstring typos
979
+ * Various usability enhancements in `monai.transforms`
980
+
981
+ ## [0.5.3] - 2021-05-28
982
+ ### Changed
983
+ * Project default branch renamed to `dev` from `master`
984
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:21.04-py3` from `nvcr.io/nvidia/pytorch:21.02-py3`
985
+ * Enhanced type checks for the `iteration_metric` handler
986
+ * Enhanced `PersistentDataset` to use `tempfile` during caching computation
987
+ * Enhanced various info/error messages
988
+ * Enhanced performance of `RandAffine`
989
+ * Enhanced performance of `SmartCacheDataset`
990
+ * Optionally requires `cucim` when the platform is `Linux`
991
+ * Default `device` of `TestTimeAugmentation` changed to `cpu`
992
+
993
+ ### Fixed
994
+ * Download utilities now provide better default parameters
995
+ * Duplicated `key_transforms` in the patch-based transforms
996
+ * A multi-GPU issue in `ClassificationSaver`
997
+ * A default `meta_data` issue in `SpacingD`
998
+ * Dataset caching issue with the persistent data loader workers
999
+ * A memory issue in `permutohedral_cuda`
1000
+ * Dictionary key issue in `CopyItemsd`
1001
+ * `box_start` and `box_end` parameters for deepgrow `SpatialCropForegroundd`
1002
+ * Tissue mask array transpose issue in `MaskedInferenceWSIDataset`
1003
+ * Various type hint errors
1004
+ * Various docstring typos
1005
+
1006
+ ### Added
1007
+ * Support of `to_tensor` and `device` arguments for `TransformInverter`
1008
+ * Slicing options with SpatialCrop
1009
+ * Class name alias for the networks for backward compatibility
1010
+ * `k_divisible` option for CropForeground
1011
+ * `map_items` option for `Compose`
1012
+ * Warnings of `inf` and `nan` for surface distance computation
1013
+ * A `print_log` flag to the image savers
1014
+ * Basic testing pipelines for Python 3.9
1015
+
1016
+ ## [0.5.0] - 2021-04-09
1017
+ ### Added
1018
+ * Overview document for [feature highlights in v0.5.0](https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md)
1019
+ * Invertible spatial transforms
1020
+ * `InvertibleTransform` base APIs
1021
+ * Batch inverse and decollating APIs
1022
+ * Inverse of `Compose`
1023
+ * Batch inverse event handling
1024
+ * Test-time augmentation as an application
1025
+ * Initial support of learning-based image registration:
1026
+ * Bending energy, LNCC, and global mutual information loss
1027
+ * Fully convolutional architectures
1028
+ * Dense displacement field, dense velocity field computation
1029
+ * Warping with high-order interpolation with C++/CUDA implementations
1030
+ * Deepgrow modules for interactive segmentation:
1031
+ * Workflows with simulations of clicks
1032
+ * Distance-based transforms for guidance signals
1033
+ * Digital pathology support:
1034
+ * Efficient whole slide imaging IO and sampling with Nvidia cuCIM and SmartCache
1035
+ * FROC measurements for lesion
1036
+ * Probabilistic post-processing for lesion detection
1037
+ * TorchVision classification model adaptor for fully convolutional analysis
1038
+ * 12 new transforms, grid patch dataset, `ThreadDataLoader`, EfficientNets B0-B7
1039
+ * 4 iteration events for the engine for finer control of workflows
1040
+ * New C++/CUDA extensions:
1041
+ * Conditional random field
1042
+ * Fast bilateral filtering using the permutohedral lattice
1043
+ * Metrics summary reporting and saving APIs
1044
+ * DiceCELoss, DiceFocalLoss, a multi-scale wrapper for segmentation loss computation
1045
+ * Data loading utilities:
1046
+ * `decollate_batch`
1047
+ * `PadListDataCollate` with inverse support
1048
+ * Support of slicing syntax for `Dataset`
1049
+ * Initial Torchscript support for the loss modules
1050
+ * Learning rate finder
1051
+ * Allow for missing keys in the dictionary-based transforms
1052
+ * Support of checkpoint loading for transfer learning
1053
+ * Various summary and plotting utilities for Jupyter notebooks
1054
+ * Contributor Covenant Code of Conduct
1055
+ * Major CI/CD enhancements covering the tutorial repository
1056
+ * Fully compatible with PyTorch 1.8
1057
+ * Initial nightly CI/CD pipelines using Nvidia Blossom Infrastructure
1058
+
1059
+ ### Changed
1060
+ * Enhanced `list_data_collate` error handling
1061
+ * Unified iteration metric APIs
1062
+ * `densenet*` extensions are renamed to `DenseNet*`
1063
+ * `se_res*` network extensions are renamed to `SERes*`
1064
+ * Transform base APIs are rearranged into `compose`, `inverse`, and `transform`
1065
+ * `_do_transform` flag for the random augmentations is unified via `RandomizableTransform`
1066
+ * Decoupled post-processing steps, e.g. `softmax`, `to_onehot_y`, from the metrics computations
1067
+ * Moved the distributed samplers to `monai.data.samplers` from `monai.data.utils`
1068
+ * Engine's data loaders now accept generic iterables as input
1069
+ * Workflows now accept additional custom events and state properties
1070
+ * Various type hints according to Numpy 1.20
1071
+ * Refactored testing utility `runtests.sh` to have `--unittest` and `--net` (integration tests) options
1072
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:21.02-py3` from `nvcr.io/nvidia/pytorch:20.10-py3`
1073
+ * Docker images are now built with self-hosted environments
1074
+ * Primary contact email updated to `monai.contact@gmail.com`
1075
+ * Now using GitHub Discussions as the primary communication forum
1076
+
1077
+ ### Removed
1078
+ * Compatibility tests for PyTorch 1.5.x
1079
+ * Format specific loaders, e.g. `LoadNifti`, `NiftiDataset`
1080
+ * Assert statements from non-test files
1081
+ * `from module import *` statements, addressed flake8 F403
1082
+
1083
+ ### Fixed
1084
+ * Uses American English spelling for code, as per PyTorch
1085
+ * Code coverage now takes multiprocessing runs into account
1086
+ * SmartCache with initial shuffling
1087
+ * `ConvertToMultiChannelBasedOnBratsClasses` now supports channel-first inputs
1088
+ * Checkpoint handler to save with non-root permissions
1089
+ * Fixed an issue for exiting the distributed unit tests
1090
+ * Unified `DynUNet` to have single tensor output w/o deep supervision
1091
+ * `SegmentationSaver` now supports user-specified data types and a `squeeze_end_dims` flag
1092
+ * Fixed `*Saver` event handlers output filenames with a `data_root_dir` option
1093
+ * Load image functions now ensure little-endian
1094
+ * Fixed the test runner to support regex-based test case matching
1095
+ * Usability issues in the event handlers
1096
+
1097
+ ## [0.4.0] - 2020-12-15
1098
+ ### Added
1099
+ * Overview document for [feature highlights in v0.4.0](https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md)
1100
+ * Torchscript support for the net modules
1101
+ * New networks and layers:
1102
+ * Discrete Gaussian kernels
1103
+ * Hilbert transform and envelope detection
1104
+ * Swish and mish activation
1105
+ * Acti-norm-dropout block
1106
+ * Upsampling layer
1107
+ * Autoencoder, Variational autoencoder
1108
+ * FCNet
1109
+ * Support of initialisation from pretrained weights for densenet, senet, multichannel AHNet
1110
+ * Layer-wise learning rate API
1111
+ * New model metrics and event handlers based on occlusion sensitivity, confusion matrix, surface distance
1112
+ * CAM/GradCAM/GradCAM++
1113
+ * File format-agnostic image loader APIs with Nibabel, ITK readers
1114
+ * Enhancements for dataset partition, cross-validation APIs
1115
+ * New data APIs:
1116
+ * LMDB-based caching dataset
1117
+ * Cache-N-transforms dataset
1118
+ * Iterable dataset
1119
+ * Patch dataset
1120
+ * Weekly PyPI release
1121
+ * Fully compatible with PyTorch 1.7
1122
+ * CI/CD enhancements:
1123
+ * Skipping, speed up, fail fast, timed, quick tests
1124
+ * Distributed training tests
1125
+ * Performance profiling utilities
1126
+ * New tutorials and demos:
1127
+ * Autoencoder, VAE tutorial
1128
+ * Cross-validation demo
1129
+ * Model interpretability tutorial
1130
+ * COVID-19 Lung CT segmentation challenge open-source baseline
1131
+ * Threadbuffer demo
1132
+ * Dataset partitioning tutorial
1133
+ * Layer-wise learning rate demo
1134
+ * [MONAI Bootcamp 2020](https://github.com/Project-MONAI/MONAIBootcamp2020)
1135
+
1136
+ ### Changed
1137
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:20.10-py3` from `nvcr.io/nvidia/pytorch:20.08-py3`
1138
+
1139
+ #### Backwards Incompatible Changes
1140
+ * `monai.apps.CVDecathlonDataset` is extended to a generic `monai.apps.CrossValidation` with an `dataset_cls` option
1141
+ * Cache dataset now requires a `monai.transforms.Compose` instance as the transform argument
1142
+ * Model checkpoint file name extensions changed from `.pth` to `.pt`
1143
+ * Readers' `get_spatial_shape` returns a numpy array instead of list
1144
+ * Decoupled postprocessing steps such as `sigmoid`, `to_onehot_y`, `mutually_exclusive`, `logit_thresh` from metrics and event handlers,
1145
+ the postprocessing steps should be used before calling the metrics methods
1146
+ * `ConfusionMatrixMetric` and `DiceMetric` computation now returns an additional `not_nans` flag to indicate valid results
1147
+ * `UpSample` optional `mode` now supports `"deconv"`, `"nontrainable"`, `"pixelshuffle"`; `interp_mode` is only used when `mode` is `"nontrainable"`
1148
+ * `SegResNet` optional `upsample_mode` now supports `"deconv"`, `"nontrainable"`, `"pixelshuffle"`
1149
+ * `monai.transforms.Compose` class inherits `monai.transforms.Transform`
1150
+ * In `Rotate`, `Rotated`, `RandRotate`, `RandRotated` transforms, the `angle` related parameters are interpreted as angles in radians instead of degrees.
1151
+ * `SplitChannel` and `SplitChanneld` moved from `transforms.post` to `transforms.utility`
1152
+
1153
+ ### Removed
1154
+ * Support of PyTorch 1.4
1155
+
1156
+ ### Fixed
1157
+ * Enhanced loss functions for stability and flexibility
1158
+ * Sliding window inference memory and device issues
1159
+ * Revised transforms:
1160
+ * Normalize intensity datatype and normalizer types
1161
+ * Padding modes for zoom
1162
+ * Crop returns coordinates
1163
+ * Select items transform
1164
+ * Weighted patch sampling
1165
+ * Option to keep aspect ratio for zoom
1166
+ * Various CI/CD issues
1167
+
1168
+ ## [0.3.0] - 2020-10-02
1169
+ ### Added
1170
+ * Overview document for [feature highlights in v0.3.0](https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md)
1171
+ * Automatic mixed precision support
1172
+ * Multi-node, multi-GPU data parallel model training support
1173
+ * 3 new evaluation metric functions
1174
+ * 11 new network layers and blocks
1175
+ * 6 new network architectures
1176
+ * 14 new transforms, including an I/O adaptor
1177
+ * Cross validation module for `DecathlonDataset`
1178
+ * Smart Cache module in dataset
1179
+ * `monai.optimizers` module
1180
+ * `monai.csrc` module
1181
+ * Experimental feature of ImageReader using ITK, Nibabel, Numpy, Pillow (PIL Fork)
1182
+ * Experimental feature of differentiable image resampling in C++/CUDA
1183
+ * Ensemble evaluator module
1184
+ * GAN trainer module
1185
+ * Initial cross-platform CI environment for C++/CUDA code
1186
+ * Code style enforcement now includes isort and clang-format
1187
+ * Progress bar with tqdm
1188
+
1189
+ ### Changed
1190
+ * Now fully compatible with PyTorch 1.6
1191
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:20.08-py3` from `nvcr.io/nvidia/pytorch:20.03-py3`
1192
+ * Code contributions now require signing off on the [Developer Certificate of Origin (DCO)](https://developercertificate.org/)
1193
+ * Major work in type hinting finished
1194
+ * Remote datasets migrated to [Open Data on AWS](https://registry.opendata.aws/)
1195
+ * Optionally depend on PyTorch-Ignite v0.4.2 instead of v0.3.0
1196
+ * Optionally depend on torchvision, ITK
1197
+ * Enhanced CI tests with 8 new testing environments
1198
+
1199
+ ### Removed
1200
+ * `MONAI/examples` folder (relocated into [`Project-MONAI/tutorials`](https://github.com/Project-MONAI/tutorials))
1201
+ * `MONAI/research` folder (relocated to [`Project-MONAI/research-contributions`](https://github.com/Project-MONAI/research-contributions))
1202
+
1203
+ ### Fixed
1204
+ * `dense_patch_slices` incorrect indexing
1205
+ * Data type issue in `GeneralizedWassersteinDiceLoss`
1206
+ * `ZipDataset` return value inconsistencies
1207
+ * `sliding_window_inference` indexing and `device` issues
1208
+ * importing monai modules may cause namespace pollution
1209
+ * Random data splits issue in `DecathlonDataset`
1210
+ * Issue of randomising a `Compose` transform
1211
+ * Various issues in function type hints
1212
+ * Typos in docstring and documentation
1213
+ * `PersistentDataset` issue with existing file folder
1214
+ * Filename issue in the output writers
1215
+
1216
+ ## [0.2.0] - 2020-07-02
1217
+ ### Added
1218
+ * Overview document for [feature highlights in v0.2.0](https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md)
1219
+ * Type hints and static type analysis support
1220
+ * `MONAI/research` folder
1221
+ * `monai.engine.workflow` APIs for supervised training
1222
+ * `monai.inferers` APIs for validation and inference
1223
+ * 7 new tutorials and examples
1224
+ * 3 new loss functions
1225
+ * 4 new event handlers
1226
+ * 8 new layers, blocks, and networks
1227
+ * 12 new transforms, including post-processing transforms
1228
+ * `monai.apps.datasets` APIs, including `MedNISTDataset` and `DecathlonDataset`
1229
+ * Persistent caching, `ZipDataset`, and `ArrayDataset` in `monai.data`
1230
+ * Cross-platform CI tests supporting multiple Python versions
1231
+ * Optional import mechanism
1232
+ * Experimental features for third-party transforms integration
1233
+
1234
+ ### Changed
1235
+ > For more details please visit [the project wiki](https://github.com/Project-MONAI/MONAI/wiki/Notable-changes-between-0.1.0-and-0.2.0)
1236
+ * Core modules now require numpy >= 1.17
1237
+ * Categorized `monai.transforms` modules into crop and pad, intensity, IO, post-processing, spatial, and utility.
1238
+ * Most transforms are now implemented with PyTorch native APIs
1239
+ * Code style enforcement and automated formatting workflows now use autopep8 and black
1240
+ * Base Docker image upgraded to `nvcr.io/nvidia/pytorch:20.03-py3` from `nvcr.io/nvidia/pytorch:19.10-py3`
1241
+ * Enhanced local testing tools
1242
+ * Documentation website domain changed to https://docs.monai.io
1243
+
1244
+ ### Removed
1245
+ * Support of Python < 3.6
1246
+ * Automatic installation of optional dependencies including pytorch-ignite, nibabel, tensorboard, pillow, scipy, scikit-image
1247
+
1248
+ ### Fixed
1249
+ * Various issues in type and argument names consistency
1250
+ * Various issues in docstring and documentation site
1251
+ * Various issues in unit and integration tests
1252
+ * Various issues in examples and notebooks
1253
+
1254
+ ## [0.1.0] - 2020-04-17
1255
+ ### Added
1256
+ * Public alpha source code release under the Apache 2.0 license ([highlights](https://github.com/Project-MONAI/MONAI/blob/0.1.0/docs/source/highlights.md))
1257
+ * Various tutorials and examples
1258
+ - Medical image classification and segmentation workflows
1259
+ - Spacing/orientation-aware preprocessing with CPU/GPU and caching
1260
+ - Flexible workflows with PyTorch Ignite and Lightning
1261
+ * Various GitHub Actions
1262
+ - CI/CD pipelines via self-hosted runners
1263
+ - Documentation publishing via readthedocs.org
1264
+ - PyPI package publishing
1265
+ * Contributing guidelines
1266
+ * A project logo and badges
1267
+
1268
+ [highlights]: https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md
1269
+
1270
+ [Unreleased]: https://github.com/Project-MONAI/MONAI/compare/1.5.2...HEAD
1271
+ [1.5.2]: https://github.com/Project-MONAI/MONAI/compare/1.5.1...1.5.2
1272
+ [1.5.1]: https://github.com/Project-MONAI/MONAI/compare/1.5.0...1.5.1
1273
+ [1.5.0]: https://github.com/Project-MONAI/MONAI/compare/1.4.0...1.5.0
1274
+ [1.4.0]: https://github.com/Project-MONAI/MONAI/compare/1.3.2...1.4.0
1275
+ [1.3.2]: https://github.com/Project-MONAI/MONAI/compare/1.3.1...1.3.2
1276
+ [1.3.1]: https://github.com/Project-MONAI/MONAI/compare/1.3.0...1.3.1
1277
+ [1.3.0]: https://github.com/Project-MONAI/MONAI/compare/1.2.0...1.3.0
1278
+ [1.2.0]: https://github.com/Project-MONAI/MONAI/compare/1.1.0...1.2.0
1279
+ [1.1.0]: https://github.com/Project-MONAI/MONAI/compare/1.0.1...1.1.0
1280
+ [1.0.1]: https://github.com/Project-MONAI/MONAI/compare/1.0.0...1.0.1
1281
+ [1.0.0]: https://github.com/Project-MONAI/MONAI/compare/0.9.1...1.0.0
1282
+ [0.9.1]: https://github.com/Project-MONAI/MONAI/compare/0.9.0...0.9.1
1283
+ [0.9.0]: https://github.com/Project-MONAI/MONAI/compare/0.8.1...0.9.0
1284
+ [0.8.1]: https://github.com/Project-MONAI/MONAI/compare/0.8.0...0.8.1
1285
+ [0.8.0]: https://github.com/Project-MONAI/MONAI/compare/0.7.0...0.8.0
1286
+ [0.7.0]: https://github.com/Project-MONAI/MONAI/compare/0.6.0...0.7.0
1287
+ [0.6.0]: https://github.com/Project-MONAI/MONAI/compare/0.5.3...0.6.0
1288
+ [0.5.3]: https://github.com/Project-MONAI/MONAI/compare/0.5.0...0.5.3
1289
+ [0.5.0]: https://github.com/Project-MONAI/MONAI/compare/0.4.0...0.5.0
1290
+ [0.4.0]: https://github.com/Project-MONAI/MONAI/compare/0.3.0...0.4.0
1291
+ [0.3.0]: https://github.com/Project-MONAI/MONAI/compare/0.2.0...0.3.0
1292
+ [0.2.0]: https://github.com/Project-MONAI/MONAI/compare/0.1.0...0.2.0
1293
+ [0.1.0]: https://github.com/Project-MONAI/MONAI/commits/0.1.0
MONAI/source/CITATION.cff ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # YAML 1.2
2
+ # Metadata for citation of this software according to the CFF format (https://citation-file-format.github.io/)
3
+ #
4
+ ---
5
+ title: "MONAI: Medical Open Network for AI"
6
+ abstract: "AI Toolkit for Healthcare Imaging"
7
+ authors:
8
+ - name: "MONAI Consortium"
9
+ date-released: 2026-01-29
10
+ version: "1.5.2"
11
+ identifiers:
12
+ - description: "This DOI represents all versions of MONAI, and will always resolve to the latest one."
13
+ type: doi
14
+ value: "10.5281/zenodo.4323058"
15
+ license: "Apache-2.0"
16
+ repository-code: "https://github.com/Project-MONAI/MONAI"
17
+ url: "https://project-monai.github.io/"
18
+ cff-version: "1.2.0"
19
+ message: "If you use this software, please cite it using these metadata."
20
+ preferred-citation:
21
+ type: article
22
+ authors:
23
+ - given-names: "M. Jorge"
24
+ family-names: "Cardoso"
25
+ - given-names: "Wenqi"
26
+ family-names: "Li"
27
+ - given-names: "Richard"
28
+ family-names: "Brown"
29
+ - given-names: "Nic"
30
+ family-names: "Ma"
31
+ - given-names: "Eric"
32
+ family-names: "Kerfoot"
33
+ - given-names: "Yiheng"
34
+ family-names: "Wang"
35
+ - given-names: "Benjamin"
36
+ family-names: "Murray"
37
+ - given-names: "Andriy"
38
+ family-names: "Myronenko"
39
+ - given-names: "Can"
40
+ family-names: "Zhao"
41
+ - given-names: "Dong"
42
+ family-names: "Yang"
43
+ - given-names: "Vishwesh"
44
+ family-names: "Nath"
45
+ - given-names: "Yufan"
46
+ family-names: "He"
47
+ - given-names: "Ziyue"
48
+ family-names: "Xu"
49
+ - given-names: "Ali"
50
+ family-names: "Hatamizadeh"
51
+ - given-names: "Wentao"
52
+ family-names: "Zhu"
53
+ - given-names: "Yun"
54
+ family-names: "Liu"
55
+ - given-names: "Mingxin"
56
+ family-names: "Zheng"
57
+ - given-names: "Yucheng"
58
+ family-names: "Tang"
59
+ - given-names: "Isaac"
60
+ family-names: "Yang"
61
+ - given-names: "Michael"
62
+ family-names: "Zephyr"
63
+ - given-names: "Behrooz"
64
+ family-names: "Hashemian"
65
+ - given-names: "Sachidanand"
66
+ family-names: "Alle"
67
+ - given-names: "Mohammad"
68
+ family-names: "Zalbagi Darestani"
69
+ - given-names: "Charlie"
70
+ family-names: "Budd"
71
+ - given-names: "Marc"
72
+ family-names: "Modat"
73
+ - given-names: "Tom"
74
+ family-names: "Vercauteren"
75
+ - given-names: "Guotai"
76
+ family-names: "Wang"
77
+ - given-names: "Yiwen"
78
+ family-names: "Li"
79
+ - given-names: "Yipeng"
80
+ family-names: "Hu"
81
+ - given-names: "Yunguan"
82
+ family-names: "Fu"
83
+ - given-names: "Benjamin"
84
+ family-names: "Gorman"
85
+ - given-names: "Hans"
86
+ family-names: "Johnson"
87
+ - given-names: "Brad"
88
+ family-names: "Genereaux"
89
+ - given-names: "Barbaros S."
90
+ family-names: "Erdal"
91
+ - given-names: "Vikash"
92
+ family-names: "Gupta"
93
+ - given-names: "Andres"
94
+ family-names: "Diaz-Pinto"
95
+ - given-names: "Andre"
96
+ family-names: "Dourson"
97
+ - given-names: "Lena"
98
+ family-names: "Maier-Hein"
99
+ - given-names: "Paul F."
100
+ family-names: "Jaeger"
101
+ - given-names: "Michael"
102
+ family-names: "Baumgartner"
103
+ - given-names: "Jayashree"
104
+ family-names: "Kalpathy-Cramer"
105
+ - given-names: "Mona"
106
+ family-names: "Flores"
107
+ - given-names: "Justin"
108
+ family-names: "Kirby"
109
+ - given-names: "Lee A.D."
110
+ family-names: "Cooper"
111
+ - given-names: "Holger R."
112
+ family-names: "Roth"
113
+ - given-names: "Daguang"
114
+ family-names: "Xu"
115
+ - given-names: "David"
116
+ family-names: "Bericat"
117
+ - given-names: "Ralf"
118
+ family-names: "Floca"
119
+ - given-names: "S. Kevin"
120
+ family-names: "Zhou"
121
+ - given-names: "Haris"
122
+ family-names: "Shuaib"
123
+ - given-names: "Keyvan"
124
+ family-names: "Farahani"
125
+ - given-names: "Klaus H."
126
+ family-names: "Maier-Hein"
127
+ - given-names: "Stephen"
128
+ family-names: "Aylward"
129
+ - given-names: "Prerna"
130
+ family-names: "Dogra"
131
+ - given-names: "Sebastien"
132
+ family-names: "Ourselin"
133
+ - given-names: "Andrew"
134
+ family-names: "Feng"
135
+ doi: "https://doi.org/10.48550/arXiv.2211.02701"
136
+ month: 11
137
+ year: 2022
138
+ title: "MONAI: An open-source framework for deep learning in healthcare"
139
+ ...
MONAI/source/CODE_OF_CONDUCT.md ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Contributor Covenant Code of Conduct
2
+
3
+ ## Our Pledge
4
+
5
+ In the interest of fostering an open and welcoming environment, we as
6
+ contributors and maintainers pledge to making participation in our project and
7
+ our community a harassment-free experience for everyone, regardless of age, body
8
+ size, disability, ethnicity, sex characteristics, gender identity and expression,
9
+ level of experience, education, socio-economic status, nationality, personal
10
+ appearance, race, religion, or sexual identity and orientation.
11
+
12
+ ## Our Standards
13
+
14
+ Examples of behavior that contributes to creating a positive environment
15
+ include:
16
+
17
+ * Using welcoming and inclusive language
18
+ * Being respectful of differing viewpoints and experiences
19
+ * Gracefully accepting constructive criticism
20
+ * Focusing on what is best for the community
21
+ * Showing empathy towards other community members
22
+
23
+ Examples of unacceptable behavior by participants include:
24
+
25
+ * The use of sexualized language or imagery and unwelcome sexual attention or
26
+ advances
27
+ * Trolling, insulting/derogatory comments, and personal or political attacks
28
+ * Public or private harassment
29
+ * Publishing others' private information, such as a physical or electronic
30
+ address, without explicit permission
31
+ * Other conduct which could reasonably be considered inappropriate in a
32
+ professional setting
33
+
34
+ ## Our Responsibilities
35
+
36
+ Project maintainers are responsible for clarifying the standards of acceptable
37
+ behavior and are expected to take appropriate and fair corrective action in
38
+ response to any instances of unacceptable behavior.
39
+
40
+ Project maintainers have the right and responsibility to remove, edit, or
41
+ reject comments, commits, code, wiki edits, issues, and other contributions
42
+ that are not aligned to this Code of Conduct, or to ban temporarily or
43
+ permanently any contributor for other behaviors that they deem inappropriate,
44
+ threatening, offensive, or harmful.
45
+
46
+ ## Scope
47
+
48
+ This Code of Conduct applies both within project spaces and in public spaces
49
+ when an individual is representing the project or its community. Examples of
50
+ representing a project or community include using an official project e-mail
51
+ address, posting via an official social media account, or acting as an appointed
52
+ representative at an online or offline event. Representation of a project may be
53
+ further defined and clarified by project maintainers.
54
+
55
+ ## Enforcement
56
+
57
+ Instances of abusive, harassing, or otherwise unacceptable behavior may be
58
+ reported by contacting the project team at monai.contact@gmail.com. All
59
+ complaints will be reviewed and investigated and will result in a response that
60
+ is deemed necessary and appropriate to the circumstances. The project team is
61
+ obligated to maintain confidentiality with regard to the reporter of an incident.
62
+ Further details of specific enforcement policies may be posted separately.
63
+
64
+ Project maintainers who do not follow or enforce the Code of Conduct in good
65
+ faith may face temporary or permanent repercussions as determined by other
66
+ members of the project's leadership.
67
+
68
+ ## Attribution
69
+
70
+ This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
71
+ available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
72
+
73
+ [homepage]: https://www.contributor-covenant.org
74
+
75
+ For answers to common questions about this code of conduct, see
76
+ https://www.contributor-covenant.org/faq
MONAI/source/CONTRIBUTING.md ADDED
@@ -0,0 +1,417 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ - [Introduction](#introduction)
2
+ - [The contribution process](#the-contribution-process)
3
+ - [Preparing pull requests](#preparing-pull-requests)
4
+ 1. [Checking the coding style](#checking-the-coding-style)
5
+ 1. [Unit testing](#unit-testing)
6
+ 1. [Building the documentation](#building-the-documentation)
7
+ 1. [Automatic code formatting](#automatic-code-formatting)
8
+ 1. [Adding new optional dependencies](#adding-new-optional-dependencies)
9
+ 1. [Signing your work](#signing-your-work)
10
+ 1. [Utility functions](#utility-functions)
11
+ 1. [Backwards compatibility](#backwards-compatibility)
12
+ - [Submitting pull requests](#submitting-pull-requests)
13
+ - [The code reviewing process (for the maintainers)](#the-code-reviewing-process)
14
+ - [Reviewing pull requests](#reviewing-pull-requests)
15
+ - [Admin tasks (for the maintainers)](#admin-tasks)
16
+ - [Releasing a new version](#release-a-new-version)
17
+
18
+ ## Introduction
19
+
20
+ Welcome to Project MONAI! We're excited you're here and want to contribute. This documentation is intended for individuals and institutions interested in contributing to MONAI. MONAI is an open-source project and, as such, its success relies on its community of contributors willing to keep improving it. Your contribution will be a valued addition to the code base; we simply ask that you read this page and understand our contribution process, whether you are a seasoned open-source contributor or whether you are a first-time contributor.
21
+
22
+ ### Communicate with us
23
+
24
+ We are happy to talk with you about your needs for MONAI and your ideas for contributing to the project. One way to do this is to create an issue discussing your thoughts. It might be that a very similar feature is under development or already exists, so an issue is a great starting point. If you are looking for an issue to resolve that will help Project MONAI, see the [*good first issue*](https://github.com/Project-MONAI/MONAI/labels/good%20first%20issue) and [*Contribution wanted*](https://github.com/Project-MONAI/MONAI/labels/Contribution%20wanted) labels.
25
+
26
+ ### Does it belong in PyTorch instead of MONAI?
27
+
28
+ MONAI is part of [PyTorch Ecosystem](https://pytorch.org/ecosystem/), and mainly based on the PyTorch and Numpy libraries. These libraries implement what we consider to be best practice for general scientific computing and deep learning functionality. MONAI builds on these with a strong focus on medical applications. As such, it is a good idea to consider whether your functionality is medical-application specific or not. General deep learning functionality may be better off in PyTorch; you can find their contribution guidelines [here](https://pytorch.org/docs/stable/community/contribution_guide.html).
29
+
30
+ ## The contribution process
31
+
32
+ *Pull request early*
33
+
34
+ We encourage you to create pull requests early. It helps us track the contributions under development, whether they are ready to be merged or not. [Create a draft pull request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/changing-the-stage-of-a-pull-request) until it is ready for formal review.
35
+
36
+ Please note that, as per PyTorch, MONAI uses American English spelling. This means classes and variables should be: normali**z**e, visuali**z**e, colo~~u~~r, etc.
37
+
38
+ ### Preparing pull requests
39
+
40
+ To ensure the code quality, MONAI relies on several linting tools ([flake8 and its plugins](https://gitlab.com/pycqa/flake8), [black](https://github.com/psf/black), [isort](https://github.com/timothycrosley/isort), [ruff](https://github.com/astral-sh/ruff)),
41
+ static type analysis tools ([mypy](https://github.com/python/mypy), [pytype](https://github.com/google/pytype)), as well as a set of unit/integration tests.
42
+
43
+ This section highlights all the necessary preparation steps required before sending a pull request.
44
+ To collaborate efficiently, please read through this section and follow them.
45
+
46
+ - [Checking the coding style](#checking-the-coding-style)
47
+ - [Licensing information](#licensing-information)
48
+ - [Unit testing](#unit-testing)
49
+ - [Building documentation](#building-the-documentation)
50
+ - [Signing your work](#signing-your-work)
51
+
52
+ #### Checking the coding style
53
+
54
+ Coding style is checked and enforced by flake8, black, isort, and ruff, using [a flake8 configuration](./setup.cfg) similar to [PyTorch's](https://github.com/pytorch/pytorch/blob/master/.flake8).
55
+ Before submitting a pull request, we recommend that all linting should pass, by running the following command locally:
56
+
57
+ ```bash
58
+ # optionally update the dependencies and dev tools
59
+ python -m pip install -U pip
60
+ python -m pip install -U -r requirements-dev.txt
61
+
62
+ # run the linting and type checking tools
63
+ ./runtests.sh --codeformat
64
+
65
+ # try to fix the coding style errors automatically
66
+ ./runtests.sh --autofix
67
+ ```
68
+
69
+ Full linting and type checking may take some time. If you need a quick check, run
70
+
71
+ ```bash
72
+ # run ruff only
73
+ ./runtests.sh --ruff
74
+ ```
75
+
76
+ #### Licensing information
77
+
78
+ All source code files should start with this paragraph:
79
+
80
+ ```
81
+ # Copyright (c) MONAI Consortium
82
+ # Licensed under the Apache License, Version 2.0 (the "License");
83
+ # you may not use this file except in compliance with the License.
84
+ # You may obtain a copy of the License at
85
+ # http://www.apache.org/licenses/LICENSE-2.0
86
+ # Unless required by applicable law or agreed to in writing, software
87
+ # distributed under the License is distributed on an "AS IS" BASIS,
88
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
89
+ # See the License for the specific language governing permissions and
90
+ # limitations under the License.
91
+
92
+ ```
93
+
94
+ ##### Exporting modules
95
+
96
+ If you intend for any variables/functions/classes to be available outside of the file with the edited functionality, then:
97
+
98
+ - Create or append to the `__all__` variable (in the file in which functionality has been added), and
99
+ - Add to the `__init__.py` file.
100
+
101
+ #### Unit testing
102
+
103
+ MONAI tests are located under `tests/`.
104
+
105
+ - The unit test's file name currently follows `test_[module_name].py` or `test_[module_name]_dist.py`.
106
+ - The `test_[module_name]_dist.py` subset of unit tests requires a distributed environment to verify the module with distributed GPU-based computation.
107
+ - The integration test's file name follows `test_integration_[workflow_name].py`.
108
+
109
+ A bash script (`runtests.sh`) is provided to run all tests locally.
110
+ Please run ``./runtests.sh -h`` to see all options.
111
+
112
+ To run a particular test, for example `tests/losses/test_dice_loss.py`:
113
+
114
+ ```
115
+ python -m tests.losses.test_dice_loss
116
+ ```
117
+
118
+ Before submitting a pull request, we recommend that all linting and unit tests
119
+ should pass, by running the following command locally:
120
+
121
+ ```bash
122
+ ./runtests.sh -f -u --net --coverage
123
+ ```
124
+
125
+ or (for new features that would not break existing functionality):
126
+
127
+ ```bash
128
+ ./runtests.sh --quick --unittests
129
+ ```
130
+
131
+ It is recommended that the new test `test_[module_name].py` is constructed by using only
132
+ python 3.9+ build-in functions, `torch`, `numpy`, `coverage` (for reporting code coverages) and `parameterized` (for organising test cases) packages.
133
+ If it requires any other external packages, please make sure:
134
+
135
+ - the packages are listed in [`requirements-dev.txt`](requirements-dev.txt)
136
+ - the new test `test_[module_name].py` is added to the `exclude_cases` in [`./tests/min_tests.py`](./tests/min_tests.py) so that
137
+ the minimal CI runner will not execute it.
138
+
139
+ ##### Testing data
140
+
141
+ Testing data such as images and binary files should not be placed in the source code repository.
142
+ Please deploy them to a reliable file sharing location (the current preferred one is [https://github.com/Project-MONAI/MONAI-extra-test-data/releases](https://github.com/Project-MONAI/MONAI-extra-test-data/releases)).
143
+ At test time, the URLs within `tests/testing_data/data_config.json` are accessible
144
+ via the APIs provided in `tests.utils`: `tests.utils.testing_data_config` and `tests.utils.download_url_or_skip_test`.
145
+
146
+ *If it's not tested, it's broken*
147
+
148
+ All new functionality should be accompanied by an appropriate set of tests.
149
+ MONAI functionality has plenty of unit tests from which you can draw inspiration,
150
+ and you can reach out to us if you are unsure of how to proceed with testing.
151
+
152
+ MONAI's code coverage report is available at [CodeCov](https://codecov.io/gh/Project-MONAI/MONAI).
153
+
154
+ #### Building the documentation
155
+
156
+ MONAI's documentation is located at `docs/`.
157
+
158
+ ```bash
159
+ # install the doc-related dependencies
160
+ pip install --upgrade pip
161
+ pip install -r docs/requirements.txt
162
+
163
+ # build the docs
164
+ cd docs/
165
+ make html
166
+ ```
167
+
168
+ The above commands build html documentation, they are used to automatically generate [https://docs.monai.io](https://docs.monai.io).
169
+
170
+ The Python code docstring are written in
171
+ [reStructuredText](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) and
172
+ the documentation pages can be in either [reStructuredText](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) or [Markdown](https://en.wikipedia.org/wiki/Markdown). In general the Python docstrings follow the [Google style](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings).
173
+
174
+ Before submitting a pull request, it is recommended to:
175
+
176
+ - edit the relevant `.rst` files in [`docs/source`](./docs/source) accordingly.
177
+ - build html documentation locally
178
+ - check the auto-generated documentation (by browsing `./docs/build/html/index.html` with a web browser)
179
+ - type `make clean` in `docs/` folder to remove the current build files.
180
+
181
+ Please type `make help` in `docs/` folder for all supported format options.
182
+
183
+ #### Automatic code formatting
184
+
185
+ MONAI provides support of automatic Python code formatting via [a customised GitHub action](https://github.com/Project-MONAI/monai-code-formatter).
186
+ This makes the project's Python coding style consistent and reduces maintenance burdens.
187
+ Commenting a pull request with `/black` triggers the formatting action based on [`psf/Black`](https://github.com/psf/black) (this is implemented with [`slash command dispatch`](https://github.com/marketplace/actions/slash-command-dispatch)).
188
+
189
+ Steps for the formatting process:
190
+
191
+ - After submitting a pull request or push to an existing pull request,
192
+ make a comment to the pull request to trigger the formatting action.
193
+ The first line of the comment must be `/black` so that it will be interpreted by [the comment parser](https://github.com/marketplace/actions/slash-command-dispatch#how-are-comments-parsed-for-slash-commands).
194
+ - [Auto] The GitHub action tries to format all Python files (using [`psf/Black`](https://github.com/psf/black)) in the branch and makes a commit under the name "MONAI bot" if there's code change. The actual formatting action is deployed at [project-monai/monai-code-formatter](https://github.com/Project-MONAI/monai-code-formatter).
195
+ - [Auto] After the formatting commit, the GitHub action adds an emoji to the comment that triggered the process.
196
+ - Repeat the above steps if necessary.
197
+
198
+ #### Adding new optional dependencies
199
+
200
+ In addition to the minimal requirements of PyTorch and Numpy, MONAI's core modules are built optionally based on 3rd-party packages.
201
+ The current set of dependencies is listed in [installing dependencies](https://monai.readthedocs.io/en/stable/installation.html#installing-the-recommended-dependencies).
202
+
203
+ To allow for flexible integration of MONAI with other systems and environments,
204
+ the optional dependency APIs are always invoked lazily. For example,
205
+
206
+ ```py
207
+ from monai.utils import optional_import
208
+ itk, _ = optional_import("itk", ...)
209
+
210
+ class ITKReader(ImageReader):
211
+ ...
212
+ def read(self, ...):
213
+ return itk.imread(...)
214
+ ```
215
+
216
+ The availability of the external `itk.imread` API is not required unless `monai.data.ITKReader.read` is called by the user.
217
+ Integration tests with minimal requirements are deployed to ensure this strategy.
218
+
219
+ To add new optional dependencies, please communicate with the core team during pull request reviews,
220
+ and add the necessary information (at least) to the following files:
221
+
222
+ - [setup.cfg](https://github.com/Project-MONAI/MONAI/blob/dev/setup.cfg) (for package's `[options.extras_require]` config)
223
+ - [requirements-dev.txt](https://github.com/Project-MONAI/MONAI/blob/dev/requirements-dev.txt) (pip requirements file)
224
+ - [docs/requirements.txt](https://github.com/Project-MONAI/MONAI/blob/dev/docs/requirements.txt) (docs pip requirements file)
225
+ - [environment-dev.yml](https://github.com/Project-MONAI/MONAI/blob/dev/environment-dev.yml) (conda environment file)
226
+ - [installation.md](https://github.com/Project-MONAI/MONAI/blob/dev/docs/source/installation.md) (documentation)
227
+
228
+ When writing unit tests that use 3rd-party packages, it is a good practice to always consider
229
+ an appropriate fallback default behaviour when the packages are not installed in
230
+ the testing environment. For example:
231
+
232
+ ```py
233
+ from monai.utils import optional_import
234
+ plt, has_matplotlib = optional_import("matplotlib.pyplot")
235
+
236
+ @skipUnless(has_matplotlib, "Matplotlib required")
237
+ class TestBlendImages(unittest.TestCase):
238
+ ```
239
+
240
+ It skips the test cases when `matplotlib.pyplot` APIs are not available.
241
+
242
+ Alternatively, add the test file name to the ``exclude_cases`` in `tests/min_tests.py` to completely skip the test
243
+ cases when running in a minimal setup.
244
+
245
+ #### Signing your work
246
+
247
+ MONAI enforces the [Developer Certificate of Origin](https://developercertificate.org/) (DCO) on all pull requests.
248
+ All commit messages should contain the `Signed-off-by` line with an email address. The [GitHub DCO app](https://github.com/apps/dco) is deployed on MONAI. The pull request's status will be `failed` if commits do not contain a valid `Signed-off-by` line.
249
+
250
+ Git has a `-s` (or `--signoff`) command-line option to append this automatically to your commit message:
251
+
252
+ ```bash
253
+ git commit -s -m 'a new commit'
254
+ ```
255
+
256
+ The commit message will be:
257
+
258
+ ```
259
+ a new commit
260
+
261
+ Signed-off-by: Your Name <yourname@example.org>
262
+ ```
263
+
264
+ Full text of the DCO:
265
+
266
+ ```
267
+ Developer Certificate of Origin
268
+ Version 1.1
269
+
270
+ Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
271
+ 1 Letterman Drive
272
+ Suite D4700
273
+ San Francisco, CA, 94129
274
+
275
+ Everyone is permitted to copy and distribute verbatim copies of this
276
+ license document, but changing it is not allowed.
277
+
278
+
279
+ Developer's Certificate of Origin 1.1
280
+
281
+ By making a contribution to this project, I certify that:
282
+
283
+ (a) The contribution was created in whole or in part by me and I
284
+ have the right to submit it under the open source license
285
+ indicated in the file; or
286
+
287
+ (b) The contribution is based upon previous work that, to the best
288
+ of my knowledge, is covered under an appropriate open source
289
+ license and I have the right under that license to submit that
290
+ work with modifications, whether created in whole or in part
291
+ by me, under the same open source license (unless I am
292
+ permitted to submit under a different license), as indicated
293
+ in the file; or
294
+
295
+ (c) The contribution was provided directly to me by some other
296
+ person who certified (a), (b) or (c) and I have not modified
297
+ it.
298
+
299
+ (d) I understand and agree that this project and the contribution
300
+ are public and that a record of the contribution (including all
301
+ personal information I submit with it, including my sign-off) is
302
+ maintained indefinitely and may be redistributed consistent with
303
+ this project or the open source license(s) involved.
304
+ ```
305
+
306
+ #### Utility functions
307
+
308
+ MONAI provides a set of generic utility functions and frequently used routines.
309
+ These are located in [``monai/utils``](./monai/utils/) and in the module folders such as [``networks/utils.py``](./monai/networks/).
310
+ Users are encouraged to use these common routines to improve code readability and reduce the code maintenance burdens.
311
+
312
+ Notably,
313
+
314
+ - ``monai.module.export`` decorator can make the module name shorter when importing,
315
+ for example, ``import monai.transforms.Spacing`` is the equivalent of ``monai.transforms.spatial.array.Spacing`` if
316
+ ``class Spacing`` defined in file `monai/transforms/spatial/array.py` is decorated with ``@export("monai.transforms")``.
317
+
318
+ For string definition, [f-string](https://www.python.org/dev/peps/pep-0498/) is recommended to use over `%-print` and `format-print`. So please try to use `f-string` if you need to define any string object.
319
+
320
+ #### Backwards compatibility
321
+
322
+ MONAI in general follows [PyTorch's policy for backward compatibility](https://github.com/pytorch/pytorch/wiki/PyTorch's-Python-Frontend-Backward-and-Forward-Compatibility-Policy).
323
+ Utility functions are provided in `monai.utils.deprecated` to help migrate from the deprecated to new APIs. The use of these utilities is encouraged.
324
+ The pull request [template contains checkboxes](https://github.com/Project-MONAI/MONAI/blame/dev/.github/pull_request_template.md#L11-L12) that
325
+ the contributor should use accordingly to clearly indicate breaking changes.
326
+
327
+ The process of releasing backwards incompatible API changes is as follows:
328
+
329
+ 1. discuss the breaking changes during pull requests or in dev meetings with a feature proposal if needed.
330
+ 1. add a warning message in the upcoming release (version `X.Y`), the warning message should include a forecast of removing the deprecated API in:
331
+ 1. `X+1.0` -- major version `X+1` and minor version `0` the next major version if it's a significant change,
332
+ 1. `X.Y+2` -- major version `X` and minor version `Y+2` (the minor version after the next one), if it's a minor API change.
333
+ 1. Note that the versioning policy is similar to PyTorch's approach which does not precisely follow [the semantic versioning](https://semver.org/) definition.
334
+ Major version numbers are instead used to represent major product version (which is currently not planned to be greater than 1),
335
+ minor version for both compatible and incompatible, and patch version for bug fixes.
336
+ 1. when recommending new API to use in place of a deprecated API, the recommended version should
337
+ provide exact feature-like behaviour otherwise users will have a harder time migrating.
338
+ 1. add new test cases by extending the existing unit tests to cover both the deprecated and updated APIs.
339
+ 1. collect feedback from the users during the subsequent few releases, and reconsider step 1 if needed.
340
+ 1. before each release, review the deprecating APIs and relevant tests, and clean up the removed APIs described in step 2.
341
+
342
+ ### Submitting pull requests
343
+
344
+ All code changes to the dev branch must be done via [pull requests](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/proposing-changes-to-your-work-with-pull-requests).
345
+
346
+ 1. Create a new ticket or take a known ticket from [the issue list][monai issue list].
347
+ 1. Check if there's already a branch dedicated to the task.
348
+ 1. If the task has not been taken, [create a new branch in your fork](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork)
349
+ of the codebase named `[ticket_id]-[task_name]`.
350
+ For example, branch name `19-ci-pipeline-setup` corresponds to [issue #19](https://github.com/Project-MONAI/MONAI/issues/19).
351
+ Ideally, the new branch should be based on the latest `dev` branch.
352
+ 1. Make changes to the branch ([use detailed commit messages if possible](https://chris.beams.io/posts/git-commit/)).
353
+ 1. Make sure that new tests cover the changes and the changed codebase [passes all tests locally](#unit-testing).
354
+ 1. [Create a new pull request](https://help.github.com/en/desktop/contributing-to-projects/creating-a-pull-request) from the task branch to the dev branch, with detailed descriptions of the purpose of this pull request.
355
+ 1. Check [the CI/CD status of the pull request][github ci], make sure all CI/CD tests passed.
356
+ 1. Wait for reviews; if there are reviews, make point-to-point responses, make further code changes if needed.
357
+ 1. If there are conflicts between the pull request branch and the dev branch, pull the changes from the dev and resolve the conflicts locally.
358
+ 1. Reviewer and contributor may have discussions back and forth until all comments addressed.
359
+ 1. Wait for the pull request to be merged.
360
+
361
+ ## The code reviewing process
362
+
363
+ ### Reviewing pull requests
364
+
365
+ All code review comments should be specific, constructive, and actionable.
366
+
367
+ 1. Check [the CI/CD status of the pull request][github ci], make sure all CI/CD tests passed before reviewing (contact the branch owner if needed).
368
+ 1. Read carefully the descriptions of the pull request and the files changed, write comments if needed.
369
+ 1. Make in-line comments to specific code segments, [request for changes](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-request-reviews) if needed.
370
+ 1. Review any further code changes until all comments addressed by the contributors.
371
+ 1. Comment to trigger `/black` and/or `/integration-test` for optional auto code formatting and [integration tests](.github/workflows/integration.yml).
372
+ 1. [Maintainers] Review the changes and comment `/build` to trigger internal full tests.
373
+ 1. Merge the pull request to the dev branch.
374
+ 1. Close the corresponding task ticket on [the issue list][monai issue list].
375
+
376
+ [github ci]: https://github.com/Project-MONAI/MONAI/actions
377
+ [monai issue list]: https://github.com/Project-MONAI/MONAI/issues
378
+
379
+ ## Admin tasks
380
+
381
+ ### Release a new version
382
+
383
+ The `dev` branch's `HEAD` always corresponds to MONAI docker image's latest tag: `projectmonai/monai:latest`.
384
+ The `main` branch's `HEAD` always corresponds to the latest MONAI milestone release.
385
+
386
+ When major features are ready for a milestone, to prepare for a new release:
387
+
388
+ - Prepare [a release note](https://github.com/Project-MONAI/MONAI/releases) and release checklist.
389
+ - Check out or cherry-pick a new branch `releasing/[version number]` locally from the `dev` branch and push to the codebase.
390
+ - Create a release candidate tag, for example, `git tag -a 0.1.0rc1 -m "release candidate 1 of version 0.1.0"`.
391
+ - Push the tag to the codebase, for example, `git push origin 0.1.0rc1`.
392
+ This step will trigger package building and testing.
393
+ The resultant packages are automatically uploaded to
394
+ [TestPyPI](https://test.pypi.org/project/monai/). The packages are also available for downloading as
395
+ repository's artifacts (e.g. the file at <https://github.com/Project-MONAI/MONAI/actions/runs/66570977>).
396
+ - Check the release test at [TestPyPI](https://test.pypi.org/project/monai/), download the artifacts when the CI finishes.
397
+ - Optionally run [the cron testing jobs](https://github.com/Project-MONAI/MONAI/blob/dev/.github/workflows/cron.yml) on `releasing/[version number]`.
398
+ - Rebase `releasing/[version number]` to `main`, make sure all the test pipelines succeed.
399
+ - Once the release candidate is verified, tag and push a milestone, for example, `git push origin 0.1.0`.
400
+ The tag must be with the latest commit of `releasing/[version number]`.
401
+ - Upload the packages to [PyPI](https://pypi.org/project/monai/).
402
+ This could be done manually by ``twine upload dist/*``, given the artifacts are unzipped to the folder ``dist/``.
403
+ - Merge `releasing/[version number]` to `dev`, this step must make sure that the tagging commit unchanged on `dev`.
404
+ - Publish the release note.
405
+
406
+ Note that the release should be tagged with a [PEP440](https://www.python.org/dev/peps/pep-0440/) compliant version number.
407
+
408
+ If any error occurs during the release process, first check out a new hotfix branch from the `releasing/[version number]`,
409
+ then make PRs to the `releasing/[version number]` to fix the bugs via the regular contribution procedure.
410
+
411
+ If any error occurs after the release process, first check out a new hotfix branch from the `main` branch,
412
+ make a patch version release following the semantic versioning, for example, `releasing/0.1.1`.
413
+ Make sure the `releasing/0.1.1` is merged back into both `dev` and `main` and all the test pipelines succeed.
414
+
415
+ <p align="right">
416
+ <a href="#introduction">⬆️ Back to Top</a>
417
+ </p>
MONAI/source/Dockerfile ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) MONAI Consortium
2
+ # Licensed under the Apache License, Version 2.0 (the "License");
3
+ # you may not use this file except in compliance with the License.
4
+ # You may obtain a copy of the License at
5
+ # http://www.apache.org/licenses/LICENSE-2.0
6
+ # Unless required by applicable law or agreed to in writing, software
7
+ # distributed under the License is distributed on an "AS IS" BASIS,
8
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9
+ # See the License for the specific language governing permissions and
10
+ # limitations under the License.
11
+
12
+ # To build with a different base image
13
+ # please run `docker build` using the `--build-arg PYTORCH_IMAGE=...` flag.
14
+ ARG PYTORCH_IMAGE=nvcr.io/nvidia/pytorch:24.10-py3
15
+ FROM ${PYTORCH_IMAGE}
16
+
17
+ LABEL maintainer="monai.contact@gmail.com"
18
+
19
+ # TODO: remark for issue [revise the dockerfile](https://github.com/zarr-developers/numcodecs/issues/431)
20
+ RUN if [[ $(uname -m) =~ "aarch64" ]]; then \
21
+ export CFLAGS="-O3" && \
22
+ export DISABLE_NUMCODECS_SSE2=true && \
23
+ export DISABLE_NUMCODECS_AVX2=true && \
24
+ pip install numcodecs; \
25
+ fi
26
+
27
+ WORKDIR /opt/monai
28
+
29
+ # install full deps
30
+ COPY requirements.txt requirements-min.txt requirements-dev.txt /tmp/
31
+ RUN cp /tmp/requirements.txt /tmp/req.bak \
32
+ && awk '!/torch/' /tmp/requirements.txt > /tmp/tmp && mv /tmp/tmp /tmp/requirements.txt \
33
+ && python -m pip install --upgrade --no-cache-dir pip \
34
+ && python -m pip install --no-cache-dir -r /tmp/requirements-dev.txt
35
+
36
+ # compile ext and remove temp files
37
+ # TODO: remark for issue [revise the dockerfile #1276](https://github.com/Project-MONAI/MONAI/issues/1276)
38
+ # please specify exact files and folders to be copied -- else, basically always, the Docker build process cannot cache
39
+ # this or anything below it and always will build from at most here; one file change leads to no caching from here on...
40
+
41
+ COPY LICENSE CHANGELOG.md CODE_OF_CONDUCT.md CONTRIBUTING.md README.md versioneer.py setup.py setup.cfg runtests.sh MANIFEST.in ./
42
+ COPY tests ./tests
43
+ COPY monai ./monai
44
+
45
+ # TODO: remove this line and torch.patch for 24.11
46
+ RUN patch -R -d /usr/local/lib/python3.10/dist-packages/torch/onnx/ < ./monai/torch.patch
47
+
48
+ RUN BUILD_MONAI=1 FORCE_CUDA=1 python setup.py develop \
49
+ && rm -rf build __pycache__
50
+
51
+ # NGC Client
52
+ WORKDIR /opt/tools
53
+ ARG NGC_CLI_URI="https://ngc.nvidia.com/downloads/ngccli_linux.zip"
54
+ RUN wget -q ${NGC_CLI_URI} && unzip ngccli_linux.zip && chmod u+x ngc-cli/ngc && \
55
+ find ngc-cli/ -type f -exec md5sum {} + | LC_ALL=C sort | md5sum -c ngc-cli.md5 && \
56
+ rm -rf ngccli_linux.zip ngc-cli.md5
57
+ ENV PATH=${PATH}:/opt/tools:/opt/tools/ngc-cli
58
+ RUN apt-get update \
59
+ && DEBIAN_FRONTEND="noninteractive" apt-get install -y libopenslide0 \
60
+ && rm -rf /var/lib/apt/lists/*
61
+ # append /opt/tools to runtime path for NGC CLI to be accessible from all file system locations
62
+ ENV PATH=${PATH}:/opt/tools
63
+ ENV POLYGRAPHY_AUTOINSTALL_DEPS=1
64
+
65
+
66
+ WORKDIR /opt/monai
MONAI/source/LICENSE ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [yyyy] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
MONAI/source/MANIFEST.in ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ include versioneer.py
2
+ include monai/_version.py
3
+
4
+ include README.md
5
+ include LICENSE
MONAI/source/README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <p align="center">
2
+ <img src="https://raw.githubusercontent.com/Project-MONAI/MONAI/dev/docs/images/MONAI-logo-color.png" width="50%" alt='project-monai'>
3
+ </p>
4
+
5
+ **M**edical **O**pen **N**etwork for **AI**
6
+
7
+ ![Supported Python versions](https://raw.githubusercontent.com/Project-MONAI/MONAI/dev/docs/images/python.svg)
8
+ [![License](https://img.shields.io/badge/license-Apache%202.0-green.svg)](https://opensource.org/licenses/Apache-2.0)
9
+ [![auto-commit-msg](https://img.shields.io/badge/dynamic/json?label=citations&query=%24.citationCount&url=https%3A%2F%2Fapi.semanticscholar.org%2Fgraph%2Fv1%2Fpaper%2FDOI%3A10.48550%2FarXiv.2211.02701%3Ffields%3DcitationCount)](https://arxiv.org/abs/2211.02701)
10
+ [![PyPI version](https://badge.fury.io/py/monai.svg)](https://badge.fury.io/py/monai)
11
+ [![docker](https://img.shields.io/badge/docker-pull-green.svg?logo=docker&logoColor=white)](https://hub.docker.com/r/projectmonai/monai)
12
+ [![conda](https://img.shields.io/conda/vn/conda-forge/monai?color=green)](https://anaconda.org/conda-forge/monai)
13
+
14
+ [![premerge](https://github.com/Project-MONAI/MONAI/actions/workflows/pythonapp.yml/badge.svg?branch=dev)](https://github.com/Project-MONAI/MONAI/actions/workflows/pythonapp.yml)
15
+ [![postmerge](https://img.shields.io/github/checks-status/project-monai/monai/dev?label=postmerge)](https://github.com/Project-MONAI/MONAI/actions?query=branch%3Adev)
16
+ [![Documentation Status](https://readthedocs.org/projects/monai/badge/?version=latest)](https://monai.readthedocs.io/en/latest/)
17
+ [![codecov](https://codecov.io/gh/Project-MONAI/MONAI/branch/dev/graph/badge.svg?token=6FTC7U1JJ4)](https://codecov.io/gh/Project-MONAI/MONAI)
18
+ [![monai Downloads Last Month](https://assets.piptrends.com/get-last-month-downloads-badge/monai.svg 'monai Downloads Last Month by pip Trends')](https://piptrends.com/package/monai)
19
+
20
+ MONAI is a [PyTorch](https://pytorch.org/)-based, [open-source](https://github.com/Project-MONAI/MONAI/blob/dev/LICENSE) framework for deep learning in healthcare imaging, part of the [PyTorch Ecosystem](https://pytorch.org/ecosystem/).
21
+ Its ambitions are as follows:
22
+
23
+ - Developing a community of academic, industrial and clinical researchers collaborating on a common foundation;
24
+ - Creating state-of-the-art, end-to-end training workflows for healthcare imaging;
25
+ - Providing researchers with the optimized and standardized way to create and evaluate deep learning models.
26
+
27
+ ## Features
28
+
29
+ > _Please see [the technical highlights](https://monai.readthedocs.io/en/latest/highlights.html) and [What's New](https://monai.readthedocs.io/en/latest/whatsnew.html) of the milestone releases._
30
+
31
+ - flexible pre-processing for multi-dimensional medical imaging data;
32
+ - compositional & portable APIs for ease of integration in existing workflows;
33
+ - domain-specific implementations for networks, losses, evaluation metrics and more;
34
+ - customizable design for varying user expertise;
35
+ - multi-GPU multi-node data parallelism support.
36
+
37
+ ## Requirements
38
+
39
+ MONAI works with the [currently supported versions of Python](https://devguide.python.org/versions), and depends directly on NumPy and PyTorch with many optional dependencies.
40
+
41
+ * Major releases of MONAI will have dependency versions stated for them. The current state of the `dev` branch in this repository is the unreleased development version of MONAI which typically will support current versions of dependencies and include updates and bug fixes to do so.
42
+ * PyTorch support covers [the current version](https://github.com/pytorch/pytorch/releases) plus three previous minor versions. If compatibility issues with a PyTorch version and other dependencies arise, support for a version may be delayed until a major release.
43
+ * Our support policy for other dependencies adheres for the most part to [SPEC0](https://scientific-python.org/specs/spec-0000), where dependency versions are supported where possible for up to two years. Discovered vulnerabilities or defects may require certain versions to be explicitly not supported.
44
+ * See the `requirements*.txt` files for dependency version information.
45
+
46
+ ## Installation
47
+
48
+ To install [the current release](https://pypi.org/project/monai/), you can simply run:
49
+
50
+ ```bash
51
+ pip install monai
52
+ ```
53
+
54
+ Please refer to [the installation guide](https://monai.readthedocs.io/en/latest/installation.html) for other installation options.
55
+
56
+ ## Getting Started
57
+
58
+ [MedNIST demo](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/main/2d_classification/mednist_tutorial.ipynb) and [MONAI for PyTorch Users](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/main/modules/developer_guide.ipynb) are available on Colab.
59
+
60
+ Examples and notebook tutorials are located at [Project-MONAI/tutorials](https://github.com/Project-MONAI/tutorials).
61
+
62
+ Technical documentation is available at [docs.monai.io](https://docs.monai.io).
63
+
64
+ ## Citation
65
+
66
+ If you have used MONAI in your research, please cite us! The citation can be exported from: <https://arxiv.org/abs/2211.02701>.
67
+
68
+ ## Model Zoo
69
+
70
+ [The MONAI Model Zoo](https://github.com/Project-MONAI/model-zoo) is a place for researchers and data scientists to share the latest and great models from the community.
71
+ Utilizing [the MONAI Bundle format](https://monai.readthedocs.io/en/latest/bundle_intro.html) makes it easy to [get started](https://github.com/Project-MONAI/tutorials/tree/main/model_zoo) building workflows with MONAI.
72
+
73
+ ## Contributing
74
+
75
+ For guidance on making a contribution to MONAI, see the [contributing guidelines](https://github.com/Project-MONAI/MONAI/blob/dev/CONTRIBUTING.md).
76
+
77
+ ## Community
78
+
79
+ Join the conversation on Twitter/X [@ProjectMONAI](https://twitter.com/ProjectMONAI), [LinkedIn](https://www.linkedin.com/company/projectmonai), or join our [Slack channel](https://forms.gle/QTxJq3hFictp31UM9).
80
+
81
+ Ask and answer questions over on [MONAI's GitHub Discussions tab](https://github.com/Project-MONAI/MONAI/discussions).
82
+
83
+ ## Links
84
+
85
+ - Website: <https://project-monai.github.io/>
86
+ - API documentation (milestone): <https://monai.readthedocs.io/>
87
+ - API documentation (latest dev): <https://monai.readthedocs.io/en/latest/>
88
+ - Code: <https://github.com/Project-MONAI/MONAI>
89
+ - Project tracker: <https://github.com/Project-MONAI/MONAI/projects>
90
+ - Issue tracker: <https://github.com/Project-MONAI/MONAI/issues>
91
+ - Wiki: <https://github.com/Project-MONAI/MONAI/wiki>
92
+ - Test status: <https://github.com/Project-MONAI/MONAI/actions>
93
+ - PyPI package: <https://pypi.org/project/monai/>
94
+ - conda-forge: <https://anaconda.org/conda-forge/monai>
95
+ - Weekly previews: <https://pypi.org/project/monai-weekly/>
96
+ - Docker Hub: <https://hub.docker.com/r/projectmonai/monai>
MONAI/source/SECURITY.md ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Security Policy
2
+
3
+ ## Reporting a Vulnerability
4
+ MONAI takes security seriously and appreciate your efforts to responsibly disclose vulnerabilities. If you discover a security issue, please report it as soon as possible.
5
+
6
+ To report a security issue:
7
+ * please use the GitHub Security Advisories tab to "[Open a draft security advisory](https://github.com/Project-MONAI/MONAI/security/advisories/new)".
8
+ * Include a detailed description of the issue, steps to reproduce, potential impact, and any possible mitigations.
9
+ * If applicable, please also attach proof-of-concept code or screenshots.
10
+ * We aim to acknowledge your report within 72 hours and provide a status update as we investigate.
11
+ * Please do not create public issues for security-related reports.
12
+
13
+ ## Disclosure Policy
14
+ * We follow a coordinated disclosure approach.
15
+ * We will not publicly disclose vulnerabilities until a fix has been developed and released.
16
+ * Credit will be given to researchers who responsibly disclose vulnerabilities, if requested.
17
+ ## Acknowledgements
18
+ We greatly appreciate contributions from the security community and strive to recognize all researchers who help keep MONAI safe.
MONAI/source/__init__.py ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ """
3
+ MONAI Project Package Initialization File
4
+ """
MONAI/source/docs/.readthedocs.yaml ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Read the Docs configuration file
2
+ # See https://docs.readthedocs.io/en/stable/config-file for details
3
+
4
+ version: 2
5
+
6
+ build:
7
+ os: ubuntu-22.04
8
+ tools:
9
+ python: "3.9"
10
+ sphinx:
11
+ configuration: docs/source/conf.py
12
+ python:
13
+ install:
14
+ - requirements: docs/requirements.txt
MONAI/source/docs/Makefile ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Minimal makefile for Sphinx documentation
2
+ #
3
+
4
+ # You can set these variables from the command line, and also
5
+ # from the environment for the first two.
6
+ SPHINXOPTS ?=
7
+ SPHINXBUILD ?= sphinx-build
8
+ SOURCEDIR = source
9
+ BUILDDIR = build
10
+
11
+ # https://github.com/Project-MONAI/MONAI/issues/4354
12
+ export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION := python
13
+
14
+ # Put it first so that "make" without argument is like "make help".
15
+ help:
16
+ @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
17
+
18
+ .PHONY: help Makefile
19
+
20
+ # Catch-all target: route all unknown targets to Sphinx using the new
21
+ # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
22
+ %: Makefile
23
+ PIP_ROOT_USER_ACTION=ignore pip install -r requirements.txt
24
+ @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
25
+
26
+ clean:
27
+ rm -rf build/
28
+ rm -rf source/_gen
29
+ rm -rf source/*_properties.csv
MONAI/source/docs/_static/custom.css ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ @import url('https://fonts.googleapis.com/css?family=Lekton:700|Roboto&display=swap');
2
+ body{font-family:'Roboto',sans-serif;}.wy-menu-vertical p.caption{color:#7cccc7;}
3
+ *{font-variant-ligatures: none;}.autoclasstoc td {padding:0.2rem;line-height:normal;}
4
+ dl.field-list>dt{word-break: normal}
MONAI/source/docs/images/3d_paired.png ADDED
MONAI/source/docs/images/BTCV_organs.png ADDED

Git LFS Details

  • SHA256: 6edb3358d9ac570c2495ebffb0069e79535f92f00f3592ef23e86268e97afd45
  • Pointer size: 131 Bytes
  • Size of remote file: 726 kB
MONAI/source/docs/images/MONAI-logo-color.png ADDED
MONAI/source/docs/images/MONAI_arch.png ADDED
MONAI/source/docs/images/MONAI_bundle_cloud.png ADDED

Git LFS Details

  • SHA256: fdf5b545aafc80c218bc3bc84698e9ded6a167fff5199227ecc9a97f2a837a92
  • Pointer size: 131 Bytes
  • Size of remote file: 131 kB
MONAI/source/docs/images/MONAI_clouds.png ADDED
MONAI/source/docs/images/MONAI_map_cloud.png ADDED

Git LFS Details

  • SHA256: feaefb6801b016082651b5c9ddd1bd8f29792012c5493b8b13e4f80edcb52fcb
  • Pointer size: 131 Bytes
  • Size of remote file: 105 kB
MONAI/source/docs/images/UNETR.png ADDED

Git LFS Details

  • SHA256: 3a3aa9f4ae4160c9553a5dfdeba1b40d21e8a8c5f6690757ce3e5941fa9ce674
  • Pointer size: 131 Bytes
  • Size of remote file: 246 kB
MONAI/source/docs/images/affine.png ADDED

Git LFS Details

  • SHA256: 1f4b1541b36faf9fadb91aa50835bdd1c385ba07ca6273c753f2720c6a165a42
  • Pointer size: 131 Bytes
  • Size of remote file: 258 kB
MONAI/source/docs/images/amp_training_a100.png ADDED

Git LFS Details

  • SHA256: b3da4a2ce9bb75baade874f953fa4efa9288246a1a0791bf3ea4a27595e888a8
  • Pointer size: 131 Bytes
  • Size of remote file: 543 kB
MONAI/source/docs/images/amp_training_v100.png ADDED

Git LFS Details

  • SHA256: fe4bf2dbed97d9429a1cb1bbe452eeabc9ed266dccecad1d828e660f98f94d1c
  • Pointer size: 131 Bytes
  • Size of remote file: 678 kB
MONAI/source/docs/images/arch_modules.png ADDED

Git LFS Details

  • SHA256: 42314a4e5dfc3feebe587caeb337ac190cabd317dfd78927b1fa2c140eedaa44
  • Pointer size: 131 Bytes
  • Size of remote file: 240 kB
MONAI/source/docs/images/auto3dseg.png ADDED

Git LFS Details

  • SHA256: 899fa0a81474c13701f027135010fb5b42ba8f1284895010ce73b037e40846bf
  • Pointer size: 131 Bytes
  • Size of remote file: 235 kB
MONAI/source/docs/images/blend.png ADDED
MONAI/source/docs/images/blend_images.png ADDED
MONAI/source/docs/images/brats_distributed.png ADDED

Git LFS Details

  • SHA256: abb44e898239d33a9b79580d9d8b4266817a5f5089b6ae928ab4a303ffff8502
  • Pointer size: 131 Bytes
  • Size of remote file: 191 kB
MONAI/source/docs/images/cache_dataset.png ADDED

Git LFS Details

  • SHA256: e5a859f718cbcb496fd4aad5556423b96ba1bb62a321f6375033e37a9807a3b5
  • Pointer size: 131 Bytes
  • Size of remote file: 340 kB
MONAI/source/docs/images/cam.png ADDED
MONAI/source/docs/images/coplenet.png ADDED

Git LFS Details

  • SHA256: cf293e2294c6678f1d69a73520645755a50e95c04fe8f90fda9c325cb0d36116
  • Pointer size: 131 Bytes
  • Size of remote file: 645 kB