Archived Roadmaps
See the latest roadmaps here

We are finalizing our UI/UX overhaul by migrating all remaining components to the new ohif/ui-next. This update will fully replace the outdated OHIF/UI, offering enhanced customization, accessibility, and compatibility. Migration guides will be provided to ensure a smooth transition.

We will introduce undo/redo capabilities for annotation and segmentation actions, allowing users to reverse or reapply changes step-by-step. This feature will support configurable undo stack depth, with programmatic control to enable, disable, or clear the undo/redo history as needed.

Introducing a 3D brush tool for voxel-level segmentation, this tool enables selective region filling based on grayscale similarity and spatial connectivity. Users benefit from interactive previews, adjustable parameters, and integration with existing segmentations for precise, dynamic 3D editing.

We will develop methods that enable retrieval of detailed segmentation metrics, including intensity values, metabolic tumor volume, lesion glycolysis, and other critical statistics. This will support both individual labelmap segmentations and groups of overlapping segmentations, facilitating advanced analysis.

We will add functionality to interpolate label map segmentations across slices, providing users with a smooth, continuous segmentation experience. This tool will include a dedicated UI and will leverage algorithms from ITK to support interpolation between non-adjacent labelmap slices.

We will add a WebGPU-accelerated Grow Cut segmentation tool to our library, enabling fast, 3D segmentation. This tool will offer both manual and automatic modes, allowing users to configure initial regions for precise or one-click segmentation.

We will leverage AI models to enable automatic propagation and adjustment of segments across image slices. This functionality will use encoder-decoder models such as Segment Anything Model (SAM), allowing users to efficiently extend and adapt segmentations slice by slice.

We will improve the rendering of segmentation outlines in MIP viewports, ensuring that outlines accurately follow the anatomy from all viewing angles. This enhancement will provide configurable outline thickness and color, as well as interactive selection of segments.

We will create a tool that identifies and places bidimensional measurements on the largest in-plane slice of a 3D segment. This includes the longest diameter along the lesion’s main axis and the longest possible perpendicular diameter within the same slice.

We will add functionality to highlight individual segments on hover by increasing their border thickness, making it easier for users to identify and interact with specific lesions. This enhancement will require adjustments to the WebGL shader for dynamic border control.


























Archived Roadmaps
See the latest roadmaps here

We are finalizing our UI/UX overhaul by migrating all remaining components to the new ohif/ui-next. This update will fully replace the outdated OHIF/UI, offering enhanced customization, accessibility, and compatibility. Migration guides will be provided to ensure a smooth transition.

We will introduce undo/redo capabilities for annotation and segmentation actions, allowing users to reverse or reapply changes step-by-step. This feature will support configurable undo stack depth, with programmatic control to enable, disable, or clear the undo/redo history as needed.

Introducing a 3D brush tool for voxel-level segmentation, this tool enables selective region filling based on grayscale similarity and spatial connectivity. Users benefit from interactive previews, adjustable parameters, and integration with existing segmentations for precise, dynamic 3D editing.

We will develop methods that enable retrieval of detailed segmentation metrics, including intensity values, metabolic tumor volume, lesion glycolysis, and other critical statistics. This will support both individual labelmap segmentations and groups of overlapping segmentations, facilitating advanced analysis.

We will add functionality to interpolate label map segmentations across slices, providing users with a smooth, continuous segmentation experience. This tool will include a dedicated UI and will leverage algorithms from ITK to support interpolation between non-adjacent labelmap slices.

We will add a WebGPU-accelerated Grow Cut segmentation tool to our library, enabling fast, 3D segmentation. This tool will offer both manual and automatic modes, allowing users to configure initial regions for precise or one-click segmentation.

We will leverage AI models to enable automatic propagation and adjustment of segments across image slices. This functionality will use encoder-decoder models such as Segment Anything Model (SAM), allowing users to efficiently extend and adapt segmentations slice by slice.

We will improve the rendering of segmentation outlines in MIP viewports, ensuring that outlines accurately follow the anatomy from all viewing angles. This enhancement will provide configurable outline thickness and color, as well as interactive selection of segments.

We will create a tool that identifies and places bidimensional measurements on the largest in-plane slice of a 3D segment. This includes the longest diameter along the lesion’s main axis and the longest possible perpendicular diameter within the same slice.

We will add functionality to highlight individual segments on hover by increasing their border thickness, making it easier for users to identify and interact with specific lesions. This enhancement will require adjustments to the WebGL shader for dynamic border control.

























