Programmatically adding a PlanarFreehandROI annotation in Cornerstone.js is not rendering the SVG path, despite correct data structure

Hi,
I am working with OHIF v3 and Cornerstone.js, and I’m trying to programmatically add a PlanarFreehandROI annotation from a predefined set of world coordinates.

The annotation is successfully added to the MeasurementService and appears in the measurements list. The object passed to annotationManager.addAnnotation seems to be perfectly structured, with the correct FrameOfReferenceUID, 2D canvas coordinates in data.contour.points, and 3D world coordinates in handles.points. However, the SVG layer remains empty—no <path> element for the annotation is being rendered.

I have already debugged the following:

  • FrameOfReferenceUID: Confirmed it matches the viewport’s UID.
  • Data Structure: The annotation object has the correct data.contour.points structure for rendering.
  • Coordinate System: I am converting the 3D world coordinates to 2D canvas coordinates using viewport.worldToCanvas().
  • State Flags: I have tried setting the invalidated flag to both true and false. Neither triggers a render.

Here is the relevant code and data:

1. The Component Logic (ViewerLayout.tsx)

This component’s useEffect hook triggers the injection logic once the viewport is ready. The injectMeasurements function prepares the data and calls the necessary services.

const injectMeasurements = useCallback(
  (currentImageId, displayUID, instance, viewportElement) => {
    const renderingEngine = getRenderingEngine(renderingEngineId);
    if (!renderingEngine) return;
    const viewport = renderingEngine.getViewport(activeViewportId);
    if (!viewport) return;

    const annotationManager = annotation.state.getAnnotationManager();
    if (!annotationManager) return;

    const source = measurementService.getSource('Cornerstone3DTools', '0.1');

    measurementsMock.forEach(rawTemplate => {
      // 1. Prepare the measurement object with 3D world coordinates
      const rawMeasurement = {
        uid: rawTemplate.uid,
        label: rawTemplate.label,
        toolName: rawTemplate.toolName,
        points: rawTemplate.points, // This comes from my mock JSON file
        displaySetInstanceUID: displayUID,
        SOPInstanceUID: instance.SOPInstanceUID,
        FrameOfReferenceUID: instance.FrameOfReferenceUID,
        referenceStudyUID: instance.StudyInstanceUID,
        referenceSeriesUID: instance.SeriesInstanceUID,
        referencedImageId: currentImageId,
        frameNumber: 1,
        type: measurementService.VALUE_TYPES.POLYLINE,
        selected: false,
        source: source,
        data: {},
        textBox: rawTemplate.textBox,
      };

      // 2. Add the measurement to OHIF's MeasurementService (This part works)
      measurementService.annotationToMeasurement(
        source,
        measurementService.VALUE_TYPES.POLYLINE,
        rawMeasurement,
        false
      );

      // 3. Prepare and add the annotation for rendering
      if (!annotationManager.getAnnotation(rawMeasurement.uid)) {
        // 3a. Convert 3D world points to 2D canvas points
        const canvasPoints = rawMeasurement.points.map(p =>
          viewport.worldToCanvas(p as Types.Point3)
        );

        // 3b. Create the final annotation object for Cornerstone
        const annotationForRendering = {
          uid: rawMeasurement.uid,
          highlighted: false,
          invalidated: true, // Tried both true and false
          isLocked: false,
          visible: true,
          annotationType: rawMeasurement.toolName,
          imageId: rawMeasurement.referencedImageId,
          frameNumber: rawMeasurement.frameNumber,
          metadata: {
            toolName: rawMeasurement.toolName,
            FrameOfReferenceUID: rawMeasurement.FrameOfReferenceUID,
          },
          handles: {
            points: rawMeasurement.points, // 3D World points for handles
            textBox: { worldPosition: rawMeasurement.textBox?.worldPosition },
            isMoving: false,
          },
          data: {
            contour: {
              points: canvasPoints, // 2D Canvas points for drawing
              polyline: canvasPoints,
            },
            cachedStats: {},
          },
        };
        
        const groupKey = annotationForRendering.metadata.FrameOfReferenceUID;
        
        console.log('Final object sent to addAnnotation:', annotationForRendering);
        
        // 3c. Add the annotation
        annotationManager.addAnnotation(annotationForRendering, groupKey);
      }
    });

    // 4. Trigger a re-render
    renderingEngine.render();
  },
  [measurementService, renderingEngineId, activeViewportId]
);

2. Mock Data (measurements.json)

This file provides the source geometry.

[
  {
    "uid": "mockuid",
    "toolName": "PlanarFreehandROI",
    "label": "test",
    "points": [
      // ... a very long array of [x, y, z] world coordinates ...
      [ -15.41, -28.22, 74.03 ],
      [ -15.34, -28.15, 74.03 ],
      // ... etc.
    ],
    "textBox": {
      "worldPosition": [ 250, 320, -21.62 ]
    }
  }
]

The Problem:

Despite the annotationForRendering object being structurally correct and containing all necessary data, and despite annotationManager.addAnnotation() being called without any errors, the corresponding <path> element is never created in the viewer’s SVG layer. Manually drawing a PlanarFreehandROI works perfectly and generates the expected SVG.

What could be the missing step or subtle configuration that is preventing the programmatic annotation from being rendered by the PlanarFreehandROITool? Is there a different function than annotationManager.addAnnotation that I should be using for this purpose, or a state that needs to be set?