View in #cornerstone3d on Slack
@Julian_Linares: Hello! I’m currently working with ROI annotations, and I’ve noticed that sending these annotations to the server can consume a massive amount of memory (up to 5 GB in some cases), even when the server simply receives the event with the annotation object and does no processing.
The memory usage seems to depend heavily on the image resolution and the size of the ROI. Smaller images or ROIs still consume a lot of memory but remain within a more manageable range.
Is there a better or recommended way to send these annotations to the server for storage?
@Bill_Wallace: There are several methods of compressing the annotations, but they aren’t well supported by OHIF yet
The new segmentation labelmap storage SOP class, when combined with JPEG-LS compresses extremely well and gets better as the images get better.
I don’t have sample sizes yet, but I’m guessing even small samples will be 10:1, and large ones probably 1000:1
Contour segmentions also are a lot smaller
@Alireza_Sedghi: what is a ROI annotation? you mean labelmap?
@Julian_Linares: I mean the RectangleROITool and EllipticalROITool
@Alireza_Sedghi: I see, you are sending the points
too
ignore them
storePointData
set this to false in the configuration of the tool
@Julian_Linares: Great, thank you
Turns out it was already set to false by default, but it looks like its storing them anyways. Maybe "CORNERSTONE_TOOLS_ANNOTATION_COMPLETED"
provides them anyways in the detail?
@Alireza_Sedghi: hmm werid
@Julian_Linares: I assume the “points” we are talking about are the ones in pointsInShape
which go on for thousand of lines. I’ll check in different times if the toolConfiguration stays the same (which it should because Im never directly changing it)
@Alireza_Sedghi: yes pointsInShape
@Julian_Linares: Checking again and pointsInShape
are being stored (or provided in CORNERSTONE_TOOLS_ANNOTATION_COMPLETED event, (event.detail.anntotation
) even with storePointData
always set to false
@Alireza_Sedghi: Can you create an issue for us
@Julian_Linares: Done, #1709