What is the correct way to extract scalar data for a slice from a VolumeViewport?

Hi, I’m building an MPR viewer with image filter feature. In order to apply filters to the image, I need to extract scalar data for the current slice.

StackViewport has a getCornerstoneImage function that can do this but VolumeViewport doesn’t seem to have anything similar. So I tried using getImageData and the following code to extract the slice:

const getSliceDataFromVolume = (
	scalarData: number[],
	dimensions: number[],
	sliceIndex: number,
	orientation: 'axial' | 'coronal' | 'sagittal' = 'axial'
) => {
	let index = 0;
	let width = 0; // Width of output data
	let height = 0; // Height of output data

	switch (orientation) {
		case 'axial':
			index = dimensions[2] - 1 - sliceIndex; // Map slice index to actual location on volume data
			width = dimensions[0];
			height = dimensions[1];
			break;
		case 'coronal':
			index = dimensions[1] - 1 - sliceIndex;
			width = dimensions[0];
			height = dimensions[2];
			break;
		default: // sagittal
			index = sliceIndex;
			width = dimensions[1];
			height = dimensions[2];
			break;
	}

	const step = dimensions[0] * dimensions[1]; // Size of an axial slice
	const arr = [];
	switch (orientation) {
		case 'axial':
			return scalarData.slice(step * index, step * index + step);
		case 'coronal':
			for (let i = height - 1; i >= 0; i--) {
				arr.push(...scalarData.slice(index * width + i * step, width + index * width + i * step));
			}
			break;
		default: // sagittal
			for (let i = height - 1; i >= 0; i--) {
				for (let j = 0; j < width; j++) {
					arr.push(scalarData[index + j * dimensions[0] + i * step]);
				}
			}
	}
	return arr;
};

This does seem to work correctly, except the intensity values I got doesn’t quite match what I get from getCornerstoneImage.

This is what I get from getCornerstoneImage (left) and getImageData (middle) and getImageData with intensity values scaled down by x100 (right):

Upon further investigation, it seems like VolumeViewport is using a different scale compare to StackViewport but i can’t figure out what it is. Is this an intended behavior or am I doing something wrong?

Thank you.

[Edit: Typo]

The intensities displayed on the screen have applied the modality lookup table (LUT) derived from the metadata, which is handled in this section.
packages/dicomImageLoader/src/shared/scaling/scaleArray.ts

import { PixelDataTypedArray } from '../../types';

export default function scaleArray(
  array: PixelDataTypedArray,
  scalingParameters
): boolean {
  const arrayLength = array.length;
  const { rescaleSlope, rescaleIntercept, suvbw } = scalingParameters;

  if (scalingParameters.modality === 'PT' && typeof suvbw === 'number') {
    for (let i = 0; i < arrayLength; i++) {
      array[i] = suvbw * (array[i] * rescaleSlope + rescaleIntercept);
    }
  } else {
    for (let i = 0; i < arrayLength; i++) {
      array[i] = array[i] * rescaleSlope + rescaleIntercept;
    }
  }

  return true;
}