Skip to content

[camera_android] prevent startImageStream OOM error when main thread hangs (flutter#166533) #8998

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
4 changes: 4 additions & 0 deletions packages/camera/camera_android/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
## 0.10.10+3

* Fix flutter#166533 - prevent startImageStream OOM error when main thread paused.

## 0.10.10+2

* Don't set the FPS range unless video recording. It can cause dark image previews on some devices becuse the auto exposure algorithm is more constrained after fixing the min/max FPS range to the same value. This change has the side effect that providing the `fps` parameter will not affect the camera preview when not video recording. And if you need a lower frame rate in your image streaming handler, you can skip frames by checking the time it passed since the last frame.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@
import android.media.ImageReader;
import android.os.Handler;
import android.os.Looper;
import android.util.Log;
import android.view.Surface;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.VisibleForTesting;
import io.flutter.plugin.common.EventChannel;
import io.flutter.plugins.camera.types.CameraCaptureProperties;
Expand All @@ -22,6 +24,8 @@

// Wraps an ImageReader to allow for testing of the image handler.
public class ImageStreamReader {
private static final String TAG = "ImageStreamReader";
@VisibleForTesting public static final int MAX_IMAGES_IN_TRANSIT = 3;

/**
* The image format we are going to send back to dart. Usually it's the same as streamImageFormat
Expand All @@ -33,6 +37,17 @@ public class ImageStreamReader {
private final ImageReader imageReader;
private final ImageStreamReaderUtils imageStreamReaderUtils;

@VisibleForTesting(otherwise = VisibleForTesting.NONE)
@Nullable
public Handler handler;

/**
* This solves a memory issue, when the main thread hangs (e.g. when pausing the debugger) while
* many frames are being sent using android.os.Handler.post(). This causes an OOM error rather
* quickly. So to avoid this we apply some back-pressure.
*/
@VisibleForTesting public int numImagesInTransit = 0;

/**
* Creates a new instance of the {@link ImageStreamReader}.
*
Expand Down Expand Up @@ -95,40 +110,55 @@ public void onImageAvailable(
@NonNull Image image,
@NonNull CameraCaptureProperties captureProps,
@NonNull EventChannel.EventSink imageStreamSink) {
try {
Map<String, Object> imageBuffer = new HashMap<>();
// The limit was chosen so it would not drop frames for reasonable lags of the main thread.
if (numImagesInTransit >= ImageStreamReader.MAX_IMAGES_IN_TRANSIT) {
Log.d(TAG, "Dropping frame due to images pending on main thread.");
image.close();
return;
}

Map<String, Object> imageBuffer = new HashMap<>();

imageBuffer.put("width", image.getWidth());
imageBuffer.put("height", image.getHeight());
try {
// Get plane data ready
if (dartImageFormat == ImageFormat.NV21) {
imageBuffer.put("planes", parsePlanesForNv21(image));
} else {
imageBuffer.put("planes", parsePlanesForYuvOrJpeg(image));
}

imageBuffer.put("width", image.getWidth());
imageBuffer.put("height", image.getHeight());
imageBuffer.put("format", dartImageFormat);
imageBuffer.put("lensAperture", captureProps.getLastLensAperture());
imageBuffer.put("sensorExposureTime", captureProps.getLastSensorExposureTime());
Integer sensorSensitivity = captureProps.getLastSensorSensitivity();
imageBuffer.put(
"sensorSensitivity", sensorSensitivity == null ? null : (double) sensorSensitivity);

final Handler handler = new Handler(Looper.getMainLooper());
handler.post(() -> imageStreamSink.success(imageBuffer));
image.close();

} catch (IllegalStateException e) {
// Handle "buffer is inaccessible" errors that can happen on some devices from ImageStreamReaderUtils.yuv420ThreePlanesToNV21()
final Handler handler = new Handler(Looper.getMainLooper());
// Handle "buffer is inaccessible" errors that can happen on some devices from
// ImageStreamReaderUtils.yuv420ThreePlanesToNV21()
final Handler handler =
this.handler != null ? this.handler : new Handler(Looper.getMainLooper());
handler.post(
() ->
imageStreamSink.error(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldnt this also have --numImagesInTransit otherwise if we had more than 3 images error then we would no longer update any frames.

Copy link
Author

@kwikwag kwikwag Apr 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@reidbaker Thanks for these questions. I'll address the issues one-by-one

  1. As opposed to how I initially phrased the issue, this will not only happen when the debugger is paused, but any time the main loop is lagging. I see no benefit to the 'consecutive' method:

    • I'm not sure what you mean by an "off-by-one set of conditions." If the main thread is lagging by one frame, it will do so regardless of this fix. Dropping frames will only happen when at least two frames have already been sent to the handler but have not yet been handled. Being less familiar with Flutter/Android workings, I'm not sure if the EventSink.success() call can raise an exception. If it can, it would perhaps be wish to wrap the call with a try..finally. However, I would not move the decrement before the success() call, as decrement should happen when the image memory can readily be released, otherwise a queue build-up might still occur.
    • Resetting the number of frames whenever any frame reaches the main thread can have an adverse affect, too. Imagine a scenario where a frame is generated every 10ms, but handled every 20ms on the main thread itself. You will start accumulating an infinite amount of frames pending on post() and eventually crash the app with an out-of-memory error. Ensuring that post() arrives and completes is exactly the queue capability you are talking about, as far as I understand.
  2. Regarding not updating frames anymore - the only scenario I see that you will no longer update frames in, is when the handler stops handling the Runnables passed to post(). If that is the case, it means the main thread is paused or hanging, in which case it should be OK to not update the frames until it releases and the Runnable gets invoked. I see no risk of deadlock, as AFAIK there is no way for the main thread to wait on whatever calls onImageAvailable().

How long will it be before @camsim99 comes back?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My two cents:

  • @reidbaker can you clarify the off-by-one issue? I'm also a bit confused there.
  • Based on my current understanding of things (off-by-one issue aside), I agree with @kwikwag that we should not reset numImagesInTransit back to 0 when a frame is delivered. My understanding is that if we do, then we would no longer only be waiting on 3 images to accumulate to start closing images, but instead, would be waiting on 3 then 6 then potentially 9 and so on and so forth.
  • I agree with @reidbaker that in this case the comment is linked to (the case where an IllegalStateException is thrown streaming an image), we should also decrement numImagesInTransit because we have essentially "handled" that image.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My worry is that over time either changes to the behavior of android or the camera code we relyon will cause us to not decrement at the same rate we increment and cause the camera to break for users.

The weak reference suggestion would mitigate that I think.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kwikwag Thank you for the clear explanation on the weak reference solution! Considering the concern of this PR changing behavior, I am for pursuing that solution instead of this one. What do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kwikwag We're leaning towards the weak reference solution based on this discussion. What do you think?

"IllegalStateException",
"Caught IllegalStateException: " + e.getMessage(),
null));
} finally {
image.close();
}

imageBuffer.put("format", dartImageFormat);
imageBuffer.put("lensAperture", captureProps.getLastLensAperture());
imageBuffer.put("sensorExposureTime", captureProps.getLastSensorExposureTime());
Integer sensorSensitivity = captureProps.getLastSensorSensitivity();
imageBuffer.put(
"sensorSensitivity", sensorSensitivity == null ? null : (double) sensorSensitivity);

final Handler handler =
this.handler != null ? this.handler : new Handler(Looper.getMainLooper());
++numImagesInTransit;
boolean postResult =
handler.post(
() -> {
imageStreamSink.success(imageBuffer);
--numImagesInTransit;
});
}

/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,20 @@
import static org.mockito.ArgumentMatchers.anyInt;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.never;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;

import android.graphics.ImageFormat;
import android.media.Image;
import android.media.ImageReader;
import android.os.Handler;
import io.flutter.plugin.common.EventChannel;
import io.flutter.plugins.camera.types.CameraCaptureProperties;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.robolectric.RobolectricTestRunner;
Expand Down Expand Up @@ -61,48 +66,18 @@ public void onImageAvailable_parsesPlanesForNv21() {
when(mockImageStreamReaderUtils.yuv420ThreePlanesToNV21(any(), anyInt(), anyInt()))
.thenReturn(mockBytes);

// The image format as streamed from the camera
int imageFormat = ImageFormat.YUV_420_888;

// Mock YUV image
Image mockImage = mock(Image.class);
when(mockImage.getWidth()).thenReturn(1280);
when(mockImage.getHeight()).thenReturn(720);
when(mockImage.getFormat()).thenReturn(imageFormat);

// Mock planes. YUV images have 3 planes (Y, U, V).
Image.Plane planeY = mock(Image.Plane.class);
Image.Plane planeU = mock(Image.Plane.class);
Image.Plane planeV = mock(Image.Plane.class);

// Y plane is width*height
// Row stride is generally == width but when there is padding it will
// be larger. The numbers in this example are from a Vivo V2135 on 'high'
// setting (1280x720).
when(planeY.getBuffer()).thenReturn(ByteBuffer.allocate(1105664));
when(planeY.getRowStride()).thenReturn(1536);
when(planeY.getPixelStride()).thenReturn(1);

// U and V planes are always the same sizes/values.
// https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888
when(planeU.getBuffer()).thenReturn(ByteBuffer.allocate(552703));
when(planeV.getBuffer()).thenReturn(ByteBuffer.allocate(552703));
when(planeU.getRowStride()).thenReturn(1536);
when(planeV.getRowStride()).thenReturn(1536);
when(planeU.getPixelStride()).thenReturn(2);
when(planeV.getPixelStride()).thenReturn(2);

// Add planes to image
Image.Plane[] planes = {planeY, planeU, planeV};
when(mockImage.getPlanes()).thenReturn(planes);
// Note: the code for getImage() was previously inlined, with uSize set to one less than
// getImage() calculates (see function implementation)
Image mockImage = ImageStreamReaderTestUtils.getImage(1280, 720, 256, ImageFormat.YUV_420_888);

CameraCaptureProperties mockCaptureProps = mock(CameraCaptureProperties.class);
EventChannel.EventSink mockEventSink = mock(EventChannel.EventSink.class);
imageStreamReader.onImageAvailable(mockImage, mockCaptureProps, mockEventSink);

// Make sure we processed the frame with parsePlanesForNv21
verify(mockImageStreamReaderUtils)
.yuv420ThreePlanesToNV21(planes, mockImage.getWidth(), mockImage.getHeight());
.yuv420ThreePlanesToNV21(
mockImage.getPlanes(), mockImage.getWidth(), mockImage.getHeight());
}

/** If we are requesting YUV420, then we should send the 3-plane image as it is. */
Expand All @@ -120,40 +95,9 @@ public void onImageAvailable_parsesPlanesForYuv420() {
when(mockImageStreamReaderUtils.yuv420ThreePlanesToNV21(any(), anyInt(), anyInt()))
.thenReturn(mockBytes);

// The image format as streamed from the camera
int imageFormat = ImageFormat.YUV_420_888;

// Mock YUV image
Image mockImage = mock(Image.class);
when(mockImage.getWidth()).thenReturn(1280);
when(mockImage.getHeight()).thenReturn(720);
when(mockImage.getFormat()).thenReturn(imageFormat);

// Mock planes. YUV images have 3 planes (Y, U, V).
Image.Plane planeY = mock(Image.Plane.class);
Image.Plane planeU = mock(Image.Plane.class);
Image.Plane planeV = mock(Image.Plane.class);

// Y plane is width*height
// Row stride is generally == width but when there is padding it will
// be larger. The numbers in this example are from a Vivo V2135 on 'high'
// setting (1280x720).
when(planeY.getBuffer()).thenReturn(ByteBuffer.allocate(1105664));
when(planeY.getRowStride()).thenReturn(1536);
when(planeY.getPixelStride()).thenReturn(1);

// U and V planes are always the same sizes/values.
// https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888
when(planeU.getBuffer()).thenReturn(ByteBuffer.allocate(552703));
when(planeV.getBuffer()).thenReturn(ByteBuffer.allocate(552703));
when(planeU.getRowStride()).thenReturn(1536);
when(planeV.getRowStride()).thenReturn(1536);
when(planeU.getPixelStride()).thenReturn(2);
when(planeV.getPixelStride()).thenReturn(2);

// Add planes to image
Image.Plane[] planes = {planeY, planeU, planeV};
when(mockImage.getPlanes()).thenReturn(planes);
// Note: the code for getImage() was previously inlined, with uSize set to one less than
// getImage() calculates (see function implementation)
Image mockImage = ImageStreamReaderTestUtils.getImage(1280, 720, 256, ImageFormat.YUV_420_888);

CameraCaptureProperties mockCaptureProps = mock(CameraCaptureProperties.class);
EventChannel.EventSink mockEventSink = mock(EventChannel.EventSink.class);
Expand All @@ -162,4 +106,98 @@ public void onImageAvailable_parsesPlanesForYuv420() {
// Make sure we processed the frame with parsePlanesForYuvOrJpeg
verify(mockImageStreamReaderUtils, never()).yuv420ThreePlanesToNV21(any(), anyInt(), anyInt());
}

@Test
public void onImageAvailable_dropFramesWhenHandlerHalted() {
final int numExtraFramesPerBatch = ImageStreamReader.MAX_IMAGES_IN_TRANSIT * 2;
final int numFramesPerBatch = ImageStreamReader.MAX_IMAGES_IN_TRANSIT + numExtraFramesPerBatch;

int dartImageFormat = ImageFormat.YUV_420_888;
final List<Runnable> runnables = new ArrayList<Runnable>();

ImageReader mockImageReader = mock(ImageReader.class);
ImageStreamReaderUtils mockImageStreamReaderUtils = mock(ImageStreamReaderUtils.class);
ImageStreamReader imageStreamReader =
new ImageStreamReader(mockImageReader, dartImageFormat, mockImageStreamReaderUtils);

Handler mockHandler = mock(Handler.class);
imageStreamReader.handler = mockHandler;

// initially, handler will simulate a hanging main looper, that only queues inputs
when(mockHandler.post(any(Runnable.class)))
.thenAnswer(
inputs -> {
Runnable r = inputs.getArgument(0, Runnable.class);
runnables.add(r);
return true;
});

CameraCaptureProperties mockCaptureProps = mock(CameraCaptureProperties.class);
EventChannel.EventSink mockEventSink = mock(EventChannel.EventSink.class);

// send some images whose "main-looper" callbacks won't get run, so some frames will drop
for (int i = 0; i < numFramesPerBatch; ++i) {
Image mockImage =
ImageStreamReaderTestUtils.getImage(1280, 720, 256, ImageFormat.YUV_420_888);
imageStreamReader.onImageAvailable(mockImage, mockCaptureProps, mockEventSink);

// make sure the image was closed, even when skipping frames
verify(mockImage, times(1)).close();

// frames beyond MAX_IMAGES_IN_TRANSIT are expected to be skipped
final int expectedFramesInQueue =
i < ImageStreamReader.MAX_IMAGES_IN_TRANSIT
? i + 1
: ImageStreamReader.MAX_IMAGES_IN_TRANSIT;

// check that we collected all runnables in this method
assertEquals(runnables.size(), expectedFramesInQueue);

// ensure the stream reader's count agrees
assertEquals(imageStreamReader.numImagesInTransit, expectedFramesInQueue);

// verify post() was not called more times than it should have
verify(mockHandler, times(expectedFramesInQueue)).post(any(Runnable.class));
}

// make sure callback was not yet invoked
verify(mockEventSink, never()).success(any(Map.class));

// simulate frame processing
for (Runnable r : runnables) {
r.run();
}

// make sure all callbacks were invoked so far
verify(mockEventSink, times(ImageStreamReader.MAX_IMAGES_IN_TRANSIT)).success(any(Map.class));

// switch handler to simulate a running main looper
when(mockHandler.post(any(Runnable.class)))
.thenAnswer(
input -> {
Runnable r = input.getArgument(0, Runnable.class);
r.run();
return true;
});

// send some images that will get processed by the handler
for (int i = 0; i < numFramesPerBatch; ++i) {
Image mockImage =
ImageStreamReaderTestUtils.getImage(1280, 720, 256, ImageFormat.YUV_420_888);
imageStreamReader.onImageAvailable(mockImage, mockCaptureProps, mockEventSink);

// make sure the image is closed when no frame-skipping happens
verify(mockImage, times(1)).close();

// since the handler is not "halting", each image available should cause a post(), which the
// mockHandler runs immediately, thus numImagesInTransit should remain zero.
verify(mockHandler, times(ImageStreamReader.MAX_IMAGES_IN_TRANSIT + i + 1))
.post(any(Runnable.class));
assertEquals(imageStreamReader.numImagesInTransit, 0);
}

// make sure all callbacks were invoked
verify(mockEventSink, times(ImageStreamReader.MAX_IMAGES_IN_TRANSIT + numFramesPerBatch))
.success(any(Map.class));
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
// Copyright 2013 The Flutter Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.

package io.flutter.plugins.camera.media;

import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

import android.media.Image;
import java.nio.ByteBuffer;

public class ImageStreamReaderTestUtils {
public static Image getImage(int imageWidth, int imageHeight, int padding, int imageFormat) {
int rowStride = imageWidth + padding;

int ySize = (rowStride * imageHeight) - padding;
int uSize = (ySize / 2) - (padding / 2);
int vSize = uSize;

// Mock YUV image
Image mockImage = mock(Image.class);
when(mockImage.getWidth()).thenReturn(imageWidth);
when(mockImage.getHeight()).thenReturn(imageHeight);
when(mockImage.getFormat()).thenReturn(imageFormat);

// Mock planes. YUV images have 3 planes (Y, U, V).
Image.Plane planeY = mock(Image.Plane.class);
Image.Plane planeU = mock(Image.Plane.class);
Image.Plane planeV = mock(Image.Plane.class);

// Y plane is width*height
// Row stride is generally == width but when there is padding it will
// be larger.
// Here we are adding 256 padding.
when(planeY.getBuffer()).thenReturn(ByteBuffer.allocate(ySize));
when(planeY.getRowStride()).thenReturn(rowStride);
when(planeY.getPixelStride()).thenReturn(1);

// U and V planes are always the same sizes/values.
// https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888
when(planeU.getBuffer()).thenReturn(ByteBuffer.allocate(uSize));
when(planeV.getBuffer()).thenReturn(ByteBuffer.allocate(vSize));
when(planeU.getRowStride()).thenReturn(rowStride);
when(planeV.getRowStride()).thenReturn(rowStride);
when(planeU.getPixelStride()).thenReturn(2);
when(planeV.getPixelStride()).thenReturn(2);

// Add planes to image
Image.Plane[] planes = {planeY, planeU, planeV};
when(mockImage.getPlanes()).thenReturn(planes);

return mockImage;
}
}
Loading