I mage streaming in camera plugin

Refer to the link to add the camera plugin to the Flutter project.

To start image streaming, call startImageStream in the camera controller. The method is triggered every time a new frame arrives.

controller.startImageStream((CameraImage img) { <YOUR CODE> });

The output CameraImage class has 4 members: image format, height, width and finally planes which consists of the bytes of the image.

class CameraImage {

final ImageFormat format;

final int height;

final int width;

final List<Plane> planes;

}

The format of the image varies with the platforms:

Android: android.graphics.ImageFormat.YUV_420_888

iOS: kCVPixelFormatType_32BGRA (Note that the format was kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange in v2.8.0. Later in v4.0.0 it was changed back to 32BGRA.)

As a result of the different format, the output CameraImage on iOS and Android are different:

Android: planes is a list of bytes arrays of Y, U and V planes of the image.

is a list of bytes arrays of and of the image. iOS: planes contains a single array containing the RGBA bytes of the image.

Knowing the format is important for properly decoding the image and feeding it to TensorFlow Lite.