(edit: not sure why the code is not formatting properly)
I have a Xamarin Forms apps where a picture is taken using the camera and I want to check if there are any faces there via the Azure Face Api
The picture is taken using the Xamarin Community Toolkit CameraView and i can successfully display that in an Image. The CameraView returns both an Image and a byte[] of the image.
I have checked that the byte[] actually is containing the image by testing as such:
img_selfie.Source = ImageSource.FromStream((() => new MemoryStream(_vm.SelfieImageData)));
which does then show the image.
However, when i pass that into the Azure Face api, it always returns no faces
var faceClient = new FaceClient(new ApiKeyServiceClientCredentials(key)); faceClient.Endpoint = endpoint; var faces = await faceClient.Face.DetectWithStreamAsync(new MemoryStream(_vm.SelfieImageData));
In the cameraview page:
`void CameraView_MediaCaptured(object sender, MediaCapturedEventArgs e)
{
// previewPicture.Source = e.Image;
_vm.SelfieImage = e.Image;
_vm.Rotation = e.Rotation;
_vm.SelfieImageData = e.ImageData;
Navigation.PopModalAsync();
}
void btn_TakePhoto_Clicked(System.Object sender, System.EventArgs e)
{
cameraView.Shutter();
}
`
Please could you suggest what could be going wrong here?