My app’s Face Detection UX works phenomenally great in the Expo iOS client app.
I spent the last several several months building an Augmented Reality app around this one feature. The app is now completed.
Today I deployed it to TestFlight, only to realize that in a production environment, the Expo Face Detector API does not seem to function at all. I’m not even receiving errors from the Camera prop “onFaceDetectionError” handler, so I have no idea where to begin troubleshooting.
Furthermore, I can confirm that the Camera component’s other “onFacesDetected” prop handler is not being called either.
Why does it work so well in the Expo iOS client app, but not in TestFlight?
(please note that when uploading the Expo-generated IPA file to iTunes Connect, there are no errors thrown… just a clean and successful upload)
Is it possible that the reason for it not working in TestFlight is because I have to register with Google Mobile Vision (GMV) for my own api developer account so that api consumption is invoiced to me instead of Expo? If yes, then how would I swap out Expo’s GMV api key with my own GMV api keys?
tested/failed on the following iOS devices:
iPhone 6+, iOS 11.3
iPhone 7+, iOS 11.1