Can't stop thinking about the centralization going on here, even if you avoid using Google apps now any app could be feeding back your information to Google.
Too bad "Ingredients" labels like they have on food are impractical in the App space. But maybe Apple could highlight key APIs and net connections that it finds in use during app review. Something like, "WARNING: This app makes silent, background connections to [Google, Facebook, etc etc]"
Like In App Purchase warnings, but for background net connections.
I really like this idea, and while it seems like it would be a bit difficult for Apple to implement and the common consumer to understand, it would still further their goal of providing privacy to their users. Plus it would help consumers decide what services they want to give their information and analytics data to, which is pretty powerful.
This landmark tracking on the mobile is new for tensorflow . The benchmark here is dlib which does an implementation of One Millisecond Face Alignment with an Ensemble of Regression Trees paper by Kazemi and Sullivan.
Apple seems to use both, but they've been leaning towards Core* rather than *Core recently. We have CoreAudio, CoreBluetooth, CoreFoundation, CoreGraphics, CoreImage, CoreLocation, CoreText…but there's also WebCore, JavaScriptCore, QuartzCore, and ImageCaptureCore.
Why do they provide another API for offline face detection, when Google Play Services already comes with one?
It would be great if apps that use the existing offline face detection were just switched to the better model, without needing to change any app code.
And, if the problem is that some devices can't run the new models, it would better for that to be handled by the API provider, than by each app developer.