Building Android Apps with Machine Learning Google API

Rajesh K      android

You may think why I need to integrate machine learning in Android Apps.We will explain some of the contexts where machine learning will be useful. Consider the situation,when your users are in the office,you shouldn't notify them to do something in your android application.So,you have to know the user context,so that you can avoid disturbing the users of your app.You may ask the user for his home location and store it.By using the awareness API,you can tell them to Play your game or do to something in Your Android apps when they are in home.

1.Awareness API
With the Google Awareness API, you can enable your app to intelligently react to the user's current situation. The Awareness API exposes 7 different types of context, including location, weather, user activity, and nearby beacons, enabling your app to refine the user experience in new ways that weren't possible before. Your app can combine these context signals to make inferences about the user's current situation, and use this information to provide customized experiences (for example, suggesting a playlist when the user plugs in headphones and starts jogging).

Context types:
Context is at the heart of the Awareness API. Contextual data includes sensor-derived data such as location (lat/lng), place type (park, coffee shop), and activity (walking, driving). Below are the some context informations which we can get from these Google API's

Awareness API benefits
Easy implementation: You only need to add a single API to your app, which greatly simplifies integration and improves your productivity.
Better context data: Raw signals are processed for improved quality. For example, advanced algorithms are used to determine the user's activity with a high level of accuracy.
Optimal system health: The Awareness API automatically manages its impact on battery life and data usage, so your app doesn't have to.
The Awareness API consists of two distinct APIs which your app can use to get context signals to determine the user's current situation.

2.Fence API
In the Awareness API, the concept of "fences" is taken from geofencing, in which a geographic region, or "geofence", is defined, and an app receives callbacks when a user enters or leaves this region. The Fence API expands on the concept of geofencing to include many other context conditions in addition to geographical proximity. An app receives callbacks whenever the context state transitions. For example, if your app defines a fence for headphones, it will get callbacks when the headphones are plugged in, and when they are unplugged.

It lets your app react to the user's current situation, and provides notification when a combination of context conditions are met. For example, "tell me whenever the user is walking and their headphones are plugged in". Once a fence is registered, the Fence API can send callbacks to your app even when it's not running.

3.Snapshot API
This API lets your app request information about the user's current context. For example, "give me the user's current location and the current weather conditions".You can use the Snapshot API to get information about the user's current environment.

Using the Snapshot API, you can access a variety of context signals.
Detected user activity, such as walking or driving.
Nearby beacons that you have registered.
Headphone state (plugged in or not).
Location, including latitude and longitude.
Place where the user is currently located.
Weather conditions in the user's current location.