LightBlog

mercredi 21 juin 2017

Google adds Semantic Time Support to the Awareness APIs

During last year’s I/O, Google launched the Awareness API. The API allows developers to use various signals such as location, weather, time, user activity, etc. in order to provide contextual experiences. What this means is that applications can consider data from multiple sources through the use of a single API in order to react more accurately to user conditions. Applications become more “aware” of the context in which the phone is in without significantly draining system resources in discovering this context. Now, the API is being updated with Semantic Time support that automatically handles the localization for developers.

Developers can utilize the Awareness API via Google Play services. The API consists of two separate APIs designed to take advantage of context signals within the app. The Snapshot API lets the app request information about the user’s current context, while the Fence API lets the app react to changes in user’s context, and when it matches a particular set of conditions. For example, the application can start playing music based on the following condition: “tell me whenever the user is walking and their headphone is plugged in.”

Until now, developers had to use third-party APIs to make the app “understand” higher level time abstractions. An app needed to have an absolute condition like “tomorrow at 8 AM”. Expressions like “next holiday” or “in the morning” couldn’t be recognized. The latest update brings Semantic Time support. In short, an application can now determine when the next holiday, sunset, or sunrise is based on geographical position. Full instructions how to use Semantic Time is available on the Android Developers Blog source link provided below.


Source: Android Developers



from xda-developers http://ift.tt/2sVtzth
via IFTTT

.

Aucun commentaire:

Enregistrer un commentaire