Session Details

Integrating Sensor Data into Mobile Applications To Make Them Smarter  

Level :
Date :
9:45 AM Saturday
Room :
Interested : (69) - Registered : (-)


In this session you will learn how and why adding "contextual awareness" into mobile applications is critical to user engagement. You will learn how to integrate powerful contextual signals into your iOS and Android apps. You will learn specifics like user-activity recognition, location, and environmental data which can be integrated into any iOS and Android application. If you are working on a new app or simply want to augment a current app, in this session we will have several developers work directly with you on your app and get right into the code! You will learn how to integrate sensor data, data from storage on your smartphone, and a multitude of web services into your app to make them more "contextually aware.” We will review common use cases and sample apps that come with the SDK and focus on what can be done with context in your applications.

The Speaker(s)


Nazmul Idris

Nazmul Idris a veteran technologist, former Googler, and most recently founder and CEO of TRNQL. As a multifaceted engineer, designer, and communicator, Nazmul has focused on creating amazing user experiences across platforms. These include; cloud-connected, mobile, tablet, web, and wearable devices; experiences that rely on telemetry data from personal sensors (biometric, location, environmental, input, etc) & that can leverage a wide array of interaction devices (phones, wearable computers, projectors, etc). Nazmul has also been a frequent speaker/presenter at Google I/O, led many Google workshops at IO, as well as founding and moderating the UX Design for Developer communities on G+, and, and creating the UX Design class on Udacity.
  • Not Interested
  • Interested