Unleash your users' sensor data with Sense360
(Image Credit: iStockPhoto/PeskyMonkey)
Modern devices are packed with sensors, but few apps take advantage of the data provided due to how complex it can be to make sense of it all. Sense360 aims to change this through enabling your applications to capture, combine, and analyse data from a variety of sources in order to create intelligent new mobile experiences.
Even if every app on your device used Sense360, it would only have around a 5% hit on your battery after a full day.
The company's new SDK which launched today allows developers to access data from the phone’s GPS chip, barometer, accelerometer, gyroscope, compass, and ambient light sensor. Instead of just interpreting data from each source individually, Sense360 makes sense of all the data to provide a clearer picture of what the user is doing and/or may want at that moment.
For example, Uber could tell a user has been sitting in a bar for hours a distance from home via the GPS, it could know that the user has now got up and walked outside via the accelerometer, and it could tell it's night thanks to the ambient light sensor. Uber's app could use this data to then send a notification to our friend suggesting he might want to grab a cab home (our smartphones don't yet come with breathalysers - so we're not sure of his state here.)
"We have integrated the Sense360 SDK to drive our Ambient Alerts feature. This automatically sends users a notification as soon as they drive into a gas station, informing them which of their credit cards would give them the highest rewards on that specific day and for that particular gas station," said Matthew Goldman, CEO of Wallaby Financial, a Bankrate subsidiary that develops software to help to consumers maximise credit card rewards.
Of course this could ring alarm bells about the privacy implications of such intricate data, but Sense360 has put a lot of thought into ensuring neither party has access to data they are not supposed to...
Using the earlier Uber car example, Sense360 knows that an Uber user has gone to a bar on the Upper East Side of Manhattan - and is ready to leave - but has no idea who that user is. Uber, on the other hand, would know that the user has left the bar although would not have access to the data which the user's device's sensors have provided all-day.
Sense360 makes sense of all the data to provide a clearer picture of what the user is doing
You would think such extensive usage of sensor data would have a big impact on battery life, but the company says that it won't be the case. In fact, even if every app on your device used Sense360 it would only have around a 5% hit on your battery after a full day.
"Phones are getting smarter as more and better sensors are added, but the apps that run on them aren't," said Eli Portnoy, CEO and co-founder of Sense360. "We are living in an age where our phones have the hardware to do so much more for consumers, but the tools aren't there for software developers to take advantage of all the sensors. We are bridging that gap."
Sense360 is removing the need for developers to make sense of sensor data themselves, consider the impact on battery life, and maintaining the user's privacy. This allows creators to focus on what matters, and that's creating a great user experience.
For more information and to get started with the SDK, visit here.
Do you think it's time we took more advantage of sensor data? Let us know in the comments.
- » Google will pay hackers to discover bugs in apps with over 100m installs
- » Huawei is offering up $1.5 billion to woo software developers
- » Apple’s September event developer updates: iOS 13, watchOS 6, Apple Arcade, and more
- » Apple is giving iOS apps which handle real cash in an HTML5 wrapper a bit longer to transition to native