The new Flex device from Fitbit released in January, the new Pulse device from the French Withings announced this month, and rumored revamp of the current Nike+ FuelBand later this year make the self tracking devices a hot market right now.
So far, these companies have been working hard to add more input variables into their devices, to expand the possibilities of their ecosystems. Withings started as a smart scale and now is a full-fledged health monitoring system with weight, body fat measurements, outdoor activities, sleep, heart rate, blood pressure and indoor air quality.
The consumer market for self-tracking devices rides on the quantified self trend that is growing fast in Europe. Products like Google Glass and smart watches like the Pebble will only push this trend further, and the quality of the ecosystem of each of these companies can determine their hands free access.
What if these devices could understand how I feel, what I need, what I did during the day, and what I will most likely want to do that evening? And what if this information could be passed to my home, a digital automated home? Individual tracking has the power to harness the potential of the automated home. We’ve seen attempts to connect all of our appliances at home, and the Internet of things can make this a reality, but without consumer-oriented integration this will only add a layer of complexity to our already crowded tech day-to-day activities.
Could it be that the automated home hasn’t yet taken off because of the lack of integration with our lifestyle? Is it possible that smart user insights can make the automated home make sense? I don’t see digital homes becoming mainstream for the sheer purpose of digitifying our lives, they need to add value for the user without complexity. If my kitchen knows I’m low on Vitamin C, it can adjust my daily food plan, or if my living room knows I fell asleep on the sofa, it can dim the lights and turn off the television.
What’s interesting is that what started as fitness tracking devices have developed into a powerful ecosystem of user’s body information, their databases evolved into a valuable asset to feed our home’s digital brain, and it might as well be the needed trigger to shift towards a true integrated home! Most of these ecosystems are opening their huge data sets to developers – like the RunKeeper Health Graph – they need to become the de facto platform for the personal tracking experience. Nike on the other hand has a walled garden ecosystem a la Apple (is that why Tim Cook wears a Nike FuelBand?). If a closed platform in this arena can survive in the long run is for time to tell.
What do we need to see happening next? For one we need these tracking devices to get more input variables like continuous heart rate monitoring (without the awkward bands), to integrate the function of different devices into one – like having the function of the Sanofi iBG Star into a tracking wristband, and also to have more devices feeding into this ecosystem, with more complex body variables – could toilets measure our body waste composition and infer on our health?
If these ecosystems are open, the potential for creating a truly intelligent and connected home is huge! The connected home needs to react to my emotions, to my lifestyle, and to my health. And the winner in the battle for self tracking devices will come from the one with the most complete ecosystem, with intelligent insights delivered to third party developers. The device itself will mean nothing: all of the current devices will merge in our smart watches anyway!
When Google released the futuristic prototype of its much heralded Google Glass in April, many called the hands-free device revolutionary and speculated on how they could change the travel game as we know it for tourists.
Think of it: Now you can have your GPS right in front of your eyes without using your hands, or take pictures or video with a simple voice command. There's also the possibility of, say, getting real-time flight information as you walk to your gate or ditching those guide books completely and using it as a built-in tour guide when visiting museums or historic sites. In fact, many of the icons on the current prototype's modified screen already have functions used frequently by travels, such as camera, location, search, chat and maps.
Living in New York City, there are plenty of towns a stone’s throw from the city. I decided to head north to Sleepy Hollow in Hudson Valley, N.Y.-- a picturesque town filled with cafes, shops and historic sites, made famous by Washington Irving and his tales of the Headless Horseman.
As of now, Google Glass can do things like record video, send text messages, provide translations, and give directions. It doesn't yet have its own cellular radio, so it has to sync up with mobile phones via Bluetooth to access Wi-Fi and 3G or 4G data connections.
The trip was about 45 minutes door to door. Before I headed out, I powered up and connected Google Glass to get directions. The GPS function doesn’t work with an iPhone yet, so I had to use an Android phone.When paired to a smart phone, using voice activation, Google Glass can provide maps and turn-by-turn directions that you can see through a tiny lens that’s attached to the device.
When I arrived in Sleepy Hollow, I followed the signs to the center of town and figured I’d give Google Glass another try. This time, instead of asking for directions, I used the voice command to find nearby restaurants. Jackpot! I was actually surprised at how well the voice recognition software worked. I didn’t have to repeat myself and Glass gave me a list of choices within a few miles from where I was standing.
Glass’ voice recognition can be used for just about anything: to ask a general question, get a phrase translated, find flight information, speak and send an email or text, take a picture or record a video, and share them with friends, and the list goes on. At this point, I needed to use voice recognition to find an ATM. And once again, it worked perfectly, listing several banks in the area.
While I was on a roll, I decided to try to book a hotel. Three for three. Google Glass gave me plenty of options to choose from. I would have liked to see them separated by price, even ranked, but the ones listed fit my criteria of being “nearby." So, after using the track pad to swipe through my choices, I picked one of them by tapping the track pad. Glass gave me the address, and the options to get directions or call the hotel. I tested Glass once more by tapping “call,” so I could to make a reservation. That worked too.
Read the full story at http://www.ecived.com/en/!
No comments:
Post a Comment