As we age, mental and physical challenges do creep up. And we need assistance to stay healthy, mobile, and independent.
A combination of a trained AI and installed sensor platform in mock living room, kitchen, hospital room, and retail space.
- It can do everything from “see” someone who has fallen (and help them if they can’t get up)
- answer health-related questions, such as “what are symptoms of a stroke?”,
- Scan and monitor health data: read heart rate, heart rate variability, and breathing rate.
- How do people want to interact with a computer when they’re sick?
- What’s the most-effective way to connect this data to third parties, such as emergency medical services?
- Where do you store a robot in someone’s home?
- natural language processing
- Chat bot interface for Q&A,
- visual recognition technology (the systems eyes)
- are funneled and analyzed through a body of the system, e.g. a SoftBank Pepper robot connected to the cloud.
- IBM Multi-Purpose Eldercare Robot Assistant (IBM MERA )
More on IBM MERA:
- The sensor network built into the lab is connected to that same toolkit of technology. With sensors in the floor, walls, ceiling, and other objects, like wearables, this cognitive agent “collective” can learn patterns of how and when the elderly wake up in the morning, eat breakfast, exercise, or take medication — and offer verbal reminders to take a medication if they forget (or maybe change the light color for those who are hearing impaired). Some of this IBM Accessibility technology is already out in the real world, like the homes of some over-65 residents of Bolzano, Italy, where we’re testing solutions with local government to help their citizens live independently, longer.