AI in Healthcare Video Series Part 3: Data Driven Intelligence and Virtual Care

November 12th, 2020

The increase in virtual care – through tele-health, mobile health and other technologies – provides a perfect opportunity to use many different artificial intelligence algorithms to process data and support data driven decision making.

At the AEHRC our Health Internet of Things team is using sensors and home monitoring devices linked to the internet of things to support data drive health and aged care.

Recently we licensed our Smarter Safer Home (SSH) technology to an Australian SME to support aged care in Australia – read more here for the CSIRO news story. SSH is currently being used in a clinical trial of the technology with over 100 aged Australians living alone supported by our health provide partners, Anglicare, Integrated Living and All About Living.

Today we are releasing two videos about how we use knowledge representation and reasoning technologies to support our clinical terminology products.

Today, we have two videos:

  • Dr Mohan Karunanithi explaining the Smarter Safer Homes platform and the underlying AI technologies in the platform
  • Dr Qing Zhang describing some of our ongoing research in using AI technologies, sensors and the internet of things.

You can read more about artificial intelligence technology in the primer section of our AI report, while the case studies discussed in these videos are in the Data Driven section of the report – Exemplars of Artificial Intelligence and Machine Learning in Healthcare (PDF).

Title:

[Start of recorded material at 00:00:00] [CSIRO Group Leader Dr Mohan Karunanithi appears prominent on the screen]
Mohan Karunanithi

We have a global crisis. The population is rapidly ageing and there is an increasing shortage of care workers to support them. One in seven Australians are aged 65 and over. This is expected to increase to one in five by 2050. Most older Australians prefer to age in their own family home safely and independently. The Australian e-Health Research Centre has developed a technology platform powered by artificial intelligence to enable this.

[Animation of logo of Australian e-Health Research Centre] [Mohan Karunanithi appears prominent on the screen. His name and title briefly appear on screen]
I am Dr Mohan Karunanithi and I lead the Health Services Research Group at the Australian e-Health Research Centre, which develops and validates innovative solutions for remote care delivery models to support chronic diseases and aged care through the use of sensors, monitoring devices, smartphones and the Internet.

Residential care placements are limited and costly, in particular for the increase in the ageing population we expect over the next 30 years. Supporting older Australians to age in their own home is becoming more real as the future of aged care. Currently, the problem with home care support is that there is limited staff capacity. Moreover, initiating an assessment for government funded home care support is done only when an older person is on the decline, which can be late and the access to this support is often delayed.

[Mohan Karunanithi appears prominent on the screen]
What we have done here at the Australian e-Health Research Centre is develop a technology based platform called the Smarter, Safer Homes to enable older people to live longer and safely in their own home with support and care provided by their family or aged care provider.

[An image titled Smarter safer homes displays, which is a floor layout of a home that has flow lines emanating from it identifying, Power, Contact, Motion, Accelerometer and Heat and humidity]
This platform was co-designed with a network of older people and carers. The platform uses wireless sensors such as motion, power, humidity sensors placed around the home to measure one’s activities.

[Mohan Karunanithi appears prominent on the screen]
Although there have been technology platforms that have recently emerged in the market that do the same thing, our platform features a novel analytical tool called the Objective Activities of Daily Living.

[An image split into 3 sections displays. The top left shows Real time Sensor Data with an arrow depicting this data flowing into the image on the right. The image on the right is of a graph titled Objective Activity of Daily Living. The bottom half of the image shows the floor layout of a home with the flows as displayed earlier with arrows pointing up depicting the data being sent from the home into the datasets above]
This tool not only translates the functional independence of activities of daily living used in clinical setting, to be digital but personalises this to one’s own signature of activities in their own home setting.

[An image titled Sally Citizen displays. The left side of the image has the titles from top to bottom, Home, Health Checks Daily Activity, Alerts, Diary, Video Call, Settings, About SSH. From the top heading of Home an arc curves to the bottom right of the screen breaking down into sections for Health Checks, Sleep, Social, Walking and Daily Activity, each showing different levels per activity]
How this works is that it extracts the sensor data in real time to infer different activities of one’s daily living patterns to measure their functional independence.

[Mohan Karunanithi appears prominent on the screen]
This comprises of five domains of activities mobility, being movement around the home, postural transfer, ability to dress themselves, ability to make their meals, and the ability to attend to their hygiene, such as showering and toileting. To calculate these activities automatically we use advanced machine learning techniques such as multi scale clustering and automatic peak detection in time series analysis. To aggregate these activity measures towards an Objective Activities of Daily Living score, we use random forest technique to determine the most accurate representation of one’s functional wellbeing status.

Over the last five to seven years, we employed the Smarter, Safer Homes Platform in many trials in which aged care providers have demonstrated its use for remotely monitoring the clients’ well-being status before attending to timely intervention of home care support. We are currently at the stage of completing a validation study of the platform in 200 participants receiving home care support from their aged care providers. With the Smarter, Safer Homes Platform in the homes of your loved ones, you can be at peace knowing that they are well and going about their daily lives from anywhere and everywhere.

Additionally, by engaging a care provider through the platform, you could be assured that timely support and care are provided for their personal needs. The Smarter, Safer Homes Platform can allow all older Australians to age with dignity and a better quality of life.

[Image of report cover appears on black background with a voiceover]
Download the report today for more insights into using artificial intelligence and machine learning for health applications, read exciting case studies from Australia’s largest digital health initiative, the Australian e-Health Research Centre, and get in touch with us to discuss collaborations.

[End of recorded material 00:05:23]

Dr Mohan Karunanithi on AI technologies used to build the Smarter Safer Homes platform.

[Start of recorded material at 00:00:00] [CSIRO Team Leader Dr Qing Zhang appears prominent on the screen]
In a multi residential smart home environment individual identification is one of the most critical issues in order to realise the full functionality and the potential of the smartphone platforms personalised service. This case study demonstrates how we developed an AI powered solution to support a multi residential home environment.

[Animation of logo of Australian e-Health Research Centre] [Qing Zhang appears prominent on the screen. His name and title briefly appear on screen]
My name is Dr Qing Zhang and I lead the Health Internet of Things team at CSIRO. Together with the CSIRO Energy and Data 61 we are focusing on developing new non wearable privacy-preserving human identification sensors for smart home platform, through using Ultra Wide Band Radar Technology.

The smart home analyses data from sensors deployed in the home environment to measure a person’s activities of daily life and provide the necessary support. This approach works well when there is only one person living alone. However, in homes with multiple residents, activity identification models designed for single person living environments do not produce satisfactory results because it is difficult to know whose data the sensors are capturing.

There are usually two methods of indoor human identification. Computer vision systems is one, however these have poor performances in low visibility conditions and they inevitably raise privacy concerns. Another is the use of wearable devices, however these require the resident to always wear or carry the device throughout the day, which prevents them from being widely accepted by older communities, let alone those with neurodegenerative diseases.

At the Australian e-Health Research Centre we have developed a new artificial intelligence driven identification sensor. Compared to existing approaches, this is a non wearable, privacy protected sensor, which is the size of a credit card and can be easily deployed on the ceiling of the home.

[A slide displays in 3 sections, 2 images on the left side and one image on the right. The top left image is that of a small blue box with screws in each corner holding the lid in place. The lower image on the left is labelled UWB transmitted pulse. This image shows what the UWB transmitted pulse looks like in graph like form. A straight line which then displays large peaks and troughs and then levels out again. The image on the right has a person shown at 3 points, d1, d2 and d3 walking in the detection zone circle, with lines going from each person upwards to the UWB sensor in the ceiling]
As you can see in this slide, this sensor uses Ultra Wide Band, or UWB radar technology.

UWB radar systems can be installed in indoor environments in a non intrusive manner, and offer many advantages such as high resolution rate, low power cost, and strong resistance to narrowband interferences. When residents passing through the detection zone under our sensor, the reflected data collected by the UWB sensor is a high frequency time series data stream.

The artificial intelligence component in the sensor unit then appropriately process and analyse the received UWB radar signals, to extract the unique features and patterns of each resident, and to identify them.

[Qing Zhang appears prominent on the screen. His name and title briefly appear on screen]
This process begins by visualising the sensor data as a heat map using a bandpass filter as shown in this slide.

[A slide displays in 2 sections, an image on the left side and one image on the right. The left image has a black box of data on the top of the screen, with a blue arrow pointing down to the image of a heat map below. The image is titled Encode UWB data as heat map. The image on the right has two examples results of a heat map. This image is titled Different person with different walking patterns. The image shows the heat map results for Person A along the top showing the heat sensor results for person A, the left image is for walking straight, the right is waking diagonally. The bottom section of the image on the right shows the same walking pattern results for person B.]
The UWB signal scatters from different parts of the body at different times with different the amplitudes, depending on the distance to the body part, as well as the size and material of the reflected part. In this slide you can see examples of scattered signals of two different subjects walking near the UWB radar, with different walking patterns. The brighter the colour indicates the closer the sensor is to the target.

Since the two subjects are different in height and size the reflected signal from them as they pass by the UWB radar will result in a different heat map.

[Qing Zhang appears prominent on the screen. His name and title briefly appear on screen]
The artificial intelligence component of our identification sensor will then extract the features of the heat map patterns of individual residents and use the trained neural network model to identify the individuals. We use a 16 layer convolutional neural network to train our sensor model. Preliminary experimental results show that this new sensor has high recognition accuracy of over 90 percent in distinguishing between 14 individuals in an indoor environment.

This identification sensor is compatible with CSIRO’s Smarter Safer Home Platform.

[An image titled Smarter safer homes displays, which is a floor layout of a home that has flow lines emanating from it identifying, Power, Contact, Motion, Accelerometer and Heat and humidity]
It will help to extend this platform to wide range of applications to support more elderly Australians who prefer to age at home. This novel sensor provides a simple and reliable solution to ensure smart home’s performance in a multi residential environment.

[Qing Zhang appears prominent on the screen. His name and title briefly appear on screen]
It is an environmental sensor but it also protects resident’s privacy. This sensor works on batteries and can be deployed in the home easily. With this sensor, many existing smart home platforms that support independent living, can be easily scaled up to support a multiple residential environment.

[Image of report cover appears on black background with a voiceover]
Download the report today for more insights into using artificial intelligence and machine learning for health applications, read exciting case studies from Australia’s largest digital health initiative, the Australian e-Health Research Centre, and get in touch with us to discuss collaborations.

[End of recorded material 00:05:31]

Dr Qing Zhang on how the internet of things is helping to deliver new non-wearable health sensors.