2023: Digital Human Cardiac Coach ~ Not #ChatGPT
This is Evolution - NOT #ChatGPT
In the three years since #XMed2019, our work on the #DigitalHuman #CardiacCoach has continued. We have not stopped. Our work and creativity has intensified.
In this update, we share the journey of one of the most frightening things about heart surgery - going home.
And it is just as frightening for the carer. So from this experience, we have developed the #DigitalHuman #Carer conversation to help with this transition.
But this is not something that #ChatGPT can explain. Read along.
Update for #NextMed2023
If EVER there was a time for pushing the edge and going way beyond, it is NOW. And that is why sharing this video for NextMed Health is so important. And why NextMed Health is so important.
We believe this is the first #Carer interaction modelled with a #DigitalHuman #HealthCoach.
This is a real-life interaction between heart patient carer - me - and the #AI powered #DigitalHuman #CardiacCoach. As many of you know, heart patient Allan Johnson is #Patient #1 for the #DigitalHuman #CardiacCoach. He is widely known as #CardiacMan.
The common #Corpus of #Conversations supports multiple contexts, including direct patient conversations as well as carer conversations.
The topic of this conversation that the carer is having with the Coach – finding out what’s involved in going to hospital and then going home – is the same topic of conversation that the patient would have with the Coach. What is different is the context of the carer.
In this conversation, the carer - me - not only asks questions, but also makes statements a number of times, such as “I’m not sure I’ll remember all this”. This is based on real life.
In both the question interaction and statement interaction, the Coach is able to infer intent from the context and provide information in a natural contextual flow.
The questions (intents) we used are based on years of R&D including in the cardiac rehabilitation setting, patient forums globally, peer reviewed research and general patient education sourced globally.
Why Does This Matter?
This is not a LLM or #ChatGPT - but a co-designed contextual bounded domain.
Around the world, the caring economy is massive but failing those who need it most.
The carer is often forgotten in a heart patient’s health journey, but so essential and potentially lifesaving.
However, the carer is the one constant human presence in the lives of patients.
And because of this, the carer is an essential participant in frontline healthcare performing two critical functions.
Firstly, carers help the patient to understand what the health system is telling them. And secondly, the carer is responsible for procedural patient support such as wound care, bathing, remembering to take medications, eating, exercise and so on.
But #HealthIlliteracy of both the carer and #Patient impacts health outcomes.
#ChatGPT and the regurgitation of voluminous information does not overcome #HealthIlliteracy nor achieve understanding - it confounds it.
This is not to say there is not a use-case for #ChatGPT, but this must be determined through the governance of co-design and #Ethics.
The human must always be in the loop.
This is why, co-designed contextual domain specific localised language models are essential. There is much work to be done.
Nadia: Politics | Bigotry | Artificial Intelligence
The video also previews my new book - the explosive inside story of "Nadia: Politics | Bigotry | Artificial Intelligence", coming in 2023.
Update: publication date is 16 February 2024, available on Amazon!
I am so excited to share the inside story of #Nadia.
My inbox is FULL of messages from people around the world, wanting to know the inside story. I am told that universities are running courses with the #Nadia story as a case study, second guessing what happened. The second guessing will soon be answered!
The #Nadia book also tells the genesis of the #DigitalHuman #CardiacCoach.
Stay tuned for more