BACK

how to make a subcaption in shopify

Collecting Healthcare Data with Alexa so we're glad you could join us to,introduce we are myself I'm

InterSystems Developers

Updated on Feb 28,2023

Collecting Healthcare Data with Alexa

The above is a brief introduction to how to make a subcaption in shopify

Let's move on to the first section of how to make a subcaption in shopify

Let TThunt's experts help you find the best TikTok product on your Shopify business!

Find TikTok products (It's Free)
No difficulty
No complicated process
Find winning products
3.5K Ratings

WHY YOU SHOULD CHOOSE TTHUNT

TThunt has the world's largest selection of TikTok products to choose from, and each product has a large number of advertising materials, so you can choose advertising materials for TikTok ads or Facebook ads without any hassle.

how to make a subcaption in shopify catalogs

Collecting Healthcare Data with Alexa

so we're glad you could join us to,introduce we are myself I'm Nikolai and,with Jin and Patrick we'll introduce,ourselves formally in just a minute but,this session is collecting Alexa data,sorry collecting healthcare data with,Alexa and if you want to move to the,next slide so to introduce myself first,hi I'm Nikolai michgo I'm a sales,engineer I typically work with large,IDNs large large Hospital chains and,pharmaceutical companies and I've got a,real focus on innovation and machine,learning so hopefully I can tie that,back into you know everything we talked,about in the presentation and demo today,but I'll hand that over to Jim to,introduce himself hi everyone my name is,Jen Kim I am also a sales engineer,working on the same health care team as,Nikolai and we're really excited to,share with you an innovative use case we,came up with with Alexa to show you the,latest in assistance technologies and,I'm Pat Jameson I'm a physician I'm the,product manager fryers for health I have,a long background in informatics and a,deep interest in speech recognition and,decision support all right so we'll get,going so what will drive you through,today and what my goal is to to,accomplish these three things that's to,give an overview of how healthcare and,voice applications can apply and can,change healthcare we're gonna give you a,demonstration of how Alexa applies in,one specific scenario and then we're,gonna talk about how that all fits into,the inner systems ecosystem so relating,iris for health and newest platform or,even you know some of our older,platforms as well and how we can,integrate Alexa and collect data that,way so um to get started I'm gonna do a,little overview of what voice enabled,applications and technologies can do so,the idea I'm gonna talk about today is,smart speakers so so smart speakers,being Alexa Google home you know any,sort of voice AI or chat but you talk,about talk to with your voice now this,could be on your phone it could be on,your SmartWatch I recently was gifted a,SmartWatch and I'm addicted to it but I,know in a week I'm gonna be putting it,down so to give you a little bit,background on Alexa you've all seen the,advertisements you've all seen google,home you've all seen this there's a ton,you can do with Alexa there really is,it's crazy when you have an A,that acts as a human and it comes out of,a speaker how much that can replace you,know a human in a conversation so to,give you an eye high-level overview,there found everywhere now even hospital,beds and Amazon Alexa as you can see on,the screen has ninety thousand different,skills so you can talk to Alexa with,ninety thousand separate and unique,voice interactions which in my mind is,pretty incredible,you know that could replace you know my,girlfriend not yet not yet,had to give one wife out there there are,a hundred million devices that are Alexa,enabled a hundred million now that's,that's incredible and 8.2 million,households so roughly twenty percent of,the United States household have some,sort of Alexa device and that doesn't,count Google home which is pretty crazy,and then when you integrate Amazon Alexa,or Google home into eBay and Amazon it,can purchase over a hundred million,things never in a million years and I'm,young compared to the crowd what I think,I could go and buy a set of golf clubs,or you know buy a new iPhone on on a,smart speaker so I think the prot the,question that sort of sparked this,session was Kenny smart speaker solve,complex problems and the really the way,I like to look at whether a solution is,good at solving a problem is to look at,what people are doing with it already so,I've got a quiz for everyone I'm gonna,ask you to raise your hand in just a,minute if you for each of these answers,and my question is what is the most,popular task on Alexa or and or Google,home so these are aggregated so raise,your hand if you think it's a which is,completing an online purchase well you,got one we got two all right do you,think it's setting a timer all right,we've got a lot of smart people in the,crowd or do you think it's scheduling an,appointment of any kind that's right so,scheduling your timer is the most,popular task you know on Amazon or oh,well like so go home so I'll do that,right now hey Alexa set a timer for ten,minutes,ten minutes starting now pretty easy,right I do that when I'm cooking I do,that you know for any reason but can,that apply this sort of voice AI,is it just limited to simple tasks like,setting a timer or can I actually do,meaningful work with my Alexa smart,speaker so I think the key thing is that,smart speakers have currently been,limited to the low-hanging fruit meaning,that they do tasks that are simple,repetitive menial that you might have,even had a utility such as a timer to do,to perform before so if we look down the,list setting a timer is actually the,number one - playing a song you know,reading the news setting an alarm these,are all really easy things and even the,most some of the most popular ones,setting your thermostat are controlling,your smart home or actually pretty low,in utilization for these for these smart,devices so I guess you're all wondering,we're at a healthcare conference we had,an inter systems conference what does,this have to do at all with healthcare,well on the next slide what I'll show,you is that is Alexa and Google home how,do they apply to healthcare well like I,mentioned the tests that you can use or,that you can accomplish with an Alexa or,smart speaker are I like to group into,these categories so you can automate,menial or time-consuming tasks things,like asking a questionnaire setting a,timer you can free valuable resources,when you apply those Alexa skills to,those tasks so the example are given,just a moment is a nurse might have to,go and administer a questionnaire that's,time consuming we want the nurse to be,doing something else as opposed to,administering administering a,questionnaire but it can also collect,data on interval a patient comes into,the hospital and they need to provide a,family history why can't Alexa do that,and then satisfied care demands in,seconds so that might be sort of the,telehealth how that would apply and,observe and record patient behavior as,much as the privacy minded people in the,room like myself are afraid that,someone's always listening if you do,have a smart speaker with a microphone,next to a bed or next to a patient with,a smart watch well then you can observe,and record patient behavior in a natural,setting and not in a not in a artificial,environment as much as we might like to,think that the hospital is a natural,environment so,for nursing if we go to the next slide I,looked up on Google because I'm not a,nurse and if we look at the tasks that a,nurse performs one of the number one is,observing and recording patient behavior,and so Kenan Alexa will perform that,right now and answers yes sometimes can,they perform physical exams well not yet,so yeah I don't want Alexa grabbing me,you're doing anything yet that'll,that'll freak me out but Alexa can,certainly collect patient histories they,can do things like educating patients,about treatment plans a doctor part of,their job or part of the nurses job,might be going into you know a room,after a prescription is prescribed and,saying this is how you should be taking,this prescription this is how you should,be treated with some certain prognosis,Alexa can at least do the informational,side about that it won't replace the,pharmacist or the nurse you know,entirely yet but it's certainly a move,to free those high value resources,rather quickly in the near future,excellent so I'm gonna turn it over to,Patrick for an overview of voice and,applications in healthcare thanks,Nicholai and I think that's a good,summary this has been again this area's,really gaining momentum in healthcare,because really using voice is a very,natural way to collect data so there are,a number of institutions that have,jumped out in front on this and I think,we'll see many more in the next few,years but some big providers such as,Beth Israel Boston Children's,Commonwealth Care Alliance Norwell,Health Liberty naseeb or cedars-sinai,are all today using voice applications,in their institutions next slide,Amazon's Alexa of the devices in,healthcare is really leading the charge,it's it's probably out front today in,terms of healthcare applications so let,me give you just a few ideas of how,people are using these things Cigna uses,Alexa,to help manage their employees health,improvement goals and earn wellness,incentives and this is done using Alexa,Lemang go members can ask Alexa for,their last blood sugar reading so that's,again integrating in to Alexa medical,devices Boston hospitals enhanced,recovery after surgery makes it possible,to collect data from parents and,caregivers so that their team is updated,with the status patient statuses and,then we have companies like Express,Scripts atrium health Swedish health,connect they are using Alexa to make it,easy for patients to understand when,their prescriptions are going to be,delivered or make an appointment all,things that we would like our smart,devices to do next slide so here's how a,few things are being done at Boston's,Children's ICU using voice assistance a,charge nurse a nurse can ask whose is,the charge nurse or how many beds are,available on the wards just simple facts,that help help save time without having,to track down who's in charge we could,use Alexa to do voice-enabled versions,of checklist so in typically in many,institutions prior to surgery we go,through checklists to validate that,we've met all the conditions before for,example doing a transplant these prompts,can help reduce errors and it's a,natural way of recording things kids MD,MD is used by parents at home to help,them figure out whether their kids need,to see a doctor,using Alexa next like,in the UK Alexa has started a,partnership with the National Health,Service so that patients can access,relevant information using speech,recognition so they could get,information about common illnesses now,you could say well why can't they just,do this with Google but this is a,natural way for many people to interact,with devices these days and the National,Health Service what they're wanting to,do is provide a vetted source of,information so that they're not,effectively having patients go to,websites that may contain erroneous,information here's so that they could,get state-backed service that provides,annotated or validated information next,line as I mentioned voice recognition,speech recognition is going to really,enable a whole new class I believe of,startup companies and this is just an,one example of a startup called Suki,this is a voice enabled digital,assistant for doctors and what it does,is it's trying to lift some of the,burden of documentation by enabling them,to focus on what the doctor wants to do,which is treating the patients it's able,to really personalize its approach to,collecting data from the doctor as they,use it so it's it's tailoring its its,questions based on the physicians,workflow so it's basically like having,an assistant in the exam room but you,know this is just scratching the surface,there's going to be just many many more,examples of what we're going to see,voice recognition and speech recognition,do in the future so with that I'm going,to turn it over thank you oh and lastly,and Nikolai's already mentioned this it,takes over a lot of mundane tasks you,know for example trivial tasks like,changing the TV Alexa can absolutely do,this,getting status information for example,when will general show a demo of this,for example like your pain level next,slide,Alexa can obviously do a lot more than,just change the Tampa channel on a TV it,can really simplify the job for nurses,by augmenting their daily activities and,collecting data on routine tasks that,they have to do all the time next slide,now is actually a really good time to,show you a demonstration of what we've,been talking about Alexa and into,healthcare today and what you see on the,slide there having Alexa in the hospital,bed so we've encountered this throughout,the year with some of our customers and,some of their goals was really to save,nurses time alright there's a timer that,was a 10 minute timer yeah I think we,all forgot about it so some of the,things that some of our clients are,trying to do at Alexa today is really,just having the TV channel be changed or,half the patients use Alexa to turn the,lights on and off and so what we've done,is actually developed a sample Alexis,skill Patrick gave a little bit of a,foreshadow but what we did was we,created a sample paint wreckless,something that clinicians and nurses,goes into a patient's room and ask them,how is your what's your pain level today,on a scale one through 10 I'm sure that,if you guys have been in a hospital,before that's something that's a,question that you guys are used to,hearing over and over again and just to,show you just cut to the chase Alexa run,pain checklist,welcome I am the entire systems voice,assistant I will go through a simple,checklist and I will send your responses,to your doctor first what is your first,and last name my name is Marla Gonzalez,Thank You Marla now tell me the month,day and the year of your date of birth,my birthday is January 1st 1980 next on,a zero to ten pain scale tell me the,intensity of your pain by saying zero if,you'd have no pain or ten if you need,immediate assistance my pain level is,eight,ouch I will record your pain level as,eight and sent for help right away,inter systems will also examine your,unified care record and let your doctors,know of any related diagnosis or,medication that might be causing your,pain I hope you feel better soon good,bye,all right thank you all like that so now,just to show you what just happened in,my interaction with Alexa so these are,some of the prompts that I've engaged,with Alexa through the alexa custom,skill that we've developed there are a,number of things to note here is there,are Alexa as the wake word run as the,launch word and the pain checklist as,the invocation name and a common,question I tend to hear is do you have,to say it is exact script and the,question is no leveraging Alexa's SDK,we're able to make this a very,conversational taking advantage of,Amazon's voice user interface which is,actually very very intuitive and it's,actually highly configurable on the,Amazon Alexa console online and what you,see there and the second line is my name,is Mara like gonzales marla gonzales is,a sample patient that we tend to use,over and over here at inner systems and,what this utterance is is you're,basically with the utterance indicating,what kind of method if you think about,object-oriented programming what method,you would like to trigger with that,utterance and for example here when I,say my pain level is 8,that utterance my pain level is triggers,pain rating intent and that slot value,is the value that the data that Alexa is,looking to capture and it stores it into,a value and you see down at the bottom,value 8 and ultimately as Alexa engages,with the patient for example in the,health care space we create a JSON,payload and example here is a JSON,payload that was generated after Alexa,conversed with me and gathered all the,data as a result of the pain checklist,and that goal here is sending this JSON,payload to your application and at inner,systems global summit 2019,what better application is there to send,uttered an inner systems arse for health,and just as a high / high level overview,at the conclusion of this Amazon Alexa,skill it was a restful service that sent,the JSON payload to inner systems ARS,for health up in the cloud and we have,it paired to a number of modules that,we'll be going over in just a second and,just to walk you through at a high level,a diagram of what we've configured here,for today's demo you can think of one of,those boxes on the bottom right that,says device data as a Amazon device that,you saw today now it doesn't have to be,Amazon alike site can be Google home or,for some of our other clients they've,actually gone and pondered whether they,can hook up bedside devices so it's not,limited to Amazon Alexa could be a,number of IOT devices every so once we,hooked it hooked up to inter systems,ours for health we also configured it to,healthcare unified care record for those,of you who are not familiar with some of,our products it's basically a single,source of truth from multiple data,sources that's been normalized,aggregated and deduplicated and what,we've done here in the purpose of this,demo is we've basically pulled in the,patient's clinical history so that we,can enrich the data analytics that we,can perform and with that let me just,show you what happened on the back end,after it is Amazon Alexis Gill concluded,just making sure you guys are able to,see it okay so the first thing that,happened while,there was an email that got sent off and,this could be a clinician or any,caretaker who needs to know about the,patient interacting what Alexa and some,of the things here are the information,that the patient said seeing marla,gonzalez the pain level but two lines,that might be pretty interesting here,something that the patient didn't say,are that this pain may be related to a,patient's recent diagnosis of chest pain,or in the next hour,the patient has a predicted pain level,of eight point zero seven three and it's,asking the clinician to review the,patient's chart for more details and to,dive into those components i will dive,into the message viewer and just a quick,show of hands how many of you guys are,familiar with the messenger so it seems,like the most of you but for those of,you who are not familiar message viewer,is a simple web viewer where you can see,all the processes that basically,happened inside ours for health and the,reason I say that is once the JSON,payload was received by iers for health,our our application as a demonstration a,number of things happen first,translating a JSON payload to an hl7,message that could be shipped down to,other downstream systems whether it be,other labs or anyone who's actually,looking for this patient interaction and,a key component here is I made an API,call to he'll share a unified care,record now at your organization this,might be other instances of EMR you have,or other sources of data you already,have at your organization and we fetched,back the patient's clinical history here,and so now we have the device data that,was gathered from Amazon Alexa and now,we have access to the patient's clinical,history and so now this really sets up a,nice environment to do AI and machine,learning which which are the common,core's that we've been talking about,throughout the day today especially at,the keynotes and for the purposes of,this demo we ran a very simple,regression model based on time series,data now it's a very simple regression,model but it paints the picture of all,to all kinds of,is that you can do with the data that's,coming in two hours for health from,devices like Alexa and we were able to,turn out stuff that says the prediction,is paint the predicted pain level in the,next hour and so we're able to tie all,these up and ultimately ship it off to,other clinicians or any caretakers who,need to know about this via emails or,any other forms of communication and,ultimately we've operationalize this so,that all this machine learning on the,data coming in from the devices like,Alexa and all the data you're capturing,at your EMR s so that it's actually,sitting inside the production as an,interface so that as new hl7 messages,come in as the patient continues to,engage more and more with Alexa here is,an operation called Alexa the pain PMML,model dot operation that sits here and,basically runs the predictive risk model,every time the patient says my pain,level is eight or my pain level is,hopefully lower hopefully in the ones in,tears and so with that I'll turn it over,to Nikolai yes so just to give a quick,recap of that the IOT data is an event,where we're used to responding to hl7,events all the time clinical events all,the time in a transactional matter this,is another vehicle in our mind to,capture clinical events so if we can,enrich that data with both a,longitudinal health record and machine,learning or predictive model model,execution you can do much more when you,have when you capture more events,through I oh t and we think in what's my,opinion at least that smart speakers are,a great vehicle to then capture new,clinical events so the way this all,works in the background and if we go to,the next slide this is a very good,marketing slide of how we we attack the,problem events come in two inner systems,iris they come in all day this might be,your healthcare instance it might be,your bare-bones iris instance or health,connect somewhere in between I hope hope,that everyone in the room is really semi,familiar with those terms and coming out,of that is data storage and response to,real-time transactions and within er,systems iris,we've been talking the past three or,four global summits about enabling AI,and machine learning and the way we do,that is providing seamless simple,connections to the blue box on the,screen there which is your predictive,risk and predictive modeling environment,well we know we know some some people,use tableau click or power bi to do,their descriptive analytics and they,might use spark tensorflow,or some other data science environment,to do predictive analytics we then can,incorporate that back into inner systems,iris in two ways,the first the first way is we ant now,actually have a Python gateway which has,not been incredibly publicized but I,believe there is a session today at the,tech exchange about it,and what what that allows you to do is,to natively run Python code and transmit,data from kashchei or I'm sorry iris,iris code back and forth to to your,Python environment and it's it's as easy,as saying well what what I think is,great about it is that you can hire a,developer young at a college probably,similar to our age that that has Python,experience they have Python experience,they've done Python for five or six,years it's now the number one,programming language in all accounts so,the industry knows Python you can take,them have them write their machine,learning script write write in what they,know and then integrate that directly,where the data lives which means that,like you see on screen here and we have,a new keyword called Z pi meaning that,let's run Python directly in its own,context and transmit data back and forth,to iris code now I think that's pretty,incredible just because I'm a machine,learning junkie but what it what it,really means in practice is that you,don't have to go look for that outside,integration of tableau or SPARC you,don't have to look for specialty cachet,and iris experience you can go directly,you know at you know student so in that,at the workforce and say I need to do,machine learning do it in however you,want and we'll make it work on on our,data so I think the next thing we're,gonna talk about is one of the ways we,can actually integrate with an external,environment is using apache spark so,this is an incredibly popular data,analytics framework apache spark if,you're not familiar is is a set is a,single computer or a set of clusters,that you can set up that have advanced,machine learning and numerical,processing libraries it's based on Java,and it allows a high concurrent access,to various different databases including,inner systems iris we provide a high,performance data high performance data,connector for SPARC so if you access,thee for example Alexa pain table you,can have ten workers go down to that,same table in your spark cluster,efficiently grab back all the data you,need from that from that one query and,get back all that you need for your,specific machine learning tasks in a,very short short runtime round-trip now,so if you go to the next one we also,have an integration with Zeppelin,notebook now with the Python gateway so,like I said keeping the developer happy,is something we're certainly certainly,proud of you can use Zeppelin notebook,directly an inner system's iris on top,of that Python gateway so is whether,it's Zeppelin notebook or Jupiter,notebook you can then use the analytics,tools that that the new hires are you,know they're familiar with and then run,that directly on the data so what I'll,do now is I'm going to pass it back to,Jim so those were some of the ways on,how you can unlock the data that you,gather in your organization through,devices like Amazon Alexa and I'm sure,there are some of you who are wondering,how do I create my own Alexa skill and,is it hard personally for me I think it,was very laid out very well documented,on Amazon Alexa this sample scale for,example took only about a day but maybe,two the biggest thing the biggest,obstacle was what skill am I going to,build and so step one was actually the,part that took the most planning and,designing your skill and thinking about,what kind of value that you want to get,out of having to skill and for example,this paint checklist could very well be,an application in a clinical setting,that aims to save the nurses time and,then the next step once you have an idea,that solidified is creating the skill,itself in Amazon developer console by,going to the URL and,there is a way to formulate an,interaction model so if you guys recall,the utterances and to add keywords to,trigger Alexa and to basically prop what,kind of data that you want Alexis to,collect so that's where you designed the,interaction model so that outlines what,dialogue should Alexa use to collect,data and then step four is ultimately,making use of the data by sending it to,your application which in our case was,inner systems ours for health now this,demonstration and the framework we've,shared with you today doesn't only apply,to Alexa it could be a number of IOT,devices or bedside devices even like,EKGs so that you can collect various,types of data using ours for health and,I've outlined a few environments like AI,ml unified care record and API manager,just to point out that there are other,related sessions so that you can go to,these and find out more if any of the,topics interesting and now with that we,are open to Q&A we only have a few mics,so I'll hand them around so when Alexa,was partnering with NHS and they're,gonna send them to a specific website,do they have to ask a very specific,question in order to get directed to,that website now the question was when,NHS partnered with Alexa would they have,to in effect be able to specify the,website they're going to no that's the,whole point that the the data that is,going to be part of the NHS is delivery,of health information is going to be,information that they have captured for,use with the Alexa program that the,patient's not going to be asking for a,website they're going to be asking for,essentially what should I do,my kid has a fever of 100,one right no I get that they're not,gonna specify the website but right if,I'm just asking like a general question,and I don't necessarily want it to go to,that website specifically do I have the,option to have not do it if I don't well,you always had so I think I don't think,the NHS is going to tie the hands of,patients to get medical information from,other websites but if they're working if,they're using Alexa in their home,because it's part of an NHS approved,delivery vehicle and they ask Alexa that,question,Alexa is going to answer based on the,information that the NHS has supplied,that nothing stops a patient from using,other non Alexa means to get their their,information but if they were to use,Alexa that well they're going to in the,context you know that's a good question,I guess what you're asking is is the NHS,going to restrict in some ways NHS,patients from only being able to get,access to medical data from Alexa I,somehow doubt that but how the actual,mechanism and this is pretty new,partnership how that's gonna happen,maybe they have to have a voice prompt,that says something like I need help,from the NHS or something like that,Alexa I don't know exactly but I doubt,there will be any censorship of health,data yeah that was that was my question,thinks sure hi so I just have a couple,questions about so one is you know Alexa,as far as I understand it is sort of,learns how you speak and gets better,over time but if you're gonna be in a,patient context you know people with,different accents maybe even with broken,English you know or whatever it is and,so how is it Alexa going to learn from,that and is it really then gonna be you,know productive I mean even different,physicians and clinicians have different,accents and I'll discuss my other,question too because it's sort of,related which is would you then also,have the capability to you know if,somebody's,Spanish to say Alexa please give this,survey in Spanish knowing that you still,want that data returned to you back in,your you know whatever your native,language is yep so before Patrick gives,any perspective from the clinical side,so as someone who have developed a Lexus,skill those are things that you're going,to have control over so you have the,ability to give the same survey in,multiple different languages so that's,up to the developer and right and so,within Amazon Alexa there are voice user,profiles that basically learns how the,person talks the different accents that,the person may be using does that answer,your question well yeah sort of but I,mean if I'm a patient therefore the,first time you're gonna give me a survey,you haven't had a chance to learn how I,speak right and so yes I know I have a,nice southern accent now but you know so,I think a part of the Alexa program,their development process is actually,training against various different,corpuses of voice data so let's say,they'd probably like I say nothing's,perfect but they'll probably hit a large,bell curve of most of most interactions,and if it doesn't understand as a,developer you can always trigger a nurse,or or another clinical professional to,then come in and understand for Alexa so,like I said it's enhancing but it's not,replacing and that's certainly something,I want to kind of I'll hijack this,question for is that well you know,Patrick talked about enhancing nurses,very quickly we all get in our mind that,that you know Alexa can replace nurses I,don't think that's that's the case I,think we all know that nurses are,already overworked as are many health,care professionals this is about,alleviating the workload as well so,hopefully I answered your question and,in my own little two cents another part,of this is as you know there's really,two types of speak speech recognition,speaker dependent and speaker,independent you know we've come from a,long way in the speech recognition,community initially all speech,recognition was speaker dependent you,had to train a system before it could,recognize any,but within the last several years,clearly we've built better speaker,independent systems now one limitation,is that if you're going to do open-ended,speech recognition like dictation it's,gonna probably fail without some amount,of training in terms of the speaker but,when you're narrowing the responses,through the use of a grammar and that's,what a lone Alexus skill is essentially,it's it's a grammar based mechanism you,have a much much higher probability of,being able to do good recognition that a,speaker independent I mean again it's,not going to be foolproof but it's gonna,have a high probability that it's going,to match in a speaker independent way is,there any way for Alexa to push like for,example good morning what's your a 1c,today with custom so the question was,whether Alexa is able to wake up on its,own and ask questions or say commands,right the answer is with custom,development yes but it is more difficult,than it sounds hey so if Lex was in a,patient room do you have the ability to,send it patient identifier here and room,identifier is that it can send back to,you great question because we've,actually received this a ton of times,before particularly and in my safety,conscious mind I would like what we do,is particularly we provision the Alexa,with a device ID and then tie that in a,back-end system to a location from there,you could probably hit an epic or some,other API to figure out who is actually,in that room I'm privacy conscious so I,would say never send Amazon the patient,identifier but you certainly can do that,sort of tying together pretty easily on,the back end yeah kind of related to,that but is there any dynamic feedback,like if you give a name and date and it,doesn't match up anybody does it just go,away or is there any sort of way to give,feedback saying I didn't find a patient,so another great question the question,was if you give a porous,BOTS and you know patient doesn't exist,or you know some other error occurs in,the back end,can you send Alexa back in error state,and the answer is yes so when,interacting with the API what we do in,the last very last interaction of this,demo skill is actually call out to send,some submission to Inter systems iris,but as part of that we also send a,response now every single intent or,voice interact singular voice,interaction in that skill can then send,and receive both data and then some sort,of response back and check that response,so if I say what's your name and I say,Nikolay michgo and I don't exist in the,EMP eye or I don't exist in some sort of,you know NHS database then I can then,have it react and say are you sure I,don't recognize you as a patient and,that's really up to the developer to,make sure they do their due diligence so,they're not just submitting you know,poor data to the back end is there any,evidence that supports that patients,actually want this I mean I know when I,call up a credit card company or,something like that and I get an,automated system to answer my questions,I kind of get angry I'd rather talk to a,person and I don't think I'm alone in,that so I'm just thinking when it comes,to something really specific like,healthcare data do patients even want,this I think that's a good good point an,interesting concern I think medicine is,fundamentally a high-touch occupation,and I think we are we don't want it to,become personalized and self-service but,at the same time we recognize that it is,very expensive and costly to do routine,types of things that could be easily,handled without high-cost personnel,collecting data for example that has to,be collected and let's say every few,hours it's just not a good allocation of,our health workforce so I think there is,a delicate compromise that is going to,happen here I don't think we're gonna,expect our alexis alexa units to to,replace physician,and nurses and to enter into really,weighty dialogues about your prognosis,or things like that but I don't think,it's a stretch to think that a it if,it's comes just to routine information,like forms that you need to fill out,I don't see why Alexa can't replace the,clipboard for example you know I think,that's perfectly acceptable and I'll add,something to that it might be a good,idea to put a poll out to everyone in,the room who raise your hand if you,would feel comfortable submitting your,you know you know your paying checklist,or family history into a smart speaker,so I think you have your answer right,there so if you're reporting your blood,pressure daily and you know you get a,$10 coupon when you go the pharmacy you,just need to connect into the payers,people are usually willing to do that,yeah there's nothing that incentivizes,behavior than money,I know we're talking about patient,identifiers and I have a couple,questions with that but when you're,talking about having dual patients in a,room what happens with Alexa invoice,recognizing with that or visitors in the,room and people trying to answer,questions for the patient is there any I,guess you know data on that and what,could happen and I guess the consequence,is with patient crossing yeah I don't I,don't know if we have a good solution to,that right now and in other words you're,worried about conversations and multiple,voices and who's who patients in the,room you know I'm sitting there maybe I,got a head trauma or something you're,trying to ask me a question and I think,you're talking to me but you're really,asking question next to me and I'm,answering the questions on top of them,right I don't think that we can expect,Alexa to sort that out today okay that's,a little beyond where and frankly in the,ICU under those situations or in an,emergency room,I wouldn't expect even Alexa yeah but,yeah I don't think that the problem is,insolvable it's just it's a bridge too,far from where we are today I think but,I'm sure it will be and all it's,something real quick on that as part of,Alexa they have something called voice,segmentation so whoever initializes the,intent or speaks says hey hey Alexa they,are segmented from other voices so you,have sort of a fingerprint for your,voice and so you will continue the,intent and if Jin talks over me it won't,register is that part of data I won't,say it's perfect is it because I think,the next patient coming obviously that,patient's not staying there but Alexis,learning their voice in recognizing the,way that they're speaking their tone,their you know accent someone else comes,in so we'll need that system to be much,more accurate in the future to solve,that that's an area does everything in,the room get recorded by Alexa and sent,to Amazon is there any HIPAA,implications around that so another,great question the question is what are,the HIPPA implications of having a smart,speaker I do believe Amazon has a health,cloud where you can provide,alexxa as a device meaning that the data,is not persisted however if you aren't,careful with provisioning a data you,could unworn on knowingly transmitted,you know pH I or PII you know to Amazon,and you know they have enough data and,just and just about today I believe,earlier this year Amazon came out with a,HIPAA compliant version of Amazon Alexa,and I believe there are six pilot sites,already using the HIPAA compliant,version well I thank you very much,everyone and thanks for coming to supper

Congratulation! You bave finally finished reading how to make a subcaption in shopify and believe you bave enougb understending how to make a subcaption in shopify

Come on and read the rest of the article!

Browse More Content