The Novel Coronavirus outbreak has caused a lot of problems to the tech industry but has also pushed companies to use technologies to help those in need. We have seen companies help out in different ways like Razer’s decision to produce surgical marks or Apple’s decision to suspend interest charges for Apple Card users. Last week, Bing introduced a coronavirus tracker for users who are looking for information on the internet.

Now, Microsoft has announced that it worked with Centers for Disease Control and Prevention (CDC) to create a self-check chatbot. The chatbot asks a series of questions including the travel history, symptoms as well as pre-existing conditions. The chatbot then takes into account the answers to determine if the person could have the infection. The chatbot makes the final recommendation based on the symptoms and if the person shows mild symptoms like cough and fever then it just advises to take rest and check out the information provided by the CDC. In case the bot detects an emergency it will advise the person to call 911 for immediate assistance.

Public health organizations, hospitals and others on the front lines of the COVID-19 response need to be able to respond to inquiries, provide the public with up to date outbreak information, track exposure, quickly triage new cases and guide next steps. Microsoft’s Healthcare Bot service is one solution that uses artificial intelligence (AI) to help the CDC and other frontline organizations respond to these inquiries, freeing up doctors, nurses, administrators and other healthcare professionals to provide critical care to those who need it.

– Microsoft

CDC said that the bot is not meant to provide medical assistance or treatment but is there to help you make a decision about seeking appropriate medical care. The bot was developed using Microsoft’s Healthcare Bot service and uses Microsoft Azure to process the requests.

If you’re someone who’s experiencing the symptoms then you can give the bot a try on CDC’s website. In case your symptoms are severe then you should immediately call 911 for assistance and not rely on the chatbot.

Comments