Artificial Intelligence has been developed rapidly in last five years. Nowadays, AI is not only limited to tech companies, but statisticians in pharmaceutical companies also apply AI methodology to improve diagnosis processes and drug inventions.
Statisticians have explored benefits of applying machine learning methods for those big data problems in observational studies. It is unlikely for us to get a huge sample size for clinical trial study, due to the limited number of patients that are willing to participate in research.
In addition, observational studies are not assigned at random, which means decision of which patient receive which treatment are confounding. If a patient with severe symptoms was treated with light medication, whereas a patient with light symptom received heavy symptom, even though the drug is indeed effective, the result of observational study might tell us a different story ---- the drug is not working at all.
Another big problem for big data is that people are likely to find a lot significant features, but few of them are actually important. For example, is a data scientist want to measure the chance of crime in a city, he / she might found out refrigerator is a significant feature. However, does refrigerator really link to number of crimes? No, population is the actually the factor. This is a consistent problem for data.
One could imagine that, in the future, patients will not need to visit hospitals for diagnosis of basic illness, they will simply rest in their own room and video chat with doctors in the hospital. A doctor can diagnose a patient based on indications measured by AI from remote control of health instrument. But before that, we still has a long way to go.