6 min read
Author | Dr. Min Sun, Chief AI Scientist, Appier
Artificial intelligence (AI) and machine learning (ML) have moved from the backrooms of computer science into the mainstream. Their impact is being felt in everything from how we shop through to money markets and medical research.
Larger models have been trained in separated modality. For instance, GPT-3 is the first 100-billion-parameter model for natural language processing (NLP). Recently, a-trillion-parameter model (T5-XXL) has also been trained. They can be used to write articles, analyze text, perform translations and even create poetry.
In parallel, we’ve seen models used for image recognition and generation greatly improved as they have also been trained with more data sets. What we are seeing emerge is the power that can come from combining two or more AI models without changing these large models. In this way, combining these large models becomes affordable. That will allow us to use AI to interpret text and generate a completely new image.
We are also seeing how the architecture of one model can be adapted to solve problems across different domains. The most powerful example is how the architecture that powers NLP models is being used in biomedical research. In the biomedical domain, sequences of codes, such as DNA or amino acid, are commonly used. Since sequences of codes can be treated as a type of language with hidden structure, the architecture used in NLP models can be potentially used to understand and generate sequences of codes in the biomedical domain as well. One impressive example in early 2021 is that biomedical researchers used language model architecture to predict virus mutations and to understand protein folding – a key challenge in the creation of some of the vaccines now available.
AI in Healthcare and Biomedical Research
The prototype of messenger RNA (mRNA) COVID-19 vaccines have been developed in days thanks to the digitization tools of genetic code sequencing and the transcription tools of making mRNA from genetic code sequence. With the help of AI to predict new mutations in the Sars-Cov-2 virus, the process of developing mRNA vaccines will be even faster.
Machine learning and AI don’t replace clinicians and researchers; they allow these professionals to work faster and rapidly test hypotheses. Instead of waiting for cell cultures to grow in the physical world, they can use these models to understand what will happen much faster in the digital simulation.
AI can also be used as a diagnostic tool. As well now being approved to read x-rays, AI can be used to listen to the sound of someone coughing and indicate whether the patient is likely to be suffering from COVID-19 or some other illness.
As more and more people wear devices that can monitor heart rate, body temperature, blood pressure and other critical factors, the data can be used to give doctors greater insight into a patient’s condition. It also aids accuracy when making diagnoses as doctors and other clinicians are no longer reliant on patient recollections.
The E-Commerce Boom Is AI-Driven
Over the last year, online commerce has grown significantly and is expected to continue to increase. COVID-19 restrictions have resulted in people spending much more time online – not just shopping but in online meetings, playing games, accessing social media and using apps. The growing digital journeys undertaken by people have generated more data that can be used to understand human behavior.
However, more data also brings a greater complexity. In the past, if a brand wanted to reach the widest possible audience, they would pay for a TV or radio ad. Today, there’s no single, most effective channel for reaching customers. Reaching the right customer on the right channel at the right time is complicated for humans, but that complexity can be overcome through the use of AI.
We can expect to see AI being used more and more to generate insight to not only find the right customers that marketers look for, but also to access the often-forgotten long tail of customers. In addition, AI will be used to dynamically generate creative to create content for those customers, driving higher engagement. It also gives marketers a way to effectively create and test different creative at a pace and scale previously thought impossible.
Data-Driven Finance Relies on AI
Furthermore, the main application of AI in finance has been in high-frequency trading where transactions are conducted between machines faster than any person can. This will continue in both traditional finance and in the world of cryptocurrencies, where we see different AIs engage in ‘warfare’. Investors have been using AI to make long-term predictions – which has required systems that can understand investors’ long-term targets. These were typically centered around measures such as revenues, incomes and profits. However, that has proven more challenging with cryptocurrencies.
While high-frequency trading strategies are important, there is another factor that is far more challenging to predict. Much of what we see in cryptocurrency markets is driven by ‘human madness’. While AI models struggle with this today, we can expect the AI models of the future to evolve and do a better job of predicting this behavior through closely monitoring trends in media and social networks.
The Future of Education
Curricula and textbooks have typically been developed to serve large populations of ‘average’ students. These materials include content designed for a wide gamut of different abilities. However, experts, such as Sir Ken Robinson, point out that the ‘conveyor belt’ model of education doesn’t take into account the individual abilities and needs of students.
Therefore, we see AI being used to revolutionize the way curriculum is created and delivered. It can be used to provide more personalized curriculum or personal problem sets for students. Instead of every student working through the same set of problems or questions, they receive a set that are customized to their specific level.
For example, an elementary school student may be very strong with fractions in mathematics, but has a problem with trigonometry. Instead of putting the student through the standard curriculum, he or she would spend less time on fractions and more time on trigonometry. As a student proceeds through a course, AI will monitor their progress and self-modify to meet the specific needs of that student.
With so much content now available online, cheating and plagiarism has become a huge issue. While detecting plagiarism is quite easy – there is already AI that can detect direct copying and similar text where just a few words or the tense are altered – there are other challenges. For example, a student may take content from one language and translate it to another. This is harder to detect, but AI is being developed to solve this problem.
Similarly, image interpretation AI is being developed to find instances where arts students copy or imitate a design.
Smart Farming and Factories
Factories and farms are using data in innovative ways too. However, they differ from many other AI applications as they don’t focus on end-users. Instead, they focus on products, produce and machines. This requires an investment in sensors, robots and automation, and the optimization of operations.
The biggest development we are seeing in this area is in the generalization of findings between different areas. For example, if AI is being used to increase yields in an apple crop, can those AI models be reapplied for the growing of other fruits such as bananas or peaches?
Similarly, if a factory is manufacturing LCD panels and has found ways to increase their yield rates, can those tools and lessons be applied to other manufacturing processes and factories?
Perhaps the biggest prediction we can make about AI in 2021 and beyond can be summarized in one word: leverage. Using existing AI model architecture, combining well developed models and finding ways to generalize existing models to other applications will continuously increase the impact of AI along with accelerated digital transformation across many domains.
* This article was originally published on ITProPortal.