Home BlogWeekly News 10 Bits: the Data News Hotlist

10 Bits: the Data News Hotlist

by Cassidy Chansirik
opioid usage

This week’s list of data news highlights covers January 9, 2021 – January 15, 2021 and includes articles about training an AI model with a trillion parameters and using AI to diagnose opioid use disorder earlier. 

1. Training a Trillion-Parameter AI Model

Google researchers have developed a technique that has enabled them to train an AI language model with more than a trillion parameters, making it one of the largest language models to date. Researchers trained the model on a technique called Switch Transformers, which takes a pool of different models and uses a routing algorithm to choose which model to apply to a given dataset. This technique can therefore choose between trillions of parameters and train models faster. When researchers trained the language model to translate between over 100 different languages, translations across 91 languages were four times faster than Google’s previous language model.

2. Determining the Chemical Makeup of Drugs

Scientists from the University of Victoria in Canada are using supercomputers to test drugs for their chemical agent fingerprints and using AI to better match these fingerprints to a database of known compounds. The researchers hope to enable pharmacists to quickly verify what is in the drugs they issue, protecting patients from bad batches and preventing some of the more than 70,000 U.S. deaths that occur annually from drug overdoses.

3. Creating a Cancer Database to Improve Precision Medicine Therapies

Researchers from Johns Hopkins University and 18 other organizations around the United States and Poland have created a comprehensive database of head and neck cancers to improve the development of precision medicine therapies. To create the database, researchers analyzed tumors from 108 patients. They found that across three different subtypes of tumors, there was widespread deletion of genes, leading to the loss of ability to produce an immune response. With the database, clinicians can explore different treatment options. 

4.Capturing Vital Signs from Phone Cameras

Developers from the digital healthcare startup Binah.ai have created an app that uses computer vision and phone cameras to capture a range of vital signs, such as heart rate, oxygen saturation, respiration rate, heart rate variability, and mental stress. The app monitors vital signs by applying an imaging tool that measures the changes in red, green, and blue light reflected from the skin and analyzes the results using computer vision technology. When tested against medical-grade equipment, the app achieved a 95 percent accuracy rate. 

5. Using Facial Recognition to Identify People Wearing Masks

NEC Corporation, a Japanese information technology company, has developed a facial recognition system that can identify people even when they are wearing masks. The system verifies people’s identities by honing in on uncovered facial features that are captured on cameras, such as the eyes, and compares the images to existing photos. In tests, the system completed verifications in under one second and achieved an accuracy rate of 99 percent. 

6. Open-Sourcing the AI Technology of Amazon’s Alexa

Amazon has open-sourced the technology that underpins its Alexa digital assistant to allow companies with a need for voice assistant technology, such as automakers, the ability to custom build their own products. This offering, called the Alexa Custom Assistant, allows companies to create customized voice assistants which respond to wake words and commands that are unique to the specific features their products offer. Currently, Alexa Custom Assistant is being offered to automobile companies in 14 countries to develop voice assistants to control software inside vehicles. 

7. Designing an Early Warning System for Infectious Disease Surveillance

Researchers from the Columbia Mailman School of Public Health in New York have developed a model to identify gaps in the surveillance of infectious respiratory diseases. Not every U.S. county collects and shares data on influenza, but researchers can now forecast the spread of influenza in these locations by using data streams from multiple other locations combined with mobility data. The researchers validated their model using data from 35 states that reported influenza during 9 seasons, and coronavirus during 4 seasons, and found that their model was able to create more accurate forecasts that prioritized locations with large populations. 

8. Detecting Breast Cancer Earlier Than Radiologists

Researchers from DeepHealth, a machine learning software company that focuses on medical solutions, have developed a deep learning model that in some cases can detect breast cancer one year earlier than standard clinical models used by radiologists. Researchers trained their model to identify breast cancer using images from screening mammograms, and tested the model on tasks that increased in difficulty to imitate how humans typically learn. When the team compared the performance of the model to five radiologists reading the same screening mammograms, the model outperformed all five radiologists with an average increase in sensitivity of 14 percent. 

9. Diagnosing Opioid Use Disorder Earlier

Researchers from Ariel University in Israel have developed a machine learning algorithm that identifies predictors of opioid use disorder (OUD) that clinicians can use for earlier diagnosis. The researchers built the algorithm to identify OUD based on 436 predictor variables found in 20 million patient healthcare claims and found that patients with OUD had significantly more annual opioid prescriptions, days of opioid treatment, and longer consecutive opioid prescriptions. The model also identified hypertension, hyperlipidemia, the number of hypertensive crisis events, and age as significant predictors for OUD. 

10. Analyzing Data for Pharmaceuticals Manufacturing with Machine Learning

Researchers from the Massachusetts Institute of Technology have developed a machine learning approach that predicts the quality of biopharmaceutical products. It is often costly to measure the quality of these products because some attributes, such as microbiological or chemical quality, cannot be measured directly or can only be measured during production. To decrease costs, the researchers use AI to predict what type of quality control would most accurately measure the physical, chemical, or biological quality for a specific product and then measure the impact processing changes will have on the product. 

Image credit: Michael Longmire

You may also like

Show Buttons
Hide Buttons