Home BlogWeekly News 10 Bits: the Data News Hotlist

10 Bits: the Data News Hotlist

by Cassidy Chansirik
by
Neighborhood in New York City

This week’s list of data news highlights covers February 20, 2021 – February 26, 2021 and includes articles about assessing environmental conditions and aiding mental health diagnoses with machine learning. 

1. Speeding Up Patent Classifications

Engineers at the U.S. Patent and Trademark Office are in the process of developing algorithms to speed up how they classify, search, and check the quality of patents. The algorithms currently only classify patent applications for the art units of the agency and are being measured against the evaluations that human examiners give for accuracy. The engineers will use this feedback loop of information that examiners provide to refine the algorithms and improve patent and trademark workflows. 

2. Identifying Prescription Patterns for Opioid Medications

Clinicians at Lifespan Health System in Rhode Island have developed machine learning models that use electronic health records to identify physicians who are prescribing opioid medications at higher rates than average. Each day, the models generate a scatterplot of all clinicians’ prescribing practices in the system and positions them on the graph. Pharmacists use this information to provide suggestions to outlier clinicians about prescriptions that follow recommendations from the Centers for Disease Control and Prevention. Since being implemented in 2019, the system has seen a 14.4 percent decrease in average morphine-equivalent daily doses. 

3. Informing Populations about COVID-19 Vaccine Distribution

Researchers at Johns Hopkins University have created a centralized data dashboard to help people with disabilities find out when they are eligible for COVID-19 vaccinations. The dashboard provides information about vaccines for people who fit in one of four categories: those in longer-term care settings such as senior-living facilities, those in group-homes, those with chronic conditions, and those with intellectual or developmental disabilities. The dashboard visualizes how states are prioritizing people and is accessible to policymakers looking to improve their state’s current plans.  

4.Assessing Environmental Conditions within Cities

Researchers at the University of North Carolina at Chapel Hill have developed a tool to assess environmental conditions in individual neighborhoods using satellite data and demographic information. The team applied the tool to 164 cities and found that environmental hazards, such as air pollution and dangerously high summer temperatures, disproportionately affect lower-income neighborhoods, where there tends to be fewer dedicated green spaces with trees and plants that can absorb carbon emissions in the air and to offer shade to lower temperatures.  

5. Predicting Earthquake Hazards Using Region-Specific Data

Computer scientists at the Southern California Earthquake Center have found a way to understand earthquake hazards without empirical data. Traditional modeling methods use observed data from earthquakes in any region to create statistical models about potential future earthquakes, but because they are not location specific, they can lead to inaccurate estimates about the hazard in a specific area. Instead, the team is using a new method that relies on  both region-specific data and general data. Researchers believe this model will be particularly useful in countries like New Zealand that has geological features surrounding faults that are unaccounted for by empirical data. 

6. Using AI to Discover Different Compounds for Treating Fibrosis

Insilico Medicine, a biotechnology company in Hong Kong, has used AI to parse through medical literature to find a new drug to treat idiopathic pulmonary fibrosis (IPF), a rare condition where the lung becomes scarred, making breathing difficult. Scientists first examined a database of proteins to identify those involved in the scarring process and then identified the genes involved in regulating these processes. They then applied AI to research papers, clinical trial data, and patent and grant applications to see if the identified genes had been used in processes related to IPF. 

7. Monitoring Street-level Floods

Researchers from New York University, the Science and Resilience Institute in New York, and the City University of New York are using real-time flood sensors and communication networks to create a data portal on local flooding patterns. The sensors detect the location and depth of flooding, and have already been installed in the Gowanus neighborhood in Brooklyn and Hamilton Beach in Jamaica Bay, which often experience flooding from the overflow of storm water sewers. The researchers hope that the data portal can be integrated into larger flood monitoring programs so that residents can receive timely information without having to call New York City’s flood watch program. 

8. Aiding Mental Health Diagnoses with Machine Learning

Researchers at the University of Birmingham in England have used machine learning to better identify patients with a mix of psychotic and depressive symptoms. Historically, clinicians diagnose patients with a primary illness of either psychosis or depression, which causes them to prescribe treatments that only address the symptoms of one but not the other. To help clinicians understand the characteristics of patients with mixed symptoms, researchers trained a model to build a disease profile of patients with symptoms of both illnesses using imaging data and questionnaire responses from 300 patients. Clinicians used these profiles to compare how accurate their diagnoses were, and found that on average, patients with mixed symptoms were more likely to be prescribed treatments for depression than psychosis.

9. Outsourcing Human Players in Video Games

Researchers at Uber AI Labs have developed a set of reinforcement learning algorithms that are better at playing classic video games than human players or other AI systems. Unlike previous algorithms that lost at games when they encountered data that they were not trained on, the new algorithms took screen grabs as they played and would try a different approach whenever they found themselves losing, by referring to previous grabs. The researchers used the algorithms to play 55 Atari games, a standard video game used for testing, where it beat typical AI systems 85.5 percent of the time. 

10. Gig Workers Gathering Data to Ensure Pay Accuracy

Drivers and bikers for Uber, Lyft, and other firms in the gig economy have gathered their own data on the trips they make to help workers spot pay discrepancies. Using tools like UberCheats, a Google Chrome extension that automatically calculates what the shortest distance on a delivery trip should be, workers can compare if the distance matches what the app pays them for. Workers are also pooling their data from several apps on tools such as Driver’s Seat Cooperative to help them decide when it is most profitable to sign onto a particular app. The city of San Francisco has also bought their data in an effort to learn more about gig work operations.

Image credit: Andreas M

You may also like

Show Buttons
Hide Buttons