Home BlogWeekly News 10 Bits: the Data News Hotlist

10 Bits: the Data News Hotlist

by Cassidy Chansirik
by
Amazon rainforest

This week’s list of data news highlights covers March 27, 2021 – April 2, 2021 and includes articles about fighting illegal deforestation in the Amazon and speeding up the drug review process with AI. 

1. Developing Underwater Wi-Fi

Researchers from the University of Science and Technology in Saudi Arabia have developed Aqua-Fi, an underwater system that can transmit Wi-Fi signals using lasers and semiconductor light sources. Aqua-Fi uses radio waves to send data from a diver’s smartphone to a single-board computer, which then sends the data to a semiconductor light source attached to a computer on land through light beams. To test the system, researchers transferred media from computers a few meters apart, and achieved maximum speeds of 2 megabytes per second, with an average delay of only 1 millisecond for each signal transmission.

2. Fighting Illegal Deforestation in the Amazon

Rainforest Foundation US, a nonprofit NGO focused on protecting environmental rights to land in Central and South America, is working with indigenous communities in the Amazon to use data to fight illegal tree logging. With satellite data, teams within indigenous communities are able to collect, monitor, and analyze photo and video evidence of illegal deforestation efforts. Furthermore, Rainforest Foundation US has integrated data into a smartphone app, which alerts the teams of suspicious activities, allowing them to more easily pursue charges with law enforcement. 

3. Detecting Hypoglycemia from Electrocardiograms

Researchers from the University of Warwick in England, Western University in Canada, and University of Napoli in Italy have used machine learning algorithms to detect hypoglycemia, which is low blood glucose levels, from electrocardiograms (ECG). The researchers trained the algorithms to recognize the occurrence of a hypoglycemic event from the sequence and pattern of heartbeats using ECG datasets. The researchers are hopeful that by recognizing the occurrence of a hypoglycemic event, machine learning algorithms could be trained to predict when a patient will have spiked blood glucose levels hours in advance. 

4. Detecting Tsunamis from Sensors on Cargo Ships

Researchers at the University of Colorado Boulder have discovered how ships with GPS sensors can help detect tsunamis before they occur. When an underwater earthquake occurs, it causes an elevation in the surface of the sea, which changes the pattern and height of waves. The researchers used real-life coordinates to simulate a ship in tsunami waters that produced different variations in sea surface elevation and wave velocity and discovered that in areas of high ship density, a 12 mile gap between vessels could provide enough accurate data to forecast and predict a tsunami 15 minutes before onset. 

5. Accelerating the Drug Review Process with AI

The U.S. Food and Drug Administration (FDA) is using an AI platform to improve the efficiency of reviewing adverse drug reports. The tool works by extracting up to 120 fields from safety reports of clinical drug trials into structured content that reviewers can digest more easily. Previously, reviewers had to manually analyze more than 2 million reports from before and after a drug went to market to find relevant information about adverse effects, which can consume up to 1,000 hours of their time each year. With the platform, reviewers can input fields for a specific drug and have more structured information in minutes.  

6. Using Machine Learning to Discover Underlying Genetic Factors that Cause Cancer

Researchers at Johns Hopkins University have used a machine learning algorithm to analyze the effect contributors to cancer, such as aging and smoking, have on an individual’s genes. Understanding how different contributors cause mutations in genes is important to cancer research so the team is using algorithms to better determine how less-known contributors to cancer affect the body. Their approach has found that although 69 percent of mutations found in cancer patients occur randomly, there are genetic effects of obesity in certain tissues that contribute to mutations that cause cancer. 

7. Finding Errors in Labelled Datasets Used to Test Machine Learning Systems

Computer scientists from the Massachusetts Institute of Technology have used confident learning, a machine learning approach that finds errors or irrelevant data in dataset labels, to examine the accuracy of the ten most-cited datasets used to test machine learning systems. Their study found that 54 percent of the data flagged by the algorithm was labelled incorrectly, such as mislabeling animal species and classifying music from a Bruce Springsteen as an orchestra. 

8. Growing Crops More Sustainably

Researchers from the National University of Singapore have used computational analysis and a supercomputer to determine how Asian vegetable crops can be grown more sustainably with fewer chemical fertilizers. The team used computational analysis to characterize the genetic material of the vegetables and then used a supercomputer to analyze and identify 300 bacteria species in soil that could provide nutrients, stimulate growth, and suppress pathogens for the vegetables based on their genetic material. 

9. Creating Living Robots that Have Memory Capabilities

Researchers at the University of Vermont and Tufts University in Massachusetts have engineered new xenobots, which are tiny synthetic organisms created from the stem cells of frogs that scientists can program to express certain behaviors. The researchers used a supercomputer to simulate different behaviors that the xenobots could exhibit based on different engineering designs. In their latest version, the researchers have given the xenobots a basic memory function, which they believe can help these organisms record the presence of drugs, pollutants, or disease conditions.

10. Improving Water Utilities in Cities

The cities of Tucson, Arizona and Newark, New Jersey are using AI to improve the infrastructure and function of its water utilities. In Tucson, the water department is using data on the history of pipe breakage to predict pipeline failures and alert staff about when to replace critical infrastructure. In Newark, the water and sewer department is collecting and monitoring water quality data to predict necessary water treatment and adjustments. 

Image credit: Conscious Design 

You may also like

Show Buttons
Hide Buttons