Home BlogWeekly News 10 Bits: the Data News Hotlist

10 Bits: the Data News Hotlist

by Hodan Omaar
by
Mekong River

This week’s list of data news highlights covers April 17, 2021 – April 23, 2021 and includes articles about spotting plastic in the Mekong river and identifying conspiracy theories about COVID-19 with machine learning.

1. Inspiring New Recipes with AI

Researchers from Sony AI have partnered with Korea University to develop FlavorGraph, an AI tool that maps different ingredient combinations and recommends to chefs the most complementary ingredients according to their taste, aroma, and flavor profile. FlavorGraph uses information on more than 1,500 flavor molecules and how they have been used in past recipes to predict how well two ingredients will pair together. The company hopes to incorporate this tool into a recipe creation app that chefs can use to design new menus.

2. Keeping Cities Clean with Robots

The city of Helsinki, Finland is piloting the use of Trombia Free, a robot street sweeper developed by the city to clean city streets. Trombia Free is a fully electric autonomous street sweeper that consumes 85 percent less energy than conventional street sweeping machines. The sweeper is so quiet that it can be used to sweep streets at night, which means it hinders traffic as little as possible. With Trombia Free, the city hopes to meet its goal of carbon neutrality by 2035.

3. Making Flying Safer and More Efficient

Government agencies and airlines are training AI models on flight data to make flying safer and more efficient. The U.S. Federal Aviation Administration (FAA) has partnered with Aireon, an aircraft tracking and surveillance company, to monitor and analyze flight reports for anomalies on Boeing 737 Max flights, which caused two deadly crashes in October 2018 and March 2019. AirAsia from Malaysia is using AI to analyze flight arrival and departure delays so that it can better adhere to its business model of planes spending no more than 25 minutes at an airport gate.

4. Understanding the Pathology Alzheimer’s Disease

Researchers at the Mayo Clinic in Minnesota have analyzed genomic data to better understand the pathology of Alzheimer’s disease. In one study, researchers used an algorithm to classify brain patterns associated with Alzheimer’s using brain tissue samples that were donated to the Mayo Clinic Brain Bank. Using this model, researchers were able to narrow down the genes of interest for the disease from 50,000 to just 5.

5. Improving Urban Planning on University Campuses with E-Scooters

Dublin City University is working together with TIER, an e-scooter operator, and the Insight SFI Research Center for Data Analytics, a data analytics center in Ireland, to deploy e-scooters equipped with cameras on the university’s campus. For this pilot project, the scooters will record footage of sidewalks and streets and researchers will apply computer vision technology to detect how many pedestrians are on a footpath, road, or cycling lane. With this information, campus researchers hope to create traffic congestion alerts, monitor road conditions, and reduce collisions between cyclists and pedestrians at problematic intersections.

6. Launching an AI Supercomputer in Slovenia

Slovenia has launched its new supercomputer Vega, which has been specifically built to develop solutions for scientific and the public sector problems that use machine learning, AI, and high-performance data analytics. Scientists plan to use Vega to explore new medicines and therapies for cancer and Alzheimer’s disease, identify molecules for breakthrough drug treatments with AI, and track infection rates of different diseases. Vega is the first in a series of eight planned high-performance computing centres in the EU.

7.  Studying the Effects of Climate Change using Satellite Data

Researchers are using PlanetScope, a dataset of high resolution NASA satellite images, to study the effects of climate change on frozen areas of Earth, such as the Arctic, mountain glaciers, and Antarctica. One study has used PlanetScope to track the melting of the Greenland ice sheet. By applying an algorithm to calculate the depths of water around the ice sheet, they found that this ice sheet is one of the largest contributors to sea levels rising. Another study led by researchers at the University of Washington has used PlanetScope to create digital elevation models that detect changes in elevation due to the flow of glaciers, avalanches, and season snowmelt.

8. Addressing Plastic Pollution in the Mekong River

The United Nations Environmental Program (UNEP) has partnered with Google to create machine learning models that more accurately determine how plastic pollution enters the Mekong River, which is one of the ten rivers that together contribute 95 percent of plastic discharge into ocean waters. The models will use community-sourced, annotated images to identify and map locations where plastic waste most frequently enters the river, helping local governments take more targeted actions against plastic leakage.

9. Identifying Conspiracy Theories about COVID-19 with Machine Learning

Researchers at the Los Alamos National Laboratory, a U.S. Department of Energy (DOE) national laboratory, have used machine learning to identify four different COVID-19 conspiracy theory themes. The researchers used a dataset of around 2 million tweets that contained COVID-19 keywords to build machine learning models that recognized informational patterns in written content and classified them according to similarity. With the models, the team found that the four most widespread conspiracy theories included the idea that 5G cell towers spread the virus, the Bill and Melinda Gates Foundation created COVID-19, the virus was developed in a laboratory, and that COVID-19 vaccines are dangerous.

10. Building an Eco-Friendly Supercomputer for Climate Forecasting

The Met Office in the United Kingdom, which provides the national weather service, has collaborated with Microsoft to build an eco-friendly supercomputer specifically for climate forecasting. The supercomputer will run entirely on renewable energy and has the ability to compute 60 quadrillion calculations per second. The supercomputer will be used to provide more detailed weather models, create different weather scenarios, improve local forecasts, and better predict severe weather.

Image credits: Flickr user

You may also like

Show Buttons
Hide Buttons