This week’s list of data news highlights covers May 15, 2021 – May 21, 2021 and includes articles about understanding phenomenons in space with supercomputing and managing city parking.
Researchers at Stanford University have used AI and a brain-computer interface (BCI) device, which detects brain signals that indicate a person’s intended movements and translates them into commands, to enable a man with full-body paralysis to communicate via text. To do so, the researchers placed BCI chips on the man’s head to detect neural signals that dictate hand movements and used AI to interpret these signals into typing motions on a computer screen keyboard. With this method, the researchers were able to generate 18 words per minute, which is only 5 words fewer than the average person typing on a smartphone.
Researchers at the University College London have used supercomputing to understand why gas-like outbursts from the Sun, such as solar winds and solar flares, cool slower than expected. This is important because hot solar gases carried along winds disrupt communication from satellites and transportation systems. The researchers simulated how solar winds stretch from the Sun to Earth and found that the winds stayed hot due to magnetic lines breaking and reconnecting, which causes releases of energy and maintains heat.
Researchers at the University of Rhode Island have developed a sensor that mimics how dog noses detect bombs by identifying substances in the air down to their part-per-trillion. During pilot tests, the researchers placed the sensors on drones to identify a sample target substance and other ambient air particles, and transmitted the data wirelessly through an Internet platform. With this method, the researchers believe that law enforcement officials can replace sniffer dogs, which have limited attention spans and are unable to provide qualitative information about the abundance of a substance in a bomb.
Sidewalk Labs, an urban innovation company in New York focused on sustainability, has launched Pebble, a vehicle sensor that cities can use to manage parking and curb availability. Cities place the sensors on parking spaces to detect whether a vehicle is present and then relay this information to a cloud platform using Wi-Fi. So far, the company has tested Pebble in the San Francisco Bay Area to determine optimal parking prices at public transportation stations based on space availability.
The National Health Services (NHS) in England is using AI tools to address the backlog of patients that were sidelined due to COVID-19. Currently, 4.7 million people are awaiting treatment, 387,000 of whom have been waiting for over a year. One AI tool, Healthy.io, analyzes photos of urine sample test strips for patterns to assess kidney function. With Healthy.io, the NHS has been able to reach 500,000 patients with diabetes to set up in-person appointments based on their test results.
Sumitomo Rubber Industries (SRI), a Japanese tire and rubber company, will use Fugaku, the world’s fastest supercomputer, to simulate different rubber materials at the particulate level to enhance how it develops vehicle tires. With Fugaku, chemists will simulate the molecular behavior and chemical changes of rubber materials to improve the technology the company uses to prevent wear and tear of tires over time.
Scientists from the National Science Foundation, a U.S. government agency that supports research in science and engineering, are using computer vision and supercomputing to identify infrastructure vulnerable to damage by natural disasters. First, the scientists used computer vision to identify different types of structural support on buildings, such as roofs, windows, chimneys, and columns of buildings, from images captured by satellites and Google Maps. Then, the team used a supercomputer to simulate how earthquakes and hurricanes will impact structural support. When the team applied this technique to buildings affected by the 2020 hurricane in Louisiana, the supercomputer modeled which roofs withstood wind and water damage with 90 percent accuracy.
Researchers at the University of Michigan Ann Arbor and the National Cerebral Cardiovascular Center in Japan have used machine learning to predict a patient’s risk for cardiac arrest using data about daily weather conditions and timing, such as the hour of the day and whether it was a public holiday. The team trained an algorithm to predict the out-of-hospital cardiac arrest risk for 525,000 patients based on either weather conditions, timing, or both. When tested on a separate set of 136,000 patients who had previously experienced a cardiac arrest, the algorithm predicted that Sundays, Mondays, or public holidays in combination with low temperatures or sharp temperature drops within and between days, strongly increased their risk for future episodes.
The United Kingdom has collaborated with Vaisala, a Finnish company that focuses on environmental and industrial measurements, and Yotta, a U.K. software company, to collect and analyze data about road conditions. Engineers have been able to collect geospatial video data about highways from security cameras using Vaisala’s RoadAI program and have used Yotta to assess the videos for any road hazards that require repairs, such as potholes or fading division lines. By combining data and AI technology, regional urban planners can conduct maintenance more routinely rather than on a yearly basis.
Google has developed an AI-powered dermatology tool that uses the camera of a smartphone to help users identify issues with their hair, skin, and nails. Users launch the tool from a smartphone app and take three pictures of their hair, skin, or nail concern from different angles. The app then asks questions about the user’s concerns, such as how long they have had the issue and their symptoms, and uses AI to analyze how closely the user’s information matches 288 known dermatological conditions. When used by clinicians, the tool increased the likelihood of identifying a patient’s correct condition by 20 percent.
Image credit: Andre Tan