This week’s list of data news highlights covers February 26, 2022 to March 4, 2022 and includes articles on viewing woolly mammoths with augmented reality and using an AI system to expedite court proceedings.
Researchers at Dartmouth College have created an AI system that can detect signs of emotional disorders from Reddit conversations. The system labels conversations with an emotion and tracks transitions between emotions in posts from the same user. The system can detect major depressive, anxiety, or bipolar disorders. The team trained the system with conversations from users with and without self-reported disorders.
Researchers at the University of Tokyo, Tohoku University, and Japanese IT company Fujitsu have partnered with the city of Kawasaki to use a supercomputer to simulate tsunami evacuation drills. The team will use the supercomputer to forecast flooding from potential tsunamis. Volunteers in the surrounding community will receive alerts according to the forecast and evacuate accordingly.
Walmart has launched a virtual tool that lets customers try on clothing remotely. Customers can choose one of 50 models that closely resembles their own body type. The tool then uses computer vision and AI to display how an article of clothing appears on that model. The company plans to update the tool to allow customers to upload their own photos.
Researchers at the University of Chicago have created a new method of combining atoms into an array in quantum computers. The team was able to build quantum systems with more quantum bits by placing atoms of the metals rubidium and cesium in an alternating pattern. In tests, the team built a quantum computer with a record-breaking 512 qubits.
The Supreme Court of the Philippines has announced plans to expedite court operations with an AI system. The Court will use the system to digitize rendered judgments and transcribe stenographic notes.
Scientists at the Max Planck Institute for Intelligent Systems in Germany have created a sensor that can measure the extent of its contact with external objects. The sensor contains reflective aluminum flakes that change color upon contact and a camera to capture the changes. An AI system and computer vision technology then determine the location of contact, the amount of force applied, and the direction of the force.
Instagram has added auto-generated captions to videos on its platform. An AI system will create the captions, which will be available in select languages. The company expects the quality of the captions will improve over time as the AI system learns.
The Orlando Economic Partnership, a public-private organization dedicated to business development in Orlando, Florida, has partnered with Unity, a U.S.-based 3D software company, to create a digital twin of the city. The organization will use the digital twin to enhance business development, urban planning, and climate resilience efforts.
Google has updated Soli, a radar sensor that uses radio waves to collect data on users’ movements, motions, and positions, to respond to non-verbal cues from humans. Soli can decipher intentions behind perceived actions or body language and can be integrated in household devices like Google Nest.
Researchers at the University of Southern California, the Natural History Museum of Los Angeles County, and La Brea Tar Pits have created new Snapchat and Instagram lenses that display extinct animals. The team built scientifically accurate models of 13 extinct animals, including the western camel, ancient bison, and woolly mammoth, from preserved specimens. They then used augmented reality to create the lenses, which let users view the animals in their surrounding environment.