Making life a little easier after loss
It’s difficult after losing a limb to become accustomed to prosthetics. It’s obviously very different than the limb a person lost, one that naturally and easily responds to what the brain tells it to do. With innovations in prosthetics, however, ease of use is on the rise. By using machine learning, researchers are able to significantly smooth out prosthetic movements and better interpret brain signals. Take a look at how machine learning aids prosthetics here:
Recreate that look for me, will you?
Some people are nervous about big tech companies taking over the clothing industry. For instance, what if Amazon could (1) use computer vision to take your measurements based on pictures you upload, (2) send a picture of an article of clothing you want and your measurements to a tailor who then makes custom clothing for you, and then of course (3) ships the item to you in record time. It’s possible, but there are a lot of potential hiccups in this process.
However, one company, Original Stitch, has recently improved an aspect of this potential future supply chain. It’s employing computer vision and ML to predict your body shape and measurements from a couple of pictures. It’s created an algorithm that uses regular pictures you’ve snapped of yourself to estimate your measurements within one inch of accuracy. And, unlike some of the company’s competitors, you don’t even have to get into spandex to take the pictures.
Evacuating people more effectively
It’s a nightmare for officials and people living nearby when a nuclear power plant accidentally leaks radioactive material. Officials move quickly into emergency mode, evacuating people from potential danger zones as fast as possible. But this is a difficult process when there are large numbers of people to shift. One group of researchers at the The University of Tokyo Institute of Industrial Science have developed a machine learning tool that can help accurately predict—30 hours in advance—where radioactive material will land.This can aid officials as they decide which areas to focus on. Watch the simulation of a nuclear power plant leak in Tokyo here:
Unique of the week
In the last newsletter we showed how Nvidia used deep learning to turn regular videos into slow-motion videos. This week, they’re at something new. They’ve developed a program that can take watermarks off photos. They can also take very grainy photos and “denoise” them to produce higher quality photos. Watch it work here:
- AgShift uses ML and deep learning to assess food quality and determine an objective price
- Researchers use ML to determine the success of a schizophrenia treatment for patients
- Doctors, using ML, may soon be able to conduct quick in-office blood tests
- ML is as good as, and sometimes better than, animal studies in predicting chemical toxicity
- This tool is making it harder for emails that impersonate other people to get through to businesses
- New whale-language dialects discovered by machine learning
If you want to know how causality will revolutionize artificial intelligence,Judea Pearl’s The Book of Why: The New Science of Cause and Effect may give you some insights into that. The book seems to be making waves in the world. While we haven’t read it, it sounds like it’s worth checking out.
Are you or someone you know having trouble naming a baby? If it’s a girl, here’s a tool that draws from US Social Security data to display baby name trends and create new names with AI based on your selection. The AI will even create Harry Potteresque names for you.