Autonomous underwater robot hunter will take on the invasive lionfish

Twenty-five years ago, lionfish started invading coastal waters in the Americas, and the species is now causing damage to the local ecosystems from Venezuela, through the Caribbean, and up the East Coast of the United States. Since most of these fish are genetically similar, the theory goes that a few private fish collectors dumped their fish in the water, and now we have a problem: these fish can lay 30,000 eggs every five days, local prey aren’t scared of them because they still don’t recognize them as predators, and lionfish have no natural predators in these foreign waters.

But humans could become their main predators because these fish are tasty—and they sell for up to $20 a pound at upscale restaurants, when scuba divers can get to them. However, there aren’t enough scuba divers fishing for them, and these fish hideout in spots much deeper than humans can go. Enter stage right: an untethered, autonomous, underwater robot powered by machine learning aims to hunt these fish. Students from Worcester Polytechnic Institute are training the robot to use computer vision to recognize what a lionfish is and then to run it through with a spear.

Want to dance like Michael Jackson?

This application has been all the rage recently. Researchers at UC Berkeley have developed a way to copy the physical movement of one person’s body and superimpose it on another person’s body, making it look like they are dancing the same dance moves as someone else. It’s pretty impressive, and you should definitely watch the video here if you haven’t already:

I see your screen

Creepy. That’s exactly what this new application of machine learning is. A team of researchers from several universities, including the University of Pennsylvania, Columbia University, and Tel Aviv University, has recently added a new remote surveillance possibility to the growing category. With the right tools, someone can capture audio through, say, your Google Home to identify a surprising amount of information about what’s being displayed on your LCD screen. It’s all done by sound. With each pixel load emitting a specific sound as your page refreshes, machine learning translates what’s on your screen with some degree of success by decoding processing sounds.

Extras

  • Want to become a better data analyst? Learn Tableau through this training academy from Lightpost Analytics, a company supporting this week’s newsletter.
  • Do you or someone you know suffer from type 1 diabetes? Machine learning products are improving the tech available to sufferers. Listen here to learn more.