This time-lapse video from LSA’s Museum of Zoology takes the bat species Artibeus jamacanensis from specimen to display. The process might be a little stomach-churning, but then again, good science isn’t always mess-free.
As one of the largest university museums in the world, the Museum of Zoology is a crucial resource for use in research, conservation, and education. Studying animals such as Artibeus jamacanensis allows scientists to craft a tangible record of life on Earth.
Imagine building a mechanical character from a computer program that could calculate the size and positioning of gears in order to make that character’s motions come alive. The team at Disney Research have designed and programmed this interface for non-experts, and have 3D printed the resulting objects: Computational Design of Mechanical Characters.
The video presentation could definitely use a bit of background music or narration, but it’s still fascinating to watch and great for discussion. The content is also groundbreaking for automata, self-operating, moving mechanical devices that look like robots or living things.
Northern Arizona University’s Con Slobodchikoff, Ph.D., and his student-teams have been studying the alarm calls of Gunnison’s prairie dogs for over 30 years. The result: the prairie dog language has been somewhat decoded. Yes, we know what they’re saying when they yip and squeak!
The animals have word-like phonemes, combining those into sentence-like calls. They have social chatter. They can distinguish between types of predators that are nearby — dogs, coyotes, humans — and seem to have developed warnings that specify the predators’ species and size and color.
This video is a win-win because you get to watch prairie dogs (and their predators), all while learning about how we observe, analyze, and test to find out more about their sophisticated animal language. Be sure to read the interview with Slobodchikoff for more information…
Science fiction stories in which pilots control spacecrafts with their minds have become less about fiction and more science. A team of researchers at the University of Minnesota have developed the next step in thought-controlled vehicles. Watch this model helicopter fly through an obstacle course using brainwaves.
The aircraft’s pilot operates it remotely using a cap of electrodes to detect brainwaves that are translated into commands.
Ultimately, the developers of the mind-controlled copter hope to adapt their technology for directing artificial robotic limbs and other medical devices. Today’s best neural prosthetics require electrodes to be implanted in the body and are thus reserved for quadriplegics and others with disabilities severe enough justify invasive surgery.
"We want to develop something non-invasive that can benefit lots of people, not just a limited number of patients," says Bin He, a biomedical engineer at the University of Minnesota in Minneapolis, whose new results build on his previous work with a virtual thought-controlled helicopter.
A fascinating note: some would-be pilots could not provide clear thought commands during trial studies. Those candidates who meditated or practiced yoga had better focus and stronger mind-body awareness, allowing them to adapt to the brain-computer interface with less training.
Robots have to be able to move, perhaps quickly, on all kinds of terrain — search and rescue missions on remote parts of Earth or explorations on other planets like Mars will require it. So terradynamic researchers at Georgia Tech are creating and testing robots that have different leg shapes, all inspired by animals, to handle movement in a variety of environments. Bonus technology: 3D printing.