BYU CS Logo
Computing That Serves

Training Computers, Helping Humans

A trio of BYU profs plans to use top NSF award to revolutionize digital capabilities
 

Three BYU faculty recently received Faculty Early Career Development (Career) Awards, the National Science Foundation’s most prestigious award for junior faculty. Each awardee received a five-year grant to support both his research and accompanying educational outreach programs.

Computer science’s Ryan Farrell and David Wingate and mechanical engineering’s Oliver Johnson are all exploring the intersections of human and computer learning, hoping to expand the frontiers of both.

Snap a pic, ID that bug

A butterfly flits through your yard: mostly orange, with a few black segmenting lines and a smattering of white dots around its edges. You announce with smug conviction that it’s a monarch. But, your butterfly-savvy friend points out, viceroy and queen butterflies have a number of the same features, so can you really be sure?

In the near future, with help from an app that can tell you what you’re seeing based on a single uploaded image, the answer will probably be yes.

With his NSF award funds, computer science professor Ryan Farrell is hoping to create a platform capable of analyzing an image and offering both an expert-level ID and a range of other relevant information. At the core of the funded work, Farrell and his students are developing recognition algorithms that overcome key limitations of current approaches. He’s done and is doing similar research on birds, cars and even tigers, but for this project, he’s focusing on insects like butterflies and beetles.

One of his goals, Farrell said, is to “give people at their fingertips the ability to better understand the natural world around them.” Beyond that, possible applications extend to conservation biology, forestry and agriculture.

He’s starting by creating a dataset of 250,000 images (100 photos each of 2,500 different species of butterflies and beetles): the more images the platform has available, the better its ability to interpret subtle variances in a single uploaded image. Once the platform is released, other researchers will be better able to track what and how many species are where, accelerating additional research efforts.

Though he’s a computer guy, Farrell said he’s intrigued by the biological world. “I connect personally with the things I see in nature,” he said. “So it’s exciting all the prospects this research has.”  

Creating better materials with help from Google and videogames

Mechanical engineering professor Oliver Johnson is looking to better understand material structure and properties — using the math behind Google’s web-search algorithm and a social-media-based videogame.

This unconventional approach to tackling an engineering problem was enough to land him his second round of funding from the NSF in less than a year: in 2016 he received a grant for more than $400,000 for related research, and in June this year his Career Award funding will kick in.

For this project Johnson and his students are exploring grain boundary networks in polycrystalline materials, which include metals and ceramics. The grain boundaries are defects that form a network of interfaces between well-ordered regions of the materials, but their structures vary dramatically and can accordingly enhance the properties of materials or diminish them. Johnson hopes that by learning to more effectively engineer the structure of these defect networks (putting the right boundaries in the right places), researchers could in turn improve the performance of materials for advanced engineering applications.

As part of his research, Johnson will develop an online multiplayer videogame for high school and college STEM students. The purpose is twofold: First, it will allow students to help solve the puzzle to discover the best network for stronger, lighter and more energy efficient materials. Second, through observations of players’ methods, it will allow researchers to in turn improve future materials-design algorithms.

“Humans are really good at 3-D spatial reasoning. If I toss you a ball, you can actually solve a pretty complex problem very fast and know where to put your hand to catch it,” he said, adding that computers struggle with similar tasks. “So we’re combining those human 3-D spatial reasoning strengths with the ability of computers to perform rapid calculations — leveraging collaboration among groups of humans and between humans and computers to solve complex materials design problems.”

Hey Siri, can we simplify?

Artificial intelligence (AI), said computer science professor David Wingate, has the power to change the world — in big ways — for the better. Problem is, “things that humans find so easy are the hardest tasks for computers to do. How do we learn? How do we improve with experience? How do we overcome our mistakes? Why do we find some things funny?”

Training a computer to respond to the world around it in ways similar to a human brain is possible: think Siri and Alexa, self-driving cars, Google’s image search, and any range of robotics. But the training traditionally requires “mountains of data,” Wingate said, and mountains of data aren’t always available.

Using the NSF award money, Wingate and students in his lab are trying to reduce the amount of data required for effective AI. To do that, they’re employing Bayesian statistics, which, he said, “are a great model for how humans process info.” How it works, in a nutshell: “I have some beliefs about the way the world works, I see some data, and then I update my beliefs.” 

“So if humans are so efficient at this and computers are so inefficient at this, how can we take these ideas from Bayesian statistics and combine that with deep learning and try to get the best of both worlds?” he asked.

The possible applications for the research are wide and varied — ranging from drone functionality to low-cost medical imaging development. Wingate’s ultimate goal, he said, is to “not just make rich people richer, but solve problems that can really make the world a better place for everyone. If your life is a 9.3 on a scale of 1–10, maybe through the magic of machine learning we could make it a 9.4 and you’d have a better life. But with that same time and effort I could take someone’s life at a 3 and move it to a 5, and have a much greater impact. We’re constantly on the lookout for projects that could do that.”

Author: Andrea Christensen