Woman in tan sweater.

When Lingyao “Ivy” Yuan walked into a social psychology seminar during her Ph.D. program, she encountered an image that would shape her career: three rocks on a table, each with googly eyes. One was large, another medium-sized, and the smallest sat in front, completing the trio. Without thinking, Ivy saw more than stones—she saw relationships, and her mind declared, “This is a family.”

That moment sparked a question that still drives her research today: If we naturally see human qualities in simple objects, what happens when technology creates something that looks and acts almost human?

Ivy earned her Ph.D. in Information Systems at Indiana University, with a minor in social psychology. When it came time to choose a dissertation topic, she brought four ideas to her advisor. One stood out—not because it was safe, but because it was risky. “I said, ‘I really want to do anthropomorphism. You’ve never done this before, but hear me out,’” she recalls. Her advisor’s response sealed the deal: “You look more excited when we talk about this one.”

Anthropomorphism—the tendency to attribute human characteristics to non-human entities—has long been studied in sociology and religious studies. Ivy wanted to explore it in the context of information systems: Why do we yell at a computer when it crashes, or treat a virtual assistant as if it understands us? That curiosity became the foundation for her career.

After joining Iowa State in 2015, Ivy’s research evolved from studying anthropomorphism to investigating “digital humans,” highly realistic virtual representations that can be controlled by either human intelligence or artificial intelligence. Her early work tackled the “uncanny valley,” a theory suggesting that as a virtual agent becomes more human-like, our sense of affinity increases—until it suddenly drops into a valley of discomfort when the likeness feels eerie. Ivy and her collaborators tested whether that valley could be crossed by pushing realism further.

In 2017, she helped stage a groundbreaking experiment at SIGGRAPH, a major visual arts conference. “We had a real-time rendering se

Host avatar (left) and sample guest avatar (right) at the 2017 SIGGRAPH conference.

tup where my collaborator appeared as a virtual agent with less than a second delay,” she says. “It took 20 engineers behind the scenes to make it happen.” Back then, creating a floating, shoulder-up avatar was expensive and complex. Today, companies can pay $30 for a subscription to generate lifelike avatars—a shift Ivy predicted years ago.

Her current research builds on that foundation but goes deeper into the ethical and psychological dimensions of human-AI interaction. One project examines “moral patiency”—whether humans should behave morally toward digital beings.

“It’s not about whether AI should act ethically,” Ivy explains. “It’s about whether we, as humans, should hold ourselves to a moral standard when interacting with

Actual host and guest at the same conference. Photo credit: Lingyao (Ivy) Yuan.

something that looks and feels human, even if we know it’s not.”

This question touches on profound issues: If someone shouts at a digital assistant or abuses a virtual companion, does it matter? Could those behaviors spill over into real-world relationships? And where do we draw boundaries when

digital humans resemble vulnerable populations, like children?

Another line of research explores disclosure and trust. If companies openly state that an interaction involves a digital human, does that change how people respond? “Sometimes you can’t help it,” Ivy says. “Even if you know it’s artificial, your unconscious response might still treat it like a human.”

Her team is also studying applications in healthcare and customer service, including whether digital celebrities can influence behavior. In one study, they created a virtual version of actor Hugh Jackman to deliver skin cancer prevention messages—a cause he personally champions. The results suggest that people are more receptive when the message comes from a familiar face, even if that face is entirely virtual.

In 2023, Ivy achieved two major milestones: tenure at Iowa State and publication in Harvard Business Review. Her article outlined four types of digital humans—virtual agents, assistants, companions, and influencers—and argued that virtual companionship is the next frontier. “Imagine someone with Alzheimer’s seeing a younger version of their spouse in a virtual environment,” she says. “Or a soldier experiencing PTSD finding comfort in a judgment-free digital therapist. These scenarios aren’t far off.”

A woman sitting at a piano and a young boy holding a violin.
Ivy and her son enjoy making music together.

Despite her futuristic research, Ivy’s personal life is grounded in creativity and wellness. She’s a certified yoga instructor and has taught classes in Ames. Since 2018, she has taken piano lessons, and the hobby has become a family affair. Her husband plays violin, and both sons study music, turning their home into a mini concert hall. These pursuits aren’t just hobbies; they were lifelines during postpartum depression. “Yoga and piano helped me cope,” she says. “Now they’re bonding moments with my family.”

From googly-eyed rocks to virtual companions, Ivy Yuan’s work bridges psychology, technology, and ethics. As digital humans become part of everyday life, her research ensures we understand not just what these agents can do—but what they mean for human connection.