Three DKU papers presented at human-computer interaction conference

Four research assistants at Duke Kunshan University presented their projects on human-computer interaction (HCI) to an academic conference in Indonesia.

Kaiyuan Lou, Weicheng Zheng, Qingyang He and Hongni Ye from the DKU HCI Lab attended the Chinese CHI 2023 in Bali as the co-authors of three papers and also served as conference volunteers.

Their research touched on themes including the ethical use of generative artificial intelligence and challenges encountered by autistic children learning about privacy.

Lou, Zheng and He, who are undergraduates in DKU’s Class of 2024, and University of Hamburg doctoral student Ye, collaborated on the projects with researchers from Tsinghua University, the University of California, Berkeley, and Virginia Tech.

Zheng and He, who study data science and media and arts respectively, saw their paper awarded honorable mention at the conference. They said the experience opened their eyes to the possibilities within the field.

“Scholars in the HCI field allowed us to gain insights into its diversity, as well as its underlying problems that need to be identified and solved,” they added.

Held on Nov. 13-16 last year at the Tsinghua Southeast Asia Center, UID Bali Campus, the conference is organized by the International Chinese Association of Computer Human Interaction (ICACHI) and fosters connections between Chinese HCI scholars, academia and industry. The theme of the conference was “Generative. Reflective. Envisioning”.

“At the conference, we saw many remarkable works by big names and learned about subjects we had never explored before,” said Lou, who majors in data science.

“This conference allowed me to see more interdisciplinary possibilities in HCI and also to consider more thoroughly about future research directions.”

Ye said, “We were delighted to meet researchers and scholars from different regions and countries who are dedicated to their respective fields of study. It was a fruitful experience.”

Led by Dr. Xin Tong, assistant professor of computation and design at Duke Kunshan, the DKU HCI Lab has researched areas including digital human construction and VR/AR games.

Read the authors’ introductions to their three papers below:

“Exploring Designers’ Perceptions and Practices of Collaborating with Generative AI as a Co-creative Agent in a Multi-stakeholder Design Process: Take the Domain of Avatar Design as an Example”

Qingyang He, Weicheng Zheng, Hanxi Bao, Ruiqi Chen and Xin Tong

With the wide application of generative artificial intelligence (AI) tools, designers’ workflows are undergoing a drastic change, especially for designs that engage multiple stakeholders, such as investors and audiences. However, we had no idea how designers view and collaborate with generative AI as a co-creative agent. In order to investigate the topic in depth, we selected the domain of static graphic design as an example and conducted a qualitative interview study with 21 professional avatar designers with varied experience and professionalism in using generative AI tools. The research found that designers frequently struggled with whether to regard AI as a co-creator that takes into account the interests of all stakeholders. In addition, during the realistic co-creation process with the AI system, they also faced many challenges, such as constantly adjusting AI outputs during the iteration and generation process and searching for design inspirations. Based on these findings, we summarized the knowledge and creation patterns of collaboration with generative AI and suggested potential paths for future co-creation between designers and AI from both technical and ethical perspectives.

“I Keep Sweet Cats In Real Life, But What I Need In The Virtual World Is A Neurotic Dragon: Virtual Pet Designs With Personality Patterns”

Hongni Ye, Ruoxin You, Kaiyuan Lou, Yili Wen, Xin Yi and Xin Tong

Virtual Pets, as virtual agents designed to provide companionship and emotional value, display users’ emotions, personalities, emotional appeals and aesthetic preferences to a considerable extent. In the study, we designed the virtual pets’ visual appearances via the Five Factor Model (FFM) for clustering and analysis. Meanwhile, we designed special generative neural networks using Neural Cell Automata (NCA) to generate a large number of new voxel models based on limited samples. In two user studies, we collected feedback from 33 and 47 participants respectively on varied pet models. Experiment 1 shows that the voxel-based models have significant advantages in agreeableness, which is more in line with users’ expectations for virtual pets; in Experiment 2, users demonstrated preference for the AI-generated models, and the data show that some features of the models are closely related to users’ specific perceptions.

“RedCapes: Design and Evaluation of a Game Towards Improving Autistic Children’s Privacy Awareness”

Xiaowen Yuan, Hongni Ye, Ziheng Tang, Xiangrong Zhu, Yaxing Yao and Xin Tong

Autistic children experience unique challenges in privacy education due to their peculiarities in social communication, physical self-awareness, and difficulty in understanding abstract concepts, while these developmental disabilities have not been taken into consideration in the creation of past research tools. Our research aims to fill this gap by studying the effects and challenges of learning privacy for autistic children through the design and evaluation of an interactive prototype for privacy education called RedCapes. We observed 9 children with autism and 6 children under normal development during the game, tested their learnings of privacy-related knowledge (percentage of correct answers in pre- and post-privacy awareness learning tests, completion time, and so on), and gathered data from questionnaires and parent interviews for analysis. As a result, we found that RedCapes helped improve the privacy awareness of children with autism, but they were still challenged in understanding and expressing privacy concepts. Based on these findings, we explained implications for designing future privacy education games for children with autism, highlighted their special needs in learning privacy, and provided beneficial advice on future privacy education designs for autistic children.

If you are a journalist looking for information about the University or for an expert to interview for a story, our team can help.

Media Contact

Senior Editor/Writer

Gareth McPherson


Add our