I am currently a Human-AI Researcher. In the past, I was a full-stack developer, and I aspire to become an entrepreneur in the future. My present research interest are human/community-AI interaction. Specifically, Designing interactive visualization systems that enhance AI's transparency and controllability for programmers; and Developing social computing systems that can derive insights to lay the groundwork for fostering the programmers-AI community.
I am pursuing my Thesis-based Master's in Computer Science at the University of Waterloo, mentored by Dr. Jian Zhao of the WatVis Lab at UWaterloo. In the past, I was guided by Dr. Zhicong Lu from DEER Lab and Dr. Can Liu of the ERFI Lab at CityU HK. After obtaining my Master's, I would like to continue on a PhD journey, continuing my research in Human-AI Interaction & Social Computing.
I am open to collaborating on research projects and actively seeking a PhD opportunity.
We explored the workflow of collaborative natural language programming and designed a system to support prompt sharing and referring.
This work explores literature from Cognitive Science to synthesize relevant theories and findings for the HCI audience to reference. Thanks to wonderful supervisor Can Liu and co-authors, Brinda Mehra and Kejia Shen.
I'm coming to the CHI23 conference at Hamburg, German in 2023. I will present my work on the collaboration between human and code generation models. I will also present my work on moderating online social media and fostering the sense of community.
Wizundry is a WoZ platform that allows multiple Wizards to collaboratively operate a speech-to-text based system remotely. Our findings reveal the promises and challenges of the multi-Wizard approach and open up new research questions.
We have identified three major challenges and proposed three decision-making stages, each with its own relevant factors. Additionally, we present a thorough process model that captures programmers' interaction patterns.
A system that assists programmers by enabling hierarchical task decomposition, incremental code generation, and verification of results during prompt authoring. Bridging the abstraction gap between programmers and LLMs.
A system to support collaborative prompt engineering by providing referring, requesting, sharing, and linking mechanisms. It assists programmers in comprehending collaborators' prompts and building on their collaborators' work, reducing repetitive updates and communication costs.
A narrative-based viewer participation tool that utilizes a dynamic graphical plot to reflect chatroom negativity. We discovered that StoryChat encouraged viewers to contribute prosocial comments, increased viewer engagement, and fostered viewers' sense of community.
Exploring speech input in HCI, we address editing challenges. Our study combines Cognitive Science with HCI, revealing memory patterns and proposing new interaction concepts for efficient speech editing.
A real-time, web-based WoZ platform that allows multiple Wizards to collaboratively operate a speech-to-text based system remotely. Our findings reveal the promises and challenges of the multi-Wizard approach and open up new research questions.
May 2021 - Jan 2022
Oct 2021 - April 2021
We have developed a patented sensor for measuring dissolved oxygen in the ocean and used its fast and real-time nature to build AI network, which is dedicated to monitoring water quality in real time and predicting the health of the ocean within three months.
Sep 2020 - May 2021
I am part of a new team at Networld, nearD, a social networking site with a focus on privacy, multi-identity, and locality.
Jan 2023 - Present
June 2021 - Aug 2021
Sep 2018 - June 2022
I love to read therotical papers about interface design and HCI design principles.
Reification turns concepts into first class objects, polymorphism permits commands to be applied to objects of different types, and reuse makes both user input and system output accessible for later use.
Demonstrational interfaces, interfaces that let the user perform actions on concrete example objects while constructing an abstract program, thus letting the user create parameterized procedures and objects without learning a programming language, are discussed.
The seven-stage interaction model consists of (1) Establishing the Goal, (2) Forming the Intention, (3) Specifying the Action Sequence, (4) Executing the Action on the System's Interface, (5) Perceiving the System's State as a Response to the Action, (6) Interpreting the State, and (7) Evaluating the System State with respect to the Goals and Iterating until the goal is achieved.
The paper argues for a shift from interface design to interaction design as the means to significantly enhance user interfaces. It calls for the development of powerful interaction models, a better understanding of sensory-motor aspects, and novel interaction architectures addressing key challenges like reinterpretability, resilience, and scalability.
Instrumental Interaction describes graphical user interfaces in terms of domain objects and interaction instruments. Interaction between users and domain objects is mediated by interaction instruments, similar to the tools and instruments we use in the real world to interact with physical objects.
Direct manipulation has been lauded as a good form of interface design, and some interfaces that have this property have been well received by users. This article delves into the cognitive aspects of direct manipulation interfaces, examining both their advantages and disadvantages.
Why do people create extra representations to help them make sense of situations, diagrams, illustrations, instructions and problems? The obvious explanation—external representations save internal memory and computation—is only part of the story.
Layout constraints in a user interface toolkit provide a declarative mechanism for controlling the size and position of objects in an interactive display, along with an efficient update mechanism for maintaining display layouts automatically in the face of dynamic changes.
This paper explore the concept of "interaction" that lack of clear definitions in the field. It identifies various existing concepts, such as interaction as dialogue, transmission, optimal behavior, embodiment, and tool use. These concepts vary in scope and their understanding of the causal relationships between humans and computers.