New paper on voice UX
Just published! “Unboxing Manipulation Checks for Voice UX” (Interacting with Computers) https://doi.org/10.1093/iwc/iwae062 #VoiceUX #HCI #research #methodology
Katie Seaborn, Jun Kato (AIST), and Mayu Koike (Tokyo Tech) were awarded the JSPS Grants-in-Aid for Scientific Research B (KAKENHI Kiban B) for the project Kawaii Vocalics: Modelling Reactions to “Cute” Voice Phenomena in Interactive Agents.
https://kaken.nii.ac.jp/grant/KAKENHI-PROJECT-24K02972/ #jsps #grant #award
Takao Fujii and Katie Seaborn (Aspire Lab) alongside co-author Madeleine Steeds (UCD) received the #CHI2024 Honorable Mention award (top 5%) for “Silver-Tongued and Sundry”! https://programs.sigchi.org/chi/2024/awards/honorable-mentions #chi #chi24 #award #intersectionality #japan #pronouns #chatgpt #hci #design #research #llm
Selecting sites for academic conferences is a key issue in the SIGCHI community and beyond. Dr. Katie Seaborn and Adrian Petterson (The University of Toronto) have co-authored possibilities for going forward. https://medium.com/p/071173b3c34e #acm #sigchi #hci #chi #chi24 #whychi #siteselection