Abstract
This study examines how emerging sensor-based technologies can augment the personality expression of digital characters across multiple media. While digital animation and games have traditionally relied on movement to convey traits, the integration of motion capture, wearable biosensors, and live coding introduces new opportunities for dynamic, embodied character design. Drawing on the MONOLOVE saga, we developed four prototypes across animation, games, interactive performance, and interactive networked environments. Central to our approach is the Wheel of Personality model, a structured taxonomy that organizes expressive parameters into four categories: Character Structure, Motion–Action, Interaction, and Environment. Each prototype was designed to explore how these categories, mediated through sensor technologies, contribute to the perception of personality traits. An evaluation with 14 participants from diverse backgrounds employed questionnaires and interviews to assess the alignment between intended and perceived character traits. The results show that movement and interaction were consistently identified as the most influential cues, while the impact of environmental factors varied across media. Additional influences included narration and the personality of the audience, underscoring the interpretive nature of perception. We conclude that personality expression emerges from the interplay of multimodal cues and context, offering methodological insights and frameworks for designing expressive and emotionally resonant digital characters in trans-media productions.