In this paper, we introduce the "Garden Alive" system that allows users to interact with an emotionally intelligent virtual garden by manipulating tangible user interfaces. The Garden Alive aims to provide both entertainment and education. The proposed system is composed of three components. In the first module, tangible user interfaces bridge to the garden in a virtual world. In the second module, an artificial intelligent module consists of two sub-modules: an evolution module and an emotion module. The third module is the virtual garden, which displays growth and the reactions of virtual plants. As for the user interfaces, user's hand gestures are recognized by cameras, the amount of light measured by light detection sensors, and the position of water detected by water detection sensors. Based on this sensed information, the artificial intelligent module determines the evolutionary direction of the plants and changes their emotional state. Our Virtual Garden has adapted the L-system so the virtual plants grow in a similar manner to real plants. In this proposed "Garden Alive" system, there are several kinds of plants and each has different genes that is to individually unique. So we can observe plants evolving for generations by evaluating a proposed fitness function. Finally, we implemented intelligent plants which show emotional reactions according to interaction with the user, instead of simply reacting to stimulus. Therefore, the proposed system is applicable for use to both entertainment and education.
Teaejin Ha, Woontack Woo, "Garden Alive: An Emotionally Intelligent Interactive Garden," International Journal of Virtual Reality (IJVR), 5, 4, pp. 21-30, 2006.