In the not so distant future, androids may be a part of our everyday lifestyle. There can be a desire not only to make these androids do our work, but also to enable communication with humans in a natural way. As a substantial part of human communication is through or body language, natural mimic is essential if artificial communication is to seem real. Even though this might appear like a trivial problem, there are a lot of obstacles to solve. To achieve physical human appearance, we have to develop artificial skin that looks and folds naturally. This is far from an easy task, as living tissue has totally different characteristics from synthetic materials. Secondly we need some form of artificial actuators, preferably situated in or behind the synthetic skin. Human muscles have some amazing properties that we are not able to mach yet. They are silent, strong, flexible, and precise, and last for millions of cycles. At last there is a need for a sensory system of some sort. Humans have an extremely advanced feedback system, providing information about factors such as pressure, temperature, pain etc. This feedback enables humans to make advanced decisions about their surroundings, and adjust their appearance accordingly. Even though all these factors were in place, natural mimic wouldn’t be achieved before the android developed adaptive behaviour of its face expressions. A robot could of course be pre programmed with a fixed set of face expressions, but this would undoubtedly restrict the personification of such an android. In order to achieve natural mimic and the impression of personality, it’s essential to make the android’s face expressions adaptive and to make the android learn and create face expressions never shown before. This master thesis addresses some parts of the problem of achieving adaptive facial behaviour, essential for making androids operate in social settings, and obtaining natural communication with humans.