Robots were traditionally isolated entities inside factories, doing repetitive mundane tasks, usually at a higher speed and often with more precision than humans. However, robots with advanced human-robot interaction abilities, which could for example converse using natural language, possess advanced vision and affective capabilities, and might even fluidly collaborate and turn-take with humans, have started to arise. Such robots have been targetting a variety of roles; from companions, to tabletop assistants, to museum characters and beyond. In this lecture, after an introduction, we will have a look at case studies of four such systems, also focusing at interesting issues regarding how we can measure different aspects of user experience when interacting with robots, including verbal, affective, long-term (repeated interaction), and task-related aspects. Numerous avenues for utilization of such concepts & technologies towards non-physically-embodied systems (such as avatars or even body-less interactive agents) are also touched upon. Furthermore, a wider contextualization of this real-world topic is also covered.