Emotion recognition technologies are expected to grow at a rate of almost 40% per annum, reaching an expected market size of 36.07 billion USD by 2021. However, there has been a paucity of academic discourse about the substantial risks associated with the technologies and legal solutions needed to overcome them. This thesis primarily explores biometric emotion recognition technologies that seek to avoid regulation by data protection norms by working in close to real time, and not storing any personal data. This paper challenges this viewpoint through a critical analysis of the concept of personal data in the General Data Protection Reg-ulation. It also argues that such technologies in any case amount to an interference to the hu-man right to respect for private life. However, it acknowledges that significant challenges exist in the field regardless of the theoretical legal outcome and acknowledges that data pro-tection may not in any case be the best mechanism to protect people from harm.