Biometric data is unique and permanent. You can change your password, but not your face or voice. This is precisely what makes biometrics fundamentally different from other types of personal data. Moreover, due to the rapid rise of AI, biometric applications are becoming increasingly versatile and more complex than traditional identification methods. As a result, the risk of errors and bias increases.
Biometric systems analyse personal characteristics, such as faces, voices or gait. Precisely because these characteristics are immutable, biometric applications increase the risk of prejudice, discrimination and unjustified conclusions. Think of false identification, exclusion of certain groups or unwanted surveillance.
At the same time, European legislation is becoming increasingly strict. The EU AI Act, the GDPR and additional guidelines from supervisory authorities are causing a shift towards a sharper focus on individual rights.
Biometrics has evolved into a question of fundamental rights: privacy, autonomy, equal treatment and human dignity. Organisations that anticipate this can build trust and strengthen their competitive position.
The use of biometrics offers various advantages. 'One of the most important advantages is speed', says PwC expert Bram van Tiel. 'At an airport entrance, you normally have to queue, show your boarding pass and often confirm your identity. That takes time. If your biometric identity has already been linked to your boarding pass and passport in advance, you can walk straight through after a facial scan. That saves time and is also more efficient for the organisation, because it can process more passengers with fewer staff.'
Van Tiel continues: 'Organisations that have implemented biometric systems have often gone through a long process before they could use them. You can't just start using them. A common mistake is starting from the technology and only thinking about governance and privacy implications later. The most important advice: work out the business case in advance and assess in advance whether biometrics is necessary and proportionate.'
'Ask yourself what problem you want to solve. Why do you want to do this? To differentiate yourself, reduce costs or improve user convenience? Can you achieve the goal with a less intrusive method? And do the advantages outweigh the investments and risks? Ultimately, it's about assessing the biometrics business case in advance from multiple perspectives: financial, reputation, customer engagement, technological, ethical and compliance/regulatory.'
Organisations must first:
By establishing governance early, you prevent costly adjustments afterwards and reduce legal risks. In this way, you make compliance part of your design rather than an obstacle. Van Tiel: 'Securing biometric data requires special attention, because everyone's biometric profile is unique. Someone can easily misuse and replace a code, password or access pass. But not your voice, fingerprint or iris. You therefore have to protect these extremely well.'
Legislation is also a guiding factor in governance. The most important laws are the GDPR and the EU AI Act. The AI Act prohibits some AI-driven biometric applications, such as real-time remote biometric identification in public spaces. Biometric categorisation based on personal characteristics also falls under this ban, as does emotion recognition in the workplace or in education. In addition, the EU considers large-scale collection and scraping of facial images for databases to be high risk.
For permitted high-risk biometric AI systems, strict requirements apply. Organisations must ensure:
Supervisory authorities in the Netherlands and the EU adopt a strict approach:
If your organisation does not meet these requirements, you risk enforcement action, fines and reputational damage.
Adequate governance does not stop at implementation. As an organisation, you must carefully manage biometrics-related personal data throughout the entire data lifecycle. You must strictly limit retention periods to the operational purpose, and you must establish procedures for data withdrawal in the event of data breaches or misuse.
Logs and metadata also fall under privacy legislation. These can be traceable to individuals and therefore deserve the same protection as primary biometric data. Transparency towards users about what data is processed, for what purpose and how long it is retained is essential for responsible data management.
Biometrics is not an IT project, but a management responsibility. For responsible use of biometric applications, focus is needed on:
'Biometrics is developing at lightning speed', Van Tiel concludes. 'Legislation is keeping pace and that's no coincidence. Where technology makes more and more possible, organisational responsibility grows. Can biometrics be deployed everywhere? No, certainly not. Caution is required. But organisations that apply biometrics responsibly are building trust, brand value and future resilience. In that sense, companies must now have the courage to use it within the legal frameworks.'