Ann-Kathrin Goßmann

Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB
ann-kathrin.gossmann@iosb.fraunhofer.de

Handling data

The recent successes of artificial intelligence (AI) have mainly been achieved through data-driven AI methods such as machine learning or deep neural networks. These methods rely on extensive data. In work and learning systems, such as those that are the focus of KARL, personal data is often processed. Data protection is therefore mandatory and requires the IT security of the processing systems.

Relevance

In AI-based systems and business models, data is not only the basis for functionality and for generating added value for users and customers. Rather, data is also a business basis that is absolutely worth protecting and a responsibility imposed by data protection towards the data subjects, insofar as personal data is processed. Work and learning systems used in Europe based on data-driven AI methods must be considered within the legal framework of the European General Data Protection Regulation (GDPR), labor law (BetrVG) and, in the future, probably a European AI regulation. Sustainable solutions take these regulations into account not only to avoid sanctions in the event of data protection violations, but also because they can become a European locational advantage, especially in the wake of the wave of AI success. Raising awareness of the basics of data protection is of great importance in order to create comprehensive protection for those affected (employees) and "protection" for the responsible bodies (employers).

Objective

KARL wants to examine the questions that companies need to ask themselves from the perspective of data protection and data security. The focus here is on the introduction of AI-based systems that process sensitive and, in particular, personal data. The eight KARL use cases are intended to provide exemplary answers.

Specifically, KARL aims to answer the following questions in the context of data protection and data security:
- Under what conditions should data protection be considered for AI-based work and learning systems?
- What responsibilities and obligations are associated with the processing of personal data and what special features does this entail for AI-based systems?
- What other regulations may be relevant for such systems?
- How can data protection be supported by technology design with regard to issues such as transparency and data subject rights?
- What special requirements do AI-based systems place on IT security?

Handouts for companies

- Workshop on data protection basics with a view to the special features of AI-based systems and specific regulations under the European AI Regulation (defining the purpose of data processing, listing personal data, risk identification)
- Practical concepts for the technical support of data protection issues (e.g. data protection-friendly data collection, transparent data processing)

Handling data

The recent successes of artificial intelligence (AI) have mainly been achieved through data-driven AI methods such as machine learning or deep neural networks. These methods rely on extensive data. In work and learning systems, such as those that are the focus of KARL, personal data is often processed. Data protection is therefore mandatory and requires the IT security of the processing systems.

Relevance

In AI-based systems and business models, data is not only the basis for functionality and for generating added value for users and customers. Rather, data is also a business basis that is absolutely worth protecting and a responsibility imposed by data protection towards the data subjects, insofar as personal data is processed. Work and learning systems used in Europe based on data-driven AI methods must be considered within the legal framework of the European General Data Protection Regulation (GDPR), labor law (BetrVG) and, in the future, probably a European AI regulation. Sustainable solutions take these regulations into account not only to avoid sanctions in the event of data protection violations, but also because they can become a European locational advantage, especially in the wake of the wave of AI success. Raising awareness of the basics of data protection is of great importance in order to create comprehensive protection for those affected (employees) and "protection" for the responsible bodies (employers).

Objective

KARL wants to examine the questions that companies need to ask themselves from the perspective of data protection and data security. The focus here is on the introduction of AI-based systems that process sensitive and, in particular, personal data. The eight KARL use cases are intended to provide exemplary answers.

Specifically, KARL aims to answer the following questions in the context of data protection and data security:
- Under what conditions should data protection be considered for AI-based work and learning systems?
- What responsibilities and obligations are associated with the processing of personal data and what special features does this entail for AI-based systems?
- What other regulations may be relevant for such systems?
- How can data protection be supported by technology design with regard to issues such as transparency and data subject rights?
- What special requirements do AI-based systems place on IT security?

Handouts for companies

- Workshop on data protection basics with a view to the special features of AI-based systems and specific regulations under the European AI Regulation (defining the purpose of data processing, listing personal data, risk identification)
- Practical concepts for the technical support of data protection issues (e.g. data protection-friendly data collection, transparent data processing)

Ann-Kathrin Goßmann

Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB
ann-kathrin.gossmann@iosb.fraunhofer.de