I see common ground between the two of you. It needs the human component also with artificial intelligence.
Clemens Suerbaum: Your introduction at the beginning, that the working world will be characterized by AI in the future, scared me a bit. I hope it will be characterized by more humanity.
I just read a ticker, they reported about an AI workshop. One of the projects mentioned was AI counting trees on aerial photos. So where currently people are walking through the area and laboriously making a tally. If you then find that out with the help of the image analysis by the AI, then that is also a great relief for the people who are working there. And the people can be used elsewhere, for example in tree planning or replanting.
Is that perhaps exactly the kind of graft, Ms. Steininger, that you mentioned earlier? You first have to put a lot of input into it, to think about what do I need? And then targeted there the AI on it to give perhaps also again a bit more freedom?
Rosmarie Steininger: I think that is very important. So what do I want to do and what of it do I leave to the AI? That combination is important. But also, what is the object of what I am actually looking at? If I'm counting trees, then I've done very well if I hit the right number, if I can also maybe determine the size or something.
I would just be very careful with actual AI based on probabilities or based on large patterns when it comes to people. Because a human is a single person with very individual preferences. There's such a big difference. People are not trees. It is important that you catch exactly what the person needs, wants and is able to do in the context of personnel selection.
On the subject of assessment centers or AI: of course, a person can manipulate or distort just as much as a deterministic algorithm or an AI. You have to take that into account and be sensitive to it from the start. The result should be normative and technically as good as possible.
Clemens Suerbaum: With these selection mechanisms, an important question is also where does the training data actually come from? How can you decide, for example? I just read an article about how AI can be manipulated in a targeted way: e.g., through a training as a service, more precisely a data training as a service. The data is given to a service provider who creates an algorithm. As an example: a credit allocation based on all kinds of data, such as address, age and income, and perhaps also the circle of friends. Now the service provider can manipulate the whole thing in such a way that if, for example, a small a is added after the house number, the loan is approved for the wrong person.
Such risks are not even on most people's radar. They think that this will make everything easier for them. This is where you have to pay attention because it's a new field. This approach to risk counts for both the human side and the business side. +
What is the situation here in Europe and Germany? The General Data Protection Regulation, the big issue, especially when it comes to personal data, is out front. How is it with AI? Masses of data are processed, including personal data, especially in your case, Ms. Steiniger.
Rosmarie Steininger: It depends very much on the context in which you use it. In the per-sonal area, there is so far practically no regulation. Such a regulation is now supposed to come with the new EU regulation, as personnel selection in it is a high-risk area.
I have the impression so far that some are not even interested in what happens in the background. With the buying companies I experience quite often that it is ultimately about a quick sorting out. Nobody pays attention to whether it makes sense or not.
Clemens Suerbaum: I'll make a rash judgment: I think that some things deliberately move in the gray to black zone. On the other hand, if you tell people beforehand for what purposes this data is, then they can agree or disagree. Then the data has been explicitly collected for that purpose. That's how CHEMISTREE does it, for example. It is a legal violation to take collected data without prior consent. However, this unauthorized use of data is a common practice because it is easy and creates benefits.