26.04.2024
#bayernkreativPORTRAIT: Eva is an enthusiastic feminist, scientist, co-founder and known for her favorite topic feminist AI. She brings her feminist perspective to consulting, research and lectures. Her focus is on the topic of power - how it is distributed in companies, politics and society and how we can change this power imbalance. As a doctoral candidate in business informatics at the Friedrich-Alexander-University Erlangen-Nuremberg in the Business and Human Rights doctoral program, Eva is researching how artificial intelligence can be made more feminist. On the other hand, Eva is co-founder of enableYou - a young organizational and IT consultancy that enables organizations to achieve more love, meaning and growth with a focus on Future Leadership, Future Organization and Future Skills. In this interview, Eva tells us more about feminist AI (a movement and brand of enableYou), her work and the impact of artificial intelligence on the cultural and creative industries ...
You set up the feminist AI think tank together with Andreas Kraus at enableYou to steer the discussion in a new direction with a feminist perspective on AI. What does "feminist AI" mean to you and why is this perspective so important?
For me, feminist AI is a great opportunity to make our world a fairer place. Currently, most AI systems reproduce existing power structures, prejudices and stereotypes. We can even say that AI is currently making our world even more unjust - and we can't afford that! But we can also develop and use AI differently. We can do this in a feminist way. It is important to note that by feminism, I mean an intersectional and inclusive feminism that strives for justice for all marginalized people. So it's not just about women. Feminism means critical questioning, transformation and a change in power. This is how we can change the world: AI is used everywhere today and feminism has already made our world decisively fairer. If we bring this together, we can prioritize and automate diversity, justice and the focus on marginalized people through AI. Our vision is to make the world a fairer place with feminist AI.
Through the feminist AI Academy, you offer workshops for companies to raise awareness of the ethical use of AI. Can you give us a little insight into what such a workshop looks like for you and what participants can expect?
We are currently holding workshops with different focal points: 1) responsible and feminist development and use of AI, 2) fair AI prompting and 3) the development of feminist AI features. Our workshops are usually divided into three components: 1) knowledge through a keynote speech and examples, 2) discussion and reflection on your own use cases and 3) solutions and best practices for your own company. Depending on the focus of the workshop, we integrate practical phases such as Fair AI Prompting. The first step is to raise awareness of the blind spots and stereotypes we all have and the opportunities and risks of AI. That's why we usually start with a keynote on AI, its relation to power and feminism, and a series of examples of AI use cases and unfair AI. These are crucial to draw attention to the far-reaching risks of unjust AI. This is followed by a question and discussion session and a reflection phase. The aim is for participants to understand and reflect on where AI is used in their company and which people may be (negatively) affected by it. Then it gets more practical: we work together with the participants on their AI use cases and highlight the cases in which AI is susceptible to discrimination. In the last part, we talk about possible solutions. The experience of the participants is key here, as no two companies and no two AIs are the same. We will talk about how we can minimize the risks of AI and provide best practices for the feminist use and development of AI.
Supposedly objective and neutral AI systems that are already in circulation and used by companies and authorities have already been criticized for reproducing social prejudices and stereotypes. How exactly is this possible? What do you think companies should look out for when using AI systems, for example in recruitment?
Unfortunately, there are a whole range of examples of unfair AI, from recruiting, lending and pricing to monitoring and court decisions. Since AI is already being used almost everywhere, it is already influencing countless aspects of our lives. AI reproduces existing power structures because it is developed and deployed by a certain group of people; because it learns from past data and prioritizes prevailing values in its design. All of this usually happens in a capitalist context and with the purpose of automating existing processes. Unfortunately, neither the people who make decisions about AI nor those who train and develop it are usually very diverse. For the most part, it is a fairly male, fairly white and fairly privileged group. In addition, the data contains existing and past stereotypes and prejudices and marginalized groups are often underrepresented or even absent (e.g. gender data gap: women are strongly underrepresented, non-binary people are usually completely absent). In design, additional priorities are then placed on features that are often not considered by marginalized people. AI is power. It is based on data and produces data. It costs a lot of money and it creates a lot of money. This power often lies in the hands of a few privileged people and institutions.
We should not wait until AI is deployed before we start developing it. At the very beginning of the process, it is crucial to ask the question WHY AI should be used. What is the purpose? In most cases, it is not a good idea to simply automate existing processes if they involve people. This is because they were usually burdened by prejudices and power inequality in the past. If we automate this without reflection, things can't get any better. We have to start with people: We need more diversity, e.g. by including a diversity panel. We need to start with the data and ensure a more balanced representation. And we need to make the design feminist and not focus the features on the privileged. If we do that, then we have a good chance of developing and using more feminist and equitable AI.
AI is being used in many areas, including increasingly in the cultural and creative industries (CCIs). What consequences does this have for the CCI?
I think that generative AI in particular has and will have a major impact on the CCI. However, AI has already had an influence on them for some time. Platforms such as Spotify, which provide unlimited databases of music, have changed the behavior of users and the power of the music industry. It is now about appearing in the right playlists and being suggested by the AI. The first few seconds of a track are particularly important, otherwise users may click straight on. Tracks have therefore become shorter, have fewer intros and need to be convincing from the very first moments. Art has changed in order to be/remain successful. We see similar phenomena in the visual field. There is a flood of images and videos today and it is becoming increasingly difficult to stand out, which is why filtering and personalization algorithms are becoming more and more important. So one point is to adapt our creative process to AI. On the other hand, generative AI is now being added, which makes logo design, photography/image generation and text production much easier for non-experts. But it also opens up new possibilities in terms of quality and speed for professionals. I think it will give an advantage to professionals who are now learning skills to use AI in their processes. However, it will not replace creative work, but change it. This brings many opportunities, but also risks: if we ourselves use AI as a source of ideas, this can also lead to results becoming increasingly uniform. So again, a lot depends on WHY and HOW.
How can a feminist perspective help to develop ethical guidelines for the responsible use of AI in the cultural and creative industries and beyond?
A feminist perspective takes into account the needs and realities of marginalized people. It thinks inclusively and intersectionally and does not want to reproduce what is there, but to transform it. I think this is crucial for the guidelines we want to set for a responsible approach to AI. If we don't set a feminist direction, then our future won't be feminist either. What we build today will influence the next generations. So it's time to think about the future in a more feminist way today! The view of disadvantage, power structures and those affected - and how we can change injustice - should be considered in every decision, especially when it comes to AI. If we do this, then we have the chance of a fairer future.
More and more people are calling for more state regulation in the development and use of AI systems. Do you believe that legal regulations are sufficient to meet the challenges posed by AI?
I think that legal regulations can be a first step and that they are key, because we can see that it won't work without them. However, I also think that we as a society must work to make AI fairer, because we are all affected by it. We must demand from both politics and business that training data is transparent, that the purpose of AI is clear and that it is ensured that AI makes our world fairer and not more unfair. We have a right to expect AI to benefit us all, not just the privileged. Companies also have an important responsibility. They should not see laws and human rights as a maximum to be achieved, but as an acceptable minimum. It should be much less about finding loopholes than about pursuing the purpose of companies. They should be concerned with enriching a large part of society with good products and services and not just themselves.
Last but not least: What developments in the field do you hope to see in the future?
I hope that much more AI will be developed and used for feminist purposes, because then we can make the world more feminist with a new power.
The Bavarian cultural and creative industries are vital, cooperative, polyphonic and relevant to the future. We introduce you to Bavarian players. What is their business model? What drives them?
Would you also like to answer a few questions and be part of our "bayernkreativPORTRAIT" campaign? Then send us an email to kontakt(at)bayern-kreativ.de with the keyword "bayernkreativPORTRAIT".