How to align algorithms and ethical values?

Ruth Williams, Secretary General of the Austrian Foundations Association, discusses the intangible threats of digital bias and how philanthropy can act to create awareness and make sure that different voices are being heard

What if? Reimagining Philanthropy together 

By Ruth Williams, Austrian Foundations Association 

This new pan-European web series, designed by the #NextPhilanthropy group within the PEXcommunity, challenges the status quo of philanthropy in open discussions and many other ways. The inaugural event focused on biased artificial intelligence. 

Intangible threats

We are in the middle of very difficult times. 2020 and 2021 have been challenging for all of us – no matter where we live. However, the challenges were rather different, depending on the country you live in, the education you have received, the wealth you do or do not possess, whether or not you are employed. As a philanthropy support community, we saw foundations and other civil society organisations rushing to help those in urgent need. Food banks, financial support, housing. Foundations have also stepped up in supporting research and science on COVID-19. But, while we are all still in the emergency mode, waiting to get vaccinated and wanting to get back to some kind of “normal” life, there are many other topics out there, that are as dangerous, if not even more dangerous, to our society than the virus. Climate change is one of the examples.

“The fast development of artificial intelligence and how data negatively impacts most acutely groups that are already socially marginalised.”

There is another topic which is not tangible and, because of this, doesn’t appear urgent, but has a huge impact on all our lives – today and even more so in the future. The fast development of artificial intelligence and how data negatively impacts most acutely groups that are already socially marginalised. Considering that this development is hugely complex and understood by only a small group of experts, technology companies and decision-makers, emerging technology is full of potential to increase efficiency and optimise existing processes, but can also have dangerous biases for the lives of already marginalised groups. 

What if? 

During the inaugural “What if? Reimagining philanthropy” event, Benjamin Ignac, a Romani technologist, researcher, and activist from Croatia talked about how technology can be useful in many areas of life and shared his concerns about how it can be turned against marginalised communities like Roma people. A group that is already marginalised in so many ways – not only digitally.

“How do we, as a philanthropic community, ensure we take everybody along on this journey of digitalisation?”

We all depend on digital technology for education, work, social and medical services. How do we, as a philanthropic community, ensure we take everybody along on this journey of digitalisation? A journey that is happening, right now and developing very fast. A journey that should leave no one behind.

Access to technology is crucial for, well, everything. But how and which data is collected, how it is used, as well as algorithmic biases affect many lives. While many are concerned about data collection and privacy, the lack of data presents a challenge for the Roma community who are simply not represented in the digital domain. Therefore, it is hard to identify, understand and tailor programs that help solve their problems. This applies to many other marginalised groups. 

Data is also the lifestream for policymaking and Europe is a colour-blind society when it comes to AI and the bias it can bring. To give you an example: women or people with darker skin are more likely to be detected by facial recognition. This influences who is considered a potential threat by the police and subsequently often leads to predictive policing and sentencing. Certain urban areas become a common target for predictive policing, which leads to a downward spiral and only worsens the criminalisation rates.

How much control do we want algorithms to have?

It is people who teach artificial intelligence and train machines to make decisions – sometimes challenging ethical decisions. And these people pass on their biases to the machines. Algorithms are everywhere. They influence what you see in your social media news feed and so they impact the information you receive. They influence how you are rated on social scoring systems, meaning that you could be deprived of accommodation or medical services. They are used in the name of efficiency but are understood and developed by a narrow circle of people: experts, technology companies, policymakers who are often not diverse enough and do not necessarily understand the complex circumstances of the local communities, their needs, values and specificities. They bring problems to our everyday lives, that we are not prepared to deal with. 

Optimism – we need to change the rules

Philanthropy needs to live the value of optimism. We can change things. Step by step. One of the first steps is to speak out and raise awareness of the challenges that are still under the radar of decision and policymakers.

We need to empower decision-makers and public services to adapt to new policies. We need to invest our resources into developing ethical standards for algorithms, taking into account complex social realities, but also being aware of the historic injustices the world holds today. We need to create an ecosystem of better understanding. We need to shift policy-makers attention to the ethics in AI and the fact that all techs should be human-centred.

One way forward could be to strengthen the involvement of a wider range of members of society in the developing processes. To open up each other’s minds. To listen, engage and understand. To realise our own limitations – due to our backgrounds, our knowledge, the historic circumstances. We need to raise awareness. Human rights NGOs often do not deal with digital rights but have access to a wider range of stakeholders. Some companies and governments benefit from the biases in AI development. We need to be vocal about this and engage in a dialogue to create a win-win situation for all members of society. It is not the digitalisation that is the problem. It is the absence of a human-centred approach. 

“Philanthropy has a vital role to play, as it is driven by community voices and has access to civil society organisations and people who can talk about their experiences of discrimination.”

What are your thoughts? Join the conversation #nextphilanthropy.

Useful resources:

ruth williams c sima prodinger 2 2 1 edited e1621405146622 edited

Ruth Williams

Secretary General, Austrian Foundations Association

Other articles from Next Philanthropy