As founder of All Tech Is Human and a member of the Content Advisory Council at TikTok, David Ryan Polgar has been well versed in the ethical issues that technology such as AI and social media can bring. Following an initial background in law, as well as previously working in education, Polgar has been writing about responsible tech and the impact of social media and technology since 2012, and from this experience, he has seen a contrast in social behaviour between the offline and online world.
“For example,” Polgar explained, “in the offline world, when it comes to housing, discrimination based on race, gender and other immutable characteristics doesn’t tend to happen. Yet on somewhere like Facebook, messages and advertisements can be curated based on those characteristics, and that always seemed like a car crash waiting to happen.
“Early on, I saw a need for people such as attorneys and sociologists to get involved in the space, because I think it’s a misnomer to view the tech industry as solely the domain of technologists. Technology is now affecting how we live, learn, and even die.”
With misinformation and hate speech, among other harms, continuing to rise online, it has become more important than ever to ensure that content hosted by social media platforms are properly managed before these issues get out of hand.
“I think a lot of social media companies are trying to create a delicate balance between promoting free expression and limiting harms,” said Polgar.
“All of these platforms are realising the influence they have on how we communicate with one another and receive information.”
Responsible technology: how can the tech industry become more ethical?
Networking for a better future
To meet the need for increased diversity in the tech sector worldwide, Polgar founded All Tech Is Human, a non-profit network for individuals and companies across civil society, government, and industry, in 2018. The organisation looks to encourage wider discussions about how tech and social media can be more responsible and ethical, and growing the responsible tech pipeline.
Expanding on his work at All Tech Is Human, Polgar commented: “We need to have change both from the outside and the inside. We’re looking to bring together a diverse range of stakeholders who act as these interlocking parts of the power structure.
“It’s important to bring together tech leaders, tech workers, policymakers, academics, advertisers and media. These groups often stay in their own siloed worlds, so a lot of what I’m doing here is creating a space for knowledge-sharing and collaboration.”
Tech companies, for example, are able to find new ways to decrease misinformation, by gaining insight from experts in the space via the network.
The organisation also looks to create pathways for academics with ideas on mitigating problems, allowing them to reach out to tech companies and policymakers, increasing the long-term impact they can make on social media and the tech industry.
“It may seem like a simple idea, but can be a massive help,” said Polgar. “Changes can happen because tech companies, policymakers, or the media can gain insight on what’s happening at the ground level.”
Addressing regulation and misinformation
Polgar shared his insight during the recent Women in IT Summit USA, Part 1, participating in a discussion with Allie Brandenburger, CEO of TheBridge, about regulation in tech.
“One of the big takeaways from that discussion for me was that these issues around tech need input from the public,” he commented.
“At All Tech Is Human, we recently released a report on improving social media, and after interviewing a diverse sample of 42 individuals from across civil government, government and industry, we realised that we don’t have an agreed future as to where social media should be headed.
“This showed that we need more input from diverse groups to determine a better forward action.”
The All Tech Is Human founder went on to identify data extraction as the biggest issue regarding the power of social media, due to most outlets being based on a model of obtaining user data, which benefits advertising over apps.
“The fact that social media practices are more geared towards advertisers than communication creates most of the problems we see,” Polgar continued.
“This is where regulations are important. These platforms are trying to maximise their profitability inside the parameters of legality.”
According to Polgar, while tech companies need to consider the need to crack down on misinformation around topics such as Covid-19, the other side of the coin manifests itself in the argument that social media outlets don’t have the moral authority to remove these posts from the platform.
“The issue that social media platforms have is that they are under pressure to be more proactive around tackling misinformation, but often don’t have the authority to make those decisions, because the general public can feel uncomfortable about the immense power they have,” Polgar explained.
“That creates this Gordian Knot of ethics, and shows why self-regulation when it comes to tech companies doesn’t work.”
Overall, when it comes to taking responsibility for ethics in tech, Polgar believes that individuals, industry and government should work in tandem to create a more responsible environment and future. While governments should, according to Polgar, “move from a reactive state to a proactive state”, citizens need more engagement, empowerment and education on the issues that tech can bring.
“Unfortunately, there is no silver bullet when it comes to building a better tech future,” said Polgar. “We all need to be involved in the process, and right now, we’re not. That needs to change.”
See also: The biggest trends in digital ethics – as digital technologies become more sophisticated, intuitive and powerful, their growing impact on society brings the pressing issue of ethics.