Jules Polonetsky: CEO of Future of Privacy Forum; Co-Founder of Israel Tech Policy Institute

“It is not for Silicon Valley leaders or tech startups to determine the future of our democracy”

Norms are being smashed by private tech companies and governments have finally decided to catch up.

In recent years, there has been a reckoning on private companies that have long played games with our private data. Without regulation or oversight, they collected more and more information on billions of people - all to help attract and retain their attention to sell them ads.
That much we know. The lesser-known consequences of those actions taken in the early days of the social media era are unfolding today, with governments and non-profit organizations finally playing catch up to make sure citizens have the right to privacy when it comes to AI and data online. These are actions, led primarily by the European Union, that are shifting the power dynamics from tech giants and governments back to the people.

1 View gallery
Jules Polonetsky FPF
Jules Polonetsky FPF
Jules Polonetsky: CEO of Future of Privacy Forum; Co-Founder of Israel Tech Policy Institute
(Photo: Courtesy)

“Europe has just about done a giant regulatory structure for AI,” explained Jules Polonetsky, CEO of Future of Privacy Forum and the Co-Founder of Israel Tech Policy Institute. “The consequence is that companies, even startups at an early stage, need to be sophisticated about what the regulatory rules are, or they will have no business.” Citing the four-year anniversary of GDPR and following laws in some of the U.S states that followed, Polonetsky praised the work being done by governments that finally understand the pressures of regulating tech companies.
Whether or not the federal government will join some of the early states is unknown. If it does it will kick off a wave of compliance and regulatory activities like we've not seen before. “Many Israeli companies obviously look to the U.S as one of their first markets and if we actually have a strict regulatory structure, it’s really going to kick off a lot of work, effort, then compliance by companies that are pushing the edge sometimes on how they use data.”
Polonetsky visited Israel earlier this month for Cyber Week, where he helped connect a group of leading privacy lawyers, top privacy academics, and the chief privacy officers of companies like Apple and eBay together to discuss these issues. As CEO of Future Privacy Forum, he helps highlight the need for data privacy online. Before that, he was the Chief Privacy Officer at AOL making sure companies obeyed compliance laws in data collection.
“Us privacy folks are pressing the security people who, if it was up to them, might monitor information in ways they think are needed - and where some of us in the data protection world may want to draw boundaries,” he told CTech. “We don't want a world of 100% surveillance, even if it would make us safer… Privacy addresses the people with access to data and says ‘what are you doing with it?’”
In the mind of the Future Privacy Forum, there needs a balance of making sure governments or companies can protect citizens, while also “giving us some room to be stupid, talk personally, or to express ourselves without worrying that we are being monitored.” The issues can affect anything from adtech - having a chat with friends and then seeing adverts for those topics - to more nefarious things like blackmail, discriminatory behavior, or invasion of the right to privacy.
Companies themselves are trying to get ahead of these challenges by setting their own rules - something that might have unintended repercussions. For example, when Apple set its own rules about what data apps can use to allow them on their App Store, it can result in a “death penalty” for companies that accidentally violate the rules. “It is quite a potent way of regulating,” Polonetsky admitted.
Governments around the world have finally realized that these issues are too important to be left to the companies to have the final word. Companies that fight to remove unspeakable content like terrorism or child pornography may be noble, but slippery slopes occur when privacy is invaded and social media companies like Facebook or Twitter can determine objective outcomes to subjective experiences.
“If you're going to kick off a president, or if you're going to determine what is a hate crime, that's not up to a Silicon Valley executive, that is up to the democratic process,” Polonetsky continued. “Europe is leading the way with strict penalties for what kind of content needs to be taken down. Oversight processes, at the end of the day, are probably a good thing. It is not for Silicon Valley leaders or tech startups to determine the future of our democracy.”
Governments taking control of data management and AI applications might appear to be a strong first step in regulating rights and freedoms, but more headaches occur when the political poison of partisanship is put into the mix. Conservatives and liberals often disagree on what qualifies as free speech, and tech companies in the past have quickly silenced voices that expressed certain political opinions or promoted information about Covid-19 that turned out to be false. The lines of freedom and privacy are becoming a harder tightrope to cross.
“We need to be cautious that we don't create tools that allow governments to censor their opponents, or make decisions without due process themselves,” he told CTech. “At the end of the day, it is about having a democratic and moral process for balancing the conflicting rights and obligations of individuals and society.” For the folks at the Future of Privacy Forum, the laws of data protection from governments and tech companies mean one thing: power.
“They are the right for me to delete my data or the right for me to ask when people have on me,” Polonetsky added. “The obligation for you to only collect data if you have a defined purpose, and only use it for those purposes. You can see how these are things that constrain powerful entities - the big companies, government players that have your data - by putting shackles, structure, and purpose around what they can do with the data,” he concluded.