20-Minute Leaders“We really are trying to make the world a more transparent and genuine place.”
“We really are trying to make the world a more transparent and genuine place.”
A lack of transparency around content online affects every person every day, says Dan Brahmy, co-founder and CEO of Cyabra.
A lack of transparency around content online affects every person every day, says Dan Brahmy, co-founder and CEO of Cyabra. People tend to think of disinformation as being mostly around politics and government, but he says bad actors have gotten tangled up with everything we consume and skew many of our decisions with manipulation we should care about. He and his co-founders wanted to improve transparency with Cyabra, which Brahmy explains is like a filter for online conversations. Rather than fact checking, their product currently checks the author and propagation of written content to give users more information about what they read. While four years ago, many people agreed that fake news was an issue, now they are asking for a solution, he shares. Meeting that need with a successful business plan is important, Brahmy says, but his goal is to make the world a more genuine and transparent place.
We live in a dangerous world. It's difficult to understand what's real and not real. You're trying to make sense of that with Cyabra. Let's start at a high level understanding where we're at today and what are some of the big challenges that you decided to create Cyabra to solve.
I stumbled upon the mission statement of Cyabra four years ago very randomly and very luckily. I got to know a good friend of mine who served 13 years in the army within the Israeli Special Operation Command. He was telling me he ran a huge information warfare department within the Israeli forces. He said he would like to try and solve a huge issue and that there is no truth, there's only falsehood and there's only propaganda. I said, "I don't really know what you're talking about, but it sounds like we're all going to be affected by this. Whatever we do, we have to do it together." I was not an expert in that field four years ago, but I got very lucky to be accompanied by my incredible co-founders who are coming from this information warfare field.
We were able to look at the current state of the internet. When you think about it, there is no real transparency. When you and I are reading something, we do not understand if this has been propagated by a real, a bad, or a fake author. Are we facing someone with a specific intention, with a specific agenda? We need to understand the snowball effect.
When we created Cyabra four years ago, we said, "We're not doing fact checking." Because I think there's no genuine way of automating the process of fact-checking in an unbiased manner. But on the other hand, there might be a way to do what we may call the author checking and the propagation checking: are we facing a real, bad, fake person, and how much of a snowball effect is being created upon ourselves as the people consuming the information? That's been the mission statement for Cyabra. We're like a filtering mechanism for online conversations.
When most people talk about the internet, one of the first words that will come to their mind is the promotion of transparency. There is no filter. But your definition of transparency takes us one level deeper, the transparency behind not just the content itself but what is this content representing and whose view is it. We make assumptions that may not be true about the author and their intentions.
I think you nailed it perfectly. When I speak about transparency, I think there's the flip side of what you said, it's when we're looking at the really big and large social platforms out there. Our feeling and our experience makes us feel like there's bad and fake people that are writing shit on the internet, and the lack of transparency means that social platforms, in a sense, are being compensated by those mechanisms of viral propagation. The people who are inflicted by this, it's us, consumers. That's why we're trying to solve the problem. We're all on our phone typing, swiping, listening, and watching stuff all day long. So we're all being affected by this every second of the day.
Yes, of course we're affected. But who cares about the fact that there is no transparency and this viral propagation and the screwed up incentive system for the stakeholders? When you're looking to make a successful business and make a positive impact on a large scale, how do you rationalize through who cares about this problem?
We really are trying to make a dent and make the world a slightly better place or a more transparent and genuine place. We are not curing cancer. But, nevertheless, I could call what we're trying to solve an online illness.
So who cares about this? First of all, as people, we should care about who is trying to skew our opinions. My opinion can be reflected in what kind of cereal I am going to be eating this morning; tomorrow it could be about the midterm elections. We did an analysis about Johnny Depp and Amber Heard and the crazy volumes of inauthentic activities with what they were doing. It's really funny because it surrounds celebrities and athletes. There's some sort of misconception that disinformation and fake news might only be related to political parties and governments. But this is such a tiny part of the problem. The bigger problem is those bad and fake actors found a way to skew every decision and get entangled with everything that we see, everything we consume.
What you're touching on is actually a real issue that needs to trouble everybody. I'm really curious then about how you make a business out of it. You need capital and you need to show that this is actually something that can be very, very profitable.
First of all, the reason why we're here is because we were able to show the worthwhileness and the technological advancements, thanks to the fundraising that we've conducted. A lot of incredible investors are on our side and enabled us to build what we believe will become this authentic search engine for online conversations.
How do we make money? It's really easy. We sell a SaaS product, which works exactly that way, as a search engine. People pay for a volume-based kind of subscription. We usually work with the larger organizations, food and beverage, PR, consumer-oriented brands in the world, and some high level public sector agencies, like the US State Department. We did not reinvent the wheel with the business model. We reinvented the wheel with the approach on what information authenticity sounds like and feels like.
For stakeholders, companies, different people who are supporting this mission, do they understand the potential?
Four years ago when we started the company, it was a blue ocean, but it felt like a blue pond. The need for education was so high. We were talking to people, like in early 2018, with no product, just knocking on doors. People in mid, late-2018, they weren't laughing at us. They were saying, "Oh, yes. Fake news. Disinformation. But it's not really for us." Four years later and you see the shift in conversation. We see people coming to us. We see CXOs of huge corporations coming to us and saying, "We understand that it's about to happen and that we are all susceptible to this crazy shift.” Now people don't question the need; people ask for a solution before it blows up.
How do you think through and even have the ability to create these mechanisms and these solutions for the different platforms or the different modes of communication: text, audio, video?
We started with the most common one, which is written content. We know that from a vision standpoint, acting as a filtering mechanism means that we need to have an answer for every type of medium and for every type of publicly available platform so that nothing falls through the cracks. For the last four years, we focused on written content.
The next step for us within six to 12 months is focusing on the transcription from the visual content into written content. It's to say, "We do know that this is a picture or a frame within the video of Joe Rogan. And this is how it's combined with the written content talking about Joe Rogan." Audio is a whole new world; there's a lot of falsifications around audio. We've just started the research for this. But I think you'll hear more about the audio solution from our perspective probably within the next 18 to 24 months.
Michael Matias, Forbes 30 Under 30, is the author of Age is Only an Int: Lessons I Learned as a Young Entrepreneur. He studies Artificial Intelligence at Stanford University, is a Venture Partner at J-Ventures and was an engineer at Hippo Insurance. Matias previously served as an officer in the 8200 unit. 20MinuteLeaders is a tech entrepreneurship interview series featuring one-on-one interviews with fascinating founders, innovators and thought leaders sharing their journeys and experiences.
Contributing editors: Michael Matias, Megan Ryan