Skip to content
Search

Latest Stories

HELLO FUTURE: EPISODE 1: WHEN INFLUENCE BECOMES WEAPONIZED

HELLO FUTURE: EPISODE 1: WHEN INFLUENCE BECOMES WEAPONIZED


The biggest risk in the digital age isn’t fake content — it’s mistaking coordinated manipulation for real public sentiment. In this episode, Kevin Cirilli speaks with Dan Brahmy, CEO of Cyabra, about how governments, companies, and institutions can tell when online discourse actually matters — and when it’s being artificially steered. The conversation explores how influence campaigns are designed to provoke overreaction, why leadership judgment is now as important as technical detection, and how trust has quietly become a form of critical infrastructure in an era of constant narrative pressure.

Meet The Future: https://mtf.tv/


See omnystudio.com/listener for privacy information.

Speaker 1 (00:09):
We're living in a world where influence has increasingly become weaponized.
The biggest risk in the digital age isn't necessarily fake content.
It's actually mistaking coordinated manipulation for real public sentiments that
jumping from URL into IRL in real life. Hello Future,
it's me Kevin. This is a dispatch from the digital frontier.
The planet is Earth, the year is twenty twenty six,
and my guest today is someone who really lives on
the front lines and thinking about how government's companies and
institutions can tell when online discourse actually matters and when
it's actually just a bunch of bot trolls. His name
is Dan Brami, and he is the co founder and
CEO of Siabra. And Siabra it monitors all of this.
It does way more than just monitor it. Dan, welcome
to the program. Tell us what it does.

Speaker 2 (01:01):
Yeah, thank you for having me. Yeah, let me tell
you about the company. So we established a company about
eight years ago with one single goal in mind, which
is to allow governmental institutions and brands around the world
to be able to understand and restore trust and authenticity.
Because we see that everything that we watch, see and
read online now has become kind of a little blurred
in a way, and it's become much harder, which is
incredibly complicated. When you look at the informational landscape, it's chaotic,
it's confusing and vulgar. It's it's exactly it's it's vulgar
anxiety producing. Whether it's simple people like you and me,
it affect one hundred percent of the people that you
can that you can think of.

Speaker 1 (01:45):
There's all this AI slop, and I mean, I can
you know. I was just on the phone with my
mother back home in Delco and she's saying, you see
this thing on Instagram. This is before lent she gave
up Instagram for And I said, uh, Mom, that's AI.
That's AI. And so I can't even imagine dan what
companies are dealing with and what governments are having to
deal with. And then when you look at it through
the prism of hyperpartisanship and polarization or I mean, it's insane.
I don't even know how one even thinks to protect
themselves from it at all, especially when when these campaigns
are designed to provoke overreaction and prompt in real life
IRL responses. Can you give us an example one or
a case study of one that comes to mind to
help people understand because a lot of people are I
don't think they necessarily understand how this fake bot stuff
is really being deployed in order to make an impact.

Speaker 2 (02:44):
So and I think maybe to take it front and center.
The way that we look at informational chaos and this
complex way that the world that we live in right now,
it revolves around three things actors, behaviors and content. Right
because you know, you spoke about what what are we
going to do about these bonds and these sock puppets
and these avatars and these spam accounts, But.

Speaker 3 (03:08):
This is the actor layer.

Speaker 2 (03:09):
It's when you look at the source, when you look
at the people, the accounts that are generating something, you
of course you got to understand what are they trying
to achieve in what's their nature? But it is one piece,
It is one piece of the bigger puzzle, the bigger picture,
and it always evolved evolves around the actor.

Speaker 3 (03:27):
The behavior.

Speaker 2 (03:28):
How are you propagating a message, how are you propagating
a narrative? As you can imagine, because AI has made
it very easy and accessible for people to push something,
then obviously it's become much easier to virtually amplify and
manipulate other people's opinion and then the content. What if
you know, you said something actually very interesting and very
right in my opinion, which is you got to do
something that makes people tick, that makes people move uncomfortably
in their chair. If every single conversation is going to
be about hey, have you seen the new plants that
I have in my backyard? Or is the sun yellow
or orange these days.

Speaker 3 (04:05):
That's not going to make people move too much. But
what if you sent.

Speaker 2 (04:08):
Something that's semi true, semi not true. What if you
did something where the lines are blurred. It's not black
or white, it's gray at its very extreme way of
doing it. That's when we see that. And so I'll
give you an example. We have seen many different use
cases where you have brands that we've been working with
or you know, came to us and they were saying,
last time we checked, we've never had so much engagement about.

Speaker 3 (04:35):
The release of a product.

Speaker 2 (04:36):
You could be a video gaming company, you could be
a food and.

Speaker 3 (04:38):
Beverage company for all that matters.

Speaker 2 (04:40):
And they say, the last few years, we've always been
monitoring and we kind of had a grasp of what
was the intention and the nature and how.

Speaker 3 (04:47):
Things were really happening online.

Speaker 2 (04:49):
You know, could be on X, could be on LinkedIn
nowadays too, could be on Facebook, Instagram, TikTok, you name it.
When they say these days the last you know, twelve
month ish, we don't really understand how things are being
taken out of proportion completely. So it's actually very funny
because the last few months we have had a lot
of conversations with you know, the Bigger Fortune five thousand,
where the chief Communications, the head of PR crisis people
come to us and say, not only we don't know
who are these people are joining the conversation, but we
don't also don't understand how they are able to gravitate
entire narratives, shift entire communities of theirs from one point
to another.

Speaker 1 (05:30):
You know, back in like a decade ago, you would
sit there and you would think, oh, it's just some
loser and abasement, But now it's not even I mean,
it might be a loser in abasement, but now it
might be an algorithm from some hostile foreign government that's design,
or it could be or gandidly, it could be an
algorithm at a competing company that's designed to frame online
your product launch and to attack it for shareholder value before.
I mean, I have so many fou up questions.

Speaker 3 (06:02):
If you've got something very important here, you know exactly?
Now makes it even harder.

Speaker 1 (06:06):
Exactly is that illegal?

Speaker 3 (06:08):
I mean?

Speaker 1 (06:08):
How do you prove a defamation algorithm? You know?

Speaker 3 (06:11):
Right?

Speaker 1 (06:11):
I mean has there even been any cases around this?

Speaker 4 (06:13):
I mean they are more and more And I can
tell you that you know, you know because we've we've
started building that in that infrastructural layer of trust and
authenticity a couple of years ago.

Speaker 2 (06:24):
Now you see that that there are more and more
lawsuits and legal procedures and actions that are getting involved
with that because they quickly realize, what if you are
in a merger and acquisition? Do you if you're in
an M and a process?

Speaker 3 (06:38):
What if you lie about your numbers? Right? What if
you lie about your numbers?

Speaker 2 (06:41):
And in order to make it appear as a real
point of data, which you do, is you amplify that
knowledge and that that story, that narrative through whenipulated content,
it could be could be a bunch of deep fakes
of customers saying I love this product, I've used.

Speaker 3 (06:58):
This product, I want to. I want to live with
that product until the rest of my days.

Speaker 2 (07:02):
It could be it could be an me of ten
thousand bots and on top of that having fifty thousand
real people and that trusted the ten thousand bucks your
question exactly exactly, And I know.

Speaker 3 (07:15):
Like and listen, I don't.

Speaker 2 (07:17):
Maybe I've lost a little hair, you know, here and
there in the last couple of years, but that.

Speaker 3 (07:21):
You know, so far, I'm fine, But I can tell
you that this.

Speaker 2 (07:25):
Is clarity that is absolutely needed in our market.

Speaker 3 (07:30):
And we don't see that much of that clarity.

Speaker 2 (07:33):
And you're saying, you know, you're asking here, like, how
are they doing it? Well, I just want to say
something important and incredibly clear here. You can, at a
very low cost today use softwares that allow people, organizations
or just you know, malicious entities and institutions. For a
very low cost. You can falsify reach engagement. You can
amplify stories. I mean, what if you went right now
on the LLM engines, be CHGPT, claude, you name it,
and you told them, hey, can you create you know,
one hundred and twenty different posts that I want to
share over my blah blah blah guess what. They won't
be asking you what do you intend to do with this?

Speaker 3 (08:19):
Is this malicious? Now, that's not their job to ask
you this.

Speaker 2 (08:22):
So it's very easy to generate that scale the content, right,
remember the framework, the ABC, So it's incredibly easy to
generate the content and make.

Speaker 3 (08:30):
It look and feel human.

Speaker 2 (08:33):
Right, we all see that we don't have that, whether
it's text or visual. Yeah, it's but there are software
that can create accounts from the ground up and help
you maintain their presence and help you propagate their messages
over time. And the sad piece of that whole conclusion
maybe is the fact that I believe that we need
people in companies like you and me to create the
awareness because maybe you and me are more aware than people,
but my mother and your mother don't know.

Speaker 3 (09:06):
How to navigate that.

Speaker 2 (09:08):
And you know, we've seen studies where it's pretty interesting. Actually,
the teenagers nowadays, they are much more They much more
I'll find.

Speaker 3 (09:17):
You to study, I'll send it to its. It's fascinating.

Speaker 2 (09:20):
They are much more aware of how to navigate and
how to use AI driven solutions and technologies, but they
are much less educated distinguishing between that and reality.

Speaker 1 (09:33):
Here's why I'm hopeful because I think of you, sir,
as a firefighter. I think of you as a digital firefighter,
and this whole industry it is.

Speaker 3 (09:46):
I mean, you.

Speaker 1 (09:47):
Know, if the house is burning, you need the firefighter
to come put it out. Who I think and this
is this is why I'm an optimist because and I
just had to meet the future moment. I'm in honest
to god, Dan, I just had a meet the future
moment as you were talking. Because Okay, what Sayabrad does
is essentially you're the fire You're the fire engine, you're
the firehouse. I mean, you're the fire station. But the
government doesn't even have a fire station for how to
put any of this stuff out. And I'm not you know,
I don't want to get political. I want to, but
I will say this for all those people who tell me, Kevin,
you're so naive. AI is going to steal all of
these jobs. AI is not going to create any job.

Speaker 3 (10:25):
Dan.

Speaker 1 (10:26):
Your job didn't exist ten years ago.

Speaker 3 (10:28):
Did I don't think it exists?

Speaker 2 (10:29):
If you're asking me, I think we created our company
too early, and it.

Speaker 3 (10:32):
Was eight years ago.

Speaker 5 (10:33):
Wow, we listen, we had but I'll get yeah, I know,
and that's the point, you know, we're having this conversation
at such an important point in the history of our company,
because literally today we rebranded ourselves and like we changed
visually in the positioning, in the posture and the messaging
ahead of the NASDAC listing.

Speaker 3 (10:51):
That we have in a couple of weeks. But the
point that I'm trying to make is that.

Speaker 2 (10:54):
The reason behind why we rebranded ourselves today, that's because
we were still speaking in the same terms from eight
years ago when.

Speaker 3 (11:03):
We had to educate people.

Speaker 2 (11:04):
We had to convince them, We had to convince them,
we had to tell them, what do you know, We're
not in the business of like fake news.

Speaker 3 (11:12):
This is not the point. We're not in the business
of pizzagates in twenty sixteen.

Speaker 2 (11:16):
No, it's touching everything, it's touching every topic. But guess
what that was twenty eighteen. Today twenty twenty six we
do not have. I do feel in a very humble way,
and now I'm saying this, I think that we have
much less of a work of education, much more of
a work of implementation, because it's here, whether you and
I like it or not, it's already part of our
lives every day, every single thing that you watch right now,
every single comment, reply, tweet, posts, everything, and you know
it's actually very funny. You know, maybe I could take
you to another what did you call it? Meet the
future moment?

Speaker 3 (11:55):
Is that?

Speaker 1 (11:55):
What now you call it? Which is your job, your
indish Jamie. The public gets cyber protection, but they don't
get the protection from their starting. They get it, but
like this troll protection, and the businesses have to do it.
You've got to do it, I think for your kid
if you have kids, I mean, and so it factors
down to the family unit as well as corporate America.
Go ahead, and then this will be only last thing,
but come back on the show because I have another
question for you as well.

Speaker 2 (12:22):
Percent No, But but I'm saying, I'm saying, you know,
maybe maybe I've had another kind of joined uh maybe
in the future you and me here and I'll tell
you what it is.

Speaker 3 (12:30):
I think that in the next two three years, we.

Speaker 2 (12:33):
Will be living in a world where the the manipulation
will be so high and so much part of our
day to day, so easy, so accessible and truity today
kind of you know that we will live in a
world in a few years from now down the line
where people and organizations will not reward finding the inauthentic
and the manipulated. We will reward as people and institutions
relying on the genuine and the authentic. Think about the right,
the understanding what I just said. I mean, think about
the fact that you can literally go and say today
we're talking in terms of like, I know all these bots,
I found all these propagative machines, all these malicious and
fake narratives.

Speaker 3 (13:24):
Sure, that's awesome, that's wonderful.

Speaker 1 (13:26):
I just want a human I just want to talk
exactly human.

Speaker 3 (13:30):
Right, it's the fire extinguisher telling you this.

Speaker 2 (13:32):
But I think that in two three years from now,
the entire world will flip on its head and say,
unless you claim it back, unless you present it, unless
you show the proof, we will not reward you without trust.

Speaker 1 (13:46):
We will be testing for them. You have to fill
out that form. I am not a robot, and then
they give me these tests and they're like, identify the image,
where's the parking, where's the traffic lighter? I don't know,
And they don't even train you how to take that
test because you don't know if you should click the
square where the traffic light is one or two, and
I'm like, well, technically not a traffic light. It's only
the poll of one, and I'm thinking to myself, this
is dumb. God bless Dan bromy CEO and co founder
of Siabra, which uh, folks, they're the firefighters against the
trolls and they help the businesses navigate it. Congrats on
the rebrand, come back on the program. I want to ask,
I want to follow that last thread and I remember, folks,
you can get all of the latest Hello Future episodes
on the iHeartMedia app. Just download it on your phone
or wherever you get your podcasts, and sign up for
the Meet Future newsletter at MTF dot tv. Have a
great tomorrow Today

More For You